Beware the ‘short head’: PISA’s Resilient Students’ Measure

 

This post takes a closer look at the PISA concept of ‘resilient students’ – essentially a measure of disadvantaged high attainment amongst 15 year-olds – and how this varies from country to country.

7211284724_f3c5515bf7_mThe measure was addressed briefly in my recent review of the evidence base for excellence gaps in England but there was not space on that occasion to provide a thoroughgoing review.

The post is organised as follows:

  • A definition of the measure and explanation of how it has changed since the concept was first introduced.
  • A summary of key findings, including selected international comparisons, and of trends over recent PISA cycles.
  • A brief review of OECD and related research material about the characteristics of resilient learners.

I have not provided background about the nature of PISA assessments, but this can be found in previous posts about the mainstream PISA 2012 results and PISA 2012 Problem Solving.

 

Defining the resilient student

In 2011, the OECD published ‘Against the Odds: Disadvantaged students who succeed in school’, which introduced the notion of PISA as a study of resilience. It uses PISA 2006 data throughout and foregrounds science, as did the entire PISA 2006 cycle.

There are two definitions of resilience in play: an international benchmark and a country-specific measure to inform discussion of effective policy levers in different national settings.

The international benchmark relates to the top third of PISA performers (ie above the 67th percentile) across all countries after accounting for socio-economic background. The resilient population comprises students in this group who also fall within the bottom third of the socio-economic background distribution in their particular jurisdiction.

Hence the benchmark comprises an international dimension of performance and a national/jurisdictional dimension of disadvantage.

This cohort is compared with disadvantaged low achievers, a population similarly derived, except that their performance is in the bottom third across all countries, after accounting for socio-economic background.

The national benchmark applies the same national measure relating to socio-economic background, but the measure of performance is the top third of the national/jurisdictional performance distribution for the relevant PISA test.

The basis for determining socio-economic background is the PISA Index of Economic, Social and Cultural Status (ESCS).

‘Against the Odds’ describes it thus:

‘The indicator captures students’ family and home characteristics that describe their socio-economic background. It includes information about parental occupational status and highest educational level, as well as information on home possessions, such as computers, books and access to the Internet.’

Further details are provided in the original PISA 2006 Report (p333).

Rather confusingly, the parameters of the international benchmark were subsequently changed.

PISA 2009 Results: Overcoming Social Background – Equity in Learning Opportunities and Outcomes Volume II describes the new methodology in this fashion:

‘A student is classified as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter across students from all countries after accounting for socio-economic background. The share of resilient students among all students has been multiplied by 4 so that the percentage values presented here reflect the proportion of resilient students among disadvantaged students (those in the bottom quarter of the PISA index of social, economic and cultural status).’

No reason is given for this shift to a narrower measure of both attainment and disadvantage, nor is the impact on results discussed.

The new methodology is seemingly retained in PISA 2012 Results: Excellence through Equity: Giving every student the chance to succeed – Volume II:

‘A student is class­ed as resilient if he or she is in the bottom quarter of the PISA index of economic, social and cultural status (ESCS) in the country of assessment and performs in the top quarter of students among all countries, after accounting for socio-economic status.’

However, multiplication by four is dispensed with.

This should mean that the outcomes from PISA 2009 and 2012 are broadly comparable with some straightforward multiplication. However the 2006 results foreground science, while in 2009 the focus is reading – and shifts on to maths in 2012.

Although there is some commonality between these different test-specific results (see below), there is also some variation, notably in terms of differential outcomes for boys and girls.

 

PISA 2006 results

The chart reproduced below compares national percentages of resilient students and disadvantaged low achievers in science using the original international benchmark. It shows the proportion of resilient learners amongst disadvantaged students.

 

Resil 2006 science Capture

Conversely, the data table supplied alongside the chart shows the proportion of resilient students amongst all learners. Results have to be multiplied by three on this occasion (since the indicator is based on ‘top third attainment, bottom third advantage’).

I have not reproduced the entire dataset, but have instead created a subset of 14 jurisdictions in which my readership may be particularly interested, namely: Australia, Canada, Finland, Hong Kong, Ireland, Japan, New Zealand, Poland, Shanghai, Singapore, South Korea, Taiwan, the UK and the US. I have also included the OECD average.

I have retained this grouping throughout the analysis, even though some of the jurisdictions do not appear throughout – in particular, Shanghai and Singapore are both omitted from the 2006 data.

Chart 1 shows these results.

 

Resil first chart Chart 1: PISA resilience in science for selected jurisdictions by gender (PISA 2006 data)

 

All the jurisdictions in my sample are relatively strong performers on this measure. Only the United States falls consistently below the OECD average.

Hong Kong has the highest percentage of resilient learners – almost 75% of its disadvantaged students achieve the benchmark. Finland is also a very strong performer, while other jurisdictions achieving over 50% include Canada, Japan, South Korea and Taiwan.

The UK is just above the OECD average, but the US is ten points below. The proportion of disadvantaged resilient students in Hong Kong is almost twice the proportion in the UK and two and a half times the proportion in the US.

Most of the sample shows relatively little variation between their proportions of male and female resilient learners. Females have a slight lead across the OECD as a whole, but males are in the ascendancy in eight of these jurisdictions.

The largest gap – some 13 percentage points in favour of boys – can be found in Hong Kong. The largest advantage in favour of girls – 6.9 percentage points – is evident in Poland. In the UK males are ahead by slightly over three percentage points.

The first chart also shows that there is a relatively strong relationship between the proportion of resilient students and of disadvantaged low achievers. Jurisdictions with the largest proportions of resilient students typically have the smallest proportions of disadvantaged low achievers.

In Hong Kong, the proportion of disadvantaged students who are low achievers is 6.3%, set against an OECD average of 25.8%. Conversely, in the US, this proportion reaches 37.8% – and is 26.7% in the UK. Of this sample, only the US has a bigger proportion of disadvantaged low achievers than of disadvantaged resilient students.

 

‘Against the Odds’ examines the relationship between resiliency in science, reading and maths, but does so using the national benchmark, so the figures are not comparable with those above. I have, however, provided a chart comparing performance in my sample of jurisdictions.

 

Resil second chart

Chart 2: Students resilient in science who are resilient in other subjects, national benchmark of resilience, PISA 2006

 

Amongst the jurisdictions for which we have data there is a relatively similar pattern, with between 47% and 56% of students resilient in all three subjects.

In most cases, students who are resilient in two subjects combine science and maths rather than science and reading, but this is not universally true since the reverse pattern applies in Ireland, Japan and South Korea.

The document summarises the outcomes thus:

‘This evidence indicates that the vast majority of students who are resilient with respect to science are also resilient in at least one if not both of the other domains…These results suggest that resilience in science is not a domain-specific characteristic but rather there is something about these students or the schools they attend that lead them to overcome their social disadvantage and excel at school in multiple subject domains.’

 

PISA 2009 Results

The results drawn from PISA 2009 focus on outcomes in reading, rather than science, and of course the definitional differences described above make them incompatible with those for 2006.

The first graph reproduced below shows the outcomes for the full set of participating jurisdictions, while the second – Chart 2 – provides the results for my sample.

Resil PISA 2009 Capture

 

Resil third chart

Chart 3: PISA resilience in reading for selected jurisdictions by gender (PISA 2009 data)

 

The overall OECD average is pitched at 30.8% compared with 39% on the PISA 2006 science measure. Ten of our sample fall above the OECD average and Australia matches it, but the UK, Ireland and the US are below the average, the UK undershooting it by some seven percentage points.

The strongest performer is Shanghai at 75.6%, closely followed by Hong Kong at 72.4%. They and South Korea are the only jurisdictions in the sample which can count over half their disadvantaged readers as resilient. Singapore, Finland and Japan are also relatively strong performers.

There are pronounced gender differences in favour of girls. They have a 16.8 percentage point lead over boys in the OECD average figure and they outscore boys in every country in our sample. These differentials are most marked in Finland, Poland and New Zealand. In the UK there is a difference of 9.2 percentage points, smaller than in many other countries in the sample.

The comparison with the proportion of disadvantaged low achievers is illustrated by chart 3. This reveals the huge variation in the performance of our sample.

 

Resil fourth chart

Chart 4: Comparing percentage of resilient and low-achieving students in reading, PISA 2009

At one extreme, the proportion of disadvantaged low achievers (bottom quartile of the achievement distribution) is virtually negligible in Shanghai and Hong Kong, while around three-quarters of disadvantaged students are resilient (top quartile of the achievement distribution).

At the other, countries like the UK have broadly similar proportions of low achievers and resilient students. The chart reinforces just how far behind they are at both the top and the bottom of the attainment spectrum.

 

PISA 2012 Results

In 2012 the focus is maths rather than reading. The graph reproduced below compares resilience scores across the full set of participating jurisdictions, while Chart 4 covers only my smaller sample.

 

Resil PISA 2012 Capture

resil fifth chart Chart 5: PISA resilience in maths for selected jurisdictions by gender (PISA 2012 data)

 

Despite the change in subject, the span of performance on this measure is broadly similar to that found in reading three years earlier. The OECD average is 25.6%, roughly five percentage points lower than the average in 2009 reading.

Nine of the sample lie above the OECD average, while Australia, Ireland, New Zealand, UK and the US are below. The UK is closer to the OECD average in maths than it was in reading, however, and is a relatively stronger performer than the US and New Zealand.

Shanghai and Hong Kong are once again the top performers, at 76.8% and 72.4% respectively. Singapore is at just over 60% and South Korea at just over 50%. Taiwan and Japan are also notably strong performers.

Within the OECD average, boys have a four percentage point lead on girls, but boys’ relatively stronger performance is not universal – in Hong Kong, Poland, Singapore and South Korea, girls are in the ascendancy.  This is most strongly seen in Poland. The percentage point difference in the UK is just 2.

The comparison with disadvantage low achievers is illustrated in Chart 5.

 

Resil sixth chart

Chart 6: Comparing percentage of resilient and low-achieving students in maths, PISA 2012

 

Once again the familiar pattern emerges, with negligible proportions of low achievers in the countries with the largest shares of resilient students. At the other extreme, the US and New Zealand are the only two jurisdictions in this sample with a longer ‘tail’ of low achievers. The reverse is true in the UK, but only just!

 

Another OECD Publication ‘Strengthening Resilience through Education: PISA Results – background document’ contains a graph showing the variance in jurisdictions’ mathematical performance by deciles of socio-economic disadvantage. This is reproduced below.

 

resil maths deciles Capture

The text adds:

‘Further analysis indicates that the 10% socio-economically most disadvantaged children in Shanghai perform at the same level as the 10% most privileged children in the United States; and that the 20% most disadvantaged children in Finland, Japan, Estonia, Korea, Singapore, Hong Kong-China and Shanghai-China compare favourably to the OECD average.’

One can see that the UK is decidedly ‘mid-table’ at both extremes of the distribution. On the evidence of this measure, one cannot fully accept the oft-repeated saw that the UK is a much stronger performer with high attainers than with low attainers, certainly as far as disadvantaged learners are concerned.

 

The 2012 Report also compares maths-based resiliency records over the four cycles from PISA 2003 to PISA 2012 – as shown in the graph reproduced below – but few of the changes are statistically significant. There has also been some statistical sleight of hand to ensure comparability across the cycles.

 

resil comparing PISA 2003 to 2012 capture

Amongst the outcomes that are statistically significant, Australia experienced a fall of 1.9 percentage points, Canada 1.6 percentage points, Finland 3.3 percentage points and New Zealand 2.9 percentage points. The OECD average was relatively little changed.

The UK is not included in this analysis because of issues with its PISA 2003 results.

Resilience is not addressed in the main PISA 2012 report on problem-solving, but one can find online the graph below, which shows the relative performance of the participating countries.

It is no surprise that the Asian Tigers are at the top of the league (although Shanghai is no longer in the ascendancy). England (as opposed to the UK) is at just over 30%, a little above the OECD average, which appears to stand at around 27%.

The United States and Australia perform at a very similar level. Canada is ahead of them and Poland is the laggard.

 

resil problem solving 2012 Capture

 

Resilience in the home countries

Inserted for the purposes of reinforcement, the chart below compiles the UK outcomes from the PISA 2006, 2009 and 2012 studies above, as compared with the top performer in my sample for each cycle and the appropriate OECD average. Problem-solving is omitted.

Only in science (using the ‘top third attainer, bottom third disadvantage’ formula) does the UK exceed the OECD average figure and then only slightly.

In both reading and maths, the gap between the UK and the top performer in my sample is eye-wateringly large: in each case there are more than three times as many resilient students in the top-performing jurisdiction.

It is abundantly clear from this data that disadvantaged high attainers in the UK do not perform strongly compared with their peers elsewhere.

 

Resil seventh chart

Chart 7: Resilience measures from PISA 2006-2012 comparing UK with top performer in this sample and OECD average

 

Unfortunately NFER does not pick up the concept of resilience in its analysis of England’s PISA 2012 results.

The only comparative analysis across the Home Countries that I can find is contained in a report prepared for the Northern Ireland Ministry of Education by NFER called ‘PISA 2009: Modelling achievement and resilience in Northern Ireland’ (March 2012).

This uses the old ‘highest third by attainment, lowest third by disadvantage’ methodology deployed in ‘Against the Odds’. Reading is the base.

The results show that 41% of English students are resilient, the same figure as for the UK as a whole. The figures for the other home countries appear to be: Northern Ireland 42%; Scotland 44%; and Wales 35%.

Whether the same relationship holds true in maths and science using the ‘top quartile, bottom quartile’ methodology is unknown. One suspects though that each of the UK figures given above will also apply to England.

 

The characteristics of resilient learners

‘Against the Odds’ outlines some evidence derived from comparisons using the national benchmark:

  • Resilient students are, on average, somewhat more advantaged than disadvantaged low achievers, but the difference is relatively small and mostly accounted for by home-related factors (eg. number of books in the home, parental level of education) rather than parental occupation and income.
  • In most jurisdictions, resilient students achieve proficiency level 4 or higher in science. This is true of 56.8% across the OECD. In the UK the figure is 75.8%; in Hong Kong it is 88.4%. We do not know what proportions achieve the highest proficiency levels.
  • Students with an immigrant background – either born outside the country of residence or with parents were born outside the country – tend to be under-represented amongst resilient students.
  • Resilient students tend to be more motivated, confident and engaged than disadvantaged low achievers. Students’ confidence in their academic abilities is a strong predictor of resilience, stronger than motivation.
  • Learning time – the amount of time spent in normal science lessons – is also a strong predictor of resilience, but there is relatively little evidence of an association with school factors such as school management, admissions policies and competition.

Volume III of the PISA 2012 Report: ‘Ready to Learn: Students’ engagement, drive and self-beliefs’ offers a further gloss on these characteristics from a mathematical perspective:

‘Resilient students and advantaged high-achievers have lower rates of absenteeism and lack of punctuality than disadvantaged and advantaged low-achievers…

….resilient and disadvantaged low-achievers tend to have lower sense of belonging than advantaged low-achievers and advantaged high-achievers: socio-economically disadvantaged students express a lower sense of belonging than socio-economically advantaged students irrespective of their performance in mathematics.

Resilient students tend to resemble advantaged high-achievers with respect to their level of drive, motivation and self-beliefs: resilient students and advantaged high-achievers have in fact much higher levels of perseverance, intrinsic and instrumental motivation to learn mathematics, mathematics self-efficacy, mathematics self-concept and lower levels of mathematics anxiety than students who perform at lower levels than would be expected of them given their socio-economic condition…

….In fact, one key characteristic that resilient students tend to share across participating countries and economies, is that they are generally physically and mentally present in class, are ready to persevere when faced with challenges and difficulties and believe in their abilities as mathematics learners.’

Several research studies can be found online that reinforce these findings, sometimes adding a few further details for good measure:

The aforementioned NFER study for Northern Ireland uses a multi-level logistic model to investigate the school and student background factors associated with resilience in Northern Ireland using PISA 2009 data.

It derives odds ratios as follows: grammar school 7.44; female pupils 2.00; possessions – classic literature 1.69; wealth 0.76; percentage of pupils eligible for FSM – 0.63; and books in home – 0-10 books 0.35.

On the positive impact of selection the report observes:

‘This is likely to be largely caused by the fact that to some extent grammar schools will be identifying the most resilient students as part of the selection process. As such, we cannot be certain about the effectiveness or otherwise of grammar schools in providing the best education for disadvantaged children.’

Another study – ‘Predicting academic resilience with mathematics learning and demographic variables’ (Cheung et al 2014) – concludes that, amongst East Asian jurisdictions such as Hong-Kong, Japan and South Korea, resilience is associated with avoidance of ‘redoublement’ and having attended kindergarten for more than a year.

Unsurprisingly, students who are more familiar with mathematical concepts and have greater mathematical self-efficacy are also more likely to be resilient.

Amongst other countries in the sample – including Canada and Finland – being male, native (as opposed to immigrant) and avoiding ‘redoublement’ produced stronger chances of resilience.

In addition to familiarity with maths concepts and self-efficacy, resilient students in these countries were less anxious about maths and had a higher degree of maths self-concept.

Work on ‘Resilience Patterns in Public Schools in Turkey’ (unattributed and undated) – based on PISA 2009 data and using the ‘top third, bottom third’ methodology – finds that 10% of a Turkish sample are resilient in reading, maths and science; 6% are resilient in two subjects and a further 8% in one only.

Resilience varies in different subjects according to year of education.

resil Turkey Capture

There are also significant regional differences.

Odds ratios show a positive association with: more than one year of pre-primary education; selective provision, especially in maths; absence of ability grouping; additional learning time, especially for maths and science; a good disciplinary climate and strong teacher-student relations.

An Italian study – ‘A way to resilience: How can Italian disadvantaged students and schools close the achievement gap?’ (Agasisti and Longobardi, undated) uses PISA 2009 data to examine the characteristics of resilient students attending schools with high levels of disadvantage.

This confirms some of the findings above in respect of student characteristics, finding a negative impact from immigrant status (and also from a high proportion of immigrants in a school). ‘Joy in reading’ and ‘positive attitude to computers’ are both positively associated with resilience, as is a positive relationship with teachers.

School type is found to influence the incidence of resilience – particularly enrolment in Licei as opposed to professional or technical schools – so reflecting one outcome of the Northern Irish study. Other significant school level factors include the quality of educational resources available and investment in extracurricular activities. Regional differences are once more pronounced.

A second Italian study – ‘Does public spending improve educational resilience? A longitudinal analysis of OECD PISA data’ (Agasisti et al 2014) finds a positive correlation between the proportion of a country’s public expenditure devoted to education and the proportion of resilient students.

Finally, this commentary from Marc Tucker in the US links its relatively low incidence of resilient students to national views about the nature of ability:

‘In Asia, differences in student achievement are generally attributed to differences in the effort that students put into learning, whereas in the United States, these differences are attributed to natural ability.  This leads to much lower expectations for students who come from low-income families…

My experience of the Europeans is that they lie somewhere between the Asians and the Americans with respect to the question as to whether effort or genetic material is the most important explainer of achievement in school…

… My take is that American students still suffer relative to students in both Europe and Asia as a result of the propensity of the American education system to sort students out by ability and assign different students work at different challenge levels, based on their estimates of student’s inherited intelligence.’

 

Conclusion

What are we to make of all this?

It suggests to me that we have not pushed much beyond statements of the obvious and vague conjecture in our efforts to understand the resilient student population and how to increase its size in any given jurisdiction.

The comparative statistical evidence shows that England has a real problem with underachievement by disadvantaged students, as much at the top as the bottom of the attainment distribution.

We are not alone in facing this difficulty, although it is significantly more pronounced than in several of our most prominent PISA competitors.

We should be worrying as much about our ‘short head’ as our ‘long tail’.

 

GP

September 2014

 

 

 

 

 

 

Advertisements

PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This post compares the performance of high achievers from selected jurisdictions on the PISA 2012 creative problem solving test.

It draws principally on the material in the OECD Report ‘PISA 2012 Results: Creative Problem Solving’ published on 1 April 2014.

Pisa ps cover CaptureThe sample of jurisdictions includes England, other English-speaking countries (Australia, Canada, Ireland and the USA) and those that typically top the PISA rankings (Finland, Hong Kong, South Korea, Shanghai, Singapore and Taiwan).

With the exception of New Zealand, which did not take part in the problem solving assessment, this is deliberately identical to the sample I selected for a parallel post reviewing comparable results in the PISA 2012 assessments of reading, mathematics and science: ‘PISA 2012: International Comparisons of High Achievers’ Performance’ (December 2013)

These eleven jurisdictions account for nine of the top twelve performers ranked by mean overall performance in the problem solving assessment. (The USA and Ireland lie outside the top twelve, while Japan, Macao and Estonia are the three jurisdictions that are in the top twelve but outside my sample.)

The post is divided into seven sections:

  • Background to the problem solving assessment: How PISA defines problem solving competence; how it defines performance at each of the six levels of proficiency; how it defines high achievement; the nature of the assessment and who undertook it.
  • Average performance, the performance of high achievers and the performance of low achievers (proficiency level 1) on the problem solving assessment. This comparison includes my own sample and all the other jurisdictions that score above the OECD average on the first of these measures.
  • Gender and socio-economic differences amongst high achievers on the problem solving assessment  in my sample of eleven jurisdictions.
  • The relative strengths and weaknesses of jurisdictions in this sample on different aspects of the problem solving assessment. (This treatment is generic rather than specific to high achievers.)
  • What proportion of high achievers on the problem-solving assessment in my sample of jurisdictions are also high achievers in reading, maths and science respectively.
  • What proportion of students in my sample of jurisdictions achieves highly in one or more of the four PISA 2012 assessments – and against the ‘all-rounder’ measure, which is based on high achievement in all of reading, maths and science (but not problem solving).
  • Implications for education policy makers seeking to improve problem solving performance in each of the sample jurisdictions.

Background to the Problem Solving Assessment

.

Definition of problem solving

PISA’s definition of problem-solving competence is:

‘…an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen.’

The commentary on this definition points out that:

  • Problem solving requires identification of the problem(s) to be solved, planning and applying a solution, and monitoring and evaluating progress.
  • A problem is ‘a situation in which the goal cannot be achieved by merely applying learned procedures’, so the problems encountered must be non-routine for 15 year-olds, although ‘knowledge of general strategies’ may be useful in solving them.
  • Motivational and affective factors are also in play.

The Report is rather coy about the role of creativity in problem solving, and hence the justification for the inclusion of this term in its title.

Perhaps the nearest it gets to an exposition is when commenting on the implications of its findings:

‘In some countries and economies, such as Finland, Shanghai-China and Sweden, students master the skills needed to solve static, analytical problems similar to those that textbooks and exam sheets typically contain as well or better than 15-year-olds, on average, across OECD countries. But the same 15-year-olds are less successful when not all information that is needed to solve the problem is disclosed, and the information provided must be completed by interacting with the problem situation. A specific difficulty with items that require students to be open to novelty, tolerate doubt and uncertainty, and dare to use intuitions (“hunches and feelings”) to initiate a solution suggests that opportunities to develop and exercise these traits, which are related to curiosity, perseverance and creativity, need to be prioritised.’

.

Assessment framework

PISA’s framework for assessing problem solving competence is set out in the following diagram

 

PISA problem solving framework Capture

 

In solving a particular problem it may not be necessary to apply all these steps, or to apply them in this order.

Proficiency levels

The proficiency scale was designed to have a mean score across OECD countries of 500. The six levels of proficiency applied in the assessment each have their own profile.

The lowest, level 1 proficiency is described thus:

‘At Level 1, students can explore a problem scenario only in a limited way, but tend to do so only when they have encountered very similar situations before. Based on their observations of familiar scenarios, these students are able only to partially describe the behaviour of a simple, everyday device. In general, students at Level 1 can solve straightforward problems provided there is a simple condition to be satisfied and there are only one or two steps to be performed to reach the goal. Level 1 students tend not to be able to plan ahead or set sub-goals.’

This level equates to a range of scores from 358 to 423. Across the OECD sample, 91.8% of participants are able to perform tasks at this level.

By comparison, level 5 proficiency is described in this manner:

‘At Level 5, students can systematically explore a complex problem scenario to gain an understanding of how relevant information is structured. When faced with unfamiliar, moderately complex devices, such as vending machines or home appliances, they respond quickly to feedback in order to control the device. In order to reach a solution, Level 5 problem solvers think ahead to find the best strategy that addresses all the given constraints. They can immediately adjust their plans or backtrack when they detect unexpected difficulties or when they make mistakes that take them off course.’

The associated range of scores is from 618 to 683 and 11.4% of all OECD students achieve at this level.

Finally, level 6 proficiency is described in this way:

‘At Level 6, students can develop complete, coherent mental models of diverse problem scenarios, enabling them to solve complex problems efficiently. They can explore a scenario in a highly strategic manner to understand all information pertaining to the problem. The information may be presented in different formats, requiring interpretation and integration of related parts. When confronted with very complex devices, such as home appliances that work in an unusual or unexpected manner, they quickly learn how to control the devices to achieve a goal in an optimal way. Level 6 problem solvers can set up general hypotheses about a system and thoroughly test them. They can follow a premise through to a logical conclusion or recognise when there is not enough information available to reach one. In order to reach a solution, these highly proficient problem solvers can create complex, flexible, multi-step plans that they continually monitor during execution. Where necessary, they modify their strategies, taking all constraints into account, both explicit and implicit.’

The range of level 6 scores is from 683 points upwards and 2.5% of all OECD participants score at this level.

PISA defines high achieving students as those securing proficiency level 5 or higher, so proficiency levels 5 and 6 together. The bulk of the analysis it supplies relates to this cohort, while relatively little attention is paid to the more exclusive group achieving proficiency level 6, even though almost 10% of students in Singapore reach this standard in problem solving.

 .

The sample

Sixty-five jurisdictions took part in PISA 2012, including all 34 OECD countries and 31 partners. But only 44 jurisdictions took part in the problem solving assessment, including 28 OECD countries and 16 partners. As noted above, that included all my original sample of twelve jurisdictions, with the exception of New Zealand.

I could find no stated reason why New Zealand chose not to take part. Press reports initially suggested that England would do likewise, but it was subsequently reported that this decision had been reversed.

The assessment was computer-based and comprised 16 units divided into 42 items. The units were organised into four clusters, each designed to take 20 minutes to complete. Participants completed one or two clusters, depending on whether they were also undertaking computer-based assessments of reading and maths.

In each jurisdiction a random sample of those who took part in the paper-based maths assessment was selected to undertake the problem solving assessment. About 85,000 students took part in all. The unweighted sample sizes in my selected jurisdictions are set out in Table 1 below, together with the total population of 15 year-olds in each jurisdiction.

 

Table 1: Sample sizes undertaking PISA 2012 problem solving assessment in selected jurisdictions

Country Unweighted Sample Total 15 year-olds
Australia 5,612 291,976
Canada 4,601 417,873
Finland 3,531 62,523
Hong Kong 1,325 84,200
Ireland 1,190 59,296
Shanghai 1,203 108,056
Singapore 1,394 53,637
South Korea 1,336 687,104
Taiwan 1,484 328,356
UK (England) 1,458 738,066
USA 1,273 3,985,714

Those taking the assessment were aged between 15 years and three months and 16 years and two months at the time of the assessment. All were enrolled at school and had completed at least six years of formal schooling.

Average performance compared with the performance of high and low achievers

The overall table of mean scores on the problem solving assessment is shown below

PISA problem solving raw scores Capture

 .

There are some familiar names at the top of the table, especially Singapore and South Korea, the two countries that comfortably lead the rankings. Japan is some ten points behind in third place but it in turn has a lead of twelve points over a cluster of four other Asian competitors: Macao, Hong Kong, Shanghai and Taiwan.

A slightly different picture emerges if we compare average performance with the proportion of learners who achieve the bottom proficiency level and the top two proficiency levels. Table 2 below compares these groups.

This table includes all the jurisdictions that exceeded the OECD average score. I have marked out in bold the countries in my sample of eleven which includes Ireland, the only one of them that did not exceed the OECD average.

Table 2: PISA Problem Solving 2012: Comparing Average Performance with Performance at Key Proficiency Levels

 

Jurisdiction Mean score Level 1 (%) Level 5 (%) Level 6 (%) Levels 5+6 (%)
Singapore 562 6.0 19.7 9.6 29.3
South Korea 561 4.8 20.0 7.6 27.6
Japan 552 5.3 16.9 5.3 22.2
Macao 540 6.0 13.8 2.8 16.6
Hong Kong 540 7.1 14.2 5.1 19.3
Shanghai 536 7.5 14.1 4.1 18.2
Taiwan 534 8.2 14.6 3.8 18.4
Canada 526 9.6 12.4 5.1 17.5
Australia 523 10.5 12.3 4.4 16.7
Finland 523 9.9 11.4 3.6 15.0
England (UK) 517 10.8 10.9 3.3 14.2
Estonia 515 11.1 9.5 2.2 11.7
France 511 9.8 9.9 2.1 12.0
Netherlands 511 11.2 10.9 2.7 13.6
Italy 510 11.2 8.9 1.8 10.7
Czech Republic 509 11.9 9.5 2.4 11.9
Germany 509 11.8 10.1 2.7 12.8
USA 508 12.5 8.9 2.7 11.6
Belgium 508 11.6 11.4 3.0 14.4
Austria 506 11.9 9.0 2.0 11.0
Norway 503 13.2 9.7 3.4 13.1
Ireland 498 13.3 7.3 2.1 9.4
OECD Ave. 500 13.2 8.9 2.5 11.4

 .

The jurisdictions at the top of the table also have a familiar profile, with a small ‘tail’ of low performance combined with high levels of performance at the top end.

Nine of the top ten have fewer than 10% of learners at proficiency level 1, though only South Korea pushes below 5%.

Five of the top ten have 5% or more of their learners at proficiency level 6, but only Singapore and South Korea have a higher percentage at level 6 than level 1 (with Japan managing the same percentage at both levels).

The top three performers – Singapore, South Korea and Japan – are the only three jurisdictions that have over 20% of their learners at proficiency levels 5 and 6 together.

South Korea slightly outscores Singapore at level 5 (20.0% against 19.7%). Japan is in third place, followed by Taiwan, Hong Kong and Shanghai.

But at level 6, Singapore has a clear lead, followed by South Korea, Japan, Hong Kong and Canada respectively.

England’s overall place in the table is relatively consistent on each of these measures, but the gaps between England and the top performers vary considerably.

The best have fewer than half England’s proportion of learners at proficiency level 1, almost twice as many learners at proficiency level 5 and more than twice as many at proficiency levels 5 and 6 together. But at proficiency level 6 they have almost three times as many learners as England.

Chart 1 below compares performance on these four measures across my sample of eleven jurisdictions.

All but Ireland are comfortably below the OECD average for the percentage of learners at proficiency level 1. The USA and Ireland are atypical in having a bigger tail (proficiency level 1) than their cadres of high achievers (levels 5 and 6 together).

At level 5 all but Ireland and the USA are above the OECD average, but USA leapfrogs the OECD average at level 6.

There is a fairly strong correlation between the proportions of learners achieving the highest proficiency thresholds and average performance in each jurisdiction. However, Canada stands out by having an atypically high proportion of students at level 6.

.

Chart 1: PISA 2012 Problem-solving: Comparing performance at specified proficiency levels

Problem solving chart 1

.

PISA’s Report discusses the variation in problem-solving performance within different jurisdictions. However it does so without reference to the proficiency levels, so we do not know to what extent these findings apply equally to high achievers.

Amongst those above the OECD average, those with least variation are Macao, Japan, Estonia, Shanghai, Taiwan, Korea, Hong Kong, USA, Finland, Ireland, Austria, Singapore and the Czech Republic respectively.

Perhaps surprisingly, the degree of variation in Finland is identical to that in the USA and Ireland, while Estonia has less variation than many of the Asian jurisdictions. Singapore, while top of the performance table, is only just above the OECD average in terms of variation.

The countries below the OECD average on this measure – listed in order of increasing variation – include England, Australia and Canada, though all three are relatively close to the OECD average. So these three countries and Singapore are all relatively close together.

Gender and socio-economic differences amongst high achievers

 .

Gender differences

On average across OECD jurisdictions, boys score seven points higher than girls on the problem solving assessment. There is also more variation amongst boys than girls.

Across the OECD participants, 3.1% of boys achieved proficiency level 6 but only 1.8% of girls did so. This imbalance was repeated at proficiency level 5, achieved by 10% of boys and 7.7% of girls.

The table and chart below show the variations within my sample of eleven countries. The performance of boys exceeds that of girls in all cases, except in Finland at proficiency level 5, and in that instance the gap in favour of girls is relatively small (0.4%).

 .

Table 3: PISA Problem-solving: Gender variation at top proficiency levels

Jurisdiction Level 5 (%) Level 6 (%) Levels 5+6 (%)
  Boys Girls Diff Boys Girls Diff Boys Girls Diff
Singapore 20.4 19.0 +1.4 12.0 7.1 +4.9 32.4 26.1 +6.3
South Korea 21.5 18.3 +3.2 9.4 5.5 +3.9 30.9 23.8 +7.1
Hong Kong 15.7 12.4 +3.3 6.1 3.9 +2.2 21.8 16.3 +5.5
Shanghai 17.0 11.4 +5.6 5.7 2.6 +3.1 22.7 14.0 +8.7
Taiwan 17.3 12.0 +5.3 5.0 2.5 +2.5 22.3 14.5 +7.8
Canada 13.1 11.8 +1.3 5.9 4.3 +1.6 19.0 16.1 +2.9
Australia 12.6 12.0 +0.6 5.1 3.7 +1.4 17.7 15.7 +2.0
Finland 11.2 11.6 -0.4 4.1 3.0 +1.1 15.3 14.6 +0.7
England (UK) 12.1 9.9 +2.2 3.6 3.0 +0.6 15.7 12.9 +2.8
USA 9.8 7.9 +1.9 3.2 2.3 +0.9 13.0 10.2 +2.8
Ireland 8.0 6.6 +1.4 3.0 1.1 +1.9 11.0 7.7 +3.3
OECD Average 10.0 7.7 +2.3 3.1 1.8 +1.3 13.1 9.5 +3.6

There is no consistent pattern in whether boys are more heavily over-represented at proficiency level 5 than proficiency level 6, or vice versa.

There is a bigger difference at level 6 than at level 5 in Singapore, South Korea, Canada, Australia, Finland and Ireland, but the reverse is true in the five remaining jurisdictions.

At level 5, boys are in the greatest ascendancy in Shanghai and Taiwan while, at level 6, this is true of Singapore and South Korea.

When proficiency levels 5 and 6 are combined, all five of the Asian tigers show a difference in favour of males of 5.5% or higher, significantly in advance of the six ‘Western’ countries in the sample and significantly ahead of the OECD average.

Amongst the six ‘Western’ representatives, boys have the biggest advantage at proficiency level 5 in England, while at level 6 boys in Ireland have the biggest advantage.

Within this group of jurisdictions, the gap between boys and girls at level 6 is comfortably the smallest in England. But, in terms of performance at proficiency levels 5 and 6 together, Finland is ahead.

 .

Chart 2: PISA Problem-solving: Gender variation at top proficiency levels

Problem solving chart 2

The Report includes a generic analysis of gender differences in performance for boys and girls with similar levels of performance in English, maths and science.

It concludes that girls perform significantly above their expected level in both England and Australia (though the difference is only statistically significant in the latter).

The Report comments:

‘It is not clear whether one should expect there to be a gender gap in problem solving. On the one hand, the questions posed in the PISA problem-solving assessment were not grounded in content knowledge, so boys’ or girls’ advantage in having mastered a particular subject area should not have influenced results. On the other hand… performance in problem solving is more closely related to performance in mathematics than to performance in reading. One could therefore expect the gender difference in performance to be closer to that observed in mathematics – a modest advantage for boys, in most countries – than to that observed in reading – a large advantage for girls.’

 .

Socio-economic differences

The Report considers variations in performance against PISA’s Index of Economic, Social and Cultural status (IESC), finding them weaker overall than for reading, maths and science.

It calculates that the overall percentage variation in performance attributable to these factors is about 10.6% (compared with 14.9% in maths, 14.0% in science and 13.2% in reading).

Amongst the eleven jurisdictions in my sample, the weakest correlations were found in Canada (4%), followed by Hong Kong (4.9%), South Korea (5.4%), Finland (6.5%), England (7.8%), Australia (8.5%), Taiwan (9.4%), the USA (10.1%) and Ireland (10.2%) in that order. All those jurisdictions had correlations below the OECD average.

Perhaps surprisingly, there were above average correlations in Shanghai (14.1%) and, to a lesser extent (and less surprisingly) in Singapore (11.1%).

The report suggests that students with parents working in semi-skilled and elementary occupations tend to perform above their expected level in problem-solving in Taiwan, England, Canada, the USA, Finland and Australia (in that order – with Australia closest to the OECD average).

The jurisdictions where these students tend to underperform their expected level are – in order of severity – Ireland, Shanghai, Singapore, Hong Kong and South Korea.

A parallel presentation on the Report provides some additional data about the performance in different countries of what the OECD calls ‘resilient’ students – those in the bottom quartile of the IESC but in the top quartile by perfromance, after accounting for socio-economic status.

It supplies the graph below, which shows all the Asian countries in my sample clustered at the top, but also with significant gaps between them. Canada is the highest-performing of the remainder in my sample, followed by Finland, Australia, England and the USA respectively. Ireland is some way below the OECD average.

.

PISA problem resolving resilience Capture

.

Unfortunately, I can find no analysis of how performance varies according to socio-economic variables at each proficiency level. It would be useful to see which jurisdictions have the smallest ‘excellence gaps’ at levels 5 and 6 respectively.

 .

How different jurisdictions perform on different aspects of problem-solving

The Report’s analysis of comparative strengths and weaknesses in different elements of problem-solving does not take account of variations at different proficiency levels

It explains that aspects of the assessment were found easier by students in different jurisdictions, employing a four-part distinction between:

‘Exploring and understanding. The objective is to build mental representations of each of the pieces of information presented in the problem. This involves:

  • exploring the problem situation: observing it, interacting with it, searching for information and finding limitations or obstacles; and
  • understanding given information and, in interactive problems, information discovered while interacting with the problem situation; and demonstrating understanding of relevant concepts.

Representing and formulating. The objective is to build a coherent mental representation of the problem situation (i.e. a situation model or a problem model). To do this, relevant information must be selected, mentally organised and integrated with relevant prior knowledge. This may involve:

  • representing the problem by constructing tabular, graphic, symbolic or verbal representations, and shifting between representational formats; and
  • formulating hypotheses by identifying the relevant factors in the problem and their inter-relationships; and organising and critically evaluating information.

Planning and executing. The objective is to use one’s knowledge about the problem situation to devise a plan and execute it. Tasks where “planning and executing” is the main cognitive demand do not require any substantial prior understanding or representation of the problem situation, either because the situation is straightforward or because these aspects were previously solved. “Planning and executing” includes:

  • planning, which consists of goal setting, including clarifying the overall goal, and setting subgoals, where necessary; and devising a plan or strategy to reach the goal state, including the steps to be undertaken; and
  • executing, which consists of carrying out a plan.

Monitoring and reflecting.The objective is to regulate the distinct processes involved in problem solving, and to critically evaluate the solution, the information provided with the problem, or the strategy adopted. This includes:

  • monitoring progress towards the goal at each stage, including checking intermediate and final results, detecting unexpected events, and taking remedial action when required; and
  • reflecting on solutions from different perspectives, critically evaluating assumptions and alternative solutions, identifying the need for additional information or clarification and communicating progress in a suitable manner.’

Amongst my sample of eleven jurisdictions:

  • ‘Exploring and understanding’ items were found easier by students in Singapore, Hong Kong, South Korea, Australia, Taiwan and Finland. 
  • ‘Representing and formulating’ items were found easier in Taiwan, Shanghai, South Korea, Singapore, Hong Kong, Canada and Australia. 
  • ‘Planning and executing’ items were found easier in Finland only. 
  • ‘Monitoring and reflecting’ items were found easier in Ireland, Singapore, the USA and England.

The Report concludes:

‘This analysis shows that, in general, what differentiates high-performing systems, and particularly East Asian education systems, such as those in Hong Kong-China, Japan, Korea [South Korea], Macao-China, Shanghai -China, Singapore and Chinese Taipei [Taiwan], from lower-performing ones, is their students’ high level of proficiency on “exploring and understanding” and “representing and formulating” tasks.’

It also distinguishes those jurisdictions that perform best on interactive problems, requiring students to discover some of the information required to solve the problem, rather than being presented with all the necessary information. This seems to be the nearest equivalent to a measure of creativity in problem solving

Comparative strengths and weaknesses in respect of interactive tasks are captured in the following diagram.

.

PISA problem solving strengths in different countries

.

One can see that several of my sample – Ireland, the USA, Canada, Australia, South Korea and Singapore – are placed in the top right-hand quarter of the diagram, indicating stronger than expected performance on both interactive and knowledge acquisition tasks.

England is stronger than expected on the former but not on the latter.

Jurisdictions that are weaker than inspected on interactive tasks only include Hong Kong, Taiwan and Shanghai, while Finland is weaker than expected on both.

We have no information about whether these distinctions were maintained at different proficiency levels.

.

Comparing jurisdictions’ performance at higher proficiency levels

Table 4 and Charts 3 and 4 below show variations in the performance of countries in my sample across the four different assessments at level 6, the highest proficiency level.

The charts in particular emphasise how far ahead the Asian Tigers are in maths at this level, compared with the cross-jurisdictional variation in the other three assessments.

In all five cases, each ‘Asian Tiger’s’ level 6 performance in maths also vastly exceeds its level 6 performance in the other three assessments. The proportion of students achieving level 6 proficiency in problem solving lags far behind, even though there is a fairly strong correlation between these two assessments (see below).

In contrast, all the ‘Western’ jurisdictions in the sample – with the sole exception of Ireland – achieve a higher percentage at proficiency level 6 in problem solving than they do in maths, although the difference is always less than a full percentage point. (Even in Ireland the difference is only 0.1 of a percentage point in favour of maths.)

Shanghai is the only jurisdiction in the sample which has more students achieving proficiency level 6 in science than in problem solving. It also has the narrowest gap between level 6 performance in problem solving and in reading.

Meanwhile, England, the USA, Finland and Australia all have broadly similar profiles across the four assessments, with the largest percentage of level 6 performers in problem solving, followed by maths, science and reading respectively.

The proximity of the lines marking level 6 performance in reading and science is also particularly evident in the second chart below.

.

Table 4: Percentage achieving proficiency Level 6 in each domain

  PS L6  Ma L6 Sci L6 Re L6
Singapore 9.6 19.0 5.8 5.0
South Korea 7.6 12.1 1.1 1.6
Hong Kong 5.1 12.3 1.8 1.9
Shanghai 4.1 30.8 4.2 3.8
Taiwan 3.8 18.0 0.6 1.4
Canada 5.1 4.3 1.8 2.1
Australia 4.4 4.3 2.6 1.9
Finland 3.6 3.5 3.2 2.2
England (UK) 3.3 3.1 1.9 1.3
USA 2.7 2.2 1.1 1.0
Ireland 2.1 2.2 1.5 1.3
OECD Average 2.5 3.3 1.2 1.1

 Charts 3 and 4: Percentage achieving proficiency level 6 in each domain

Problem solving chart 3

Problem solving chart 4

The pattern is materially different at proficiency levels 5 and above, as the table and chart below illustrate. These also include the proportion of all-rounders, who achieved proficiency level 5 or above in each of maths, science and reading (but not in problem-solving).

The lead enjoyed by the ‘Asian Tigers’ in maths is somewhat less pronounced. The gap between performance within these jurisdictions on the different assessments also tends to be less marked, although maths accounts for comfortably the largest proportion of level 5+ performance in all five cases.

Conversely, level 5+ performance on the different assessments is typically much closer in the ‘Western’ countries. Problem solving leads the way in Australia, Canada, England and the USA, but in Finland science is in the ascendant and reading is strongest in Ireland.

Some jurisdictions have a far ‘spikier’ profile than others. Ireland is closest to achieving equilibrium across all four assessments. Australia and England share very similar profiles, though Australia outscores England in each assessment.

The second chart in particular shows how Shanghai’s ‘spike’ applies in all the other three assessments but not in problem solving.

Table 5: Percentage achieving Proficiency level 5 and above in each domain

  PS L5+  Ma L5+ Sci L5+ Re L5+ Ma + Sci + Re L5+
Singapore 29.3 40.0 22.7 21.2 16.4
South Korea 27.6 30.9 11.7 14.2 8.1
Hong Kong 19.3 33.4 16.7 16.8 10.9
Shanghai 18.2 55.4 27.2 25.1 19.6
Taiwan 18.4 37.2 8.4 11.8 6.1
Canada 17.5 16.4 11.3 12.9 6.5
Australia 16.7 14.8 13.5 11.7 7.6
Finland 15.0 15.2 17.1 13.5 7.4
England (UK) 14.2 12.4 11.7 9.1 5.7* all UK
USA 11.6 9.0 7.4 7.9 4.7
Ireland 9.4 10.7 10.8 11.4 5.7
OECD Average 11.4 12.6 8.4 8.4 4.4

 .

Charts 5 and 6: Percentage Achieving Proficiency Level 5 and above in each domain

Problem solving chart 5Problem solving chart 6.

How high-achieving problem solvers perform in other assessments

.

Correlations between performance in different assessments

The Report provides an analysis of the proportion of students achieving proficiency levels 5 and 6 on problem solving who also achieved that outcome on one of the other three assessments: reading, maths and science.

It argues that problem solving is a distinct and separate domain. However:

‘On average, about 68% of the problem-solving score reflects skills that are also measured in one of the three regular assessment domains. The remaining 32% reflects skills that are uniquely captured by the assessment of problem solving. Of the 68% of variation that problem-solving performance shares with other domains, the overwhelming part is shared with all three regular assessment domains (62% of the total variation); about 5% is uniquely shared between problem solving and mathematics only; and about 1% of the variation in problem solving performance hinges on skills that are specifically measured in the assessments of reading or science.’

It discusses the correlation between these different assessments:

‘A key distinction between the PISA 2012 assessment of problem solving and the regular assessments of mathematics, reading and science is that the problem-solving assessment does not measure domain-specific knowledge; rather, it focuses as much as possible on the cognitive processes fundamental to problem solving. However, these processes can also be used and taught in the other subjects assessed. For this reason, problem-solving tasks are also included among the test units for mathematics, reading and science, where their solution requires expert knowledge specific to these domains, in addition to general problem-solving skills.

It is therefore expected that student performance in problem solving is positively correlated with student performance in mathematics, reading and science. This correlation hinges mostly on generic skills, and should thus be about the same magnitude as between any two regular assessment subjects.’

These overall correlations are set out in the table below, which shows that maths has a higher correlation with problem solving than either science or reading, but that this correlation is lower than those between the three subject-related assessments.

The correlation between maths and science (0.90) is comfortably the strongest (despite the relationship between reading and science at the top end of the distribution noted above).

PISA problem solving correlations capture

Correlations are broadly similar across jurisdictions, but the Report notes that the association is comparatively weak in some of these, including Hong Kong. Students here are more likely to perform poorly on problem solving and well on other assessments, or vice versa.

There is also broad consistency at different performance levels, but the Report identifies those jurisdictions where students with the same level of performance exceed expectations in relation to problem-solving performance. These include South Korea, the USA, England, Australia, Singapore and – to a lesser extent – Canada.

Those with lower than expected performance include Shanghai, Ireland, Hong Kong, Taiwan and Finland.

The Report notes:

‘In Shanghai-China, 86% of students perform below the expected level in problem solving, given their performance in mathematics, reading and science. Students in these countries/economies struggle to use all the skills that they demonstrate in the other domains when asked to perform problem-solving tasks.’

However, there is variation according to students’ maths proficiency:

  • Jurisdictions whose high scores on problem solving are mainly attributable to strong performers in maths include Australia, England and the USA. 
  • Jurisdictions whose high scores on problem solving are more attributable to weaker performers in maths include Ireland. 
  • Jurisdictions whose lower scores in problem solving are more attributable to weakness among strong performers in maths include Korea. 
  • Jurisdictions whose lower scores in problem solving are more attributable to weakness among weak performers in maths include Hong Kong and Taiwan. 
  • Jurisdictions whose weakness in problem solving is fairly consistent regardless of performance in maths include Shanghai and Singapore.

The Report adds:

‘In Italy, Japan and Korea, the good performance in problem solving is, to a large extent, due to the fact that lower performing students score beyond expectations in the problem-solving assessment….This may indicate that some of these students perform below their potential in mathematics; it may also indicate, more positively, that students at the bottom of the class who struggle with some subjects in school are remarkably resilient when it comes to confronting real-life challenges in non-curricular contexts…

In contrast, in Australia, England (United Kingdom) and the United States, the best students in mathematics also have excellent problem-solving skills. These countries’ good performance in problem solving is mainly due to strong performers in mathematics. This may suggest that in these countries, high performers in mathematics have access to – and take advantage of – the kinds of learning opportunities that are also useful for improving their problem-solving skills.’

What proportion of high performers in problem solving are also high performers in one of the other assessments?

The percentages of high achieving students (proficiency level 5 and above) in my sample of eleven jurisdictions who perform equally highly in each of the three domain-specific assessments are shown in Table 6 and Chart 7 below.

These show that Shanghai leads the way in each case, with 98.0% of all students who achieve proficiency level 5+ in problem solving also achieving the same outcome in maths. For science and reading the comparable figures are 75.1% and 71.7% respectively.

Taiwan is the nearest competitor in respect of problem solving plus maths, Finland in the case of problem solving plus science and Ireland in the case of problem solving plus reading.

South Korea, Taiwan and Canada are atypical of the rest in recording a higher proportion of problem solving plus reading at this level than problem solving plus science.

Singapore, Shanghai and Ireland are the only three jurisdictions that score above 50% on all three of these combinations. However, the only jurisdictions that exceed the OECD averages in all three cases are Singapore, Hong Kong, Shanghai and Finland.

Table 6: PISA problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments

  PS + Ma PS + Sci PS + Re
Singapore 84.1 57.0 50.2
South Korea 73.5 34.1 40.3
Hong Kong 79.8 49.4 48.9
Shanghai 98.0 75.1 71.7
Taiwan 93.0 35.3 43.7
Canada 57.7 43.9 44.5
Australia 61.3 54.9 47.1
Finland 66.1 65.4 49.5
England (UK) 59.0 52.8 41.7
USA 54.6 46.9 45.1
Ireland 59.0 57.2 52.0
OECD Average 63.5 45.7 41.0

Chart 7: PISA Problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments

Problem solving chart 7.

What proportion of students achieve highly in one or more assessments?

Table 7 and Chart 8 below show how many students in each of my sample achieved proficiency level 5 or higher in problem-solving only, in problem solving and one or more assessments, in one or more assessments but not problem solving and in at least one assessment (ie the total of the three preceding columns).

I have also repeated in the final column the percentage achieving this proficiency level in each of maths, science and reading. (PISA has not released information about the proportion of students who achieved this feat across all four assessments.)

These reveal that the percentages of students who achieve proficiency level 5+ only in problem solving are very small, ranging from 0.3% in Shanghai to 6.7% in South Korea.

Conversely, the percentages of students achieving proficiency level 5+ in any one of the other assessments but not in problem solving are typically significantly higher, ranging from 4.5% in the USA to 38.1% in Shanghai.

There is quite a bit of variation in terms of whether jurisdictions score more highly on ‘problem solving and at least one other’ (second column) and ‘at least one other excluding problem solving (third column).

More importantly, the fourth column shows that the jurisdiction with the most students achieving proficiency level 5 or higher in at least one assessment is clearly Shanghai, followed by Singapore, Hong Kong, South Korea and Taiwan in that order.

The proportion of students achieving this outcome in Shanghai is close to three times the OECD average, comfortably more than twice the rate achieved in any of the ‘Western’ countries and three times the rate achieved in the USA.

The same is true of the proportion of students achieving this level in the three domain-specific assessments.

On this measure, South Korea and Taiwan fall significantly behind their Asian competitors, and the latter is overtaken by Australia, Finland and Canada.

 .

Table 7: Percentage achieving proficiency level 5+ in different combinations of PISA assessments

  PS only% PS + 1 or more% 1+ butNot PS% L5+ in at least one % L5+ in Ma + Sci + Re %
Singapore 4.3 25.0 16.5 45.8 16.4
South Korea 6.7 20.9 11.3 38.9 8.1
Hong Kong 3.4 15.9 20.5 39.8 10.9
Shanghai 0.3 17.9 38.1 56.3 19.6
Taiwan 1.2 17.1 20.4 38.7 6.1
Canada 5.5 12.0 9.9 27.4 6.5
Australia 4.7 12.0 7.7 24.4 7.6
Finland 3.0 12.0 11.9 26.9 7.4
England (UK) 4.4 9.8 6.8 21.0 5.7* all UK
USA 4.1 7.5 4.5 16.1 4.7
Ireland 2.6 6.8 10.1 19.5 5.7
OECD Average 3.1 8.2 8.5 19.8 4.4

Chart 8: Percentage achieving proficiency level 5+ in different combinations of PISA assessments

Problem solving chart 8

The Report comments:

The proportion of students who reach the highest levels of proficiency in at least one domain (problem solving, mathematics, reading or science) can be considered a measure of the breadth of a country’s/economy’s pool of top performers. By this measure, the largest pool of top performers is found in Shanghai-China, where more than half of all students (56%) perform at the highest levels in at least one domain, followed by Singapore (46%), Hong  Kong-China (40%), Korea and Chinese  Taipei (39%)…Only one OECD country, Korea, is found among the five countries/economies with the largest proportion of top performers. On average across OECD countries, 20% of students are top performers in at least one assessment domain.

The proportion of students performing at the top in problem solving and in either mathematics, reading or science, too can be considered a measure of the depth of this pool. These are top performers who combine the mastery of a specific domain of knowledge with the ability to apply their unique skills flexibly, in a variety of contexts. By this measure, the deepest pools of top performers can be found in Singapore (25% of students), Korea (21%), Shanghai-China (18%) and Chinese Taipei (17%). On average across OECD countries, only 8% of students are top performers in both a core subject and in problem solving.’

There is no explanation of why proficiency level 5 should be equated by PISA with the breadth of a jurisdiction’s ‘pool of top performers’. The distinction between proficiency levels 5 and 6 in this respect requires further discussion.

In addition to updated ‘all-rounder’ data showing what proportion of students achieved this outcome across all four assessments, it would be really interesting to see the proportion of students achieving at proficiency level 6 across different combinations of these four assessments – and to see what proportion of students achieving that outcome in different jurisdictions are direct beneficiaries of targeted support, such as a gifted education programme.

In the light of this analysis, what are jurisdictions’ priorities for improving  problem solving performance?

Leaving aside strengths and weaknesses in different elements of problem solving discussed above, this analysis suggests that the eleven jurisdictions in my sample should address the following priorities:

Singapore has a clear lead at proficiency level 6, but falls behind South Korea at level 5 (though Singapore re-establishes its ascendancy when levels 5 and 6 are considered together). It also has more level 1 performers than South Korea. It should perhaps focus on reducing the size of this tail and pushing through more of its mid-range performers to level 5. There is a pronounced imbalance in favour of boys at level 6, so enabling more girls to achieve the highest level of performance is a clear priority. There may also be a case for prioritising the children of semi-skilled workers.

South Korea needs to focus on getting a larger proportion of its level 5 performers to level 6. This effort should be focused disproportionately on girls, who are significantly under-represented at both levels 5 and 6. South Korea has a very small tail to worry about – and may even be getting close to minimising this. It needs to concentrate on improving the problem solving skills of its stronger performers in maths.

Hong Kong has a slightly bigger tail than Singapore’s but is significantly behind at both proficiency levels 5 and 6. In the case of level 6 it is equalled by Canada. Hong Kong needs to focus simultaneously on reducing the tail and lifting performance across the top end, where girls and weaker performers in maths are a clear priority.

Shanghai has a similar profile to Hong Kong’s in all respects, though with somewhat fewer level 6 performers. It also needs to focus effort simultaneously at the top and the bottom of the distribution. Amongst this sample, Shanghai has the worst under-representation of girls at level 5 and levels 5 and 6 together, so addressing that imbalance is an obvious priority. It also demonstrated the largest variation in performance against PISA’s IESC index, which suggests that it should target young people from disadvantaged backgrounds, as well as the children of semi-skilled workers.

Taiwan is rather similar to Hong Kong and Shanghai, but its tail is slightly bigger and its level 6 cadre slightly smaller, while it does somewhat better at level 5. It may need to focus more at the very bottom, but also at the very top. Taiwan also has a problem with high-performing girls, second only to Shanghai as far as level 5 and levels 5 and 6 together are concerned. However, like Shanghai, it does comparatively better than the other ‘Asian Tigers’ in terms of girls at level 6. It also needs to consider the problem solving performance of its weaker performers in maths.

Canada is the closest western competitor to the ‘Asian Tigers’ in terms of the proportions of students at levels 1 and 5 – and it already outscores Shanghai and Taiwan at level 6. It needs to continue cutting down the tail without compromising achievement at the top end. Canada also has small but significant gender imbalances in favour of boys at the top end.

Australia by comparison is significantly worse than Canada at level 1, broadly comparable at level 5 and somewhat worse at level 6. It too needs to improve scores at the very bottom and the very top. Australia’s gender imbalance is more pronounced at level 6 than level 5.

Finland has the same mean score as Australia’s but a smaller tail (though not quite as small as Canada’s). It needs to improve across the piece but might benefit from concentrating rather more heavily at the top end. Finland has a slight gender imbalance in favour of girls at level 5, but boys are more in the ascendancy at level 6 than in either England or the USA. As in Australia, this latter point needs addressing.

England has a profile similar to Australia’s, but less effective at all three selected proficiency levels. It is further behind at the top than at the bottom of the distribution, but needs to work hard at both ends to catch up the strongest western performers and maintain its advantage over the USA and Ireland. Gender imbalances are small but nonetheless significant.

USA has a comparatively long tail of low achievement at proficiency level 1 and, with the exception of Ireland, the fewest high achievers. This profile is very close to the OECD average. As in England, the relatively small size of gender imbalances in favour of boys does not mean that these can be ignored.

Ireland has the longest tail of low achievement and the smallest proportion of students at proficiency levels 5, 6 and 5 and 6 combined. It needs to raise the bar at both ends of the achievement distribution. Ireland has a larger preponderance of boys at level 6 than its Western competitors and this needs addressing. The limited socio-economic evidence suggests that Ireland should also be targeting the offspring of parents with semi-skilled and elementary occupations.

So there is further scope for improvement in all eleven jurisdictions. Meanwhile the OECD could usefully provide a more in-depth analysis of high achievers on its assessments that features:

  • Proficiency level 6 performance across the board.
  • Socio-economic disparities in performance at proficiency levels 5 and 6.
  • ‘All-rounder’ achievement at these levels across all four assessments and
  • Correlations between success at these levels and specific educational provision for high achievers including gifted education programmes.

.

GP

April 2014

PISA 2012: International Comparison of High Achievers’ Performance

.

This post examines what PISA 2012 can tell us about the comparative performance of high achievers in England, other English-speaking countries and those that top the PISA rankings.

Introductory Brochure for PISA 2012 by Kristjan Paur

Introductory Brochure for PISA 2012 by Kristjan Paur

It draws on a similar range of evidence to that deployed in my post on the PISA 2009 results (December 2010).

A more recent piece, ‘The Performance of Gifted High Achievers in TIMSS, PIRLS and PISA’ (January 2013) is also relevant.

The post reviews:

  • How the PISA 2012 Assessment Framework defines reading, mathematical and scientific literacy and its definitions of high achievement in each of the three core domains.
  • How average (headline) performance on the three core measures has changed in each jurisdiction compared with PISA 2006 and PISA 2009.
  • By comparison, how high achievers’ performance – and the balance between high and low achievers’ performance – has changed in each jurisdiction over the same period.
  • How jurisdictions compare on the ‘all-rounder’ measure, derived from achievement of a high performance threshold on all three assessments.

The twelve jurisdictions included in the main analysis are: Australia, Canada, England, Finland, Hong Kong (China), Ireland, New Zealand, Shanghai (China), Singapore, South Korea, Taiwan and the USA.

The post also compares the performance of the five home countries against the high achievement thresholds. I have foregrounded this analysis, which appears immediately below, save only for the headline (but potentially misleading) ‘top 10’ high achiever rankings for 2012.

.

Headlines

 .

World Leaders against PISA’s High Achievement Benchmarks

The top 10 performers in PISA 2012 against the high achievement benchmarks (Level 5 and above), in reading, maths and science respectively, are set out in Table 1 below.

The 2009 rankings are shown in brackets and the 2012 overall average rankings in bold, square brackets. I have also included England’s rankings.

.

Table 1

Rank Reading Maths Science
1 Shanghai (1) [1] Shanghai (1) [1] Shanghai (1) [1]
2 Singapore (3) [3] Singapore (2) [2] Singapore (2) [3]
3 Japan (5) [4] Taiwan (4) [4] Japan (5) [4]
4 Hong Kong (9) [2] Hong Kong (3) [3] Finland (3) [5]
5 S. Korea (6) [5] S Korea (5) [5] Hong Kong (6) [2]
6 N Zealand (2) [13] Liechtenstein (13) [8] Australia (7) [16]
7 Finland (4) [6] Macao (15) [6] N Zealand (4) [18]
8 Canada (7=) [8] Japan (8) [7] Estonia (17) [6]
9 France (13) [21] Switzerland (6) [9] Germany (8) [12]
10 Belgium (10) [16] Belgium (9) [15] [15] Netherlands (9) [14]
England 19th (19) [23] England 24th (32) [25] England 11th  (12) [18]

 .

On the basis of these crude rankings alone, it is evident that Shanghai has maintained its ascendancy across all three domains.

Singapore has reinforced its runner-up position by overtaking New Zealand in reading. Hong Kong and Japan also make it into the top ten in all three domains.

Notable improvements in the rankings have been made by:

  • Japan, Hong Kong and France in reading
  • Liechtenstein and Macao in maths
  • Japan and Estonia in science

.

.

Jurisdictions falling down the rankings include:

  • Australia, New Zealand and Finland in reading
  • Finland and Switzerland in maths
  • Canada and New Zealand in science.

Those whose high achiever rankings significantly exceed their average rankings include:

  • New Zealand, France and Belgium in reading
  • Belgium in maths
  • Australia, New Zealand, Germany and the Netherlands in science

The only one of the top ten jurisdictions exhibiting the reverse pattern with any degree of significance is Hong Kong, in science.

On this evidence, England has maintained its relatively strong showing in science and a mid-table position in reading, but it has slipped several places in maths.

Comparing England’s rankings for high achievers with its rankings for average performance:

  • Reading 19th versus 23rd
  • Maths 24th versus 25th
  • Science 11th versus 18th

This suggests that England is substantively stronger at the top end of the achievement spectrum in science, slightly stronger in reading and almost identical in maths. (The analysis below explores whether this is borne out by the proportions of learners achieving the relevant PISA thresholds.)

Overall, these rankings suggest that England is a respectable performer at the top end, but nothing to write home about. It is not deteriorating, relatively speaking – with the possible exception of mathematics – but it is not improving significantly either. The imbalance is not atypical and it requires attention, but only as part of a determined effort to build performance at both ends.

.

Comparing the Home Countries’ Performance

Table 2 below shows how each home country has performed at Level 5 and above in each of the three core PISA assessments since 2006.

.

Table 2

  2012 Level 5+ 2009 Level 5+ 2006 Level 5+
  Read Maths Sci Read Maths Sci Read Maths Sci
England 9.1 12.4 11.7 8.1 9.9 11.6 9.2 11.2 14.0
N Ireland 8.3 10.3 10.3 9.3 10.3 11.8 10.4 12.2 13.9
Scotland 7.8 10.9 8.8 9.2 12.3 11.0 8.5 12.1 12.5
Wales 4.7 5.3 5.7 5.0 5.0 7.8 6.4 7.2 10.9
UK 8.8 11.9 11.1 8.0 9.9 11.4 9.0 11.2 13.8
OECD average 8.4 12.6 8.4 7.6 12.7 8.5 8.6 13.3 9.0

.

In 2012, England is ahead of the other home countries in all three domains. Northern Ireland is runner-up in reading and science, Scotland in maths. Wales is a long way behind the other four in all three assessments.

Only England tops the OECD average in reading. All the home countries fall below the OECD average in maths, though all but Wales are above it in science.

Compared with 2006, England’s performance has changed little in reading, increased somewhat in maths (having fallen back betweentimes) and fallen quite significantly in science.

In comparison, Northern Ireland is on a downward trend in all three domains, as is Scotland (though it produced small improvements in maths and reading in 2009). Wales has fallen back significantly in science, though somewhat less so in reading and maths.

It seems that none of the home countries is particularly outstanding when it comes to the performance of their high achievers, but England is the strongest of the four, while Wales is clearly the weakest.

A slightly different perspective can be gained by comparing high and low performance in 2012.

Table 3 below shows that the proportion of low achievers is comfortably larger than the proportion of high achievers. This is true of all the home countries and all subjects, though the difference is less pronounced in science across the board and also in Scotland. Conversely, the imbalance is much more significant in Wales.

 .

Table 3

2012 Reading Maths Science
  L5+6 L1+below L5+6 L1+below L5+6 L1+below
England 9.1 16.7 12.4 21.7 11.7 14.9
N Ireland 8.3 16.7 10.3 24.1 10.3 16.8
Scotland 7.8 12.5 10.9 18.2 8.8 12.1
Wales 4.7 20.6 5.3 29.0 5.7 19.4
UK 8.8 16.7 11.9 21.8 11.1 15.0
OECD average 8.4 8.4 12.6 23.0 8.4 17.8

.

The ‘tail’ in reading is significantly higher than the OECD average in all four countries but – with the exception of Wales – somewhat lower in science.

In maths, the ‘tail’ is higher than the OECD average in Wales and Northern Ireland, but below average in England and Scotland.

The average figures suggest that, across the OECD as a whole, the top and bottom are broadly balanced in reading, there is a small imbalance in science towards the bottom end and a more significant imbalance in maths, again towards the bottom end.

By comparison, the home countries have a major issue at the bottom in reading, but are less significantly out of line in maths and science.

Overall, there is some evidence here of a longish tail of low achievement, but with considerable variation according to country and domain.

The bottom line is that all of the home countries have significant issues to address at both the top and the bottom of the achievement distribution. Any suggestion that they need to concentrate exclusively on low achievers is not supported by this evidence.

.

Francois Peron National Park by Gifted Phoenix 2013

Francois Peron National Park by Gifted Phoenix 2013

.

Background to PISA

 .

What is PISA?

The Programme for International Student Assessment (PISA) is a triennial OECD survey of the performance of 15 year-old students which typically covers maths, science and reading. Science was the main focus in 2006, reading in 2009 and maths in 2012.

PISA 2012 also included a computer-based assessment of problem-solving and a financial literacy assessment. However, some jurisdictions did not participate in the problem-solving exercise owing to ‘technical issues’ and financial literacy was undertaken by some countries only, as an optional extra.

Fifty-eight jurisdictions took part in PISA 2006 and 74 in PISA 2009 (65 undertook the assessment in 2009 and a further nine did so in 2010).

To date, a total of 65 jurisdictions have also taken part in PISA 2012.

According to the OECD’s own FAQ:

  • PISA tests reading, mathematical and scientific literacy ‘in terms of general competencies, that is, how well students can apply the knowledge and skills they have learned at school to real-life challenges. PISA does not test how well a student has mastered a school’s specific curriculum.’
  • Student performance in each field is comparable between assessments – one cannot reasonably argue therefore that a drop in performance is attributable to a more difficult assessment.
  • Each participating jurisdiction receives an overall score in each subject area – the average of all its students’ scores. The average score among OECD countries is set at 500 points (with a standard deviation of 100 points).
  • Participating jurisdictions are ranked in each subject area according to their mean scores, but:

‘is not possible to assign a single exact rank in each subject to each country…because PISA tests only a sample of students from each country and this result is then adjusted to reflect the whole population of 15-year-old students in that country. The scores thus reflect a small measure of statistical uncertainty and it is therefore only possible to report the range of positions (upper rank and lower rank) within which a country can be placed.’

Outside the confines of reports by the OECD and its national contractors, this is honoured more in the breach than the observance.

  • Scores are derived from scales applied to each subject area. Each scale is divided into levels, Level 1 being the lowest and Level 6 typically the highest

Further background detail on the 2012 assessments is set out in the ‘PISA 2012 Assessment and Analytical Framework’ (2013).

This explains that the framework for assessing maths was completely revised ahead of the 2012 cycle and ‘introduces three new mathematical processes that form the basis of developments in the reporting of PISA mathematics outcomes’, whereas those for science and reading were unchanged (the science framework was revised when it was the main focus in 2006 and ditto for reading in 2009).

The Framework clarifies the competency-based approach summarised in the FAQ:

‘ISA focuses on competencies that 15-year-old students will need in the future and seeks to assess what they can do with what they have learnt – reflecting the ability of students to continue learning throughout their lives by applying what they learn in school to non-school environments, evaluating their choices and making decisions. The assessment is informed, but not constrained, by the common denominator of national curricula. Thus, while it does assess students’ knowledge, PISA also examines their ability to reflect, and to apply their knowledge and experience to real-life issues in a reflective way. For example, in order to understand and evaluate scientific advice on food safety, an adult would need not only to know some basic facts about the composition of nutrients, but also to be able to apply that information.’

It explains that between 4,500 and 10,000 students drawn from 150 schools are typically tested in each jurisdiction.

Initial reports suggested that England would not take part in the 2012 assessments of problem-solving and financial literacy, but it subsequently emerged that this decision had been reversed in respect of problem-solving.

.

Setting PISA Outcomes in Context

There are plenty of reasons why one should not place excessive weight on PISA outcomes:

  • The headline rankings carry a significant health warning, which remains important, even though it is commonly ignored.

‘As the PISA 2000 and PISA 2003 samples for the United Kingdom did not meet the PISA response-rate standards, no trend comparisons are possible for these years.’ (p.1)

Hence, for the UK at least, reliable comparisons with pre-2006 results are off the table.

‘The pressure from policymakers for advice based on PISA interacts with this unhealthy mix of policy and technical people. The technical experts make sure that the appropriate caveats are noted, but the warnings are all too often ignored by the needs of the policy arm of PISA. As a result, PISA reports often list the known problems with the data, but then the policy advice flows as though those problems didn’t exist. Consequently, some have argued that PISA has become a vehicle for policy advocacy in which advice is built on flimsy data and flawed analysis.’

  • PISA is not the only game in town. TIMSS and PIRLS are equally significant, though relatively more focused on content knowledge, whereas PISA is primarily concerned with the application of skills in real life scenarios.
  • There are big political risks associated with worshipping at the PISA altar for, if the next set of outcomes is disappointing, the only possible escape route is to blame the previous administration, a strategy that wears increasingly thin with the electorate the longer the current administration has been in power.

 .

.

It would be quite wrong to dismiss PISA results out of hand, however. They are a significant indicator of the comparative performance of national (and regional) education systems. But they are solely an indicator, rather than a statement of fact.

.

What is assessed – and what constitutes high achievement – in each domain

The Assessment and Analytical Framework provides definitions of each domain and level descriptors for each level within the assessments.

.

Mathematical Literacy

The PISA 2012 mathematics framework defines mathematical literacy as:

‘An individual’s capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena. It assists individuals to recognise the role that mathematics plays in the world and to make the well-founded judgments and decisions needed by constructive, engaged and reflective citizens.’

Three aspects of maths are identified:

  • Mathematical processes and the fundamental capabilities underlying them. Three processes are itemised: formulating situations mathematically; employing mathematical concepts, facts, procedures and reasoning; and interpreting, applying and evaluating mathematical outcomes. The capabilities are: communication; mathematizing (transforming a real life problem to a mathematical form); representation; reasoning and argument; devising problem-solving strategies; using symbolic, formal and technical language and operations; and using mathematical tools.
  • Content knowledge, comprising four elements: change and relationships; space and shape; quantity; and uncertainty and data.
  • The contexts in which mathematical challenges are presented: personal; occupational; societal and scientific.

Six levels are identified within the PISA 2012 mathematics scale’. The top two are described thus:

  • ‘At Level 6 students can conceptualise, generalise and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply their insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments and the appropriateness of these to the original situations.’
  • ‘At Level 5 students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare and evaluate appropriate problem-solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriate linked representations, symbolic and formal characterisations and insight pertaining to these situations. They can reflect on their actions and formulate and communicate their interpretations and reasoning.’

.

Reading literacy

Reading Literacy is defined as:

‘An individual’s capacity to understand, use, reflect on and engage with written texts, in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society.’

The assessment ‘is built on three major task characteristics’:

  • Situation – the context or purpose for which reading takes place, which may be personal (practical and intellectual interests), public (activities and concerns of society), educational (for learning purposes) or occupational (accomplishment of a task).
  • Text – the range of material that is read, which may be print or digital. In the case of digital text, the environment may be authored (the reader is receptive), message based, or mixed. In the case of both print and digital text, the format may be continuous (sentences and paragraphs), non-continuous (eg graphs, lists), mixed or multiple, while the text type may be description, narration, exposition, argumentation, instruction or transaction.
  • Aspect – how readers engage with the text, which includes accessing and retrieving; integrating and interpreting; and reflecting and evaluating.

Separate proficiency scales are provided for print and digital reading respectively. Both describe achievement in terms of the task rather than the student.

The print reading scale has six levels (Level One is subdivided into two). The top levels are described as follows:

  • Level 6: Tasks at this level typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.
  • Level 5: Tasks at this level that involve retrieving information require the reader to locate and organise several pieces of deeply embedded information, inferring which information in the text is relevant. Reflective tasks require critical evaluation or hypothesis, drawing on specialised knowledge. Both interpretative and reflective tasks require a full and detailed understanding of a text whose content or form is unfamiliar. For all aspects of reading, tasks at this level typically involve dealing with concepts that are contrary to expectations.

For digital reading there are only four levels, categorised as 2-5. Level 5 is described thus:

‘Tasks at this level typically require the reader to locate, analyse and critically evaluate information, related to an unfamiliar context, in the presence of ambiguity. They require generating criteria to evaluate the text. Tasks may require navigation across multiple sites without explicit direction, and detailed interrogation of texts in a variety of formats.’

 .

Scientific literacy

Scientific literacy is defined as:

‘An individual’s scientific knowledge and use of that knowledge to identify questions, to acquire new knowledge, to explain scientific phenomena, and to draw evidence-based conclusions about science-related issues, understanding of the characteristic features of science as a form of human knowledge and enquiry, awareness of how science and technology shape our material, intellectual, and cultural environments, and willingness to engage in science-related issues, and with the ideas of science, as a reflective citizen.’

The domain consists of four interrelated aspects:

  • Context – life situations involving science and technology. Contexts are personal, social or global and may relate to health, natural resources, environment, hazard or the frontiers of science and technology.
  • Knowledge – knowledge of the natural world (covering physical systems, living systems, earth and space systems and technology systems) and knowledge about science itself (scientific enquiry and scientific explanations).
  • Competencies , of which  three are identified: identify scientific issues, explain phenomena scientifically and use scientific evidence.
  • Attitudes, including an interest in science, support for scientific enquiry and a motivation to act responsibly towards the natural world.

A 6-level proficiency scale is defined with the top levels explained as follows:

  • At Level 6, students can consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.
  • At Level 5, students can identify the scientific components of many complex life situations, apply both scientific concepts and knowledge about science to these situations, and can compare, select and evaluate appropriate scientific evidence for responding to life situations. Students at this level can use well-developed inquiry abilities, link knowledge appropriately and bring critical insights to situations. They can construct explanations based on evidence and arguments based on their critical analysis.

.

Denham Sunset by Gifted Phoenix

Denham Sunset by Gifted Phoenix

.

 Changes in Average Performance in Reading, Maths and Science

The OECD published PISA outcomes for maths, science and reading on 3 December 2013.

Similarly, the PISA National Report on England, published simultaneously, covers the three core assessments.

This section looks briefly at the headline average scores and rankings across the selected sample of twelve jurisdictions, principally to enable comparisons to be drawn with the subsequent analysis of high achievers’ performance.

I apologise in advance for any transcription errors. Please let me know if you spot any and I will correct the tables accordingly.

.

Reading

Table 4 below gives the headline average numerical scores and ranks in reading from PISA 2006, 2009 and 2012 respectively.

.

Table 4

Country 2012 2009 2006
score rank score rank score rank
Australia 512↓ 13↓ 515↑ 9↓ 513 7
Canada 523↓ 8↓ 524↓ 6↓ 527 4
Finland 524↓ 6↓ 536↓ 3↓ 547 2
Hong Kong 545↑ 2↑ 533↓ 4↓ 536 3
Ireland 523↑ 7↑ 496↓ 21↓ 517 6
S Korea 536↓ 5↓ 539↓ 2↓ 556 1
New Zealand 512↓ 13↓ 521 7↓ 521 5
Shanghai 570↑ 1= 556 1 N/A N/A
Singapore 542↑ 3↑ 526 5 N/A N/A
Taiwan 523↑ 8↑ 495↓ 23↓ 496 16
UK (England) 500↑ 23↑ 495↓ 25↓ 496 17
US 498↓ 24↓ 500 17 N/A N/A
OECD Average 496↑ 493↓ 495

.

Shanghai has retained the ascendancy it established in 2009, adding a further 14 points to its average 2009 score. Whereas it was only 17 points beyond its nearest competitor in 2009, that lead has now been extended to 25 points.

South Korea’s performance has fallen slightly and it has been leapfrogged in the rankings by Hong Kong (up 12 points), Singapore (up 16 points), and Japan (not included in the table).

Two countries making even more significant improvements are Taiwan (up 28 points) and Ireland (up 27 points). Conversely, the performance of Finland (down 12 points) and New Zealand (down 9 points) has noticeably declined. Finland’s performance has been declining since 2006.

Results remain broadly unchanged in Australia, Canada, England, South Korea and the USA. South Korea has been unable to make up the ground it lost in 2009.

Ireland’s huge improvement from a very similar starting point in 2009 throws England’s lack of progress into sharper relief, although it is largely catching up lost ground in 2009, having performed relatively well in 2006.

England, like the US, continues to perform slightly above the OECD average, but has fallen further behind the Asian Tigers. The gap with the world’s leader in each assessment is now 70 points (up from 60 in 2006),

.

Maths

Table 5 below sets out scores and rankings in maths since PISA 2006

.

Table 5

Country 2012 2009 2006
  score rank score rank score rank
Australia 504↓ 19↓ 514↓ 15↓ 520 13
Canada 518↓ 13↓ 527= 10↓ 527 7
Finland 519↓ 12↓ 541↓ 6↓ 548 2
Hong Kong 561↑ 3= 555↑ 3 547 3
Ireland 501↑ 20↑ 487↓ 32↓ 501 22
S Korea 554↑ 5↓ 546↓ 4 547 4
New Zealand 500↓ 23↓ 519↓ 13↓ 522 11
Shanghai 613↑ 1= 600 1 N/A N/A
Singapore 573↑ 2= 562 2 N/A N?A
Taiwan 560↑ 4↑ 543↓ 5↓ 549 1
UK (England) 495↑ 25↑ 493↓ 27↓ 495 24
US 481↓ 36↓ 487↑ 31↑ 474 35
OECD Average 494↓   496↓   497  

 .

The overall picture is rather similar to that for reading.

Shanghai (up 13 points) and Singapore (up 11 points) continue to stretch away at the head of the field. Taiwan (up 17 points) has also made significant improvement and is now close behind Hong Kong.

There has been relatively more modest improvement in Hong Kong and South Korea (which has been overtaken by Taiwan).

Elsewhere, Ireland has again made significant headway and is back to the level it achieved in 2006. But Finland’s score has plummeted 22 points. New Zealand is not far behind (down 19). There have also been significant falls in the performance of Australia (down 10) Canada (down 9) and the US (down 6).

The US is now trailing 13 points below the OECD average, having failed to sustain the substantial improvement it made in 2009.

In England meanwhile, results are largely unchanged, though now just above the OECD average rather than just below it.

The gap between England and world leader Shanghai has reached 118 points, compared with a gap in 2006 between England and world leader Taiwan of 54 points. The gap between England and its main Commonwealth competitors has narrowed, but only as a consequence of the significant declines in the latter.

.

Science

Table 6 below provides the same data in respect of science.

.

Table 6

Country 2012 2009 2006
  score rank score rank score rank
Australia 521↓ 16↓ 527= 10↓ 527 8
Canada 525↓ 10↓ 529↓ 8↓ 534 3
Finland 545↓ 5↓ 554↓ 2↓ 563 1
Hong Kong 555↑ 2↑ 549↑ 3↓ 542 2
Ireland 522↑ 15↑ 508 20 508 20
S Korea 538= 7↓ 538↑ 6↑ 522 11
New Zealand 516↓ 18↓ 532↑ 7 530 7
Shanghai 580↑ 1= 575 1 N/A N/A
Singapore 551↑ 3↑ 542 4 N/A N/A
Taiwan 523↑ 13↓ 520↓ 12↓ 532 4
UK (England) 516↑ 18↓ 515↓ 16↓ 516 14
US 497↓ 28↓ 502↑ 23↑ 489 29
OECD Average 501=   501↑   498  

 .

Shanghai is again out in front, having repeated the clean sweep it achieved in 2009.

However, it has managed only a 5-point improvement, while Taiwan has improved by 13 points and Singapore by 9 points. Hong Kong has moved up by 6 points and Taiwan by 3 points, but South Korea’s score is unchanged from 2009.

New Zealand has dropped by 16 points and Finland by 9 points compared with 2009. There have been comparatively smaller declines in Australia and Canada, while Ireland has once again improved dramatically, by 14 points, and – in this case – the improvement is not simply clawing back ground lost in 2009.

England remains comfortably above the OECD average, but has made negligible improvement since 2006. US performance has dropped back below the OECD average as it has lost some of the ground it made up in 2009.

The gap between England and the world leaders is comparable with that in maths and significantly lower than in reading. The gap is now 64 points, compared with just 47 points in 2006.

.

Overall

Overall, the Asian Tigers have consolidated their positions by maintaining improvement in all three domains, though South Korea appears to be struggling to maintain the success of earlier years.

Finland and New Zealand are in worrying decline while Ireland is making rapid progress in the opposite direction.

.

.

The US results are stagnant, remaining comparatively poor, particularly in maths.

England has broadly maintained its existing performance profile, neither improving nor declining significantly. But, it is conspicuously losing ground on the world leaders, especially in maths. Other than in science it is close to the OECD average.

There is nothing here to give comfort to either the previous Government or the present incumbents. There might be some limited relief – even a degree of shadenfreude – in the fact that several better-placed nations are falling back more severely. But of course one cannot win the ‘global race’ by simply standing still.

.

Floral by Gifted Phoenix

Floral by Gifted Phoenix

 .

Changes in High Achievers’ Performance

So much for the average headline figures.

The remainder of this post is focused on  high achievement data. The ensuing sections once more examine reading, maths and science in that order, followed by a section on all-rounders.

.

Reading

Table 7 shows how the percentage achieving higher levels in reading has changed since PISA 2006, providing separate columns for Level 6 and above level 5 respectively (there was no Level 6 in 2006)..

.

Table 7

Country 2012 2009 2006
Level 6 Levels 5 and 6 Level 6 Levels 5+6 Level 5
Australia 1.9 11.7 2.1 12.8 10.6
Canada 2.1 12.9 1.8 12.8 14.5
Finland 2.2 13.5 1.6 14.5 16.7
Hong Kong 1.9 16.8 1.2 12.4 12.8
Ireland 1.3 11.4 0.7 7.0 11.7
S Korea 1.6 14.2 1.0 12.9 21.7
New Zealand 3.0 13.9 2.9 15.8 15.9
Shanghai 3.8 25.1 2.4 19.4 N/A
Singapore 5.0 21.2 2.6 15.7 N/A
Taiwan 1.4 11.8 0.4 5.2 4.7
UK (England) 1.3 9.1 1.0 8.1 9.2
US 1.0 7.9 1.5 9.9 N/A
OECD Average 1.1 8.4 1.0 7.0 8.6

 

This reveals that:

  • In 2012, Singapore has a clear lead on its competitors at Level 6, but it is overtaken by Shanghai at Level 5 and above. New Zealand also remains comparatively strong at Level 6, but falls back significantly when Levels 5 and 6 are combined.
  • The other Asian Tigers do not perform outstandingly well at Level 6: Hong Kong, South Korea and Taiwan are all below 2.0%, behind Canada and Finland. However, all but Taiwan outscore their competitors when Levels 5 and 6 are combined.
  • Hong Kong, Shanghai, Singapore and Taiwan are all making fairly strong progress over time. Patterns are rather less discernible for other countries, though there is a downward trend in the US.
  • In Finland, New Zealand and Canada – countries that seem to be falling back overall – the percentage of Level 6 readers continues to improve. This might suggest that the proportion of the highest performers in reading is not significantly affected when national performance begins to slide.
  • When judged against these world leaders, England’s comparative performance is brought into much clearer perspective. At Level 6 it is not far behind Taiwan, South Korea and even Hong Kong. But, at Level 5 and above, the gap is somewhat more pronounced. England is improving, but very slowly.
  • The comparison with Taiwan is particularly stark. In 2006, England had roughly twice as many students performing at Level 5. By 2009 Taiwan had caught up some of this ground and, by 2012, it had overtaken.

Table 8 compares changes since PISA 2006 in national performance at Level 5 and above with changes at Level 1 and below.

This is intended to reveal the balance between top and bottom – and whether this sample of world-leading and other English-speaking jurisdictions is making consistent progress at either end of the spectrum.

.

 Table 8

Country Levels 5 (and 6 from 2009) Level 1 (or equivalent) and below
2006 2009 2012 2006 2009 2012
Australia 10.6 12.8 11.7 13.4 14.3 14.2
Canada 14.5 12.8 12.9 11.0 10.3 10.9
Finland 16.7 14.5 13.5 4.8 8.1 11.3
Hong Kong 12.8 12.4 16.8 7.2 8.3 6.8
Ireland 11.7 7.0 11.4 12.2 17.2 9.7
S Korea 21.7 12.9 14.2 5.7 5.8 7.6
New Zealand 15.9 15.8 13.9 14.6 14.3 16.3
Shanghai N/A 19.4 25.1 N/A 4.1 2.9
Singapore N/A 15.7 21.2 N/A 12.4 9.9
Taiwan 4.7 5.2 11.8 14.3 15.6 11.5
UK (England) 9.2 8.1 9.1 18.9 18.4 16.7
US N/A 9.9 7.9 N/A 17.7 16.7
OECD Average 8.6 7.0 8.4 20.1 18.8 18

 

We can see that:

  • The countries with the highest proportion of students at Level 5 and above tend to have the lowest proportion at Level 1 and below. In Shanghai in 2012, there is a 22% percentage point gap between these two populations and fewer than 3 in every hundred fall into the lower attaining group.
  • Singapore is much closer to Shanghai at the top end than it is at the bottom. But even Shanghai seems to be making faster progress at the top than at the bottom, which might suggest that it is approaching the point at which the proportion of low achievers cannot be further reduced.
  • Compared with Hong Kong and South Korea, Singapore has a higher proportion of both high achievers and low achievers.
  • Whereas Taiwan had three times as many low achievers as high achievers in 2006, by 2012 the proportions were broadly similar, but progress at the top end is much faster than at the bottom.
  • The decline in Finland has less to do with performance at the top end (which has fallen by three percentage points) than with performance at the bottom (which has increased by more than six percentage points).
  • Canada has consistently maintained a higher percentage of high achievers than low achievers, but the reverse is true in Australia. In New Zealand the percentage at the top is declining and the percentage at the bottom is increasing. The gap between the two has narrowed slightly in England, but not significantly so.
  • To catch up with Shanghai, England has to close a gap of some 16 percentage points at the top end, compared with one of around 14 percentage points at the bottom.

The PISA National Report on England offers some additional analysis, noting that 18 jurisdictions had a higher proportion of pupils than England at Level 5 or above in 2012, including all those that outperformed England overall (with the exception of Estonia and Macao), and also France and Norway.

The National Report relies more heavily on comparing the performance of learners at the 5th and 95th percentiles in each country, arguing that:

‘This is a better measure for comparing countries than using the lowest and highest scoring pupils, as such a comparison may be affected by a small number of pupils in a country with unusually high or low scores.’

This is true in the sense that a minimum sample of 4,500 PISA participants would result in fewer than 100 at Level 6 in many jurisdictions.

On the other hand, the National Report fails to point out that analysis on this basis is not particularly informative about comparative achievement of the criterion-referenced standards denoted by the PISA thresholds.

It says rather more about the spread of performance in each country and rather less about direct international comparisons.

Key points include:

  • In England the score of learners at the 5th percentile was 328, compared with 652 at the 95th percentile. This difference of 324 points is slightly larger than the OECD average difference of 310 points. More than two-thirds of OECD countries had a smaller difference between these percentiles.
  • Compared with PISA 2012, the score of high achievers at the 95th percentile in PISA 2009 increased by six points to 652, while the score of low achievers at the 5th percentile fell by six points to 328. This increase in the attainment gap is higher than in 2009 (312) but lower than in 2006 (337). Thirteen OECD countries reported a wider spread of attainment than England.
  • Of countries outperforming England, only Japan (325 points), Singapore (329 points) Belgium (339 points) and New Zealand (347 points) demonstrated a similar or wider spread of attainment. Shanghai had the lowest difference (259 points) followed by Estonia (263).
  • The strongest performing jurisdictions at the 95th percentile were Singapore (698), Shanghai (690) and Japan (689), compared with 652 for England.
  • Amongst jurisdictions ranked higher than England, only the Netherlands, Liechtenstein, Estonia and Macao secured a lower score at the 95th percentile. Only Belgium reported a lower score at the 5th percentile.

.

Maths

Turning to maths, Table 9 illustrates changes in the pattern of high achievement since 2006, again showing the percentages performing at Level 6 and above Level 5 respectively.

.

Table 9

Country 2012 2009 2006
  Level 6 Levels 5 + 6 Level 6 Levels 5+6 Level 6 Levels 5+6
Australia 4.3 14.8 4.5 16.4 4.3 16.4
Canada 4.3 16.4 4.4 18.3 4.4 18
Finland 3.5 15.2 4.9 21.6 6.3 24.4
Hong Kong 12.3 33.4 10.8 30.7 9 27.7
Ireland 2.2 10.7 0.9 6.7 1.6 10.2
S Korea 12.1 30.9 7.8 25.5 9.1 27.1
New Zealand 4.5 15.0 5.3 18.9 5.7 18.9
Shanghai 30.8 55.4 26.6 50.7 N/A N/A
Singapore 19.0 40.0 15.6 35.6 N/A N/A
Taiwan 18.0 37.2 11.3 28.5 11.8 31.9
UK (England) 3.1 12.4 1.7 9.9 2.5 11.2
US 2.2 9.0 1.9 9.9 1.3 7.7
Average 3.3 12.6 3.1 12.7 3.3 13.4

.

The variations between countries tend to be far more pronounced than in reading:

  • There is a huge 28 percentage point spread in performance at Level 6 within this sample – from 2% to 30% – compared with a three percentage point spread in reading. The spread at Level 5 and above is also significantly larger – 46 percentage points compared with 17 percentage points in reading.
  • Shanghai has an 11 percentage point lead over its nearest competitor at Level 6 and an even larger 15 percentage point lead for Level 5 and above. Moreover it has improved significantly on both counts since 2009. Well over half its sample is now performing at Level 5 or above and almost a third are at Level 6.
  • Singapore and Taiwan are the next best performers, both relatively close together. Both are improving but, following a small dip in 2009, Taiwan is improving at a faster rate – faster even than Shanghai.
  • Hong Kong and South Korea also have similar 2012 profiles, as they did back in 2006. South Korea also lost ground in 2009, but is now improving at a faster rate than Hong Kong.
  • Finland appears to be experiencing quite significant decline: the proportion of Level 6 performers in 2012 is not far short of half what it was in 2006 and performance above Level 5 has fallen by more than nine percentage points. This is a somewhat different pattern to reading, in that the top performers are also suffering from the overall decline.

.

.

  • Australia, Canada and New Zealand have maintained broadly the same performance over time, though all are showing a slight falling off at Level 5 and above, and in New Zealand this also applies at Level 6.
  • After a serious slump in 2006, Ireland has overtaken its 2006 position. Meanwhile, the US has been making some progress at Level 6 but is less convincing at Level 5 and above.
  • Once again, this comparison does not particularly flatter England. It is not too far behind the Commonwealth countries and declining Finland at Level 6 but the gap is slightly larger at Level 5 and above. That said, England has consistently performed below the OECD average and remains in that position.
  • There are, however, some grounds for domestic celebration, in that England has improved by 2.5% at Level 5 and above, and by 1.4% at Level 6. This rate of improvement bears comparison with Hong Kong, albeit from a much lower base. It suggests a narrowing gap between England and its Commonwealth counterparts.

Table 10 gives the comparison with achievement at the bottom end of the distribution, setting out the percentages performing at different levels.

.

Table 10

Country Levels 5 and 6 Level 1 and below
  2006 2009 2012 2006 2009 2012
Australia 16.4 16.4 14.8 13.0 15.9 18.6
Canada 18 18.3 16.4 10.8 11.4 13.8
Finland 24.4 21.6 15.2 5.9 7.8 12.2
Hong Kong 27.7 30.7 33.4 9.5 8.8 8.5
Ireland 10.2 6.7 10.7 16.4 20.9 16.9
S Korea 27.1 25.5 30.9 8.8 8.1 9.1
New Zealand 18.9 18.9 15.0 14.0 15.5 22.6
Shanghai N/A 50.7 55.4 N/A 4.8 3.7
Singapore N/A 35.6 40.0 N/A 9.8 8.3
Taiwan 31.9 28.5 37.2 11.9 12.8 12.8
UK (England) 11.2 9.9 12.4 19.9 19.8 21.7
US 7.7 9.9 9.0 28.1 23.4 25.9
Average 13.4 12.7 12.6 21.3 22.0 23.0

.

Key points include:

  • The same pattern is discernible amongst the strongest performers as was evident with reading: those with the highest percentages at the top end tend to have the lowest percentages at the bottom. If anything this distinction is even more pronounced. Shanghai records a 52 percentage point gap between its highest and lowest performers and the latter group is only slightly larger than the comparable group in the reading assessment.
  • Amongst the Asian Tigers, the ratio between top and bottom is at least 3:1 in favour of the top. For most of the other countries in the sample, there is never more than a 7 percentage point gap between top and bottom, but this stretches to 9 in the case of England and 13 for the USA. Needless to say, the low achievers are in the majority in both cases.
  • Although the percentages for top and bottom in Australia are broadly comparable, it has shifted since 2006 from a position where the top end was in the majority by 3 percentage points to almost a mirror image of that pattern. In New Zealand, the lower achievers have increased by almost 9 percentage points, almost double the rate of decline at the top end, as their ‘long tail’ grows significantly longer.
  • Apart from Shanghai, only Singapore, Hong Kong and South Korea have fewer than 10% in the lower performing category. Despite its reputation as a meritocratic environment, Singapore gets much closer to Shanghai at the bottom of the distribution than it does at the top. The same is true of Hong Kong and South Korea.
  • It is also noticeable that none of the Tigers is making extraordinary progress at the bottom end. Hong Kong has reduced this population by 1% since 2003, Singapore by 1.5% since 2006, Shanghai by only 0.9% since 2006. The percentage has increased in South Korea and Taiwan. Improvement has been significantly stronger at the top of the distribution. Again this might suggest that the Tigers are closing in on the point where they cannot improve further at the bottom end.
  • In Finland, the percentage achieving the higher levels has fallen by over 9 percentage points since 2006, while the increase at the lower levels is over 6 percentage points. This compares with a 3 point fall at the top and a 6 point rise at the bottom in reading. The slump amongst Finland’s high achievers is clearly more pronounced in maths.
  • England’s 9.3 percentage point gap between the top and bottom groups in 2012 is lightly larger than the 8.7 point gap in 2006. It has a whopping 43 percentage point gap to make up on Shanghai at the top end, and an 18 point gap at the bottom. England is just on the right side of the OECD average at the bottom and just on the wrong side at the top.

.

.

The National Report notes that all jurisdictions ahead of England in the rankings had a higher percentage of learners at Level 5 or above.

As for percentiles

  • The difference between the 5th percentile (335 points) and the 95th percentile (652 points) was 316 in England. The average difference for OECD countries was 301, only slightly lower than that.
  • Ten countries had a greater difference than this, five of them amongst those the highest overall mean scores. Others were Israel, Belgium, Slovakia, New Zealand and France.
  • Whereas the difference between the lowest and highest percentiles has increased very slightly across all OECD countries, this is more pronounced in England, increasing from 285 points in 2009 to 316 points in 2012. This is attributable to decreasing scores at the 5th percentile (350 in 2006, 349 in 2009 and 335 in 2012) compared with changes at the 95th percentile (643 in 2006, 634 in 2009 and 652 in 2012).

.

Science

Table 11 compares the performance of this sample of PISA participants at the higher levels in the science assessment on the last three occasions.

.

Table 11

Country 2012 2009 2006
  Level 6 Levels 5 + 6 Level 6 Levels 5+6 Level 6 Levels 5+6
Australia 2.6 13.5 3.1 14.6 2.8 14.6
Canada 1.8 11.3 1.6 12.1 2.4 14.4
Finland 3.2 17.1 3.3 18.7 3.9 20.9
Hong Kong 1.8 16.7 2 16.2 2.1 15.9
Ireland 1.5 10.8 1.2 8.7 1.1 9.4
S Korea 1.1 11.7 1.1 11.6 1.1 10.3
New Zealand 2.7 13.4 3.6 17.6 4 17.6
Shanghai 4.2 27.2 3.9 24.3 N/A N/A
Singapore 5.8 22.7 4.6 19.9 N/A N/A
Taiwan 0.6 8.4 0.8 8.8 1.7 14.6
UK (England) 1.9 11.7 1.9 11.6 3.0 14.0
US 1.1 7.4 1.3 9.2 1.5 9.1
Average 1.2 8.4 1.1 8.5 1.3 8.8

.

In science, the pattern of high achievement has more in common with reading than maths. It shows that:

  • There is again a relatively narrow spread of performance between this sample of jurisdictions – approaching five percentage points at Level 6 and 20 percentage points at Level 5 and above.
  • As in reading, Singapore outscores Shanghai at the top level 6, but is outperformed by Shanghai at Level 5 and above. Both are showing steady improvement, but Singapore’s improvement at Level 6 is more pronounced than Shanghai’s.
  • Finland remains the third best performer, although the proportion of learners achieving at both Level 6 and Level 5 plus has been declining slightly since 2006.
  • Another similarity with reading is that Australia, Finland and New Zealand all perform significantly better at Level 6 than Hong Kong, South Korea and Taiwan. Hong Kong alone performs equally well at Level 5 and above. None of these three Asian Tigers has made significant progress since 2006.
  • In Australia, Canada, New Zealand and the US there has also been relatively little progress over time – indeed some evidence to suggest a slight decline. Conversely, Ireland seems to be moving forward again after a slight dip at Level 5 and above in 2009.

.

.

  • England was a strong performer in 2006, broadly comparable with many of its competitors. But it fell back significantly in 2009 and has made no progress since then. The proportions are holding up but there is no substantive improvement since 2009, unlike in maths and (to a lesser extent) reading. However England continues to perform somewhat higher than the OECD average. There is an interesting parallel with Taiwan, although that country dipped even further than England in 2009.

Table 12 provides the comparison with the proportions achieving the lower thresholds.

.

Table 12

Country Levels 5 and 6 Levels 1 and Below
  2006 2009 2012 2006 2009 2012
Australia 14.6 14.6 13.5 12.8 12.6 13.6
Canada 14.4 12.1 11.3 10.0 9.5 10.4
Finland 20.9 18.7 17.1 4.1 6.0 7.7
Hong Kong 15.9 16.2 16.7 8.7 6.6 5.6
Ireland 9.4 8.7 10.8 15.5 15.1 11.1
S Korea 10.3 11.6 11.7 11.2 6.3 6.7
New Zealand 17.6 17.6 13.4 13.7 13.4 16.3
Shanghai N/A 24.3 27.2 N/A 3.2 2.7
Singapore N/A 19.9 22.7 N/A 11.5 9.6
Taiwan 14.6 8.8 8.4 11.6 11.1 9.8
UK (England) 14.0 11.6 11.7 16.7 14.8 14.9
US 9.1 9.2 7.4 24.4 18.1 18.2
Average 8.8 8.5 8.4 19.3 18.0 17.8

 .

  • Amongst the top performers the familiar pattern reappears. In 2012 Shanghai has 27% in the top categories against 2.7% in the bottom categories. This is very similar to reading (25.1% against 2.9%). At the bottom end, Shanghai’s nearest competitors are Hong Kong and South Korea, while Singapore and Taiwan are each approaching 10% at these levels. This is another similarity with reading (whereas, in maths, Singapore is more competitive at the lower end).
  • Since 2009, Shanghai has managed only a comparatively modest 0.5% reduction in the proportion of its students at the bottom end, compared with an increase of almost 3% at the top end. This may lend further support to the hypothesis that it is approaching the point at which further bottom end improvement is impossible.
  • No country has made consistently strong progress at the bottom end, though Ireland has made a significant improvement since 2009. There has been steady if unspectacular improvement in Hong Kong, Taiwan and Singapore. South Korea, having achieved a major improvement in 2009 has found itself unable to continue this positive trend.
  • Finland’s negative trend is consistent since 2006 at both ends of the achievement spectrum, though the decline is not nearly as pronounced as in maths. In science Finland is maintaining a ratio of 2:1 in favour of the performers at the top end, while percentages at top and bottom are now much closer together in both reading and maths.
  • There are broadly similar negative trends at top and bottom alike in the Commonwealth countries of Australia, Canada and New Zealand, although they have fallen back in fits and starts. In New Zealand the balance between top and bottom has shifted from being 4 percentage points in favour of the top end in 2006, to 3 percentage points in favour of the bottom end by 2012.
  • A similar gap in favour of lower achievers also exists in England and is unchanged from 2009. By comparison with the US (which is a virtual mirror image of the top-bottom balance in Finland, Singapore or South Korea) it is in a reasonable position, rather similar to New Zealand, now that it has fallen back.
  • England has a 1.5 percentage point gap to make up on Shanghai at the top end of the distribution, compared with a 12.2 percentage point gap at the bottom.

The PISA 2012 National Study reports that only the handful of jurisdictions shown in Table 11 above has a larger percentage of learners achieving Level 6. Conversely, England has a relatively large number of low achievers compared with these jurisdictions.

Rather tenuously, it argues on this basis that:

‘Raising the attainment of lower achievers would be an important step towards improving England’s performance and narrowing the gap between highest and lowest performers.’

When it comes to comparison of the 5th and 95th percentiles:

  • The score at the 5th percentile (343) and at the 95th percentile (674) gives a difference of 331 points, larger than the OECD average of 304 points. Only eight jurisdictions had a wider distribution: Israel, New Zealand, Luxembourg, Slovakia, Belgium, Singapore and Bulgaria.
  • The OECD average difference between the 5th and 95th percentiles has reduced slightly (from 311 in 2006 to 304 in 2012) and there has also been relatively little change in England.

.

Top-Performing All-Rounders

Volume 1 of the OECD’s ‘PISA 2012 Results’ document provides additional data about all-round top performers achieving Level 5 or above in each of the three domains.

.

PISA 2012 top performers Capture.

The diagram shows that 4.4% of learners across OECD countries achieve this feat.

This is up 0.3% on the PISA 2009 figure revealed in this PISA in Focus publication.

Performance on this measure in 2012, compared with 2009, amongst the sample of twelve jurisdictions is shown in the following Table 13. (NB that the UK figure is for the UK combined, not just England).

.

Table 13

2012 2009
%age rank %age rank
Australia 7.6 7 8.1 6
Canada 6.5 9 6.8 8
Finland 7.4 8 8.5 4
Hong Kong 10.9 4 8.4 5
Ireland 5.7 15 3.2 23
S Korea 8.1 5 7.2 7
New Zealand 8.0 6 9.9 3
Shanghai 19.6 1 14.6 1
Singapore 16.4 2 12.3 2
Taiwan 6.1 10 3.9 17
UK 5.7 15 4.6 14
US 4.7 18 5.2 11
Average 4.4 4.1

 .

In terms of percentage increases, the fastest progress on this measure is being made by Hong Kong, Ireland, Shanghai, Singapore and Taiwan. Shanghai has improved a full five percentage points and one in five of its students now achieve this benchmark.

The UK is making decent progress, particularly compared with Australia, Canada, Finland New Zealand and the US, which are moving in the opposite direction.

The Report notes:

‘Among countries with similar mean scores in PISA, there are remarkable differences in the percentage of top-performing students. For example, Denmark has a mean score of 500 points in mathematics in PISA 2012 and 10% of students perform at high proficiency levels in mathematics, which is less than the average of around 13%. New Zealand has a similar mean mathematics score of 500 points, but 15% of its students attain the highest levels of proficiency, which is above the average…these results could signal the absence of a highly educated talent pool for the future.

Having a large proportion of top performers in one subject is no guarantee of having a large proportion of top performers in the others. For example, Switzerland has one of the 10 largest shares of top performers in mathematics, but only a slightly-above-average share of top performers in reading and science.

Across the three subjects and across all countries, girls are as likely to be top performers as boys. On average across OECD countries, 4.6% of girls and 4.3% of boys are top performers in all three subjects…To increase the share of top-performing students, countries and economies need to look at the barriers posed by social background…the relationship between performance and students’… and schools’ organisation, resources and learning environment.’ (p65)

.

Denizen by Gifted Phoenix

Denizen by Gifted Phoenix

 

Conclusions

Priorities for Different Countries

On the basis of this evidence, it is possible to draw up a profile of the performance of different countries across the three assessments at these higher levels, and so make a judgement about the prospects in each of ‘a highly educated talent pool for the future’. The twelve jurisdictions in our sample might be advised as follows:

  • Shanghai should be focused on establishing ascendancy at Level 6 in reading and science, particularly if there is substance to the suspicion that scope for improvement at the bottom of the spectrum is now rather limited. Certainly it is likely to be easier to effect further improvement at the very top.
  • Singapore has some ground to catch up with Shanghai at Level 6 in maths. It has narrowed that gap by three percentage points since 2009, but there is still some way to go. Otherwise it should concentrate on strengthening its position above Level 5, where Shanghai is also conspicuously stronger.
  • Hong Kong needs to focus on Level 6 in reading and science, but perhaps also in maths where it has been extensively outpaced by Taiwan since 2009. At levels 5 and above it faces strong pressure to maintain proximity with Shanghai and Singapore, as well as marking the charge made by Taiwan in reading and maths. Progress in science is relatively slow.
  • South Korea should also pay attention to Level 6 in reading and science. It is improving faster than Hong Kong at Level 6 in maths but is also losing ground on Taiwan. That said, although South Korea now seems back on track at Level 5 and above in maths, but progress remains comparatively slow in reading and science, so both Levels 5 and 6 need attention.
  • Taiwan has strong improvement in reading and maths since 2009, but is deteriorating in science at both Levels 5 and 6. It still has much ground to pick up at Level 6 in reading. Its profile is not wildly out of kilter with Hong Kong and South Korea.
  • Finland is bucking a downward trend at Level 6 in reading and slipping only slightly in science, so the more noticeable decline is in maths. However, the ground lost is proportionately greater at Level 5 and above, once again more prominently in maths. As Finland fights to stem a decline at the lower achievement levels, it must take care not to neglect those at the top.
  • Australia seems to be slipping back at both Levels 5 and 6 across all three assessments, while also struggling at the bottom end. There are no particularly glaring weaknesses, but it needs to raise its game across the board.
  • Canada is just about holding its own at Level 6, but performance is sliding back at Level 5 and above across all three domains. This coincides with relatively little improvement and some falling back at the lower end of the achievement distribution. It faces a similar challenge to Finland’s although not so pronounced.
  • New Zealand can point to few bright points in an otherwise gloomy picture, one of which is that Level 6 performance is holding up in reading. Elsewhere, there is little to celebrate in terms of high achievers’ performance. New Zealand is another country that, in tackling more serious problems with the ‘long tail’, should not take its eye off the ball at the top.

.

.

  • The US is also doing comparatively well in reading at Level 6, but is otherwise either treading water or slipping back a little. Both Level 6 and Level 5 and above need attention. The gap between it and the world’s leading countries continues to increase, suggesting that it faces future ‘talent pool’ issues unless it can turn round its performance.
  • Ireland is a good news story, at the top end as much as the bottom. It has caught up lost ground and is beginning to push beyond where it was in 2006. Given Ireland’s proximity, the home countries might want to understand more clearly why their nearest neighbour is improving at a significantly faster rate. That said, Ireland has significant room for improvement at both Level 6 and Level 5 and above.
  • England’s performance at Level 6 and Level 5 and above has held up surprisingly well compared with 2009, especially in maths. When the comparison is solely historical, there might appear to be no real issue. But many other countries are improving at a much stronger rate and so England (as well as the other home countries) risks being left behind in the ‘global race’ declared by its Prime Minister. The world leaders now manage three times as many Level 6 performers in science, four times as many in reading and ten times as many in maths. It must withstand the siren voices urging it to focus disproportionately at the bottom end.

.

Addressing These Priorities

It is far more straightforward to pinpoint these different profiles and priorities than to recommend convincingly how they should be addressed.

The present UK Government believes firmly that its existing policy direction will deliver the improvements that will significantly strengthen its international competiveness, as judged by PISA outcomes. It argues that it has learned these lessons from careful study of the world’s leading performers and is applying them carefully and rigorously, with due attention to national needs and circumstances.

.

.

But – the argument continues – it is too soon to see the benefits of its reforms in PISA 2012, such is the extended lag time involved in improving the educational outcomes of 15 year-olds. According to this logic, the next Government will reap the significant benefits of the present Government’s reform programme, as revealed by PISA 2015.

Recent history suggests that this prediction must be grounded more in hope than expectation, not least because establishing causation between indirect policy interventions and improved test performance must surely be the weakest link in the PISA methodology.

But, playing devil’s advocate for a moment, we might reasonably conclude that any bright spots in England’s performance are attributable to interventions that the previous Government got right between five and ten years ago. It would not be unreasonable to suggest that the respectable progress made at the top PISA benchmarks is at least partly attributable to the national investment in gifted education during that period.

We might extend this argument by suggesting a similar relationship between progress in several of the Asian Tigers at these higher levels and their parallel investment in gifted education. Previous posts have drawn attention to the major programmes that continue to thrive in Hong Kong, Singapore, South Korea and Taiwan.

Shanghai might have reached the point where success in mainstream education renders investment in gifted education unnecessary. On the other hand, such a programme might help it to push forward at the top in reading and science – perhaps the only conspicuous chink in its armour. There are lessons to be learned from Singapore. (Gifted education is by no means dormant on the Chinese Mainland and there are influential voices pressing the national government to introduce more substantive reforms.)

Countries like Finland might also give serious consideration to more substantive investment in gifted education geared to strengthening high attainment in these core domains. There is increasingly evidence that the Finns need to rethink their approach.

.

.

The relationship between international comparisons studies like PISA and national investment in gifted education remains poorly researched and poorly understood, particularly how national programmes can most effectively be aligned with and support such assessments.

The global gifted education community might derive some much-needed purpose and direction by establishing an international study group to investigate this issue, providing concrete advice and support to governments with an interest.

.

GP

December 2013

The Performance of Gifted High Achievers in TIMSS, PIRLS and PISA

.

This post examines the comparative performance of high achievers in recent international comparisons studies, principally the 2011 TIMSS and PIRLS assessments.

More specifically, it compares:

  • The proportion of learners in selected countries who achieve the highest ‘advanced’ benchmarks in TIMSS 2011 maths and science assessments at Grades 4 and 8 respectively and in the PIRLS 2011 reading assessment at Grade 4;
  • How selected countries have performed on each of these measures over the period in which TIMSS and PIRLS have been administered, identifying positive and negative trends and drawing inferences about current relative priorities in different countries;
  • Selected countries’ overall ranking on each of these TIMSS and PIRLS assessments (based on the average score achieved across all learners undertaking the appropriate assessment), contrasted with their ranking for the proportion of learners achieving the highest ‘advanced’ and the lowest ‘low’ benchmarks, considering the associated implications for their national education policies; and
  • The results from TIMSS and PIRLS 2011 with those from PISA 2009, exploring whether these different studies provide a consistent picture of countries’ relative strength in educating their highest achievers and, to the extent that there are inconsistencies, how those might be explained.

The post also reviews recent publications and speeches about England’s performance in TIMSS and PIRLS 2011, with a particular focus on the aspects set out above and the high achievers’ perspective. Finally, it draws together some significant recent contributions which ask interesting questions about the nature of these assessments and their outcomes.

This is therefore a companion piece to my December 2010 post ‘PISA 2009: International Comparisons of Gifted High Achievers’ Performance’.

There is limited reference within it to the relative strengths and weaknesses of international comparisons studies of this kind. Some time ago I published the first part of a separate post on that subject.

For the purposes of this publication my pragmatic assumption is that, while such studies have significant shortcomings and should on no account be used as the sole source of evidence for educational policy-making, they do provide useful steers which, when combined with other sources of quantitative and qualitative evidence, can offer a useful guide to current strengths and weaknesses and potential future priorities.

This is therefore a ‘health warning’: some of my conclusions below do need to be treated with a degree of caution. They are broad indicators rather than incontrovertible statements of fact.

.

Background

History and Development of TIMSS and PIRLS Assessments

The Trends in International Mathematics and Science Study (TIMSS) has provided assessments of national achievement in these subjects since 1995, focused principally on two cohorts: Grade 4 (age 9/10) and Grade 8 (age 13/14).

Its companion exercise, the Progress in International Reading Study (PIRLS) was introduced in 2001 to assess reading comprehension at Grade 4.

There is a parallel TIMSS Advanced assessment of maths and physics achievement in the final year of secondary school. This was undertaken in 1995 and 2008 and is scheduled for 2015. A less difficult PrePIRLS study, providing assessment for those not yet reading confidently, was introduced for the first time in 2011.

The main TIMSS assessment has been repeated on a four-year cycle and PIRLS on a 5-year cycle making 2011 the first year in which both studies were conducted together.

  • In 1995, TIMSS was undertaken for the first time, featuring assessment at five different Grades (3,4.7,8 and the final year of secondary education through the Advanced study). Altogether there were forty-five participating countries.
  • In 1999 TIMSS was repeated at Grade 8 only, with thirty-eight countries participating, twenty-six of them participants in the original 1995 cycle.
  • In 2003 the number of TIMSS participants increased to forty-nine, all but one of which undertook the Grade 8 assessments (though only twenty-six completed the Grade 4 assessments).

TIMSS 2011 lists sixty-three and PIRLS 2011 lists forty-eight participating countries (I have excluded from these figures those countries and parts of countries participating solely for benchmarking purposes.) Altogether though, around 600,000 learners participated in TIMSS and about half as many in PIRLS.

.

Assessment Frameworks and Benchmarks

The separate assessment frameworks for Maths and Science within TIMSS are similarly constructed. There is

  • A Grade-specific content dimension specifying the subject matter to be assessed eg algebra, physics, geometry, chemistry; and
  • A cognitive dimension, capturing the knowing, applying and reasoning processes that are deployed by the learner.

The assessment framework for reading within PIRLS is slightly different. The focus of the assessment is described as ‘reading literacy’, defined thus:

‘The ability to understand and use those written language forms required by society and/or valued by the individual. Young readers can construct meaning from a variety of texts. They read to learn, to participate in communities of readers in school and everyday life, and for enjoyment.’

Two principal aspects are assessed:

  • Two purposes for reading – for literary experience and to acquire and use information; and
  • Four comprehension processes – focus on and retrieve explicitly stated information; make straightforward inferences; interpret and integrate ideas and information; and examine and evaluate content, language and textual elements.

Both TIMSS and PIRLS use achievement scales with a range from 0-1000, though most learners score between 300 and 700 and 500 – the midpoint of the scales – remains constant across different cycles, so trend-related data is relatively reliable.

Four points on this scale are specified as international benchmarks: Advanced at 625, High at 550, Intermediate at 475 and Low at 400. These benchmarks are defined differently for each subject and Grade.

The Advanced benchmark definitions are as follows:

  • Maths Grade 4:

Students can apply their understanding and knowledge in a variety of relatively complex situations and explain their reasoning. They can solve a variety of multi-step word problems involving whole numbers, including proportions. Students at this level show an increasing understanding of fractions and decimals. Students can apply geometric knowledge of a range of two- and three-dimensional shapes in a variety of situations. They can draw a conclusion from data in a table and justify their conclusion.’

  • Science Grade 4:

Students apply knowledge and understanding of scientific processes and relationships and show some knowledge of the process of scientific inquiry. Students communicate their understanding of characteristics and life processes of organisms, reproduction and development, ecosystems and organisms’ interactions with the environment, and factors relating to human health. They demonstrate understanding of properties of light and relationships among physical properties of materials, apply and communicate their understanding of electricity and energy in practical contexts, and demonstrate an understanding of magnetic and gravitational forces and motion. Students communicate their understanding of the solar system and of Earth’s structure, physical characteristics, resources, processes, cycles, and history. They have a beginning ability to interpret results in the context of a simple experiment, reason and draw conclusions from descriptions and diagrams, and evaluate and support an argument.’

  • Reading Grade 4:

‘When reading Literary Texts, students can:

    • Integrate ideas and evidence across a text to appreciate overall themes
    • Interpret story events and character actions to provide reasons, motivations, feelings, and character traits with full text-based support

When reading Informational Texts, students can:

    • Distinguish and interpret complex information from different parts of text, and provide full text-based support
    • Integrate information across a text to provide explanations, interpret significance, and sequence activities
    • Evaluate visual and textual features to explain their function.’
  • Maths Grade 8:

‘Students can reason with information, draw conclusions, make generalizations, and solve linear equations. Students can solve a variety of fraction, proportion, and percent problems and justify their conclusions. Students can express generalisations algebraically and model situations. They can solve a variety of problems involving equations, formulas, and functions. Students can reason with geometric figures to solve problems. Students can reason with data from several sources or unfamiliar representations to solve multi-step problems.

  • Science Grade 8:

‘Students communicate an understanding of complex and abstract concepts in biology, chemistry, physics, and earth science. Students demonstrate some conceptual knowledge about cells and the characteristics, classification, and life processes of organisms. They communicate an understanding of the complexity of ecosystems and adaptations of organisms, and apply an understanding of life cycles and heredity. Students also communicate an understanding of the structure of matter and physical and chemical properties and changes and apply knowledge of forces, pressure, motion, sound, and light. They reason about electrical circuits and properties of magnets. Students apply knowledge and communicate understanding of the solar system and Earth’s processes, structures, and physical features.  They understand basic features of scientific investigation. They also combine information from several sources to solve problems and draw conclusions, and they provide written explanations to communicate scientific knowledge.’

TIMSS courtesy of Toujia Elementary School in Taiwan

Taiwanese Learners Tackle TIMSS courtesy of Toujia Elementary School

.

PISA Frameworks and Benchmarks

PISA is a triennial study of 15 year-olds’ performance (so Grade 9) also in maths, science and reading. A different subject is the main focus in each cycle – in 2009 it was reading. Sixty-five countries took part in PISA 2009.

There is significant overlap with TIMSS/PIRLS participants – some 40 countries involved in TIMSS 2011 also undertook PISA 2009 – but a significant proportion of countries undertake one or the other.

My previous post sets out the definitions of Reading, Mathematical and Scientific Literacy used in the PISA 2009 study and I will not repeat them here.

PISA divides student performance into six different proficiency levels. The highest (Level 6) are defined in terms of the tasks which learners successfully perform, or the skills and competences they must display.

It is interesting to compare the emphases in these descriptions with those in the parallel TIMSS/PIRLS definitions above.

  • In reading, Level 6 tasks:

 ‘typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.’

  •  In maths Level 6 learners can:

 ‘conceptualise, generalise, and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply this insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments, and the appropriateness of these to the original situations.’

  •  And in science, they can:

 ‘consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.’

 This 2011 IPPR Report on Benchmarking the English School System explains the somewhat different approaches of these two assessment suites:

‘PISA puts less emphasis on whether a student can reproduce content, and focuses more on their ability to apply knowledge to solve tasks…

…TIMSS…focuses on curriculum and as a result tends to test pupil’s content knowledge rather than their ability to apply it…

 …PIRLS…assesses…knowledge and content of the curriculum.’

In a recent paper on the PISA 2009 results, Jerrim marks the distinction between PISA and TIMSS in slightly different terms:

‘Whereas TIMSS focuses on children’s ability to meet an internationally agreed curriculum, PISA examines functional ability – how well young people can use the skills in “real life” situations. The format of the test items also varies, including the extent to which they rely on questions that are “multiple choice”. Yet despite these differences, the two surveys summarise children’s achievement in similar ways…

…This results in a measure of children’s achievement that (in both studies) has a mean of 500 and a standard deviation of 100. However, even though the two surveys appear (at face value) to share the same scale, figures are not directly comparable (eg a mean score of 500 in PISA is not the same as a mean score of 500 in TIMSS). This is because the two surveys contain a different pool of countries upon which these achievement scores are based…Hence one is not able to directly compare results in these two surveys (and change over time) by simply looking at the raw scores.’

With these similarities and distinctions in mind, let us turn to analysis of the data.

.

High Performance At Advanced Benchmarks in TIMSS and PIRLS 2011

Table One below shows the top ten countries in each of the five TIMSS and PIRLS assessments at the Advanced benchmark of 675: Maths Grade 4, Science Grade 4, Reading Grade 4, Maths Grade 8 and Science Grade 8. I have also included some countries of interest that fell outside one or more of the ‘top tens’.

 .

Rank Maths 4 % Science 4 % Reading 4 % Maths 8 % Science 8 %
                     
1 Singapore 43 Singapore 33 Singapore 24 Taiwan 49 Singapore 40
2 Korea 39 Korea 29 Russia 19 Singapore 48 Taiwan 24
3 Hong Kong 37 Finland 20 N Ireland 19 Korea 47 Korea 20
4 Taiwan 34 Russia 16 Finland 18 Hong Kong 34 Japan 18
5 Japan 30 Taiwan 15 England 18 Japan 27 Russia 14
6 N Ireland 24 US 15 Hong Kong 18 Russia 14 England 14
7 England 18 Japan 14 US 17 Israel 12 Slovenia 13
8 Russia 13 Hungary 13 Ireland 16 Australia 9 Finland 13
9 US 13 Romania 11 Israel 15 England 8 Israel 11
10 Finland 12 England 11 N Zealand 14 Hungary 8 Australia 11
                     
Hong Kong 9 Hong Kong 9
Taiwan 13
Australia 10 Australia 7 Australia 10
Ireland 9 Ireland 7
US 7 US 10
N Zealand 4 N Zealand 5 N Zealand 5 N Zealand 9
Finland 4
N Ireland 5
Median   4   5   8   3   4

Table One: Top Ten Countries at Advanced Benchmarks, TIMSS and PIRLS 2011

 .

Several important points can be drawn from this initial analysis.

  • Singapore is by some margin the most successful country in terms of the percentage of its pupils achieving the Advanced benchmark. It tops the rankings in all but Maths Grade 8, where it is a close second to Taiwan. In all the remaining assessments, it has a 4 or 5 percentile point lead over its nearest rival, and in Science Grade 8, an astonishing lead of 16 percentile points.
  • But the proportion of Singaporean learners achieving the Advanced benchmark varies significantly, from just under a quarter in Reading to just under half in Maths Grade 8. Singapore is much closer to the PIRLS median (+16%) in Reading so, arguably, that is a relative weakness at this level.
  • Other outstanding performers include: Korea and Japan (apart from Reading which they did not undertake); Hong Kong (apart from Science at Grades 4 and 8 where it was outside the top 10); Taiwan (though it was outside the top 10 for Reading); Finland (though it was let down in the Maths Grade 8 assessment), Russia and England.
  • The top-ranked countries in TIMSS – Singapore, Korea, Hong Kong, Taiwan, Japan – typically secure a significantly higher proportion of Advanced level achievers in Maths than in Science. The reverse is broadly true in a second group of countries including Finland, the US, Russia and New Zealand. England and Australia are significantly atypical, in that Maths leads the way at Grade 4 while Science is in the ascendant at Grade 8.
  • When PIRLS is factored in, it is clear that a group of countries including Finland, Russia, the US, New Zealand and Israel secure larger proportions at the Advanced benchmark in Reading than in both Maths and Science. The same is almost true of England, though the percentages are equal for Reading and Maths at Grade 4. Unsurprisingly, the outstanding Asian TIMSS performers tend to achieve a significantly lower level in Reading. The relative reading difficulty of native languages are bound to have an impact here.
  • Interestingly, England outscored or equalled Finland on all but one assessment (Science Grade 4). It exceeded the median comfortably on all five assessments: Maths Grade 4 (+14%); Science Grade 4 (+6%), Reading (+10%); Maths Grade 8 (+5%); and Science Grade 8 (+10%). (It was however outscored by Northern Ireland on Maths Grade 4 and Reading.)
  • On the basis of these differentials, Science Grade 4 and Maths Grade 8 are England’s areas of relative weakness amongst high achievers though, if the analysis is undertaken on the basis of the gap between England and the world leader for each assessment, the incontrovertible priority is Maths Grade 8 where there is a 41 percentile point chasm between England and Taiwan.

. .

Trends Over Time in Performance Against TIMSS and PIRLS Advanced Benchmarks

Tables 2A to 2E below show how the percentage achieving the Advanced benchmark has changed over time in each country within the top 10 in each assessment in 2011 (excluding those for which there is insufficient data).

Where the percentage has declined between cycles of the assessment, the figure is emboldened. Each table also shows for each country the percentage change between the first assessment and that undertaken in 2011.

 .

Country 1995 2003 2007 2011 ImprovementSince 1995
Singapore 38 38 41 43 +5
Korea 25 39 +14
Hong Kong 17 22 40 37 +20
Taiwan 16 24 34 +18
Japan 22 21 23 30 +8
England 7 14 16 18 +11
Russia 11 16 13 +5
US 9 7 10 13 +4
Lithuania 10 10 10 0
Belgium (Flemish) 10 10 0

Table 2A: TIMSS Maths Grade 4 – Trend in Percentage Achieving Advanced Benchmark

.

Country 1995 2003 2007 2011 ImprovementSince 1995
Singapore 14 25 36 33 +19
Korea 22 29 +7
Russia 11 16 16 +5
Taiwan 14 19 15 +1
US 19 13 15 15 -4
Japan 15 12 12 14 -1
Hungary 7 10 13 13 +6
England 15 15 14 11 -4
Sweden 8 10 +2
Czech Republic 12 7 10 -2

Table 2B: TIMSS Science Grade 4 – Trend in Percentage Achieving Advanced Benchmark

.

Country 2001 2006 2011 Improvement Since 2001
Singapore 12 19 24 +12
Russia 5 19 19 +14
England 20 15 18 -2
Hong Kong 5 15 18 +13
US 15 12 17 +2
New Zealand 14 13 14 0
Taiwan 7 13 +6
Denmark 11 12 +1
Hungary 10 14 12 +2
Bulgaria 17 16 11 -6

Table 2C: PIRLS Reading – Trend in Percentage Achieving Advanced Benchmark

.

Country 1995 1999 2003 2007 2011 Improvement Since 1995
Taiwan 37 38 45 49 +12
Singapore 40 42 44 40 48 +8
Korea 31 32 35 40 47 +16
Hong Kong 23 28 31 31 34 +11
Japan 29 29 24 26 27 -2
Russia 9 12 6 8 14 +5
Australia 7 7 6 9 +2
England 6 6 5 8 8 +2
Hungary 10 13 11 10 8 -2
US 4 7 7 6 7 +3

Table 2D: TIMSS Maths Grade 8 – Trend in Percentage Achieving Advanced Benchmark

 .

Country 1995 1999 2003 2007 2011   Improvement Since 1995
Singapore 29 29 33 32 40 +11
Taiwan 27 26 25 24 -3
Korea 17 19 17 17 20 +3
Japan 18 16 15 17 18 0
Russia 11 15 6 11 14 +3
England 15 17 15 17 14 -1
Slovenia 8 6 11 13 +5
Australia 10 9 8 11 +1
US 11 12 11 10 10 -1
Hong Kong 7 7 13 10 9 +2

Table 2E: TIMSS Science Grade 8 – Trend in Percentage Achieving Advanced Benchmark

.

This trend-based data throws a different complexion on the performance of several leading countries.

  • Though Singapore has managed impressive double-digit improvements in four of the five assessments, its improvement in Grade 4 Maths is far less spectacular, at a mere 5%. Moreover, Singapore’s performance actually declined on both Grade 8 Maths and Science in 2007, though it has reversed that trend in 2011 (and quite spectacularly so in Science).
  • The rate of improvement in some other countries has exceeded that of Singapore. At Grade 4 in Maths, Hong Kong, Taiwan, Korea, England and Japan are all improving at a significantly faster rate. The same is true of Korea, Taiwan and Hong Kong at Grade 8. Singapore has comfortably the fastest rate of improvement in Grade 4 and Grade 8 Science. In Reading though, Russia and Hong Kong outscore Singapore on this metric.
  • There have also been some significant declines in performance over the period that these assessments have been conducted. Both England and the United States have suffered a decline of four percentile points in Grade 4 Science, while Taiwan’s Grade 8 Science result has fallen by three percentage points and Bulgaria’s Reading score by six percentage points.
  • Within TIMSS, most of the leading countries – including Korea, Hong Kong, Taiwan, England, Russia and the US – have improved significantly more on Maths than they have on Science. However, the reverse is true in Singapore (perhaps suggesting that Singapore science is a potentially stronger export than Singapore maths). Japan is also atypical in that there has been an improvement in Maths at Grade 4 but in all other assessments there has been no improvement or a slight decline.
  • Where countries have achieved improvements within TIMSS assessments, these are typically stronger at Grade 4 than Grade 8, though the reverse is true in Maths in Singapore and Korea, while both Russia and the US present a more balanced scorecard in this respect.
  • When PIRLS is factored in, one notices that improvements in Reading tend to be less strong than in each country’s fastest improving TIMSS subject but stronger than in its slower improving TIMSS subject. Russia is the obvious outlier, with outstanding improvement in Reading relative to both Maths and Science. In England the decline in Reading is similar to that in Science.
  • Considered from this perspective, Singapore should be prioritising Grade 4 Maths, while Korea and Hong Kong should concentrate on Grade 8 Science. The US must look at Grade 4 Science and, to a lesser extent, Grade 8 Science. England’s priorities would also be Grade 4 and Grade 8 Science plus Reading. Maths is strong at Grade 4, though relatively less so at Grade 8.

.

How Singapore Summarised the outcomes of TIMSS/PIRLS 2011

How Singapore Summarised the outcomes of TIMSS/PIRLS 2011

 .

Overall Rankings Compared With Rankings for Achievement of Advanced and Low Benchmarks

The next set of Tables examines how countries’ rankings differ for the overall assessment (based on the median score of learners from that country), the percentage achieving the highest ‘Advanced’ benchmark and the percentage achieving the lowest ‘Low’ benchmark.

This provides an indicator of whether each country’s highest achievers are outperforming the average achievers in comparative terms – and to what extent (if at all) the lowest achievers are lagging behind.

To make this manageable I have again confined the analysis to the top ten countries in each assessment against the ‘Advanced’ Benchmark.

 .

Country Advancedrank Overallrank Lowrank
Singapore 1 1 2=
Korea 2 2 1
Hong Kong 3 3 2=
Taiwan 4 4 2=
Japan 5 5 2=
N Ireland 6 6 13=
England 7 9 19=
Russia 8 10 9=
US 9 11 13=
Finland 10 8 8

Table 3A: TIMSS Grade 4 Maths – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Singapore 1 2 6=
Korea 2 1 1=
Finland 3 3 1=
Russia 4 5 5=
Taiwan 5 6 6=
US 6 7 9=
Japan 7 4 1=
Hungary 8 10 22=
Romania 9 28 34=
England 10 15 22=

Table 3B: TIMSS Grade 4 Science – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Singapore 1 4 15=
Russia 2 2 2=
N Ireland 3 5 15=
Finland 4 3 2=
England 5 11 21=
Hong Kong 6 1 2=
US 7 6 7=
Ireland 8 10 15=
Israel 9 18 29=
N Zealand 10 23 32

Table 3C: PIRLS Reading – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Taiwan 1 3 5=
Singapore 2 2 1=
Korea 3 1 1=
Hong Kong 4 4 3=
Japan 5 5 3=
Russia 6 6 7
Israel 7 7 15=
Australia 8 12 11=
England 9 10 13=
Hungary 10 11 13=

Table 3D: TIMSS Grade 8 Maths – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Singapore 1 1 4=
Taiwan 2 2 4=
Korea 3 3 2=
Japan 4 4 2=
Russia 5 7 4=
England 6 9 9=
Slovenia 7 6 4=
Finland 8 5 1
Israel 9 12 19=
Australia 10 13 11=

Table 3E: TIMSS Grade 8 Science – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

These Tables show that, particularly at the top end of the distribution, there is a very close correlation between ranking on the basis of average score and on the basis of the proportion achieving the Advanced benchmark.

There is also a fairly close correlation with the proportion achieving the Low benchmark, but this is not quite so pronounced and there are some outliers with relatively ‘long tails’ of low achievement.

  • In Maths at Grade 4 the top five countries get very high percentages of pupils past the Low Benchmark, but the next five are relatively less successful and, of these, England is least successful. It has a relatively ‘long tail’, while its highest achievers do comparatively better than the overall measure. The latter is also true of Russia and the United States, but the reverse is the case in Finland. This is arguably evidence that England, Russia and the US should prioritise the lower end of the distribution while Finland should pay more attention to the top end.
  • In Maths at Grade 8 the pattern is broadly similar though, with the exception of Israel, the ‘long tail’ for the countries just below the top rank is not quite so pronounced. This might suggest that earlier efforts to bring younger low achievers up to a higher standard – and to narrow national achievement gaps – have been at least partly successful.
  • In Science at Grade 4 these variations are once again more substantial, while tending to narrow at Grade 8, so giving a similar pattern. Singapore’s rankings suggest relatively greater priority is required at the lower end of the achievement distribution. Romania is clearly the worst in this respect, though England and Hungary are not too far behind. The’ ranking gap’ in England is broadly similar for Maths and Science at Grade 4 and at Grade 8 respectively.
  • In Science at Grade 8, Israel again has the longest tail, comparable with the situation in Maths Grade 8. Finland is again remarkable for bucking the general trend, suggesting perhaps that it is too much focused on lifting everyone up to a relatively high standard and too little focused on stretching those at the top.
  • In Reading there is relatively more volatility throughout the table, at the top as much as the bottom of the top ten. Russia, Finland and the United States have relatively ‘flat profiles’, while Hong Kong assumes the ‘reverse profile’ more typically associated with Finland in respect of Maths and Science. Several countries have a pronounced tail, including Singapore, Northern Ireland, England, Ireland, Israel and New Zealand. The latter two have the biggest issue in this respect. There is clearly an issue here for Singapore to address.

 .

Broad Comparisons Between TIMSS and PIRLS 2011 and PISA 2009

Finally in this data analysis section, it is worthwhile to compare the top-ranking countries in terms of the proportions achieving the most demanding benchmarks, to identify broad similarities and differences.

Of course the results are not strictly comparable because the assessments are substantively different, the assessed learners are older on PISA, and the cohort of countries competing with each other is not the same.

Nevertheless, the exercise is instructive.

For the purpose of the comparison I have used the Grade 8 Maths and Science assessments (because the learners taking them are almost the same age as those undertaking PISA), but I have also included PIRLS, as the only comparison available for reading.

On this occasion, however, I have included the top 20 ranked institutions in each assessment

Rank TIMSSMathsG8 PISAMaths TIMSSScienceG8 PISAScience PIRLSReading PISAReading
1 Taiwan Shanghai Singapore Singapore Singapore NZ
2 Singapore Singapore Taiwan Shanghai Russia Singapore
3 Korea Taiwan Korea NZ N Ireland Shanghai
4 HK HK Japan Finland Finland Australia
5 Japan Korea Russia Australia England Japan
6 Russia Switzerland England Japan HK Canada
7 Israel Japan Slovenia HK US Finland
8 Australia Belgium Finland UK Ireland US
9 England NZ Israel Germany Israel Sweden
10 Hungary Liechtenstein Australia Canada NZ HK
11 Turkey Finland US Netherlands Canada Belgium
12 US Germany HK Switzerland Taiwan France
13 Romania Australia NZ Estonia Denmark Korea
14 Lithuania Canada Hungary US Hungary Iceland
15 NZ Netherlands Turkey Czech Rep Bulgaria Israel
16 Ukraine Macao Sweden Ireland Croatia UK
17 Slovenia Slovenia Lithuania Belgium Australia Norway
18 Finland Slovakia Ukraine Korea Italy Ireland
19 Italy France Iran Austria Germany Poland
20 Armenia Czech Rep UAE Sweden Portugal Switzerland

 

Table 4: Top 20 Rankings for Highest Benchmark in TIMSS, PIRLS and PISA

.

In PISA results are reported for the UK as a whole, but the figures for Level 6 achievement in England are almost identical (only in maths is there a noticeable difference, with England’s result 0.1% lower than that reported for the UK).

England is ranked 29th on PISA Maths, the only column in the table in which neither England nor the UK appears.

The rankings show that a handful of the ‘usual suspects’ are highly placed on both TIMSS/PIRLS and PISA. Singapore is ubiquitous.

Some countries perform relatively better on the PISA side of the equation – New Zealand is an obvious example – while England is a comparatively better performer on TIMSS/PIRLS, as is the United States.

It is interesting to hypothesise whether these differences reflect different strengths in national education systems. Other things being equal, do those countries performing best on PISA pay relatively more attention their high achiever’s problem-solving and the application of content knowledge? Do those performing better on TIMSS/PIRLS emphasise content knowledge above ‘real life’ problem-solving?

Perhaps high-achieving learners in countries more successful in PISA are simply more familiar with assessment instruments that feature such problem-solving. Or perhaps much of the difference is explainable by more mundane variations in the assessment process. There are likely to be several different factors in play.

The countries that appear most frequently on these lists are amongst the global leaders in educating high-achieving learners. Whether there is a significant correlation with the scope and efficacy of their gifted education programmes is less certain.

We know from previous posts on this Blog that Singapore, Korea and Hong Kong have some of the best developed gifted education programmes in the world. Israel also falls into this category, as did England in the period up to 2011.

It would be a reasonable hypothesis that their investment at the top end of the ability range is having a positive effect in terms of educational outcomes as measured by these assessments, but I am not aware of any research that attempts to establish such causality.

And it is important to note that the percentages achieving the highest benchmarks in PISA/TIMSS and PIRLS vastly exceed the proportions admitted into leading countries’ gifted education programmes whereas, in England, the proportion achieving the highest benchmarks is significantly lower than the percentage in the former national gifted education programme.

Assessment Leading country % at highest BenchmarkIn leading

country

% at highestBenchmarkIn England % at highest benchmarkAverage for Assessment*
TIMSS Maths G4 Singapore 43 18 4
TIMSS Maths G8 Taiwan 49 8 5
PISA Maths Shanghai 26.6 1.7 3.1
TIMSS Science G4 Singapore 33 11 3
TIMSS Science G8 Singapore 40 14 4
PISA Science Singapore 3.9 1.8 1.1
PIRLS Reading Singapore 24 18 8
PISA Reading NZ 2.9 1.0 0.8

 *Averages for PISA are OECD countries only

Table 5: Percentage Achieving Highest Benchmark in TIMSS, PIRLS and PISA – Comparison of Leading Country and England

.

Table 5 shows that the gaps between England and the leading country can be highly variable between assessments.

  • In TIMSS Grade 4 Maths, Singapore achieves more than twice as many as England at the highest benchmark but, at Grade 8, Taiwan manages over six times as many.
  • In PISA Maths the difference between Shanghai and England is enormous – over 15 times as many Shanghai learners achieve the benchmark.
  • In TIMSS Grade 4 Science, Singapore has exactly three times as many at the highest benchmark while, at Grade 8, it has slightly less than that.
  • In PISA Science, slightly more than twice as many Singaporean learners achieve the highest benchmark.
  • In PIRLS reading the difference is much smaller, with Singapore only 25% ahead but
  • In PISA Reading, the gap between England and New Zealand is once again close to a multiple of three.

So, while the majority of assessments show the international leader having a two- or threefold greater proportion achieving the highest benchmark, there are three conspicuous outliers: TIMSS Grade 8 Maths and especially PISA Maths (where England performs significantly worse); and PIRLS Reading (where England scores significantly better.

At the same time though, England is significantly ahead of the average for each assessment, with the sole exception of PISA maths.

This is not quite the overwhelmingly negative picture painted in the Sutton Trust’s Educating the Highly Able (which I analysed at length in a previous post).

While there is a significant gap between England and the world’s leaders on all these assessments, its performance is comparatively respectable in all but PISA Maths/TIMSS Grade 8 Maths. This suggests a particular problem with secondary maths for the highest achievers in England.

.

Various PISA products courtesy of OECD

Various PISA products courtesy of OECD

 .

Domestic Analysis of England’s Performance in TIMSS and PIRLS 2011

NFER has published extensive analyses of England’s performance in TIMSS and PIRLS respectively. The analysis shows that:

  • In all four TIMSS assessments, the attainment difference between the highest and lowest performing learners was just short of 300 TIMSS scale points.
  • The best-performing countries typically have similar or smaller ranges of attainment, though there were exceptions (Taiwan for Grade 8 maths and Singapore for Grade 4 and Grade 8 Science). The variation tends to be greater for those below average than for those above.
  • Whereas at Grade 4 in Maths England’s performance can be seen as at the low end of the highest performing countries, at Grade 8 it ‘has more in common with the performance of the majority of countries than with the highest performing countries’.
  • At Grade 4 in Science ‘England is in a group of countries with relatively low proportions of pupils at the advanced benchmark’ and, despite the good showing in the rankings the profile at Grade 8 ‘differs from those of the highest scoring countries’.
  • In the PIRLS Reading assessment ‘the most able readers [in England] were among the best readers in the survey’. They reached levels similar to Singapore’s high achievers and ‘higher than the most able readers in the three top performing countries (Hong Kong, the Russian Federation and Finland)’.

The TES ran a story in which Andreas Schleicher of the OECD – the man responsible for PISA – took an idiosyncratic position, arguing that good results in TIMSS and PIRLS would actually be bad news, because:

‘Pisa – which suggests a recent decline in England’s international standing – tests children at an older age than Timss and Pirls. Mr Schleicher claimed that a good performance from England in the latter two tests, after its fall from grace in Pisa, would therefore suggest that the performance of pupils is actually deteriorating as they progress through school.

“If you put the three surveys together – I don’t think you can strictly compare them, but if you sort of use them as approximations – in my view it makes the picture a lot more worrying,” he said. “Because the message you get is that the earlier the year in school that you test kids in the UK, the better the performance internationally.

“In other words, parents and society do a great job in children getting to school but then year after year the schools system adds less value than we see across (other) countries.”…

…”It is probably true that the UK system is actually quite good in primary education, in the early years, but then afterwards it peters out – you can see the high dropout, you can see the 14-18 problem and so on,” Mr Schleicher said. “If you look at the three surveys together you don’t get a very encouraging picture. It is a more worrying picture than if you look at them one by one.”’

This statement rather ignores the fact that only a single year separates PISA participants from those undertaking the TIMSS 8th Grade studies. From the evidence above, it is not consistently borne out by performance at the highest benchmarks, especially in Science.

There are likely to be several different factors responsible for England’s relatively better performance on TIMSS/PIRLS (including in the 8th Grade assessments).

Many have been identified through research studies, the majority of them associated with technical differences in the nature of the assessments. There will also be factors associated with the systems being assessed, but I have seen no substantive evidence to back up Schleicher’s claim.

On 11 December 2012, Education Minister Elizabeth Truss gave a speech about the evidence from TIMSS and PIRLS. Towards the beginning, she advances the oft-repeated truism (not entirely borne out by the evidence above) that:

‘In the past, and still today, this country has excelled at educating a small minority of its children to the very highest level.’

In fact, the minority is relatively large compared with most other countries.

Strangely, although the speech concentrates on the raft of reforms being introduced to improve performance in reading, maths and science, there is no reference at all to those which specifically benefit the highest achievers: the introduction of Level 6 assessment at Key Stage 2 and the development of a cadre of selective specialist 16-19 maths and science free schools.

The timing of these assessments was problematic for a Government elected to power in 2010. This BBC story includes a grudging reaction to the mixed bag of results from a Government spokesman:

‘These tests reflect progress between 2006 and 2011 and were taken only a year after the election.

So to the limited extent the results reflect the effect of political leadership, Labour deserves the praise for the small improvement in reading and the blame for the stagnation in maths and the decline in science. The tests say nothing, good or bad, about what we have done.’

Meanwhile the Opposition spokesman says:

‘These results show schools in England are some of the best in Europe – thanks to the hard work of teachers and pupils. The Labour government’s reforms saw reading results improve thanks to better teaching, smaller class sizes and Labour’s National Literacy Strategy.

However, we need to understand why East Asian countries outperform us in key skills – particularly science and maths.’

.

Summing Up

This analysis aims to exemplify how careful analysis of performance against the highest benchmarks in TIMSS, PIRLS and PISA assessments can offer broad indicators of the comparative strengths and weaknesses of education systems as far as their high achievers are concerned.

It acknowledges the significant weaknesses of an evidence base derived entirely from international benchmarking studies, although it does not address directly the problems associated with such studies which tend to call the findings into question.

It does not draw out the implications for each country – readers can do that for themselves – but I hope it does reveal that even the most celebrated international examples cannot afford to rest on their laurels. To take just three national examples:

  • Singapore tops almost every assessment but it performs less well on PIRLS Reading than on the four TIMSS studies. Other countries are improving their Reading performance at the Advanced benchmark at a much faster rate, while there has also been limited improvement over time at in Maths, especially at Grade 4. Perhaps Singapore is beginning to approach a maths ‘ceiling’, preventing the proportion of high achievers from being much further improved. In both Reading and Science there is evidence to suggest that the lower end of the achievement distribution requires somewhat greater attention.
  • Despite its stellar performance in PISA 2009 and strong showing in the overall TIMSS/PIRLS rankings, Finland is not amongst the world leaders in maximising the proportion of high achievers in these studies. It outperformed England only on Science at Grade 4, probably England’s main area of weakness. While Finland may have made strong progress in eradicating ‘long tails of low achievement’, there is evidence here to suggest that it is falling behind at the top end.
  • England’s outperformance of Finland – so often held up as the model for us to emulate – deserves to be more widely known and celebrated. The situation is nowhere near as bad as the Sutton Trust’s recent report on the Highly Able might suggest. But there is no room for complacency. There are still big gaps to make up in Maths at Grade 4 and in Science at Grade 8. The trend over time is disappointing in Science at Grades 4 and 8 and also in Reading. While attention is clearly needed to shorten ‘long tails’ in Reading, Maths and Science (especially at Grade 4), this must not be at the expense of the high achievers, or England risks falling into the Finnish trap.

.

GP

January 2013

On International Benchmarking and its Potential Application to Gifted Education

International benchmarking is all the rage in education policy, so I have been examining the arguments for and against, as well as contemplating its potential application to gifted education.

Benchmarking is comparing one country’s educational performance against others, with the intention of applying to it policy solutions adapted from the countries that achieve the best educational outcomes.

It is highly attractive to policy-makers, but the education research community is not quite as readily convinced.

Comparative educators have long wrestled with the key question whether policies can be transferred across national borders or are inextricably linked with the social and cultural environment in which they originated.

And there are several other problems associated with the benchmarking process, especially when it involves the derivation of policy steers from international assessment studies.

We will take a closer look at benchmarking, its popularity and the criticism it attracts, before considering to what extent it has been applied to gifted education and the case for taking this further.

I have divided the post into three parts, in recognition of its length and complexity. I’m afraid there’s not much coverage of gifted education until Part Three, but I hope the wider treatment is of interest!


Definitions and Approaches to Benchmarking


PISA

The OECD’s Programme for International Student Assessment (PISA) has become the best-known international benchmarking tool. This triennial publication compares the changing performance of education systems, over time and relative to each other, by testing a sample of 15 year-olds in literacy, maths and science.

The first assessment took place in 1997. Results from the fifth round, PISA 2009, were published in December 2010 and planning for PISA 2012 is already under way. The number of participating countries is steadily increasing: 74 took part in PISA 2009 compared with 58 three years earlier.

PISA treads a fine line between proclaiming itself the world’s foremost education benchmarking tool and making – or, more often, allowing others to make – somewhat over-ambitious claims about its value to national policy-makers.

For the OECD is typically careful to say that countries should use the results to inform their decisions, rather than slavishly transferring entire policies from the highest-scoring countries: PISA is just one source of evidence rather than a complete answer.

But sometimes the PISA ‘brand’ is advertised in such a way that national politicians and educationalists are beguiled into thinking that it offers them something more than a set of indicators to factor into their wider analysis.

The OECD press release for the 2009 Report comes a little too close to encouraging such a response:

‘PISA aims to help countries see how their school systems match up globally with regard to their quality, equity and efficiency. The best performing education systems show what others can aspire to, as well as inspire national efforts to help students to learn better, teachers to teach better, and school systems to become more effective.’

I have yet to find a PISA publication that explicitly sets out what other evidence, alongside PISA, it would be wise for national policy makers to take into account, but we shall see later that it will on occasion take care to acknowledge the limitations of PISA-based benchmarking.

McKinsey

One example is to be found in a September 2007 publication by McKinsey and Company, ‘How the World’s Best Performing School Systems Come Out on Top’.

This contains a foreword by Andreas Schleicher, Head of the Indicators and Analysis Division in the Education Directorate of OECD and so Director of PISA. He acknowledges the problem of causation:

‘measuring performance does not of itself lead to insights about what policy and practice can do to help students to learn better’.

before admitting that McKinsey is breaking new ground in linking together quantitative results from PISA and qualitative commentary, so enabling:

‘policy makers to learn about the features of effective systems without copying systems in their entirety’.

But he continues in such a way that we are inclined to transfer our unrealistic expectations away from PISA and onto McKinsey’s broader methodology:

‘By enabling policy-makers to examine their own education systems in the light of the best performing systems that set the standards of what can be achieved, the report provides policy-makers with a unique tool to bring about improvements in schooling….Comparative analyses of this kind will become ever more important as the best-performing education systems, not simply improvement by national standards, will increasingly become the yardstick for success.’

McKinsey examines 25 school systems around the world including 10 of the top PISA performers. The qualitative dimension of their study is derived from a literature survey and interviews with a range of experts and policy-makers.

Rather strangely, curriculum and pedagogy are set aside because ‘these subjects are well-debated in the literature’ and the focus is laid exclusively on the infrastructure of the school system.

This results in an incomplete picture of system effectiveness and, by failing to take into account key elements, inevitably distorts that picture. (We shall see later that proponents of curriculum benchmarking recognise the need for a holistic treatment and this argument surely applies both ways.)

In the absence of curriculum and pedagogy, McKinsey conclude that three policy areas have the most impact, regardless of the culture in which they are applied. These three ‘key drivers’ are:

  • getting the right people to become teachers
  • developing them into effective teachers and
  • equipping the system to provide the best possible teaching to every learner.

The third of these is most relevant to the focus of this post. But it takes the reader only so far and no further.

We learn that high-performing systems have high expectations of what every child can achieve and then monitor performance against those expectations, intervening when they are not being met.

We are told that ‘the best systems have produced approaches to ensure that the school can compensate for the disadvantages resulting from the students’ home environment’.

This not strictly true. A more accurate statement can be found in the Executive Summary of the PISA 2009 Report, Volume 2:

‘Socio-economic disadvantage has many facets and cannot be ameliorated by education policy alone, much less in the short term….However, even if socio-economic background itself is hard to change, PISA shows that some countries succeed in reducing its impact on learning outcomes.’

Finally, these systems ensure that resources are targeted at those students who need them most, but this rather begs the question which students do need them most.

If the highest-performing systems have high expectations of all learners, then presumably all under-achievers are equally deserving, regardless of whether or not they are low achievers? We are never told.


PISA again

Three years on, PISA has become more explicit in its own claims for how it can help policy makers. The 2010 publication ‘Strong Performers and Successful Reformers in Education: Lessons from PISA for the United States’ suggests that PISA can:

  • show what achievements are possible in comparable education systems
  • be used to set national improvement targets and reform trajectories
  • allow countries to link their national assessments to PISA assessments
  • help countries to validate their progress, defending themselves against allegations that improvements are attributable to falling standards
  • assist countries to optimise their existing policies or consider alternative approaches.

It advances a ‘Framework for Analysis’ based on the principle that there is a continuum of approaches to education reform related to countries’ economic development.

It argues that developing countries with limited resources will be likely to have lower levels of literacy and so may choose to:

‘invest more heavily in educating well a small elite to lead the country’s industries and government operations while allocating remaining resources for teachers with little training. When teacher quality is so low, governments may also prescribe to teachers very precise job requirements, instructing teachers what to do and how to do it.’

Conversely, more advanced economies tend to focus on competing in the global economy by extending universally the education that was formerly provided only to the elite.

They need to recruit teachers from amongst the best graduates but, to do so, they must remove ‘bureaucratic command-and-control systems’ and replace them with ‘professional norms of control’ giving more professional discretion to teachers and ‘greater latitude in developing student creativity and critical thinking skills that are important to knowledge-based economies’.

All countries are said to lie somewhere along this continuum, which is presented as having the following elements:

  • Economic development: from impoverished, pre-industrial low-wage to high value-added, high wage
  • Teacher quality: from a few years more than lower secondary to high level professional knowledge workers
  • Curriculum, instruction and assessment: from basic literacy, rote learning to complex skills, creativity
  • Work organisation from hierarchical, authoritarian to flat, collegial
  • Accountability: from primary accountability to authorities to primary accountability to peers and stakeholders
  • Student inclusion: from the best students must learn at high levels to all students must learn at high levels

Although countries can make progress on one or more of these elements independently of the others, this is likely to pose problems:

‘adjusting only one or two dimensions at a time without concern for a more co-ordinated adaptation of the system as a whole risks tampering with the equilibrium that pervades successful systems.’

This rather confirms the inadequacy of McKinsey’s 2007 publication. While the elements of the continuum seem quite reasonable, one is left wondering about their empirical basis. We shall encounter a somewhat different list later in the argument.


APEC

The Asia-Pacific Economic Co-operation (APEC) has a helpful wiki that takes a wider perspective on what benchmarking is and how it is undertaken. It defines benchmarking as:

‘a means for economies to analyse internal performance compared to that of other economies, identify processes and approaches of high performing education systems, and collaborate with other member economies… to learn about successful school improvement measures.’

The purpose is to:

‘improve the economic security and social well-being of member economies…to identify and replicate a set of promising policies to achieve a turnaround in the performance of consistently low-performing schools, school systems, and/or new schools.’

According to APEC, the benchmarking process involves six key steps:

  • Form an expert group to identify criteria for selecting benchmark sites, develop benchmarking protocols and take decisions.
  • Examine the international literature on effective school improvement strategies, including any evaluations conducted by the host countries.
  • Using specified criteria, identify high performing economies with promising education policies for addressing persistently low-performing schools:
  • Specify their protocols for describing and solving the identified problems.
  • Identify an expert in the host country to conduct a case study.
  • Bring experts together to discuss findings and extrapolate practices for adoption.

This offers us a welcome example of a more decentralised approach to benchmarking educational practice: it is not essential to base a benchmarking exercise on PISA and/or any other existing international comparisons studies.


World Bank

Not to be outdone by the OECD, the World Bank is now adopting educational benchmarking.

It is currently consulting on a new Education Strategy for 2020, called ‘Learning for All’ which continues its commitment to supporting countries to achieved the education Millennium Development Goals (MDGs) building on the progress made to date but also

‘makes a significant shift toward the development of knowledge and skills that will drive people’s employability, productivity and well-being, and countries’ competitiveness and economic growth’.

The new strategy involves working to improve countries’ education systems, looking beyond inputs like schools, teachers and books, to focus on improving accountability and results.

The overview of the strategy includes a commitment to benchmarking:

‘At the country and global levels, the Bank will help develop a high-quality knowledge base to guide education systems reform. These efforts will include… new System Assessment and Benchmarking tools currently being developed. These system tools will provide detailed analysis of countries’ capacities throughout the education system, from ECD [early child development?] and teacher policy to tertiary education and skills development.

In each of these dimensions, the system tools will assess the “missing middle” of intermediate outcomes, providing information about where the results chain is breaking down. By benchmarking progress against international best practices, the tool will both highlight areas of strength and weakness and identify successful reformers [sic] that can serve as models in specific areas of education’.

These new tools are incorporated in a programme called System Assessment and Benchmarking for Education Results (SABER), which the World Bank describes as:

a comprehensive toolkit of system diagnostics to examine education systems and their component policy domains against global standards, best practices, and in comparison with policies and practices of countries around the world’.

SABER will look across the education policy spectrum and will review systems for equity and inclusion, as well as opportunities for tracking and extending learning, both of which may conceivably encompass gifted education (though there is as yet no reference to them doing so).

For each policy domain, the Bank will produce:

  • A conceptual framework to identify the intended policy goals, the policy levers in place to secure those goals and the indicators that measure progress;
  • Diagnostic tools to help assess performance on policy goals, drawing on available evidence and outcomes in the highest performing education systems;
  • Country reports giving a snapshot of performance including a 1-page report card;
  • Case studies to illustrate how countries have improved their performance;
  • A website to share the information; and
  • ‘A global education benchmarking tool drawing on a database of education leading indicators across all of the critical policy domains to give a comprehensive picture of what’s going well and what can be reformed for any country to get better results from their education system.’

In Part Two we shall look at how benchmarking is becoming influential in the United States and in England, and at the reservations that have been expressed about it.

GP

March 2011

PISA 2009: International Comparisons of Gifted High Achievers’ Performance


This post is an initial review of what PISA 2009 tells us about the performance of gifted high achievers in England and other English-speaking countries compared with the countries at the top of the PISA 2009 rankings.

It concentrates on what we can deduce from the figures rather than causation: that will be addressed in subsequent posts. It examines:

  • average performance by country, including changes between PISA 2006 and PISA 2009
  • the performance of high achievers, comparing the relative results of different countries in 2009 and how those have changed since PISA 2006
  • relative differences between the performance of high achievers and average performance by country – expressed in terms of rankings – and how those have altered between 2006 and 2009.

The twelve countries and regions included in the analysis are the highest performers – Hong Hong (China), Korea, Taiwan, Finland and, for 2009 only, Shanghai (China) and Singapore – plus Australia, Canada, Ireland, New Zealand the UK and the USA.

I should state at the outset that I am not a statistician: this is a lay analysis and I apologise in advance for any transcription errors. Nevertheless, I hope it reveals some significant findings, including points which have received scant attention in the wider media coverage of the PISA results.


Background to PISA

The Programme for International Student Assessment (PISA) is a triennial OECD survey of the performance of 15 year-old students in science, mathematics and reading. Science was the main focus in 2006; reading is the main focus in 2009.

Fifty-seven countries took part in PISA 2006; a total of sixty-seven countries have taken part in PISA 2009. The effect of this increase in numbers on rankings should be borne in mind, especially the inclusion of very high-performing areas, notably Shanghai and Singapore.

It is also worth noting at the outset that PISA rankings do not reflect the overall numbers of students achieving specific levels: a small country that has a high percentage of its students achieving a high achievement level outscores a bigger country with a lower percentage of high achievers, even though the overall number of high achievers in the bigger country is greater.

PISA assesses reading, scientific, mathematical literacy. It is important to have a clear understanding of exactly what is being assessed, not least so we can understand to what extent this differs from the nature of our own national assessments.

If a country’s national assessments are congruent with PISA then it will be likely to perform much better in PISA than a similar country which is domestically focused on quite different priorities.

According to the PISA 2009 Assessment Framework:

Reading literacy…is defined in terms of students’ ability to understand, use and reflect on written text to achieve their purposes…the capacity not just to understand a text but to reflect on it, drawing on one’s own thoughts and experiences. In PISA, reading literacy is assessed in relation to the:

Text format…continuous texts or prose organised in sentences and paragraphs…non-continuous texts that present information in other ways, such as in lists, forms, graphs, or diagrams… a range of prose forms, such as narration, exposition and argumentation…both print and electronic texts…these distinctions are based on the principle that individuals will encounter a range of written material in their civic and work-related adult life (e.g. application, forms, advertisements) and that it is not sufficient to be able to read a limited number of types of text typically encountered in school.

Reading processes (aspects): Students are not assessed on the most basic reading skills, as it is assumed that most 15-year-old students will have acquired these. Rather, they are expected to demonstrate their proficiency in accessing and retrieving information, forming a broad general understanding of the text, interpreting it, reflecting on its contents and reflecting on its form and features.

Situations: These are defined by the use for which the text was constructed. For example, a novel, personal letter or biography is written for people’s personal use; official documents or announcements for public use; a manual or report for occupational use; and a textbook or worksheet for educational use. Since some groups may perform better in one reading situation than in another, it is desirable to include a range of types of reading in the assessment items.

Mathematical literacy… is concerned with the ability of students to analyse, reason, and communicate ideas effectively as they pose, formulate, solve, and interpret solutions to mathematical problems in a variety of situations. The PISA mathematics assessment has, so far, been designed in relation to the:

Mathematical content: This is defined mainly in terms of four overarching ideas (quantity, space and shape, change and relationships, and uncertainty) and only secondarily in relation to curricular strands (such as numbers, algebra and geometry).

Mathematical processes: These are defined by individual mathematical competencies. These include the use of mathematical language, modelling and problem-solving skills…

Situations: These are defined in terms of the ones in which mathematics is used, based on their distance from the students. The framework identifies five situations: personal, educational, occupational, public and scientific.

However, a major revision of the PISA mathematics framework is currently underway in preparation for the PISA 2012 assessment.

Scientific literacy… is defined as the ability to use scientific knowledge and processes not only to understand the natural world but to participate in decisions that affect it. The PISA science assessment is designed in relation to:

Scientific knowledge or concepts: These constitute the links that aid understanding of related phenomena. In PISA, while the concepts are the familiar ones relating to physics, chemistry, biological sciences and earth and space sciences, they are applied to the content of the items and not just recalled.

Scientific processes: These are centred on the ability to acquire, interpret and act upon evidence. Three such processes present in PISA relate to: 1) describing, explaining and predicting scientific phenomena, 2) understanding scientific investigation, and 3) interpreting scientific evidence and conclusions.

Situations or contexts: These concern the application of scientific knowledge and the use of scientific processes applied. The framework identifies three main areas: science in life and health, science in Earth and environment, and science in technology.’

Defining high achievers in PISA

PISA performance scales are designed so that the average student score in OECD countries is 500 or thereabouts. Student performance is divided into 6 proficiency levels (only 5 for reading in PISA 2006), defined in terms of the competences demonstrated by students achieving that level.

Reading

In PISA 2006 in reading, the highest proficiency level 5 was achieved by 8.6% of OECD students with a lower score limit of 625.6. In PISA 2009 a level 6 was introduced (lower score limit of 698.3) which was achieved by 0.8% of OECD students. Levels 5 and 6 combined (lower score limit of 625.6) was achieved by 7.6% of OECD students. This analysis assumes therefore that levels 5 and 6 together in 2009 can be compared with level 5 in 2006.

We can conclude that overall higher level performance in OECD countries fell by 1.0% between 2006 and 2009. This may well be attributable to changes in the level of demand in the assessment framework rather than an overall dip in performance.

According to the PISA 2009 Assessment Framework (or the PISA Results book Volume I in the case of reading) tasks at level 6:

‘typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.’

And tasks at level 5:

‘that involve retrieving information require the reader to locate and organise several pieces of deeply embedded information, inferring which information in the text is relevant. Reflective tasks require critical evaluation or hypothesis, drawing on specialised knowledge. Both interpretative and reflective tasks require a full and detailed understanding of a text whose content or form is unfamiliar. For all aspects of reading, tasks at this level typically involve dealing with concepts that are contrary to expectations.

Science

In PISA 2006, science level 6 was achieved by 1.3% of OECD students and required a lower score limit of 707.9. Level 5 and above was achieved by 9.0% requiring a lower score of 633.3.

In 2009, these figures were: level 6 achieved by 1.1% of OECD students with a lower score limit of 707.9; level 5 and above achieved by 8.5% of OECD students with a lower score limit of 633.3.

The science framework does not seem to have changed significantly between the two assessments, so we can provisionally identify a small overall dip in higher level performance between 2006 and 2009.

Level 6 students can:

‘consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.

At Level 5, students can:

‘identify the scientific components of many complex life situations, apply both scientific concepts and knowledge about science to these situations, and can compare, select and evaluate appropriate scientific evidence for responding to life situations. Students at this level can use well-developed inquiry abilities, link knowledge appropriately and bring critical insights to situations. They can construct explanations based on evidence and arguments based on their critical analysis.

Mathematics

In PISA 2006 mathematics, level 6 was achieved by 3.3 % of OECD students with a lower score limit of 669.3 and level 5 and above by 13.3% of OECD students with a lower score of 607.

In PISA 2009, level 6 was achieved by 3.1% of OECD students with a lower score limit of 669.3 and level 5 and above by 12.7% of OECD students with a lower score of 607.

As with science, the framework does not appear significantly changed and so we can provisionally identify a small drop overall in the proportion of OECD students achieving these higher levels.

The PISA 2009 rubric says:

‘At ‘Level 6 students can conceptualise, generalise, and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply this insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments, and the appropriateness of these to the original situations.

‘At Level 5 students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare, and evaluate appropriate problem solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriate linked representations, symbolic and formal characterisations, and insight pertaining to these situations. They can reflect on their actions and formulate and communicate their interpretations and reasoning.’

Comparing PISA 2006 and 2009 results by country for all participants

Table 1 below compares average scores by country in PISA 2006 and PISA 2009. These are essentially the headline figures which attract most media attention and they are included here primarily for the purposes of comparison.


Country Reading Maths Science
2009 2006 2009 2006 2009 2006
score rank score rank score rank score rank score rank score rank
Aus 515 9 513 7 514 15 520 13 527 10 527 8
Can 524 6 527 4 527 10 527 7 529 8 534 3
Fin 536 3 547 2 541 6 548 2 554 2 563 1
HK 533 4 536 3 555 3 547 3 549 3 542 2
Ire 496 21 517 6 487 32 501 22 508 20 508 20
Korea 539 2 556 1 546 4 547 4 538 6 522 11
NZ 521 7 521 5 519 13 522 11 532 7 530 7
Shang 556 1 N/A N/A 600 1 N/A N/A 575 1 N/A N/A
Sing 526 5 N/A N/A 562 2 N/A N?A 542 4 N/A N/A
Taiwan 495 23 496 16 543 5 549 1 520 12 532 4
UK 494 25 495 17 492 27 495 24 514 16 515 14
US 500 17 N/A N/A 487 31 474 35 502 23 489 29
Average 493 495 496 497 501 498

However, it is worth drawing attention to some key points arising from the table:

  • As indicated above, there have been small falls in overall OECD performance in reading and maths between 2006 and 2009 and a corresponding small increase in science performance. The change in reading in particular may be more attributable to a tightening of the assessment framework
  • In reading, the average score has increased slightly in Australia, remained unchanged in New Zealand, and fallen slightly in Canada, Hong Kong, Taiwan and the UK. Given the relatively tougher assessment framework and the associated overall dip in cross-OECD performance, these countries have arguably done well to maintain their scores
  • However, there have been more significant falls in reading performance in Finland, Ireland and Korea – all three strong performers in PISA 2006. Only Ireland has experienced a significant drop in ranking as a consequence, but these results should be a matter of concern in all three countries, perhaps suggesting they may need to focus more on aspects of reading newly introduced into the 2009 assessment framework
  • In maths, the average score has increased significantly in Hong Kong and the US, remained largely unchanged in Canada, New Zealand and the UK and fallen significantly in Australia, Finland, Ireland and Taiwan. Only Ireland has experienced a significant drop in its ranking
  • Nevertheless, Australia, Finland and Canada should be concerned about the dip in their performance of 6-7 points in each case. This cannot be attributable to other countries leapfrogging them in the table
  • In science, Hong Kong, Korea and the US have all made significant improvements since 2006, while performance is largely unchanged in Australia, Ireland, New Zealand and the UK and has declined significantly in Canada, Finland and Taiwan. The latter should be concerned.
  • In all three areas, loss of rank combined with fairly static performance is attributable to other countries improving at a faster rate and is a matter of relative competition. It is not possible to depress the performance of a competitor so these countries must concentrate on improving their own performance. That said, they should take some comfort from their capacity to sustain their 2006 performance when their competitors are clearly not doing so
  • On the basis of this evidence, the countries with the biggest overall headaches are Canada, Finland and especially Taiwan, all three lauded to some degree as PISA world leaders.

Comparing Percentages of High Achievers in PISA2009 and PISA2006

Table 2 compares the percentages of high achievers in each of our 12 countries who achieved the higher levels in reading, maths and science in 2006 and 2009 respectively.


Country Reading Maths Science
2009 2006 2009 2006 2009 2006
Level 6 Levels 5+6 Level 5 Level 6 Levels 5+6 Level 6 Levels 5+6 Level 6 Levels 5+6 Level 6 Levels 5+6
Aus 2.1 12.8 10.6 4.5 16.4 4.3 16.4 3.1 14.6 2.8 14.6
Can 1.8 12.8 14.5 4.4 18.3 4.4 18 1.6 12.1 2.4 14.4
Fin 1.6 14.5 16.7 4.9 21.6 6.3 24.4 3.3 18.7 3.9 20.9
HK 1.2 12.4 12.8 10.8 30.7 9 27.7 2 16.2 2.1 15.9
Ire 0.7 7 11.7 0.9 6.7 1.6 10.2 1.2 8.7 1.1 9.4
Korea 1 12.9 21.7 7.8 25.5 9.1 27.1 1.1 11.6 1.1 10.3
NZ 2.9 15.8 15.9 5.3 18.9 5.7 18.9 3.6 17.6 4 17.6
Shang 2.4 19.4 N/A 26.6 50.7 N/A N/A 3.9 24.3 N/A N/A
Sing 2.6 15.7 N/A 15.6 35.6 N/A N/A 4.6 19.9 N/A N/A
Taiwan 0.4 5.2 4.7 11.3 28.5 11.8 31.9 0.8 8.8 1.7 14.6
UK 1 8 9 1.8 9.9 2.5 11.2 1.9 11.4 2.9 13.7
US 1.5 9.9 N/A 1.9 9.9 1.3 7.7 1.3 9.2 1.5 9.1
Average 1 7 8.6 3.1 12.7 3.3 13.4 1.1 8.5 1.3 8.8


Reading

  • The 2006 leaders amongst our subset of countries were Korea, Finland and New Zealand respectively whereas, in 2009, the leaders were Shanghai, New Zealand and Singapore (Shanghai and Singapore did not take part in the 2006 assessment).
  • All except Taiwan and Ireland exceeded the OECD average, although the percentage of the highest level 6 achievers in 2009 was lower than the OECD average in Taiwan and Ireland and equivalent to it in the UK and Korea. These four countries arguably need to concentrate more on the very top of their achievement range.
  • The percentage achieving levels 5/6 has increased over the 3-year period in Australia and Taiwan, remained largely unchanged in Hong Kong and New Zealand, fallen slightly in the UK and fallen substantially in Canada, Finland, Ireland and Korea. The decline in Korea is particularly startling.

Maths

  • The 2006 leaders in our subset were Taiwan, Korea and Hong Kong respectively at level 6 and Taiwan, Hong Kong and Korea respectively at levels 5/6. In 2009, the leaders are Shanghai, Singapore and Taiwan respectively at level 6 and Shanghai, Singapore and Hong Kong respectively at levels 5/6.
  • In 2006, the UK, US and Ireland were below the OECD average for level 6 performance and the other nine countries were above it. This continued to be the case in 2009 though, whereas the US was moving in the right direction, level 6 performance declined in the UK and Ireland, identifying this as an aspect potentially requiring attention in both countries;
  • In 2006, the same three countries were below the OECD average for level 5/6 performance and this continued to be the case in 2009. As with level 6, the US has improved its performance, drawing level with the UK, but the UK’s performance has declined somewhat and Ireland’s has declined significantly. This suggests that higher achievers also need more attention in both countries
  • Between 2006 and 2009, other countries improving their performance included Australia and Hong Kong (level 6) and Canada and Hong Kong (levels 5 and 6) though only Hong Kong managed significant improvement. Performance was relatively unchanged in Canada (level 6) and New Zealand (levels 5 and 6). There was a decline in Finland, Korea, New Zealand and Taiwan at level 6, most noticeably in Finland and Korea, and in Finland, Korea and Taiwan at levels 5 and 6 together.
  • If we compare rates of change for level 6 and levels 5/6 respectively, we see that countries doing relatively better with their highest achievers (level 6) include Australia, New Zealand, Taiwan, the UK and the US, while countries doing relatively better with their higher achievers (levels 5 and 6) include Canada, Finland and Korea.

Science

  • The 2006 leaders in terms of level 6 performance were New Zealand, Finland and the UK. Finland, New Zealand and Hong Kong led the field for level 5 and 6 performance. In 2009 Singapore, Shanghai and New Zealand respectively were leaders in level 6 performance and Shanghai, Singapore and Finland respectively for levels 5 and 6 together
  • In 2006, Ireland and Korea were below the OECD average for level 6 performance but all countries were above the average for levels 5 and 6 combined. In 2009, Taiwan had fallen below the OECD average for level 6, Korea matched it and Ireland had exceeded it; all countries were still above the OECD average for levels 5 and 6 together. This suggests that Ireland as well as Korea deserve credit for the progress made with their highest achievers in science.
  • Australia was the only other country to improve its level 6 performance in science during this period while Hong Kong, Korea and the US (very slightly) improved their performance for levels 5 and 6 together.
  • There were declines in performance at level 6 for Canada, Finland, Hong Kong Kong, New Zealand, Taiwan, the UK and the US, while Korea flatlined. The worst declines were in Canada, Taiwan and the UK.
  • In terms of levels 5 and 6 combined, improvement was made in the period by Hong Kong, Korea and the US (very slightly in the case of the US). There were declines in performance in Canada, Finland, Ireland, Korea, Taiwan and the UK, the fall in Taiwan being particularly marked.
  • Examining the rate of change for level 6 compared with levels 5 and 6, it is hard to detect a clear pattern but Australia and Ireland seem to be doing relatively better with level 6 while, conversely, Canada, Korea the UK and the US seem to be doing relatively worse

As we have noted above, performance across all OECD countries fell slightly across the board between 2006 and 2009. Insofar as this is not attributable to changes to the assessment frameworks, we might reasonably note that the OECD’s effort in producing PISA has not of itself resulted in improved performance across OECD countries for high achievers over this 3-year period.


Comparing ranks for high achievers versus all achievers, 2006 and 2009

The third and final table compares rank positions for high achievers and all achievers in 2006 and 2009 respectively.

This comparison could also be undertaken on the basis of percentages achieving the different levels and/or the average scores achieved, but the rankings are more readily available and are a reasonable guide to changes in the relative performance of countries, if not absolute changes.


Reading Maths Science
2009 2006 2009 2006 2009 2006
Level 6 Levels 5+6 All Level 5 All Level 6 Levels 5+6 All Level 6 Levels 5+6 All Level 6 Levels 5+6 All Level 6 Levels 5+6 All
Aust 4 7 9 9 7 13 16 15 14 14 13 5 7 10 4 5 8
Can 6 7 6 4 4 14 12 10 13 12 7 10 9 8 6 7 3
Fin 7 4 3 2 2 11 7 6 6 4 2 4 3 2 2 1 1
HK 10 9 4 5 3 4 3 3 3 2 3 7 6 3 9 4 2
Ire 17 22 21 6 6 40 36 32 32 28 22 15 19 20 18 19 20
Korea 12 6 2 1 1 5 5 4 2 3 4 18 18 10 18 16 11
NZ 1 2 7 3 5 9 11 13 9 8 11 3 4 7 1 2 7
Shang 3 1 1 N/A N/A 1 1 1 N/A N/A N/A 2 1 1 N/A N/A N/A
Sing 2 3 5 N/A N/A 2 2 2 N/A N/A N/A 1 2 4 N/A N/A N/A
Taiw 27 29 23 29 16 3 4 5 1 1 1 23 18 12 12 5 4
UK 12 18 25 9 27 30 30 27 24 23 22 8 11 16 3 8 14
US 8 11 17 N/A N/A 29 30 31 33 29 35 14 17 23 14 9 29


  • For reading, Taiwan and the UK were relatively unusual in 2006 because of the dissonance between their rankings – the UK because it did so much better for its higher achievers; Taiwan because it did so much worse. By 2009, Hong Kong and Korea are beginning to follow the same trend as Taiwan, while Australia, New Zealand and the US have joined the UK. These latter four countries might therefore be expected to concentrate disproportionately on their lower achievers in future
  • For maths, there are some clear disparities between relative national ranks in 2006. In the case of Canada and Ireland, the rank is significantly lower for level 5 and above than it is for all students. By 2009, however, these gaps have almost invariably narrowed, perhaps suggesting a degree of responsiveness to PISA 2006, although in some cases it appears that slippage down the overall rankings has influenced matters. Certainly there is no real evidence here that high achievers in maths are particularly neglected compared with their peers, or vice versa.
  • For science, it is clear that in 2006, Australia, New Zealand, the UK and the US were stronger performers, relatively speaking, with their higher achievers – and this is particularly pronounced in the last three of these countries. By 2009, these distinctions continue but are becoming relatively less clear-cut, following a similar pattern to maths.

Conclusions

The analysis above provides some detailed pointers for future support for high achievers, but what overall assessment can we offer for each of our English-speaking countries?


Australia

In PISA 2006, Australia achieved high overall rankings in reading (7) and science (8) and a relatively good ranking in maths (13). It fell two places in the rankings in all three areas in PISA 2009, although its average score increased slightly in maths, was unchanged in science and fell significantly in maths.

In 2006, its ranking for high achievers (levels 5 and 6) was slightly higher than its overall ranking in science, but not in reading or maths. By 2009, this remained true of science and had became true of reading as well.

The percentage of higher achievers (levels 5 and 6) in reading has increased significantly between 2006 and 2009, but the equivalent percentages in maths and science remain largely unchanged, except for small improvements for the highest achievers (level 6).

Moving forward, the priorities for Australia are likely to be improvement in maths across the board and probably for relatively low achievers in reading and science.


Canada

PISA 2006 showed Canada achieving very highly overall in reading (4) and science (3) and highly in maths (7). In 2009 it fell two places in reading (6), three places in maths (10) and five places in science (8), although average scores remained unchanged in maths and fell somewhat in science and reading.

Its 2006 rankings for high achievers were significantly lower than its overall ranking in maths and science but identical in reading. In 2009, there was still little difference in the relative rankings for science and reading and now little difference in maths either, although the change in maths is attributable to a fall in overall ranking rather than an improvement for high achievers.

The percentage of higher achievers has declined in reading and in science between 2006 and 2009 but has increased slightly in maths.

Canada has a relatively ‘balanced scorecard’ and will likely continue to focus on improving its results in all three areas and all achievement levels, though maths may be a relatively higher priority.


Ireland

Ireland’s overall rankings from PISA 2006 were high for reading (6) and mid-table for maths (22) and science (20). In PISA 2009 its ranking for science remained unchanged (20) but fell very significantly in maths (32) and especially reading (21). Average scores also fell significantly in maths and reading and were unchanged in science.

The 2006 rankings for higher achievers showed very little difference to overall rankings in science and reading, but somewhat lower relative rankings for high achievers in maths. The position is similar in 2009, and there is marked slippage down the rankings in reading – and to a lesser extent maths – for higher achievers as well as for all achievers.

The percentage of higher achievers has fallen significantly in maths and reading and slightly in science.

For the future, Ireland will need to reverse its downward trend in maths and reading while not neglecting improvements in science. It needs to focus on all levels of achievement, including its higher achievers.


New Zealand

In PISA 2006, New Zealand achieved a very high overall ranking in reading (5), a high ranking in science (7) and a relatively high ranking in maths (11). In PISA 2009, it slipped 2 places in reading and maths but retained its position in science. Average scores were unchanged in reading, fell slightly in maths and increased slightly in science.

Rankings for higher achievers in 2006 were significantly higher than overall rankings in science and slightly higher in reading and maths. By 2009 the difference between science rankings had closed somewhat, but this is attributable to slippage in the higher achieving rankings. In maths the position is broadly unchanged, but in reading the relatively higher ranking of the higher achievers is now more pronounced.

In terms of the percentages achieving higher levels, there has been little relatively change in reading, maths or science.

New Zealand is another country with a relatively ‘balanced scorecard’ but its higher achievers seem to be doing relatively well and it may wish to concentrate more on lower end of the achievement spectrum.


UK

The UK achieved good to mid-table rankings in PISA 2006 for science (14), reading (17) and maths (24). In PISA 2009 it fell slightly in science (16) and maths (27) and significantly in reading (25). Average scores fell slightly in all three areas.

In 2006, rankings for higher achievers were significantly higher than overall rankings in science and reading, but very similar in maths. This continues to be the case in 2009 with the decline shared across achievement levels.

The percentage achieving higher levels has fallen significantly between 2006 and 2009 in science and maths, and fallen slightly in reading.

The UK has to improve in all three areas, but particularly maths and reading. High achievers must be a priority in maths especially, but effort is required across all levels of achievement to ensure that lower achievers do not improve at the expense of their higher-achieving peers.


US

The PISA 2006 overall rankings for the US were low to mid-table in science (29) and maths (35). No result was declared for reading because of problems with the administration of the assessment. The PISA 2009 outcomes show that the US has improved its ranking by six places in science (23) and four places in maths (31) while it achieved a ranking of 17 in reading. Average scores increased significantly in both maths and science.

2006 rankings for higher achievers were much higher than the overall ranking in science and slightly higher in maths. By 2009, the gap had narrowed in science and maths. In reading higher achievers are ranked significantly higher than the overall ranking.

The percentage achieving higher levels is little changed in science between 2006 and 2009 but there is a significant improvement in maths.

The US is moving in broadly the right direction but has to continue to improve in all three areas, especially maths. This evidence suggests that the focus should be predominantly on lower achievers – except in maths where there is a problem across the board – but, as with the UK, care is needed to ensure that higher achievers are not neglected as a result.

The UK and the US are therefore in very similar positions, but whereas the UK needs to arrest a downward trajectory, the US is already moving in the right direction.


There is an agenda for improvement in all these countries, should they choose – as the UK has done – to align their priorities firmly with those assessed by PISA and other international comparisons studies.

And this analysis has also shown that there is clear room for improvement in the performance of other world leaders, such as Finland, Hong Kong and Korea: we should take with a big pinch of salt the news headlines that say we need only emulate them to be successful.

GP

December 2010