High Attaining Students in the 2012 Secondary School Performance Tables

.

This post collects and analyses data about the performance of high attaining students at Key Stages 4 and 5 in the 2012 Secondary Performance Tables and Key Stage 5 Tables respectively. It also draws on evidence from the Statistical First Reviews (SFRs) published alongside the tables.

The KS4 analysis compares 2012 outcomes with those for 2011, when a high-attaining pupil measure was first introduced into the Secondary Tables.

This is a companion piece to a parallel analysis of High Attaining Pupils in the 2012 Primary School Performance Tables published in December 2012.

The commentary below highlights results – either extraordinarily good or particularly bad – from specific schools identified in the tables. There may of course be extenuating circumstances not allowed for in the Tables to justify outcomes that, at first sight, seem unacceptably poor.

I cannot always reconcile the figures in the Performance Tables with those in the SFRs, although the differences are typically small. I apologise in advance for any transcription errors and cordially invite you to correct any that you find using the comments facility below.

For those who prefer not to read the full post, I have summarised some of the key findings in the next section, following a brief reprise of the generic points highlighted in media analysis to date

.

Headlines

Highlights of Media Analysis

The commentary since publication of the Performance Tables has focused predominantly on the following points (the figures quoted do not always reflect those actually in the SFRs):

  • 59.4% of pupils in all state-funded mainstream schools achieved the benchmark of 5+ A*-C Grades at GCSE including English and maths.
  • This percentage increased by 3.1% in sponsored academies, while the corresponding increase in state funded schools as a whole was 0.6%; The Department for Education’s initial press notice claimed this as evidence that: ‘standards are rising in sponsored academies… more than five times as quickly than in all state-funded schools’.
  • 36.3% of pupils eligible for free school meals achieved this measure compared with 62.6% of other pupils, giving an attainment gap of 26.3%, and so an improvement of 1.1% improvement compared with 2011.
  • 18.3% of pupils achieved the English Baccalaureate (EBacc), an increase of 2.7% compared with 2011. In mainstream state-funded schools however, only 16.4% achieved the EBacc, so giving an improvement of less than 1% compared with 2011.
  • 23.4% of schools and colleges produced no students with AAB+ A level grades in the facilitating subjects;
  • 195 secondary schools fell short of the current floor target, about 60 of them academies. There were 14 ‘converter academies’ in this position. The number of schools below the floor has almost doubled (from 107) as the threshold has been increased but, had the threshold been the same as in 2011, the number below the target would have fallen by 56.

One paper ran a headline claiming ‘Brightest Pupils “Going Backwards”’, supporting this primarily with evidence that fewer than half of high attainers were entered for the English Baccalaureate and over 60% failed to achieve it.

What truth is there in this statement and what additional evidence can be adduced to confirm or counter it? The section below highlights some key findings.

.

Highlights from This Analysis

This analysis suggests that:

  • The proportion of ‘high attainers’ varies significantly by sector: 33.6% of KS4 students in all mainstream state-funded schools are deemed ‘high attainers’. However, 36.8% of students in academies and free schools are high-attaining, including 42.5% in converter academies alone and 42.9% in free schools/UTCs/studio schools alone. Such distinctions might go some way at least towards explaining why these categories of school perform relatively well. Rather strangely, only 89.8% of students in selective schools count as high attainers, but the comparative figure for comprehensive schools is much lower at 31.7%.
  • Achievement of the 5+ GCSEs A*-C including English and maths measure is deteriorating, and relatively poor in free schools/UTCs/studio schools:  6.0% of high attainers in mainstream state-funded schools fail to achieve this measure, but that is far better than the corresponding percentages for middle attainers (44.9%) and low attainers (92.9% ). While some 480 schools achieve 100% amongst their high attainers on this measure, about 20 schools are at or below 67% and 66 are at or below 75%. This is worrying evidence of underachievement. Though converter academies score well on this measure (only 4.5% of high attainers fail to achieve the benchmark), the situation is much worse in free schools/UTCs/studio schools where 12.5% fail to do so. This is particularly surprising given the relatively high incidence of high attainers in such schools. An additional concern is that the overall percentage of high attainers achieving this measure has fallen 1.2% since last year. The fall in converter academies has been much larger at 6.9%.
  • There are big disparities in achievement of (and entry for) the EBacc: 38.5% of high attainers in mainstream state-funded schools managed the EBacc, which means of course that over 60% did not do so. The percentage was much higher in converter academies where 49.1% of high attainers achieved the EBacc. Conversely, in free schools/UTCs/studio schools just 23.6% achieve the EBacc. There are 235 schools at which no high-attaining students whatsoever managed to secure the EBacc. No high-attaining students were entered at 186 schools. The overall percentage of high attainers with the EBacc at mainstream state funded schools increased by 1.3% compared with 2011. Although there was an increase of 3.4% in sponsored academies, there was an even larger fall of 6.3% in converter academies.
  • Too many high attainers fail to make three levels of progress in English and in maths: Roughly 1 in every 6 high attainers fails to make the expected three levels of progress from KS2 to KS4 in English, and roughly 1 in 7 high attainers fails to do so in maths. This is further evidence of underachievement. There is some underachievement even amongst high attainers in selective schools. In 93 schools every single high attainer achieved the required progress in English – and the same is true of 100 schools in respect of maths. Twenty-six schools achieved this feat in both English and maths. However, there are 75 schools where 50% or fewer made the expected progress in English, and 43 schools where the same applied in maths. Middle attainers significantly outperformed high attainers at the majority of these schools. There were odd schools where even low attainers managed to outperform high attainers!  The overall percentage of high attainers in mainstream state-funded schools making the requisite progress in English fell by 3.8% compared with 2011, though it improved by 0.6% in maths. This may suggest that high attainers were adversely affected by the problems over GCSE English marking.
  • There were huge variations between schools in the percentage achieving AAB A level grades in the facilitating subjects, but there has been good progress against the associated social mobility indicator: Overall 7.4% of A level students in state-funded schools and colleges managed AAB+ Grades at A level in the facilitating subjects. This percentage reached 65% in the highest performing state-funded schools (which are predominantly selective). However, there were 574 schools and colleges where zero A level students achieved this measure. In every sector relatively more students achieve at least three A*/A grades at A level (regardless of subject) than achieve AAB in facilitating subjects. The gap between independent schools and mainstream state-funded schools on the AAB facilitating subjects measure is 15.1%, so there has been a 1% improvement on this social mobility indicator since 2011.

The full analysis is set out below, prefaced by some essential background information about the key measures and how they are defined.

.

The Key Measures and How They Are Defined

Secondary

The 2012 Secondary Performance Tables provide breakdowns of performance against some measures for ‘high attainers’, ‘middle attainers’ and ‘low attainers’. These three groups are defined on the basis of prior attainment at the end of Key Stage 2.

  • High attainers are those who achieved above Level 4 in KS2 tests – ie their average point score in English, maths and science tests was 30 or higher.
  • Middle attainers are those who achieved at the expected Level 4 in KS2 tests – ie their average points score in English, maths and science tests was between 24 and 29.99 – and
  • Low attainers are those who achieved below Level 4 in KS2 tests – ie their average points score in English, maths and science tests was under 24.

Because these calculations are made on the basis of average points scores across three subjects, it follows that ‘high attainers’ may have a relatively spiky achievement profile, compensating for middling performance in one area through high attainment in another. Conversely, learners who are exceptional in one subject but relatively low achievers in the other two are unlikely to pass the APS 30 threshold.

The ‘high attainer’ threshold is not overly demanding. The Tables attached to SFR 02/2013: GCSE and Equivalent Results in England 2011/12 (revised) show that 33.6% of all pupils in state-funded mainstream schools were within this category (33.4% of boys and 33.8% of girls). More information about the distribution of this population is provider later in this post.

The 2012 Primary Performance Tables also show that 27% of pupils achieved Level 5 or above in both English and maths, while the average points score of all pupils nationally was 28.3.

The definition of these sub-groups makes it possible to compare performance of these three groups against each other and against the national average. This enables us to reach broad conclusions about whether schools are successfully improving the performance of all pupils across the distribution, or whether they are focused disproportionately on one group or the other, perhaps in an effort to minimise the percentage of pupils not achieving the threshold performance measures.

This is particularly critical for those schools at risk of dipping below the ‘floor targets’, which determine whether or not they are vulnerable to Government intervention. The secondary floor target is for 40% of pupils in a school to achieve 5+ GCSEs at Grades C or above including English and maths. plus 70% of pupils to make the expected three levels of progress between the end of KS2 and the end of KS4 in each of English and maths.

The guidance on the Performance Tables provides a useful diagram showing the expected levels of progress for high, middle and low attaining pupils respectively. High attainers are expected to achieve Grade B or above in GCSE English and maths.

By comparing 2012 results with those from 2011, we can judge whether or not schools seem to be adjusting their behaviour, although we cannot of course establish to what extent any adjustment is attributable to the Performance Tables.

.

Key Stage 5

The Statement of Intent for the 2012 Performance Tables issued in July 2012 confirmed plans to introduce for the first time:

‘Percentages of students achieving three A levels at grades AAB or higher in facilitating subjects, reflecting the subjects and grades most commonly required by Russell Group and other top universities.’

The data provided gives the percentages for KS5 students and A level students respectively, and the measure is more accurately three or more A levels excluding equivalences.

The subjects covered by the term ‘facilitating subjects’ are listed as ‘Mathematics and Further Mathematics, English (Literature), Physics, Biology, Chemistry, Geography, History, Languages (Classical and Modern). A full list of the examinations that count as facilitating subjects is published amongst the technical papers supporting the tables.

Following publication of the Tables, significant reservations have been expressed about the design of this measure. The Russell Group – the organisation largely responsible for promulgating it – pointed out that:

‘It would be wrong to use this simple indicator as a measure of the number of pupils in a school who are qualified to apply successfully to a Russell Group university.

The Russell Group has published a guide called Informed Choices which lists ‘facilitating subjects’ which are those most commonly required for entry to our leading universities.

However, it’s important that students make decisions based on their individual circumstances. We encourage all prospective students to check the entry requirements for their chosen course before applying to a particular university.’

The AAB+ in facilitating subjects measure supports one of the Government’s preferred Social Mobility Indicators which compares the percentage of students attending state and independent schools respectively who achieve this measure. (In 2011 the gap was 16.1%, with 7.0% of state school students and 23.1% of independent school students achieving this measure.)

In addition there are also new other measures included in the 2012 KS5 Performance Tables for the first time:

  • Average point score per student (A level, IB, pre-U and AQA Bacc) – the total number of points achieved by a student divided by the total number of students taking the relevant qualifications
  • Average point score per entry (A level, IB, pre-U and AQA Bacc) – the total number of points achieved  by a student divided by the total number of entries to the relevant qualifications.
  • Value added score for each Level 3 qualification type

A technical note provides details of the point scores allocated to different grades in different qualifications.

.

2012 Results

Secondary Performance Tables

The Tables show that:

  • 59.8% (or is it 59.4%) of students in mainstream state-funded schools achieved five or more GCSEs (or equivalent) at grades A*-C including English and maths. This was achieved by 94.0% of high attainers (compared with 55.1% of middle attainers and just 7.1% of low attainers.
  • One school (Pate’s Grammar School, Cheltenham) is registered as having no high attaining students achieving this benchmark, but this is because it adopted an IGCSE English qualification which is not accepted for inclusion in the Performance tables. The next lowest was 3% (The Rushden Community College specialising in Mathematics and Computing) which includes 35 pupils in its high attainer cohort. Rushden is also reported to have made an idiosyncratic choice of English syllabus. (Two more schools – Culverhay and Raincliffe – are below 50%. A further 15 schools are at 67% or below and, altogether, 66 schools are at 75% or below, including several academies. Some 480 schools achieved 100% on this measure.
  • In English state-funded schools, 94.3% of high attainers achieved Grades A*-C in GCSE English and maths. This compared with 55.8% of middle attainers and 7.3% of low attainers. The average figure for all pupils was 59.3%. Slightly more schools – almost 530 – managed 100% on this measure but, excluding Pate’s, Rushden was again the worst performing on 3%, followed by Culverhay on 40%. Altogether (excluding Pate’s) 17 schools were at or below 67% and almost 60 below 75%, once again including some prominent academies;
  • The average point score per pupil for the best eight subjects entered (all qualifications) was 343.3 for all pupils in state-funded schools. The average for high attainers was 398.4, compared with 338.9 for middle attainers and 263.7 for low attainers. Several schools, almost all selective, achieved an APS of over 450. At the other end of the table, Culverhay came in at 273.4 and a further 40 schools were at 350 or lower. The results were similar when GCSEs only were counted, but with lower APS at the bottom of the scale. The APS for high attainers across all state-funded schools was 375.4, but 16 schools recorded an APS of 200 or lower, around half of the academies.
  • The average grade per qualification was A+ in 24 mostly selective schools, yet it was below C at 10 schools. It was D at City Academy Norwich and Culverhay and D+ at the Milton Keynes Academy. If only GCSEs are counted, the average grade was E+ at Culverhay and D in three more schools. Thirty-two selective schools recorded A+.
  • The average number of entries per pupil for high attainers across all state-funded schools was 12.4 for all qualifications and 9.7 for GCSEs. Colyton Grammar School entered high attaining pupils for an average 14.5 GCSEs, significantly higher than any other school, while 15 schools entered their high attainers for fewer than 5 GCSEs. When all qualifications are counted, The James Hornsby High School managed an astonishing 22.0 average entries amongst its 17 high attainers.
  • Turning to the English Baccalaureate, 16.2% of all KS4 pupils in state-funded schools achieved it, but 38.5% of high attainers did so, compared with 7.1% of middle attainers and fewer than 1% of low attainers. All twenty high attainers at Tauheedul Islam Girls High School achieved the EBacc. On the other hand, no high attainers did so at 235 schools. On average across English state-funded schools, 46.3% of high attaining students were entered for all the EBacc subjects. In 186 schools no high attaining pupils were entered.
  • In English state funded schools, 68% of pupils made the expected three levels of progress in English while 68.7% did so in maths. In English, more high attainers than middle and low attainers – 83.4% – made such progress and the same was true of maths where the comparable percentage was 85.8%. However, this means that 1 in every 6 high attainers failed to make the expected progress in English and 1 in 7 failed to do so in maths. There were 93 schools where 100% of high attainers made the requisite progress in English and 100 schools where the same was achieved in maths. Twenty-six schools achieved this in both English and maths.
  • At Pate’s (see above) and Rushden, no high attaining pupils made the expected progress in English and 50% or fewer managed this at 75 schools. At Parklands High School just 8% of high attainers made three levels of progress in maths and there were 43 schools where 50% or fewer managed this. In both English and maths, the vast majority of schools where less than 50% of high achievers managed three levels of progress realised significantly higher percentages for their middle attainers. A few did so for their low attainers as well. At Milton Keynes Academy, the percentages for low attainers and middle attainers in English were 76% and 79% respectively – for high attainers it was 20%. The disparities were relatively less stark in maths.
  • When it came to the value added (best eight) measure, Beis Yaakov High School scored 1065.0 for its high attainers and six more schools were over 1050 including Tauheedul and Mossbourne Academy. At Culverhay the figure was 853.7 and in 35 schools it was 950.0 or below.

.

Key Stage 5 Performance Tables

The KS5 Tables reveal that:

  • In all English state-funded schools and colleges, 4.8% of KS5 students achieved at least AAB grades in A level facilitating subjects, and 7.4% of A level students did so. The percentage reached 65% in the top-performing state-funded schools against the each of these measures. Queen Elizabeth’s Barnet was the best-performing state-funded school in this respect. At the other end of the spectrum, 610 schools and colleges – both state-funded and independent – managed zero KS5 students on this measure. The comparable number where no A level students achieved this measure was 574. Both totals are surprisingly high.
  • The average point score per A level student in state-funded schools and colleges was 736.2 and the average point score per entry was 210.2. Colchester Royal Grammar School managed an APS of 1393.0 per student while The Henrietta Barnet School managed 275.2 per entry. Six institutions managed an APS per student of less than 200, four of them academies. The APS per A level entry was below 100 at two institutions, one of them an Academy.

.

Statistical First Releases

Alongside the Performance Tables, several associated statistical publications were released. These included:

These provide some further detail about the achievement of high-attaining students.

.

Key Stage 4

SFR02/13 (Table 1D) shows the proportion of students in state-funded schools making the expected 3 levels of progress towards GCSE English and Maths having achieved a Level 5 in their KS2 tests in those subjects (so this is different to the ‘high attainers’ progression measure in the Secondary Performance Tables).

Although 76.9% of pupils made the expected progress in English and 79.7% did so in maths, that means 1 in 4 students did not make the expected progress in English and 1 in 5 did not do so in science. Although progression rates are higher for those with Level 5 than for those with lower levels, this outcome still leaves something to be desired (especially since a minority of these students did not even manage a C grade at GCSE).

Tables 6A and 6B show performance by types of school and the admissions basis of schools for different ‘attainment bands’ which do coincide with the definitions of high, middle and low attainers employed in the Performance Tables.

  • Whereas 33.6% of students in mainstream state-funded schools (31.1% in all state-funded schools) meet the ‘high attainer’ criterion, fewer (32.0%)  students in local authority maintained schools are high attainers, while the percentage in academies and free schools is 36.8%, quite a significant difference in favour of the latter. The figure for sponsored academies is just 20.9%; conversely, it is 42.5% for converter academies and for free schools, UTCs and studio schools 42.9%. These advantages in favour of free schools and convertor academies might be expected to have a significant impact on the overall performance of those schools.
  • The high attaining band (described here as ‘above level 4’) registers 94% achieving 5+ GCSEs at grades A*-C including English and maths, whereas the comparable figures for types of school are local authority maintained – 93.5%, sponsored academy – 91.5%, converter academy – 95.5% and free schools/UTC/studio schools – 87.5%, This would suggest that, while converter academies are reaping the benefits of a larger cohort of high attainers, the same cannot be said of free schools, not yet at least. The differences are broadly similar on the GCSE Grades A*-C in English and maths measure.
  • The overall percentage of high attainers achieving the EBacc in state-funded mainstream schools is 38.5%. This drops to 35% in local authority schools, but is as low as 21.1% in sponsored academies and as high as 49.1% in converter academies. In free schools, UTCs and studio schools it is only 23.6%. So, whereas converter academies have a sizeable lead over schools remaining with the local authority, sponsored academies and free schools are a long way behind.
  • Turning to the progress measures in English and maths, we find the figures for ‘above Level 4’ in mainstream state-funded schools are 83.4% and 85.8% respectively. This drops to 82.5% and 84.4% in local authority maintained schools. Converter academies are at 86.7% and 90.2% respectively, but sponsored academies manage only 76.0% and 77.8% respectively. Free schools, UTCs and studio schools are way behind on English, at 68.1%, but much more competitive in maths at 87.5%. This might suggest that UTCs are over-focused on maths, or that their intakes are heavily skewed towards those with spiky prior attainment profiles that favour maths and science over English. It would be helpful to see disaggregated figures for free schools.
  • These tables also provide breakdowns for all these measures for comprehensive and selective schools.  Not surprisingly (albeit rather oddly), 89.8% of students in selective schools are classified as ‘above Level 4’, whereas the percentage for comprehensive schools is 31.7%. Selective schools do substantially better on all the measures, especially the EBacc where the percentage of ‘above Level 4’ students achieving this benchmark is double the comprehensive school figure (70.7% against 35.0%). More worryingly, 6.6% of these high-attaining pupils in selective schools are not making the expected progress in English and 4.1% are not doing so in maths. In comprehensive school there is even more cause for concern, with 17.7% falling short of three levels of progress in English and 15.3% doing so in maths.

We can compare some of the key statistics above with the comparable figures for the previous year, 2010/11:

  • The percentage of students within the ‘high attainer’ category in all maintained mainstream schools was 33.6% in 2011/12 and almost identical at 33.5% the previous year.
  • In 2010/11, the percentage of high attainers achieving the 5+ A*-C GCSE including English and maths measure in maintained mainstream schools was 95.2%, so has fallen by 1.2% in 2011/12. The figures for sponsored academies and converter academies in 2010-11 were 93.1% and 98.4% respectively. Assuming these are strictly comparable, it means that, whereas the percentage in sponsored academies has increased by 0.4%, the percentage in converter academies has fallen by 6.9%, no doubt as a consequence of the large increase in the number of converter academies.
  • The overall percentage of high attainers achieving the E Bacc in state-funded mainstream schools was 37.2% in 2010/11, so in 2011/12 there was an increase of 1.3%. In sponsored academies the percentage of high attainers achieving the E Bacc increased by 3.4%. In converter academies, the percentage fell by 6.3%, again presumably attributable to the significant increase in the number of such schools.
  • For the progress measures in English and maths we find that, in 2010/11, 87.2% of high attainers in maintained mainstream schools made the expected progress in English, and 85.2% did so in maths. This means that, in 2011/12, the percentage of high attainers making the expected progress in English fell by a worrying 3.8% but in maths it increased by 0.6%. GCSE English marking is once more most likely to blame.
  • Interestingly, in 2010/11, the percentage of high attainers in selective schools sat at 90.3% while in comprehensive schools it was 31.6%. So, in 2011/12, the percentage in selective schools has fallen by 0.5% and the percentage in comprehensive schools has only fallen by 0.1%. On the EBacc measure, the percentage of successful selective school high attainers has fallen by 0.2%, whereas the percentage of successful comprehensive school high attainers has increased by 1.4%, so comprehensive schools are just beginning to close the gap. The percentage of high-attaining students failing to make the requisite progress in English has increased dramatically from 3.5% in 2010/11 to 6.6% in 2011/12. In maths the proportion of those failing to make such progress has also increased, from 3.4% to 4.1%. In comprehensive schools the percentage failing to make the required progress in English has also increased by 3.9%, while in maths there has been a fall of 0.8% in the percentage failing to make such progress. This would again suggest a problem with English in 2011/12.

Unfortunately SFR04/2013, containing information about GCSE and equivalent attainment by pupil characteristics, provides no breakdowns whatsoever to help establish how the performance of high attainers varies according to gender, ethnic and socio-economic background.

This remains a significant lacuna in the Performance Tables as well. It would be particularly helpful to see data for high attainers eligible for free school meals and/or the Pupil Premium, so that we can establish whether attainment gaps are being narrowed regardless of prior attainment, or whether improvements disproportionately favour one group over the others.

.

Key Stage 5

The remaining SFR05/2013 on A level results in England contains information about the proportion of students achieving 3 or more A levels with A*/A Grades as well as AAB+ in the facilitating subjects. Key points include:

  • Across all schools and FE sector colleges, the percentage of students achieving 3 or more A*/A Grades at A Level (including Applied and Double Award A levels) is 12.8%. This is a fall of 0.3% compared with 2010/11. Slightly more male students than female achieved this (13.1% against 12.6%).
  • Across all schools and FE sector colleges, the percentage of students achieving AAB grades or better is 20.5% (so 1 in 5 of all students). Slightly more female students than male achieved this (20.9% against 20.0%).
  • The comparable percentage of all students achieving at least AAB in the facilitating subjects is much lower at 9.5% and male students were again in the ascendancy: 11.5% compared with 8.2% of females. Hence more students achieved the 3+ A*/A measure than achieved the AAB+ in facilitating subjects measure.
  • 31.6% of students in independent schools achieve 3 or more A*/A Grades – so almost one-third of all students – whereas the comparable figure in state-funded schools is much lower at 10.9%. This is a gap of 20.7%.
  • Students in academies and free schools perform significantly better on this measure (13.5%) than students at local authority maintained mainstream schools (9.1%), students in sixth form colleges (9.7%) and students in all FE sector colleges (8.2%).
  • Turning to the AAB+ measure, 45.3% of students in independent schools achieve this, whereas in state-funded schools the percentage is 17.9%, giving a gap of 27.4%.
  • Students in academies and free schools manage 21.5% on this measure, while the figure for local authority maintained mainstream schools is 15.4%, for sixth form colleges 16.8% and for all FE colleges 14.5%.
  • Thirdly, with respect to the AAB+ measure in facilitating subjects, independent schools achieve 23.7% compared with 8.6% in state-funded mainstream schools. This is a gap of 15.1%. As we have noted above, this is one of the Government’s preferred social mobility indicators. In 2011, the gap was 16.1% – independent schools scoring 23.1% and state schools 7.0%. Compared with 2011, independent schools have improved by 0.6% and state schools by 1.6%, so narrowing the gap by a full percentage point.
  • The percentage achieving the facilitating subjects measure in local authority maintained mainstream schools is 7.2%, whereas in academies and free schools it is 10.5%, in sixth form colleges 5.8% and in all FE colleges 5.0%. The tendency for fewer students to achieve the facilitating subjects measure than the 3+ A*/A measure is consistent across all sectors.
  • Within the state-funded school sector, the gaps between selective and comprehensive schools on all three measures are large: 27.7% versus 8.3% on the 3+ A*/A measure, 40.6% versus 14.5% on the AAB measure and 21.5% versus 6.6% on the AAB in facilitating subjects measure.
  • Altogether, 8.1% of A level entries were awarded an A* grade and 27.5% an A*/A grade. (The 2011 figures were 8.4% and 27.2% respectively.)
  • For independent schools, the percentages were 17.2% and 49% respectively, while for state-funded schools they were 7.2% and 23.5%. In academies and free schools 8.4% of entries were awarded A* grades and 27.6% were awarded A*/A. In local authority maintained schools the comparable percentages were 6.2% and 20.8% respectively; in sixth form colleges they were 5.7% and 20.4% respectively and in all FE sector colleges 5.2% and 19.1% respectively.
  • At state-funded selective schools, 13.4% of entries received an A* grade and 41.2% received an A* or A grade. At state-funded comprehensive schools the figures were 5.9% and 20.9% respectively.
  • The highest percentages of A* grades were awarded in further maths, at 28.5%, and maths at 18%. The fewest A* grades were awarded in home economics (1.3%) and film/media/television studies (1.4%). The percentages for other facilitating subjects included: English – 6.8%; physics 10.4%; chemistry – 9.0%; biological sciences – 8.1%; geography – 6.8%; history – 7.3%; French, German, Spanish – from 7.0-7.7%.
  • The highest percentages of A*/A grades were again awarded in further maths (58.3%) and maths (44.6%). The fewest A*/A grades were awarded in ICT (10.4%) and film/media/television studies (10.6%). The percentages for other facilitating subjects included: English – 21.4%; physics – 32.8%; chemistry – 34.8%; biological sciences – 29.0%; geography – 29.9%; history – 27.1%; French, German, Spanish – 37.7% to 41.1%.
  • 16.9% of all AS level entries were awarded an A grade. The highest percentages were in ‘other modern languages. (50.9%) and further maths (41.2%). The lowest percentages were in ICT (6.3%) and accounting and finance (6.9%).

.

Closing Remarks

The overall assessment of high attainers’ performance can best be described as a mixed picture. There are huge variations between schools, some performing outstandingly well and others outstandingly badly. There are significant issues to address in the academies and free schools sectors – they cannot be regarded as exemplary performers across the board.

There is continuing evidence of underachievement at national level. We do not know anything about the proportion of high attainers from disadvantaged backgrounds, so it is not as straightforward as it might be to establish whether such underachievement is disproportionately concentrated in that group. In the absence of published data to the contrary, one inevitably fears the worst.

.

Postscript

Perhaps it was coincidence but, just a few hours after I published this post, HMCI Sir Michael Wilshaw let it be known that Ofsted would be undertaking a Rapid Response Survey on Gifted and Talented Education.

Indeed this would be:

‘The most extensive investigation of gifted and talented provision undertaken by the watchdog’.

The rapid response methodology is typically deployed when ministers raise an urgent issue that they want Ofsted to investigate which is not addressed by the planned inspection programme. The story says that HMCI himself has ordered the survey: this may or may not have been at the instigation of ministers.

Taken together, this and another article about sport inform us that:

  • A Report will be published ‘in the spring’ (so most likely April or May).
  • A representative sample of over 50 schools will be visited and inspectors will also analyse existing inspection data.
  • Issues to be investigated include: progression between KS2 and KS4; whether mixed ability classes provide sufficient stretch and challenge; early examination entry; progression to competitive universities; and support for disadvantaged gifted learners.
  • This report – or possibly the one due in February on PE and School Sport – will also examine whether talented young sportspeople are able to access comparable opportunities and enrichment to that which is available to those attending independent schools.

This example shows the typical format of a rapid response survey. There is a set of Key Findings presented as bullet points, followed by series of recommendations, typically aimed at central Government, the ‘middle tier’ and schools respectively. The main text is brief and to the point.

But HMCI has said that this survey will be the most extensive on the topic that HMI have ever undertaken. On the face of it, this is not easy to reconcile with the rapid response methodology.

Ofsted last considered gifted and talented education in December 2009, also deploying the rapid response approach.

But back in 2001 they published a more substantive document ‘Providing for Gifted and Talented Pupils: An Evaluation of Excellence in Cities and Other Grant-Funded Programmes’. It will be interesting to compare the 2013 report with this.

An even earlier publication from 1992: ‘The Education of Very Able Children in Maintained Schools’ does not seem to be available online (though one can still access the research review conducted for Ofsted by Joan Freeman in 1998).

A further concern is the limited availability of gifted education expertise within Ofsted. Though there at least two current HMI with such expertise, my understanding is that there is no longer a designated specialist lead for the topic, and so no guarantee that the individuals with the expertise can and will be released at short notice to undertake this task.

That said, if the Report helps to set a contemporary improvement agenda for gifted and talented education, that will be a huge fillip to those who work in the field.

.

Postcript 2

Post-publication of Ofsted’s Report, ASCL has referred in its press release to the KS2-4 Transition Matrices published on Raise Online.

I thought it might be useful to reproduce those here.

TM English CaptureTM Maths Capture.

These show that:

  • 98% of KS2 learners achieving 5A in English achieved 3 levels of progress from KS2 to KS4, compared with 92% of those achieving 5B and 70% of those achieving 5C
  • 87% of KS2 learners achieving 5A in English achieved 4 levels of progress from KS2 to KS4, compared with 64% of those achieving 5B and 29% of those achieving 5C
  • The percentage of learners achieving 4A in English at KS2 who went on to achieve three levels of progress and four levels of progress – 85% and 41% respectively – were significantly higher than the comparable percentages for learners achieving 5C
  • 47% of those achieving 5A in English at KS2 when on to achieve A* at GCSE, compared with 20% of those achieving 5B and 4% of those achieving 5C
  • 87% of those achieving 5A in English at KS2 went on to achieve A* or A at GCSE, compared with 64% of those achieving 5B and 29% of those achieving 5C
  • 96% of KS2 learners achieving 5A in Maths achieved 3 levels of progress from KS2 to KS4, compared with 86% of those achieving 5B and 67% of those achieving 5C
  • 84% of KS2 learners achieving 5A in Maths achieved 4 levels of progress from KS2 to KS4, compared with 57% of those achieving 5B and 30% of those achieving 5C
  • The percentage of learners achieving 4A in Maths at KS2 who went on to achieve three levels of progress and 4 levels of progress – 89% and 39% respectively – were significantly higher than the comparable percentages for learners achieving 5C.  The percentage achieving three levels of progress even exceeded the percentage of those with 5B who managed this.
  • 50% of those achieving 5A in Maths at KS2 went on to achieve A* at GCSE, compared with 20% of those achieving 5B and 6% of those achieving 5C
  • 84% of those achieving 5A in Maths at KS2 went on to achieve A* or A at GCSE, compared with 57% of those achieving 5B and 30% of those achieving 5C

I have highlighted in bold the statistics that make most uncomfortable reading. It is especially concerning that half or fewer of those achieving 5A in either maths or English were able to translate that into an A* grade at the end of KS4. That rather undermines the suggestion that limited progression is entirely attributable to the lower end of the distribution of those achieving  L5 at KS2.

.

GP

 

 

.

 

GP

January 2013

Advertisements

The Performance of Gifted High Achievers in TIMSS, PIRLS and PISA

.

This post examines the comparative performance of high achievers in recent international comparisons studies, principally the 2011 TIMSS and PIRLS assessments.

More specifically, it compares:

  • The proportion of learners in selected countries who achieve the highest ‘advanced’ benchmarks in TIMSS 2011 maths and science assessments at Grades 4 and 8 respectively and in the PIRLS 2011 reading assessment at Grade 4;
  • How selected countries have performed on each of these measures over the period in which TIMSS and PIRLS have been administered, identifying positive and negative trends and drawing inferences about current relative priorities in different countries;
  • Selected countries’ overall ranking on each of these TIMSS and PIRLS assessments (based on the average score achieved across all learners undertaking the appropriate assessment), contrasted with their ranking for the proportion of learners achieving the highest ‘advanced’ and the lowest ‘low’ benchmarks, considering the associated implications for their national education policies; and
  • The results from TIMSS and PIRLS 2011 with those from PISA 2009, exploring whether these different studies provide a consistent picture of countries’ relative strength in educating their highest achievers and, to the extent that there are inconsistencies, how those might be explained.

The post also reviews recent publications and speeches about England’s performance in TIMSS and PIRLS 2011, with a particular focus on the aspects set out above and the high achievers’ perspective. Finally, it draws together some significant recent contributions which ask interesting questions about the nature of these assessments and their outcomes.

This is therefore a companion piece to my December 2010 post ‘PISA 2009: International Comparisons of Gifted High Achievers’ Performance’.

There is limited reference within it to the relative strengths and weaknesses of international comparisons studies of this kind. Some time ago I published the first part of a separate post on that subject.

For the purposes of this publication my pragmatic assumption is that, while such studies have significant shortcomings and should on no account be used as the sole source of evidence for educational policy-making, they do provide useful steers which, when combined with other sources of quantitative and qualitative evidence, can offer a useful guide to current strengths and weaknesses and potential future priorities.

This is therefore a ‘health warning’: some of my conclusions below do need to be treated with a degree of caution. They are broad indicators rather than incontrovertible statements of fact.

.

Background

History and Development of TIMSS and PIRLS Assessments

The Trends in International Mathematics and Science Study (TIMSS) has provided assessments of national achievement in these subjects since 1995, focused principally on two cohorts: Grade 4 (age 9/10) and Grade 8 (age 13/14).

Its companion exercise, the Progress in International Reading Study (PIRLS) was introduced in 2001 to assess reading comprehension at Grade 4.

There is a parallel TIMSS Advanced assessment of maths and physics achievement in the final year of secondary school. This was undertaken in 1995 and 2008 and is scheduled for 2015. A less difficult PrePIRLS study, providing assessment for those not yet reading confidently, was introduced for the first time in 2011.

The main TIMSS assessment has been repeated on a four-year cycle and PIRLS on a 5-year cycle making 2011 the first year in which both studies were conducted together.

  • In 1995, TIMSS was undertaken for the first time, featuring assessment at five different Grades (3,4.7,8 and the final year of secondary education through the Advanced study). Altogether there were forty-five participating countries.
  • In 1999 TIMSS was repeated at Grade 8 only, with thirty-eight countries participating, twenty-six of them participants in the original 1995 cycle.
  • In 2003 the number of TIMSS participants increased to forty-nine, all but one of which undertook the Grade 8 assessments (though only twenty-six completed the Grade 4 assessments).

TIMSS 2011 lists sixty-three and PIRLS 2011 lists forty-eight participating countries (I have excluded from these figures those countries and parts of countries participating solely for benchmarking purposes.) Altogether though, around 600,000 learners participated in TIMSS and about half as many in PIRLS.

.

Assessment Frameworks and Benchmarks

The separate assessment frameworks for Maths and Science within TIMSS are similarly constructed. There is

  • A Grade-specific content dimension specifying the subject matter to be assessed eg algebra, physics, geometry, chemistry; and
  • A cognitive dimension, capturing the knowing, applying and reasoning processes that are deployed by the learner.

The assessment framework for reading within PIRLS is slightly different. The focus of the assessment is described as ‘reading literacy’, defined thus:

‘The ability to understand and use those written language forms required by society and/or valued by the individual. Young readers can construct meaning from a variety of texts. They read to learn, to participate in communities of readers in school and everyday life, and for enjoyment.’

Two principal aspects are assessed:

  • Two purposes for reading – for literary experience and to acquire and use information; and
  • Four comprehension processes – focus on and retrieve explicitly stated information; make straightforward inferences; interpret and integrate ideas and information; and examine and evaluate content, language and textual elements.

Both TIMSS and PIRLS use achievement scales with a range from 0-1000, though most learners score between 300 and 700 and 500 – the midpoint of the scales – remains constant across different cycles, so trend-related data is relatively reliable.

Four points on this scale are specified as international benchmarks: Advanced at 625, High at 550, Intermediate at 475 and Low at 400. These benchmarks are defined differently for each subject and Grade.

The Advanced benchmark definitions are as follows:

  • Maths Grade 4:

Students can apply their understanding and knowledge in a variety of relatively complex situations and explain their reasoning. They can solve a variety of multi-step word problems involving whole numbers, including proportions. Students at this level show an increasing understanding of fractions and decimals. Students can apply geometric knowledge of a range of two- and three-dimensional shapes in a variety of situations. They can draw a conclusion from data in a table and justify their conclusion.’

  • Science Grade 4:

Students apply knowledge and understanding of scientific processes and relationships and show some knowledge of the process of scientific inquiry. Students communicate their understanding of characteristics and life processes of organisms, reproduction and development, ecosystems and organisms’ interactions with the environment, and factors relating to human health. They demonstrate understanding of properties of light and relationships among physical properties of materials, apply and communicate their understanding of electricity and energy in practical contexts, and demonstrate an understanding of magnetic and gravitational forces and motion. Students communicate their understanding of the solar system and of Earth’s structure, physical characteristics, resources, processes, cycles, and history. They have a beginning ability to interpret results in the context of a simple experiment, reason and draw conclusions from descriptions and diagrams, and evaluate and support an argument.’

  • Reading Grade 4:

‘When reading Literary Texts, students can:

    • Integrate ideas and evidence across a text to appreciate overall themes
    • Interpret story events and character actions to provide reasons, motivations, feelings, and character traits with full text-based support

When reading Informational Texts, students can:

    • Distinguish and interpret complex information from different parts of text, and provide full text-based support
    • Integrate information across a text to provide explanations, interpret significance, and sequence activities
    • Evaluate visual and textual features to explain their function.’
  • Maths Grade 8:

‘Students can reason with information, draw conclusions, make generalizations, and solve linear equations. Students can solve a variety of fraction, proportion, and percent problems and justify their conclusions. Students can express generalisations algebraically and model situations. They can solve a variety of problems involving equations, formulas, and functions. Students can reason with geometric figures to solve problems. Students can reason with data from several sources or unfamiliar representations to solve multi-step problems.

  • Science Grade 8:

‘Students communicate an understanding of complex and abstract concepts in biology, chemistry, physics, and earth science. Students demonstrate some conceptual knowledge about cells and the characteristics, classification, and life processes of organisms. They communicate an understanding of the complexity of ecosystems and adaptations of organisms, and apply an understanding of life cycles and heredity. Students also communicate an understanding of the structure of matter and physical and chemical properties and changes and apply knowledge of forces, pressure, motion, sound, and light. They reason about electrical circuits and properties of magnets. Students apply knowledge and communicate understanding of the solar system and Earth’s processes, structures, and physical features.  They understand basic features of scientific investigation. They also combine information from several sources to solve problems and draw conclusions, and they provide written explanations to communicate scientific knowledge.’

TIMSS courtesy of Toujia Elementary School in Taiwan

Taiwanese Learners Tackle TIMSS courtesy of Toujia Elementary School

.

PISA Frameworks and Benchmarks

PISA is a triennial study of 15 year-olds’ performance (so Grade 9) also in maths, science and reading. A different subject is the main focus in each cycle – in 2009 it was reading. Sixty-five countries took part in PISA 2009.

There is significant overlap with TIMSS/PIRLS participants – some 40 countries involved in TIMSS 2011 also undertook PISA 2009 – but a significant proportion of countries undertake one or the other.

My previous post sets out the definitions of Reading, Mathematical and Scientific Literacy used in the PISA 2009 study and I will not repeat them here.

PISA divides student performance into six different proficiency levels. The highest (Level 6) are defined in terms of the tasks which learners successfully perform, or the skills and competences they must display.

It is interesting to compare the emphases in these descriptions with those in the parallel TIMSS/PIRLS definitions above.

  • In reading, Level 6 tasks:

 ‘typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.’

  •  In maths Level 6 learners can:

 ‘conceptualise, generalise, and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply this insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments, and the appropriateness of these to the original situations.’

  •  And in science, they can:

 ‘consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.’

 This 2011 IPPR Report on Benchmarking the English School System explains the somewhat different approaches of these two assessment suites:

‘PISA puts less emphasis on whether a student can reproduce content, and focuses more on their ability to apply knowledge to solve tasks…

…TIMSS…focuses on curriculum and as a result tends to test pupil’s content knowledge rather than their ability to apply it…

 …PIRLS…assesses…knowledge and content of the curriculum.’

In a recent paper on the PISA 2009 results, Jerrim marks the distinction between PISA and TIMSS in slightly different terms:

‘Whereas TIMSS focuses on children’s ability to meet an internationally agreed curriculum, PISA examines functional ability – how well young people can use the skills in “real life” situations. The format of the test items also varies, including the extent to which they rely on questions that are “multiple choice”. Yet despite these differences, the two surveys summarise children’s achievement in similar ways…

…This results in a measure of children’s achievement that (in both studies) has a mean of 500 and a standard deviation of 100. However, even though the two surveys appear (at face value) to share the same scale, figures are not directly comparable (eg a mean score of 500 in PISA is not the same as a mean score of 500 in TIMSS). This is because the two surveys contain a different pool of countries upon which these achievement scores are based…Hence one is not able to directly compare results in these two surveys (and change over time) by simply looking at the raw scores.’

With these similarities and distinctions in mind, let us turn to analysis of the data.

.

High Performance At Advanced Benchmarks in TIMSS and PIRLS 2011

Table One below shows the top ten countries in each of the five TIMSS and PIRLS assessments at the Advanced benchmark of 675: Maths Grade 4, Science Grade 4, Reading Grade 4, Maths Grade 8 and Science Grade 8. I have also included some countries of interest that fell outside one or more of the ‘top tens’.

 .

Rank Maths 4 % Science 4 % Reading 4 % Maths 8 % Science 8 %
                     
1 Singapore 43 Singapore 33 Singapore 24 Taiwan 49 Singapore 40
2 Korea 39 Korea 29 Russia 19 Singapore 48 Taiwan 24
3 Hong Kong 37 Finland 20 N Ireland 19 Korea 47 Korea 20
4 Taiwan 34 Russia 16 Finland 18 Hong Kong 34 Japan 18
5 Japan 30 Taiwan 15 England 18 Japan 27 Russia 14
6 N Ireland 24 US 15 Hong Kong 18 Russia 14 England 14
7 England 18 Japan 14 US 17 Israel 12 Slovenia 13
8 Russia 13 Hungary 13 Ireland 16 Australia 9 Finland 13
9 US 13 Romania 11 Israel 15 England 8 Israel 11
10 Finland 12 England 11 N Zealand 14 Hungary 8 Australia 11
                     
Hong Kong 9 Hong Kong 9
Taiwan 13
Australia 10 Australia 7 Australia 10
Ireland 9 Ireland 7
US 7 US 10
N Zealand 4 N Zealand 5 N Zealand 5 N Zealand 9
Finland 4
N Ireland 5
Median   4   5   8   3   4

Table One: Top Ten Countries at Advanced Benchmarks, TIMSS and PIRLS 2011

 .

Several important points can be drawn from this initial analysis.

  • Singapore is by some margin the most successful country in terms of the percentage of its pupils achieving the Advanced benchmark. It tops the rankings in all but Maths Grade 8, where it is a close second to Taiwan. In all the remaining assessments, it has a 4 or 5 percentile point lead over its nearest rival, and in Science Grade 8, an astonishing lead of 16 percentile points.
  • But the proportion of Singaporean learners achieving the Advanced benchmark varies significantly, from just under a quarter in Reading to just under half in Maths Grade 8. Singapore is much closer to the PIRLS median (+16%) in Reading so, arguably, that is a relative weakness at this level.
  • Other outstanding performers include: Korea and Japan (apart from Reading which they did not undertake); Hong Kong (apart from Science at Grades 4 and 8 where it was outside the top 10); Taiwan (though it was outside the top 10 for Reading); Finland (though it was let down in the Maths Grade 8 assessment), Russia and England.
  • The top-ranked countries in TIMSS – Singapore, Korea, Hong Kong, Taiwan, Japan – typically secure a significantly higher proportion of Advanced level achievers in Maths than in Science. The reverse is broadly true in a second group of countries including Finland, the US, Russia and New Zealand. England and Australia are significantly atypical, in that Maths leads the way at Grade 4 while Science is in the ascendant at Grade 8.
  • When PIRLS is factored in, it is clear that a group of countries including Finland, Russia, the US, New Zealand and Israel secure larger proportions at the Advanced benchmark in Reading than in both Maths and Science. The same is almost true of England, though the percentages are equal for Reading and Maths at Grade 4. Unsurprisingly, the outstanding Asian TIMSS performers tend to achieve a significantly lower level in Reading. The relative reading difficulty of native languages are bound to have an impact here.
  • Interestingly, England outscored or equalled Finland on all but one assessment (Science Grade 4). It exceeded the median comfortably on all five assessments: Maths Grade 4 (+14%); Science Grade 4 (+6%), Reading (+10%); Maths Grade 8 (+5%); and Science Grade 8 (+10%). (It was however outscored by Northern Ireland on Maths Grade 4 and Reading.)
  • On the basis of these differentials, Science Grade 4 and Maths Grade 8 are England’s areas of relative weakness amongst high achievers though, if the analysis is undertaken on the basis of the gap between England and the world leader for each assessment, the incontrovertible priority is Maths Grade 8 where there is a 41 percentile point chasm between England and Taiwan.

. .

Trends Over Time in Performance Against TIMSS and PIRLS Advanced Benchmarks

Tables 2A to 2E below show how the percentage achieving the Advanced benchmark has changed over time in each country within the top 10 in each assessment in 2011 (excluding those for which there is insufficient data).

Where the percentage has declined between cycles of the assessment, the figure is emboldened. Each table also shows for each country the percentage change between the first assessment and that undertaken in 2011.

 .

Country 1995 2003 2007 2011 ImprovementSince 1995
Singapore 38 38 41 43 +5
Korea 25 39 +14
Hong Kong 17 22 40 37 +20
Taiwan 16 24 34 +18
Japan 22 21 23 30 +8
England 7 14 16 18 +11
Russia 11 16 13 +5
US 9 7 10 13 +4
Lithuania 10 10 10 0
Belgium (Flemish) 10 10 0

Table 2A: TIMSS Maths Grade 4 – Trend in Percentage Achieving Advanced Benchmark

.

Country 1995 2003 2007 2011 ImprovementSince 1995
Singapore 14 25 36 33 +19
Korea 22 29 +7
Russia 11 16 16 +5
Taiwan 14 19 15 +1
US 19 13 15 15 -4
Japan 15 12 12 14 -1
Hungary 7 10 13 13 +6
England 15 15 14 11 -4
Sweden 8 10 +2
Czech Republic 12 7 10 -2

Table 2B: TIMSS Science Grade 4 – Trend in Percentage Achieving Advanced Benchmark

.

Country 2001 2006 2011 Improvement Since 2001
Singapore 12 19 24 +12
Russia 5 19 19 +14
England 20 15 18 -2
Hong Kong 5 15 18 +13
US 15 12 17 +2
New Zealand 14 13 14 0
Taiwan 7 13 +6
Denmark 11 12 +1
Hungary 10 14 12 +2
Bulgaria 17 16 11 -6

Table 2C: PIRLS Reading – Trend in Percentage Achieving Advanced Benchmark

.

Country 1995 1999 2003 2007 2011 Improvement Since 1995
Taiwan 37 38 45 49 +12
Singapore 40 42 44 40 48 +8
Korea 31 32 35 40 47 +16
Hong Kong 23 28 31 31 34 +11
Japan 29 29 24 26 27 -2
Russia 9 12 6 8 14 +5
Australia 7 7 6 9 +2
England 6 6 5 8 8 +2
Hungary 10 13 11 10 8 -2
US 4 7 7 6 7 +3

Table 2D: TIMSS Maths Grade 8 – Trend in Percentage Achieving Advanced Benchmark

 .

Country 1995 1999 2003 2007 2011   Improvement Since 1995
Singapore 29 29 33 32 40 +11
Taiwan 27 26 25 24 -3
Korea 17 19 17 17 20 +3
Japan 18 16 15 17 18 0
Russia 11 15 6 11 14 +3
England 15 17 15 17 14 -1
Slovenia 8 6 11 13 +5
Australia 10 9 8 11 +1
US 11 12 11 10 10 -1
Hong Kong 7 7 13 10 9 +2

Table 2E: TIMSS Science Grade 8 – Trend in Percentage Achieving Advanced Benchmark

.

This trend-based data throws a different complexion on the performance of several leading countries.

  • Though Singapore has managed impressive double-digit improvements in four of the five assessments, its improvement in Grade 4 Maths is far less spectacular, at a mere 5%. Moreover, Singapore’s performance actually declined on both Grade 8 Maths and Science in 2007, though it has reversed that trend in 2011 (and quite spectacularly so in Science).
  • The rate of improvement in some other countries has exceeded that of Singapore. At Grade 4 in Maths, Hong Kong, Taiwan, Korea, England and Japan are all improving at a significantly faster rate. The same is true of Korea, Taiwan and Hong Kong at Grade 8. Singapore has comfortably the fastest rate of improvement in Grade 4 and Grade 8 Science. In Reading though, Russia and Hong Kong outscore Singapore on this metric.
  • There have also been some significant declines in performance over the period that these assessments have been conducted. Both England and the United States have suffered a decline of four percentile points in Grade 4 Science, while Taiwan’s Grade 8 Science result has fallen by three percentage points and Bulgaria’s Reading score by six percentage points.
  • Within TIMSS, most of the leading countries – including Korea, Hong Kong, Taiwan, England, Russia and the US – have improved significantly more on Maths than they have on Science. However, the reverse is true in Singapore (perhaps suggesting that Singapore science is a potentially stronger export than Singapore maths). Japan is also atypical in that there has been an improvement in Maths at Grade 4 but in all other assessments there has been no improvement or a slight decline.
  • Where countries have achieved improvements within TIMSS assessments, these are typically stronger at Grade 4 than Grade 8, though the reverse is true in Maths in Singapore and Korea, while both Russia and the US present a more balanced scorecard in this respect.
  • When PIRLS is factored in, one notices that improvements in Reading tend to be less strong than in each country’s fastest improving TIMSS subject but stronger than in its slower improving TIMSS subject. Russia is the obvious outlier, with outstanding improvement in Reading relative to both Maths and Science. In England the decline in Reading is similar to that in Science.
  • Considered from this perspective, Singapore should be prioritising Grade 4 Maths, while Korea and Hong Kong should concentrate on Grade 8 Science. The US must look at Grade 4 Science and, to a lesser extent, Grade 8 Science. England’s priorities would also be Grade 4 and Grade 8 Science plus Reading. Maths is strong at Grade 4, though relatively less so at Grade 8.

.

How Singapore Summarised the outcomes of TIMSS/PIRLS 2011

How Singapore Summarised the outcomes of TIMSS/PIRLS 2011

 .

Overall Rankings Compared With Rankings for Achievement of Advanced and Low Benchmarks

The next set of Tables examines how countries’ rankings differ for the overall assessment (based on the median score of learners from that country), the percentage achieving the highest ‘Advanced’ benchmark and the percentage achieving the lowest ‘Low’ benchmark.

This provides an indicator of whether each country’s highest achievers are outperforming the average achievers in comparative terms – and to what extent (if at all) the lowest achievers are lagging behind.

To make this manageable I have again confined the analysis to the top ten countries in each assessment against the ‘Advanced’ Benchmark.

 .

Country Advancedrank Overallrank Lowrank
Singapore 1 1 2=
Korea 2 2 1
Hong Kong 3 3 2=
Taiwan 4 4 2=
Japan 5 5 2=
N Ireland 6 6 13=
England 7 9 19=
Russia 8 10 9=
US 9 11 13=
Finland 10 8 8

Table 3A: TIMSS Grade 4 Maths – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Singapore 1 2 6=
Korea 2 1 1=
Finland 3 3 1=
Russia 4 5 5=
Taiwan 5 6 6=
US 6 7 9=
Japan 7 4 1=
Hungary 8 10 22=
Romania 9 28 34=
England 10 15 22=

Table 3B: TIMSS Grade 4 Science – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Singapore 1 4 15=
Russia 2 2 2=
N Ireland 3 5 15=
Finland 4 3 2=
England 5 11 21=
Hong Kong 6 1 2=
US 7 6 7=
Ireland 8 10 15=
Israel 9 18 29=
N Zealand 10 23 32

Table 3C: PIRLS Reading – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Taiwan 1 3 5=
Singapore 2 2 1=
Korea 3 1 1=
Hong Kong 4 4 3=
Japan 5 5 3=
Russia 6 6 7
Israel 7 7 15=
Australia 8 12 11=
England 9 10 13=
Hungary 10 11 13=

Table 3D: TIMSS Grade 8 Maths – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

Country Advancedrank Overallrank Lowrank
Singapore 1 1 4=
Taiwan 2 2 4=
Korea 3 3 2=
Japan 4 4 2=
Russia 5 7 4=
England 6 9 9=
Slovenia 7 6 4=
Finland 8 5 1
Israel 9 12 19=
Australia 10 13 11=

Table 3E: TIMSS Grade 8 Science – Comparison of Rank for Achievement of Advanced Benchmark, Overall and for Achievement of Low Benchmark

.

These Tables show that, particularly at the top end of the distribution, there is a very close correlation between ranking on the basis of average score and on the basis of the proportion achieving the Advanced benchmark.

There is also a fairly close correlation with the proportion achieving the Low benchmark, but this is not quite so pronounced and there are some outliers with relatively ‘long tails’ of low achievement.

  • In Maths at Grade 4 the top five countries get very high percentages of pupils past the Low Benchmark, but the next five are relatively less successful and, of these, England is least successful. It has a relatively ‘long tail’, while its highest achievers do comparatively better than the overall measure. The latter is also true of Russia and the United States, but the reverse is the case in Finland. This is arguably evidence that England, Russia and the US should prioritise the lower end of the distribution while Finland should pay more attention to the top end.
  • In Maths at Grade 8 the pattern is broadly similar though, with the exception of Israel, the ‘long tail’ for the countries just below the top rank is not quite so pronounced. This might suggest that earlier efforts to bring younger low achievers up to a higher standard – and to narrow national achievement gaps – have been at least partly successful.
  • In Science at Grade 4 these variations are once again more substantial, while tending to narrow at Grade 8, so giving a similar pattern. Singapore’s rankings suggest relatively greater priority is required at the lower end of the achievement distribution. Romania is clearly the worst in this respect, though England and Hungary are not too far behind. The’ ranking gap’ in England is broadly similar for Maths and Science at Grade 4 and at Grade 8 respectively.
  • In Science at Grade 8, Israel again has the longest tail, comparable with the situation in Maths Grade 8. Finland is again remarkable for bucking the general trend, suggesting perhaps that it is too much focused on lifting everyone up to a relatively high standard and too little focused on stretching those at the top.
  • In Reading there is relatively more volatility throughout the table, at the top as much as the bottom of the top ten. Russia, Finland and the United States have relatively ‘flat profiles’, while Hong Kong assumes the ‘reverse profile’ more typically associated with Finland in respect of Maths and Science. Several countries have a pronounced tail, including Singapore, Northern Ireland, England, Ireland, Israel and New Zealand. The latter two have the biggest issue in this respect. There is clearly an issue here for Singapore to address.

 .

Broad Comparisons Between TIMSS and PIRLS 2011 and PISA 2009

Finally in this data analysis section, it is worthwhile to compare the top-ranking countries in terms of the proportions achieving the most demanding benchmarks, to identify broad similarities and differences.

Of course the results are not strictly comparable because the assessments are substantively different, the assessed learners are older on PISA, and the cohort of countries competing with each other is not the same.

Nevertheless, the exercise is instructive.

For the purpose of the comparison I have used the Grade 8 Maths and Science assessments (because the learners taking them are almost the same age as those undertaking PISA), but I have also included PIRLS, as the only comparison available for reading.

On this occasion, however, I have included the top 20 ranked institutions in each assessment

Rank TIMSSMathsG8 PISAMaths TIMSSScienceG8 PISAScience PIRLSReading PISAReading
1 Taiwan Shanghai Singapore Singapore Singapore NZ
2 Singapore Singapore Taiwan Shanghai Russia Singapore
3 Korea Taiwan Korea NZ N Ireland Shanghai
4 HK HK Japan Finland Finland Australia
5 Japan Korea Russia Australia England Japan
6 Russia Switzerland England Japan HK Canada
7 Israel Japan Slovenia HK US Finland
8 Australia Belgium Finland UK Ireland US
9 England NZ Israel Germany Israel Sweden
10 Hungary Liechtenstein Australia Canada NZ HK
11 Turkey Finland US Netherlands Canada Belgium
12 US Germany HK Switzerland Taiwan France
13 Romania Australia NZ Estonia Denmark Korea
14 Lithuania Canada Hungary US Hungary Iceland
15 NZ Netherlands Turkey Czech Rep Bulgaria Israel
16 Ukraine Macao Sweden Ireland Croatia UK
17 Slovenia Slovenia Lithuania Belgium Australia Norway
18 Finland Slovakia Ukraine Korea Italy Ireland
19 Italy France Iran Austria Germany Poland
20 Armenia Czech Rep UAE Sweden Portugal Switzerland

 

Table 4: Top 20 Rankings for Highest Benchmark in TIMSS, PIRLS and PISA

.

In PISA results are reported for the UK as a whole, but the figures for Level 6 achievement in England are almost identical (only in maths is there a noticeable difference, with England’s result 0.1% lower than that reported for the UK).

England is ranked 29th on PISA Maths, the only column in the table in which neither England nor the UK appears.

The rankings show that a handful of the ‘usual suspects’ are highly placed on both TIMSS/PIRLS and PISA. Singapore is ubiquitous.

Some countries perform relatively better on the PISA side of the equation – New Zealand is an obvious example – while England is a comparatively better performer on TIMSS/PIRLS, as is the United States.

It is interesting to hypothesise whether these differences reflect different strengths in national education systems. Other things being equal, do those countries performing best on PISA pay relatively more attention their high achiever’s problem-solving and the application of content knowledge? Do those performing better on TIMSS/PIRLS emphasise content knowledge above ‘real life’ problem-solving?

Perhaps high-achieving learners in countries more successful in PISA are simply more familiar with assessment instruments that feature such problem-solving. Or perhaps much of the difference is explainable by more mundane variations in the assessment process. There are likely to be several different factors in play.

The countries that appear most frequently on these lists are amongst the global leaders in educating high-achieving learners. Whether there is a significant correlation with the scope and efficacy of their gifted education programmes is less certain.

We know from previous posts on this Blog that Singapore, Korea and Hong Kong have some of the best developed gifted education programmes in the world. Israel also falls into this category, as did England in the period up to 2011.

It would be a reasonable hypothesis that their investment at the top end of the ability range is having a positive effect in terms of educational outcomes as measured by these assessments, but I am not aware of any research that attempts to establish such causality.

And it is important to note that the percentages achieving the highest benchmarks in PISA/TIMSS and PIRLS vastly exceed the proportions admitted into leading countries’ gifted education programmes whereas, in England, the proportion achieving the highest benchmarks is significantly lower than the percentage in the former national gifted education programme.

Assessment Leading country % at highest BenchmarkIn leading

country

% at highestBenchmarkIn England % at highest benchmarkAverage for Assessment*
TIMSS Maths G4 Singapore 43 18 4
TIMSS Maths G8 Taiwan 49 8 5
PISA Maths Shanghai 26.6 1.7 3.1
TIMSS Science G4 Singapore 33 11 3
TIMSS Science G8 Singapore 40 14 4
PISA Science Singapore 3.9 1.8 1.1
PIRLS Reading Singapore 24 18 8
PISA Reading NZ 2.9 1.0 0.8

 *Averages for PISA are OECD countries only

Table 5: Percentage Achieving Highest Benchmark in TIMSS, PIRLS and PISA – Comparison of Leading Country and England

.

Table 5 shows that the gaps between England and the leading country can be highly variable between assessments.

  • In TIMSS Grade 4 Maths, Singapore achieves more than twice as many as England at the highest benchmark but, at Grade 8, Taiwan manages over six times as many.
  • In PISA Maths the difference between Shanghai and England is enormous – over 15 times as many Shanghai learners achieve the benchmark.
  • In TIMSS Grade 4 Science, Singapore has exactly three times as many at the highest benchmark while, at Grade 8, it has slightly less than that.
  • In PISA Science, slightly more than twice as many Singaporean learners achieve the highest benchmark.
  • In PIRLS reading the difference is much smaller, with Singapore only 25% ahead but
  • In PISA Reading, the gap between England and New Zealand is once again close to a multiple of three.

So, while the majority of assessments show the international leader having a two- or threefold greater proportion achieving the highest benchmark, there are three conspicuous outliers: TIMSS Grade 8 Maths and especially PISA Maths (where England performs significantly worse); and PIRLS Reading (where England scores significantly better.

At the same time though, England is significantly ahead of the average for each assessment, with the sole exception of PISA maths.

This is not quite the overwhelmingly negative picture painted in the Sutton Trust’s Educating the Highly Able (which I analysed at length in a previous post).

While there is a significant gap between England and the world’s leaders on all these assessments, its performance is comparatively respectable in all but PISA Maths/TIMSS Grade 8 Maths. This suggests a particular problem with secondary maths for the highest achievers in England.

.

Various PISA products courtesy of OECD

Various PISA products courtesy of OECD

 .

Domestic Analysis of England’s Performance in TIMSS and PIRLS 2011

NFER has published extensive analyses of England’s performance in TIMSS and PIRLS respectively. The analysis shows that:

  • In all four TIMSS assessments, the attainment difference between the highest and lowest performing learners was just short of 300 TIMSS scale points.
  • The best-performing countries typically have similar or smaller ranges of attainment, though there were exceptions (Taiwan for Grade 8 maths and Singapore for Grade 4 and Grade 8 Science). The variation tends to be greater for those below average than for those above.
  • Whereas at Grade 4 in Maths England’s performance can be seen as at the low end of the highest performing countries, at Grade 8 it ‘has more in common with the performance of the majority of countries than with the highest performing countries’.
  • At Grade 4 in Science ‘England is in a group of countries with relatively low proportions of pupils at the advanced benchmark’ and, despite the good showing in the rankings the profile at Grade 8 ‘differs from those of the highest scoring countries’.
  • In the PIRLS Reading assessment ‘the most able readers [in England] were among the best readers in the survey’. They reached levels similar to Singapore’s high achievers and ‘higher than the most able readers in the three top performing countries (Hong Kong, the Russian Federation and Finland)’.

The TES ran a story in which Andreas Schleicher of the OECD – the man responsible for PISA – took an idiosyncratic position, arguing that good results in TIMSS and PIRLS would actually be bad news, because:

‘Pisa – which suggests a recent decline in England’s international standing – tests children at an older age than Timss and Pirls. Mr Schleicher claimed that a good performance from England in the latter two tests, after its fall from grace in Pisa, would therefore suggest that the performance of pupils is actually deteriorating as they progress through school.

“If you put the three surveys together – I don’t think you can strictly compare them, but if you sort of use them as approximations – in my view it makes the picture a lot more worrying,” he said. “Because the message you get is that the earlier the year in school that you test kids in the UK, the better the performance internationally.

“In other words, parents and society do a great job in children getting to school but then year after year the schools system adds less value than we see across (other) countries.”…

…”It is probably true that the UK system is actually quite good in primary education, in the early years, but then afterwards it peters out – you can see the high dropout, you can see the 14-18 problem and so on,” Mr Schleicher said. “If you look at the three surveys together you don’t get a very encouraging picture. It is a more worrying picture than if you look at them one by one.”’

This statement rather ignores the fact that only a single year separates PISA participants from those undertaking the TIMSS 8th Grade studies. From the evidence above, it is not consistently borne out by performance at the highest benchmarks, especially in Science.

There are likely to be several different factors responsible for England’s relatively better performance on TIMSS/PIRLS (including in the 8th Grade assessments).

Many have been identified through research studies, the majority of them associated with technical differences in the nature of the assessments. There will also be factors associated with the systems being assessed, but I have seen no substantive evidence to back up Schleicher’s claim.

On 11 December 2012, Education Minister Elizabeth Truss gave a speech about the evidence from TIMSS and PIRLS. Towards the beginning, she advances the oft-repeated truism (not entirely borne out by the evidence above) that:

‘In the past, and still today, this country has excelled at educating a small minority of its children to the very highest level.’

In fact, the minority is relatively large compared with most other countries.

Strangely, although the speech concentrates on the raft of reforms being introduced to improve performance in reading, maths and science, there is no reference at all to those which specifically benefit the highest achievers: the introduction of Level 6 assessment at Key Stage 2 and the development of a cadre of selective specialist 16-19 maths and science free schools.

The timing of these assessments was problematic for a Government elected to power in 2010. This BBC story includes a grudging reaction to the mixed bag of results from a Government spokesman:

‘These tests reflect progress between 2006 and 2011 and were taken only a year after the election.

So to the limited extent the results reflect the effect of political leadership, Labour deserves the praise for the small improvement in reading and the blame for the stagnation in maths and the decline in science. The tests say nothing, good or bad, about what we have done.’

Meanwhile the Opposition spokesman says:

‘These results show schools in England are some of the best in Europe – thanks to the hard work of teachers and pupils. The Labour government’s reforms saw reading results improve thanks to better teaching, smaller class sizes and Labour’s National Literacy Strategy.

However, we need to understand why East Asian countries outperform us in key skills – particularly science and maths.’

.

Summing Up

This analysis aims to exemplify how careful analysis of performance against the highest benchmarks in TIMSS, PIRLS and PISA assessments can offer broad indicators of the comparative strengths and weaknesses of education systems as far as their high achievers are concerned.

It acknowledges the significant weaknesses of an evidence base derived entirely from international benchmarking studies, although it does not address directly the problems associated with such studies which tend to call the findings into question.

It does not draw out the implications for each country – readers can do that for themselves – but I hope it does reveal that even the most celebrated international examples cannot afford to rest on their laurels. To take just three national examples:

  • Singapore tops almost every assessment but it performs less well on PIRLS Reading than on the four TIMSS studies. Other countries are improving their Reading performance at the Advanced benchmark at a much faster rate, while there has also been limited improvement over time at in Maths, especially at Grade 4. Perhaps Singapore is beginning to approach a maths ‘ceiling’, preventing the proportion of high achievers from being much further improved. In both Reading and Science there is evidence to suggest that the lower end of the achievement distribution requires somewhat greater attention.
  • Despite its stellar performance in PISA 2009 and strong showing in the overall TIMSS/PIRLS rankings, Finland is not amongst the world leaders in maximising the proportion of high achievers in these studies. It outperformed England only on Science at Grade 4, probably England’s main area of weakness. While Finland may have made strong progress in eradicating ‘long tails of low achievement’, there is evidence here to suggest that it is falling behind at the top end.
  • England’s outperformance of Finland – so often held up as the model for us to emulate – deserves to be more widely known and celebrated. The situation is nowhere near as bad as the Sutton Trust’s recent report on the Highly Able might suggest. But there is no room for complacency. There are still big gaps to make up in Maths at Grade 4 and in Science at Grade 8. The trend over time is disappointing in Science at Grades 4 and 8 and also in Reading. While attention is clearly needed to shorten ‘long tails’ in Reading, Maths and Science (especially at Grade 4), this must not be at the expense of the high achievers, or England risks falling into the Finnish trap.

.

GP

January 2013