PISA 2012 Creative Problem Solving: International Comparison of High Achievers’ Performance

This post compares the performance of high achievers from selected jurisdictions on the PISA 2012 creative problem solving test.

It draws principally on the material in the OECD Report ‘PISA 2012 Results: Creative Problem Solving’ published on 1 April 2014.

Pisa ps cover CaptureThe sample of jurisdictions includes England, other English-speaking countries (Australia, Canada, Ireland and the USA) and those that typically top the PISA rankings (Finland, Hong Kong, South Korea, Shanghai, Singapore and Taiwan).

With the exception of New Zealand, which did not take part in the problem solving assessment, this is deliberately identical to the sample I selected for a parallel post reviewing comparable results in the PISA 2012 assessments of reading, mathematics and science: ‘PISA 2012: International Comparisons of High Achievers’ Performance’ (December 2013)

These eleven jurisdictions account for nine of the top twelve performers ranked by mean overall performance in the problem solving assessment. (The USA and Ireland lie outside the top twelve, while Japan, Macao and Estonia are the three jurisdictions that are in the top twelve but outside my sample.)

The post is divided into seven sections:

  • Background to the problem solving assessment: How PISA defines problem solving competence; how it defines performance at each of the six levels of proficiency; how it defines high achievement; the nature of the assessment and who undertook it.
  • Average performance, the performance of high achievers and the performance of low achievers (proficiency level 1) on the problem solving assessment. This comparison includes my own sample and all the other jurisdictions that score above the OECD average on the first of these measures.
  • Gender and socio-economic differences amongst high achievers on the problem solving assessment  in my sample of eleven jurisdictions.
  • The relative strengths and weaknesses of jurisdictions in this sample on different aspects of the problem solving assessment. (This treatment is generic rather than specific to high achievers.)
  • What proportion of high achievers on the problem-solving assessment in my sample of jurisdictions are also high achievers in reading, maths and science respectively.
  • What proportion of students in my sample of jurisdictions achieves highly in one or more of the four PISA 2012 assessments – and against the ‘all-rounder’ measure, which is based on high achievement in all of reading, maths and science (but not problem solving).
  • Implications for education policy makers seeking to improve problem solving performance in each of the sample jurisdictions.

Background to the Problem Solving Assessment

.

Definition of problem solving

PISA’s definition of problem-solving competence is:

‘…an individual’s capacity to engage in cognitive processing to understand and resolve problem situations where a method of solution is not immediately obvious. It includes the willingness to engage with such situations in order to achieve one’s potential as a constructive and reflective citizen.’

The commentary on this definition points out that:

  • Problem solving requires identification of the problem(s) to be solved, planning and applying a solution, and monitoring and evaluating progress.
  • A problem is ‘a situation in which the goal cannot be achieved by merely applying learned procedures’, so the problems encountered must be non-routine for 15 year-olds, although ‘knowledge of general strategies’ may be useful in solving them.
  • Motivational and affective factors are also in play.

The Report is rather coy about the role of creativity in problem solving, and hence the justification for the inclusion of this term in its title.

Perhaps the nearest it gets to an exposition is when commenting on the implications of its findings:

‘In some countries and economies, such as Finland, Shanghai-China and Sweden, students master the skills needed to solve static, analytical problems similar to those that textbooks and exam sheets typically contain as well or better than 15-year-olds, on average, across OECD countries. But the same 15-year-olds are less successful when not all information that is needed to solve the problem is disclosed, and the information provided must be completed by interacting with the problem situation. A specific difficulty with items that require students to be open to novelty, tolerate doubt and uncertainty, and dare to use intuitions (“hunches and feelings”) to initiate a solution suggests that opportunities to develop and exercise these traits, which are related to curiosity, perseverance and creativity, need to be prioritised.’

.

Assessment framework

PISA’s framework for assessing problem solving competence is set out in the following diagram

 

PISA problem solving framework Capture

 

In solving a particular problem it may not be necessary to apply all these steps, or to apply them in this order.

Proficiency levels

The proficiency scale was designed to have a mean score across OECD countries of 500. The six levels of proficiency applied in the assessment each have their own profile.

The lowest, level 1 proficiency is described thus:

‘At Level 1, students can explore a problem scenario only in a limited way, but tend to do so only when they have encountered very similar situations before. Based on their observations of familiar scenarios, these students are able only to partially describe the behaviour of a simple, everyday device. In general, students at Level 1 can solve straightforward problems provided there is a simple condition to be satisfied and there are only one or two steps to be performed to reach the goal. Level 1 students tend not to be able to plan ahead or set sub-goals.’

This level equates to a range of scores from 358 to 423. Across the OECD sample, 91.8% of participants are able to perform tasks at this level.

By comparison, level 5 proficiency is described in this manner:

‘At Level 5, students can systematically explore a complex problem scenario to gain an understanding of how relevant information is structured. When faced with unfamiliar, moderately complex devices, such as vending machines or home appliances, they respond quickly to feedback in order to control the device. In order to reach a solution, Level 5 problem solvers think ahead to find the best strategy that addresses all the given constraints. They can immediately adjust their plans or backtrack when they detect unexpected difficulties or when they make mistakes that take them off course.’

The associated range of scores is from 618 to 683 and 11.4% of all OECD students achieve at this level.

Finally, level 6 proficiency is described in this way:

‘At Level 6, students can develop complete, coherent mental models of diverse problem scenarios, enabling them to solve complex problems efficiently. They can explore a scenario in a highly strategic manner to understand all information pertaining to the problem. The information may be presented in different formats, requiring interpretation and integration of related parts. When confronted with very complex devices, such as home appliances that work in an unusual or unexpected manner, they quickly learn how to control the devices to achieve a goal in an optimal way. Level 6 problem solvers can set up general hypotheses about a system and thoroughly test them. They can follow a premise through to a logical conclusion or recognise when there is not enough information available to reach one. In order to reach a solution, these highly proficient problem solvers can create complex, flexible, multi-step plans that they continually monitor during execution. Where necessary, they modify their strategies, taking all constraints into account, both explicit and implicit.’

The range of level 6 scores is from 683 points upwards and 2.5% of all OECD participants score at this level.

PISA defines high achieving students as those securing proficiency level 5 or higher, so proficiency levels 5 and 6 together. The bulk of the analysis it supplies relates to this cohort, while relatively little attention is paid to the more exclusive group achieving proficiency level 6, even though almost 10% of students in Singapore reach this standard in problem solving.

 .

The sample

Sixty-five jurisdictions took part in PISA 2012, including all 34 OECD countries and 31 partners. But only 44 jurisdictions took part in the problem solving assessment, including 28 OECD countries and 16 partners. As noted above, that included all my original sample of twelve jurisdictions, with the exception of New Zealand.

I could find no stated reason why New Zealand chose not to take part. Press reports initially suggested that England would do likewise, but it was subsequently reported that this decision had been reversed.

The assessment was computer-based and comprised 16 units divided into 42 items. The units were organised into four clusters, each designed to take 20 minutes to complete. Participants completed one or two clusters, depending on whether they were also undertaking computer-based assessments of reading and maths.

In each jurisdiction a random sample of those who took part in the paper-based maths assessment was selected to undertake the problem solving assessment. About 85,000 students took part in all. The unweighted sample sizes in my selected jurisdictions are set out in Table 1 below, together with the total population of 15 year-olds in each jurisdiction.

 

Table 1: Sample sizes undertaking PISA 2012 problem solving assessment in selected jurisdictions

Country Unweighted Sample Total 15 year-olds
Australia 5,612 291,976
Canada 4,601 417,873
Finland 3,531 62,523
Hong Kong 1,325 84,200
Ireland 1,190 59,296
Shanghai 1,203 108,056
Singapore 1,394 53,637
South Korea 1,336 687,104
Taiwan 1,484 328,356
UK (England) 1,458 738,066
USA 1,273 3,985,714

Those taking the assessment were aged between 15 years and three months and 16 years and two months at the time of the assessment. All were enrolled at school and had completed at least six years of formal schooling.

Average performance compared with the performance of high and low achievers

The overall table of mean scores on the problem solving assessment is shown below

PISA problem solving raw scores Capture

 .

There are some familiar names at the top of the table, especially Singapore and South Korea, the two countries that comfortably lead the rankings. Japan is some ten points behind in third place but it in turn has a lead of twelve points over a cluster of four other Asian competitors: Macao, Hong Kong, Shanghai and Taiwan.

A slightly different picture emerges if we compare average performance with the proportion of learners who achieve the bottom proficiency level and the top two proficiency levels. Table 2 below compares these groups.

This table includes all the jurisdictions that exceeded the OECD average score. I have marked out in bold the countries in my sample of eleven which includes Ireland, the only one of them that did not exceed the OECD average.

Table 2: PISA Problem Solving 2012: Comparing Average Performance with Performance at Key Proficiency Levels

 

Jurisdiction Mean score Level 1 (%) Level 5 (%) Level 6 (%) Levels 5+6 (%)
Singapore 562 6.0 19.7 9.6 29.3
South Korea 561 4.8 20.0 7.6 27.6
Japan 552 5.3 16.9 5.3 22.2
Macao 540 6.0 13.8 2.8 16.6
Hong Kong 540 7.1 14.2 5.1 19.3
Shanghai 536 7.5 14.1 4.1 18.2
Taiwan 534 8.2 14.6 3.8 18.4
Canada 526 9.6 12.4 5.1 17.5
Australia 523 10.5 12.3 4.4 16.7
Finland 523 9.9 11.4 3.6 15.0
England (UK) 517 10.8 10.9 3.3 14.2
Estonia 515 11.1 9.5 2.2 11.7
France 511 9.8 9.9 2.1 12.0
Netherlands 511 11.2 10.9 2.7 13.6
Italy 510 11.2 8.9 1.8 10.7
Czech Republic 509 11.9 9.5 2.4 11.9
Germany 509 11.8 10.1 2.7 12.8
USA 508 12.5 8.9 2.7 11.6
Belgium 508 11.6 11.4 3.0 14.4
Austria 506 11.9 9.0 2.0 11.0
Norway 503 13.2 9.7 3.4 13.1
Ireland 498 13.3 7.3 2.1 9.4
OECD Ave. 500 13.2 8.9 2.5 11.4

 .

The jurisdictions at the top of the table also have a familiar profile, with a small ‘tail’ of low performance combined with high levels of performance at the top end.

Nine of the top ten have fewer than 10% of learners at proficiency level 1, though only South Korea pushes below 5%.

Five of the top ten have 5% or more of their learners at proficiency level 6, but only Singapore and South Korea have a higher percentage at level 6 than level 1 (with Japan managing the same percentage at both levels).

The top three performers – Singapore, South Korea and Japan – are the only three jurisdictions that have over 20% of their learners at proficiency levels 5 and 6 together.

South Korea slightly outscores Singapore at level 5 (20.0% against 19.7%). Japan is in third place, followed by Taiwan, Hong Kong and Shanghai.

But at level 6, Singapore has a clear lead, followed by South Korea, Japan, Hong Kong and Canada respectively.

England’s overall place in the table is relatively consistent on each of these measures, but the gaps between England and the top performers vary considerably.

The best have fewer than half England’s proportion of learners at proficiency level 1, almost twice as many learners at proficiency level 5 and more than twice as many at proficiency levels 5 and 6 together. But at proficiency level 6 they have almost three times as many learners as England.

Chart 1 below compares performance on these four measures across my sample of eleven jurisdictions.

All but Ireland are comfortably below the OECD average for the percentage of learners at proficiency level 1. The USA and Ireland are atypical in having a bigger tail (proficiency level 1) than their cadres of high achievers (levels 5 and 6 together).

At level 5 all but Ireland and the USA are above the OECD average, but USA leapfrogs the OECD average at level 6.

There is a fairly strong correlation between the proportions of learners achieving the highest proficiency thresholds and average performance in each jurisdiction. However, Canada stands out by having an atypically high proportion of students at level 6.

.

Chart 1: PISA 2012 Problem-solving: Comparing performance at specified proficiency levels

Problem solving chart 1

.

PISA’s Report discusses the variation in problem-solving performance within different jurisdictions. However it does so without reference to the proficiency levels, so we do not know to what extent these findings apply equally to high achievers.

Amongst those above the OECD average, those with least variation are Macao, Japan, Estonia, Shanghai, Taiwan, Korea, Hong Kong, USA, Finland, Ireland, Austria, Singapore and the Czech Republic respectively.

Perhaps surprisingly, the degree of variation in Finland is identical to that in the USA and Ireland, while Estonia has less variation than many of the Asian jurisdictions. Singapore, while top of the performance table, is only just above the OECD average in terms of variation.

The countries below the OECD average on this measure – listed in order of increasing variation – include England, Australia and Canada, though all three are relatively close to the OECD average. So these three countries and Singapore are all relatively close together.

Gender and socio-economic differences amongst high achievers

 .

Gender differences

On average across OECD jurisdictions, boys score seven points higher than girls on the problem solving assessment. There is also more variation amongst boys than girls.

Across the OECD participants, 3.1% of boys achieved proficiency level 6 but only 1.8% of girls did so. This imbalance was repeated at proficiency level 5, achieved by 10% of boys and 7.7% of girls.

The table and chart below show the variations within my sample of eleven countries. The performance of boys exceeds that of girls in all cases, except in Finland at proficiency level 5, and in that instance the gap in favour of girls is relatively small (0.4%).

 .

Table 3: PISA Problem-solving: Gender variation at top proficiency levels

Jurisdiction Level 5 (%) Level 6 (%) Levels 5+6 (%)
  Boys Girls Diff Boys Girls Diff Boys Girls Diff
Singapore 20.4 19.0 +1.4 12.0 7.1 +4.9 32.4 26.1 +6.3
South Korea 21.5 18.3 +3.2 9.4 5.5 +3.9 30.9 23.8 +7.1
Hong Kong 15.7 12.4 +3.3 6.1 3.9 +2.2 21.8 16.3 +5.5
Shanghai 17.0 11.4 +5.6 5.7 2.6 +3.1 22.7 14.0 +8.7
Taiwan 17.3 12.0 +5.3 5.0 2.5 +2.5 22.3 14.5 +7.8
Canada 13.1 11.8 +1.3 5.9 4.3 +1.6 19.0 16.1 +2.9
Australia 12.6 12.0 +0.6 5.1 3.7 +1.4 17.7 15.7 +2.0
Finland 11.2 11.6 -0.4 4.1 3.0 +1.1 15.3 14.6 +0.7
England (UK) 12.1 9.9 +2.2 3.6 3.0 +0.6 15.7 12.9 +2.8
USA 9.8 7.9 +1.9 3.2 2.3 +0.9 13.0 10.2 +2.8
Ireland 8.0 6.6 +1.4 3.0 1.1 +1.9 11.0 7.7 +3.3
OECD Average 10.0 7.7 +2.3 3.1 1.8 +1.3 13.1 9.5 +3.6

There is no consistent pattern in whether boys are more heavily over-represented at proficiency level 5 than proficiency level 6, or vice versa.

There is a bigger difference at level 6 than at level 5 in Singapore, South Korea, Canada, Australia, Finland and Ireland, but the reverse is true in the five remaining jurisdictions.

At level 5, boys are in the greatest ascendancy in Shanghai and Taiwan while, at level 6, this is true of Singapore and South Korea.

When proficiency levels 5 and 6 are combined, all five of the Asian tigers show a difference in favour of males of 5.5% or higher, significantly in advance of the six ‘Western’ countries in the sample and significantly ahead of the OECD average.

Amongst the six ‘Western’ representatives, boys have the biggest advantage at proficiency level 5 in England, while at level 6 boys in Ireland have the biggest advantage.

Within this group of jurisdictions, the gap between boys and girls at level 6 is comfortably the smallest in England. But, in terms of performance at proficiency levels 5 and 6 together, Finland is ahead.

 .

Chart 2: PISA Problem-solving: Gender variation at top proficiency levels

Problem solving chart 2

The Report includes a generic analysis of gender differences in performance for boys and girls with similar levels of performance in English, maths and science.

It concludes that girls perform significantly above their expected level in both England and Australia (though the difference is only statistically significant in the latter).

The Report comments:

‘It is not clear whether one should expect there to be a gender gap in problem solving. On the one hand, the questions posed in the PISA problem-solving assessment were not grounded in content knowledge, so boys’ or girls’ advantage in having mastered a particular subject area should not have influenced results. On the other hand… performance in problem solving is more closely related to performance in mathematics than to performance in reading. One could therefore expect the gender difference in performance to be closer to that observed in mathematics – a modest advantage for boys, in most countries – than to that observed in reading – a large advantage for girls.’

 .

Socio-economic differences

The Report considers variations in performance against PISA’s Index of Economic, Social and Cultural status (IESC), finding them weaker overall than for reading, maths and science.

It calculates that the overall percentage variation in performance attributable to these factors is about 10.6% (compared with 14.9% in maths, 14.0% in science and 13.2% in reading).

Amongst the eleven jurisdictions in my sample, the weakest correlations were found in Canada (4%), followed by Hong Kong (4.9%), South Korea (5.4%), Finland (6.5%), England (7.8%), Australia (8.5%), Taiwan (9.4%), the USA (10.1%) and Ireland (10.2%) in that order. All those jurisdictions had correlations below the OECD average.

Perhaps surprisingly, there were above average correlations in Shanghai (14.1%) and, to a lesser extent (and less surprisingly) in Singapore (11.1%).

The report suggests that students with parents working in semi-skilled and elementary occupations tend to perform above their expected level in problem-solving in Taiwan, England, Canada, the USA, Finland and Australia (in that order – with Australia closest to the OECD average).

The jurisdictions where these students tend to underperform their expected level are – in order of severity – Ireland, Shanghai, Singapore, Hong Kong and South Korea.

A parallel presentation on the Report provides some additional data about the performance in different countries of what the OECD calls ‘resilient’ students – those in the bottom quartile of the IESC but in the top quartile by perfromance, after accounting for socio-economic status.

It supplies the graph below, which shows all the Asian countries in my sample clustered at the top, but also with significant gaps between them. Canada is the highest-performing of the remainder in my sample, followed by Finland, Australia, England and the USA respectively. Ireland is some way below the OECD average.

.

PISA problem resolving resilience Capture

.

Unfortunately, I can find no analysis of how performance varies according to socio-economic variables at each proficiency level. It would be useful to see which jurisdictions have the smallest ‘excellence gaps’ at levels 5 and 6 respectively.

 .

How different jurisdictions perform on different aspects of problem-solving

The Report’s analysis of comparative strengths and weaknesses in different elements of problem-solving does not take account of variations at different proficiency levels

It explains that aspects of the assessment were found easier by students in different jurisdictions, employing a four-part distinction between:

‘Exploring and understanding. The objective is to build mental representations of each of the pieces of information presented in the problem. This involves:

  • exploring the problem situation: observing it, interacting with it, searching for information and finding limitations or obstacles; and
  • understanding given information and, in interactive problems, information discovered while interacting with the problem situation; and demonstrating understanding of relevant concepts.

Representing and formulating. The objective is to build a coherent mental representation of the problem situation (i.e. a situation model or a problem model). To do this, relevant information must be selected, mentally organised and integrated with relevant prior knowledge. This may involve:

  • representing the problem by constructing tabular, graphic, symbolic or verbal representations, and shifting between representational formats; and
  • formulating hypotheses by identifying the relevant factors in the problem and their inter-relationships; and organising and critically evaluating information.

Planning and executing. The objective is to use one’s knowledge about the problem situation to devise a plan and execute it. Tasks where “planning and executing” is the main cognitive demand do not require any substantial prior understanding or representation of the problem situation, either because the situation is straightforward or because these aspects were previously solved. “Planning and executing” includes:

  • planning, which consists of goal setting, including clarifying the overall goal, and setting subgoals, where necessary; and devising a plan or strategy to reach the goal state, including the steps to be undertaken; and
  • executing, which consists of carrying out a plan.

Monitoring and reflecting.The objective is to regulate the distinct processes involved in problem solving, and to critically evaluate the solution, the information provided with the problem, or the strategy adopted. This includes:

  • monitoring progress towards the goal at each stage, including checking intermediate and final results, detecting unexpected events, and taking remedial action when required; and
  • reflecting on solutions from different perspectives, critically evaluating assumptions and alternative solutions, identifying the need for additional information or clarification and communicating progress in a suitable manner.’

Amongst my sample of eleven jurisdictions:

  • ‘Exploring and understanding’ items were found easier by students in Singapore, Hong Kong, South Korea, Australia, Taiwan and Finland. 
  • ‘Representing and formulating’ items were found easier in Taiwan, Shanghai, South Korea, Singapore, Hong Kong, Canada and Australia. 
  • ‘Planning and executing’ items were found easier in Finland only. 
  • ‘Monitoring and reflecting’ items were found easier in Ireland, Singapore, the USA and England.

The Report concludes:

‘This analysis shows that, in general, what differentiates high-performing systems, and particularly East Asian education systems, such as those in Hong Kong-China, Japan, Korea [South Korea], Macao-China, Shanghai -China, Singapore and Chinese Taipei [Taiwan], from lower-performing ones, is their students’ high level of proficiency on “exploring and understanding” and “representing and formulating” tasks.’

It also distinguishes those jurisdictions that perform best on interactive problems, requiring students to discover some of the information required to solve the problem, rather than being presented with all the necessary information. This seems to be the nearest equivalent to a measure of creativity in problem solving

Comparative strengths and weaknesses in respect of interactive tasks are captured in the following diagram.

.

PISA problem solving strengths in different countries

.

One can see that several of my sample – Ireland, the USA, Canada, Australia, South Korea and Singapore – are placed in the top right-hand quarter of the diagram, indicating stronger than expected performance on both interactive and knowledge acquisition tasks.

England is stronger than expected on the former but not on the latter.

Jurisdictions that are weaker than inspected on interactive tasks only include Hong Kong, Taiwan and Shanghai, while Finland is weaker than expected on both.

We have no information about whether these distinctions were maintained at different proficiency levels.

.

Comparing jurisdictions’ performance at higher proficiency levels

Table 4 and Charts 3 and 4 below show variations in the performance of countries in my sample across the four different assessments at level 6, the highest proficiency level.

The charts in particular emphasise how far ahead the Asian Tigers are in maths at this level, compared with the cross-jurisdictional variation in the other three assessments.

In all five cases, each ‘Asian Tiger’s’ level 6 performance in maths also vastly exceeds its level 6 performance in the other three assessments. The proportion of students achieving level 6 proficiency in problem solving lags far behind, even though there is a fairly strong correlation between these two assessments (see below).

In contrast, all the ‘Western’ jurisdictions in the sample – with the sole exception of Ireland – achieve a higher percentage at proficiency level 6 in problem solving than they do in maths, although the difference is always less than a full percentage point. (Even in Ireland the difference is only 0.1 of a percentage point in favour of maths.)

Shanghai is the only jurisdiction in the sample which has more students achieving proficiency level 6 in science than in problem solving. It also has the narrowest gap between level 6 performance in problem solving and in reading.

Meanwhile, England, the USA, Finland and Australia all have broadly similar profiles across the four assessments, with the largest percentage of level 6 performers in problem solving, followed by maths, science and reading respectively.

The proximity of the lines marking level 6 performance in reading and science is also particularly evident in the second chart below.

.

Table 4: Percentage achieving proficiency Level 6 in each domain

  PS L6  Ma L6 Sci L6 Re L6
Singapore 9.6 19.0 5.8 5.0
South Korea 7.6 12.1 1.1 1.6
Hong Kong 5.1 12.3 1.8 1.9
Shanghai 4.1 30.8 4.2 3.8
Taiwan 3.8 18.0 0.6 1.4
Canada 5.1 4.3 1.8 2.1
Australia 4.4 4.3 2.6 1.9
Finland 3.6 3.5 3.2 2.2
England (UK) 3.3 3.1 1.9 1.3
USA 2.7 2.2 1.1 1.0
Ireland 2.1 2.2 1.5 1.3
OECD Average 2.5 3.3 1.2 1.1

 Charts 3 and 4: Percentage achieving proficiency level 6 in each domain

Problem solving chart 3

Problem solving chart 4

The pattern is materially different at proficiency levels 5 and above, as the table and chart below illustrate. These also include the proportion of all-rounders, who achieved proficiency level 5 or above in each of maths, science and reading (but not in problem-solving).

The lead enjoyed by the ‘Asian Tigers’ in maths is somewhat less pronounced. The gap between performance within these jurisdictions on the different assessments also tends to be less marked, although maths accounts for comfortably the largest proportion of level 5+ performance in all five cases.

Conversely, level 5+ performance on the different assessments is typically much closer in the ‘Western’ countries. Problem solving leads the way in Australia, Canada, England and the USA, but in Finland science is in the ascendant and reading is strongest in Ireland.

Some jurisdictions have a far ‘spikier’ profile than others. Ireland is closest to achieving equilibrium across all four assessments. Australia and England share very similar profiles, though Australia outscores England in each assessment.

The second chart in particular shows how Shanghai’s ‘spike’ applies in all the other three assessments but not in problem solving.

Table 5: Percentage achieving Proficiency level 5 and above in each domain

  PS L5+  Ma L5+ Sci L5+ Re L5+ Ma + Sci + Re L5+
Singapore 29.3 40.0 22.7 21.2 16.4
South Korea 27.6 30.9 11.7 14.2 8.1
Hong Kong 19.3 33.4 16.7 16.8 10.9
Shanghai 18.2 55.4 27.2 25.1 19.6
Taiwan 18.4 37.2 8.4 11.8 6.1
Canada 17.5 16.4 11.3 12.9 6.5
Australia 16.7 14.8 13.5 11.7 7.6
Finland 15.0 15.2 17.1 13.5 7.4
England (UK) 14.2 12.4 11.7 9.1 5.7* all UK
USA 11.6 9.0 7.4 7.9 4.7
Ireland 9.4 10.7 10.8 11.4 5.7
OECD Average 11.4 12.6 8.4 8.4 4.4

 .

Charts 5 and 6: Percentage Achieving Proficiency Level 5 and above in each domain

Problem solving chart 5Problem solving chart 6.

How high-achieving problem solvers perform in other assessments

.

Correlations between performance in different assessments

The Report provides an analysis of the proportion of students achieving proficiency levels 5 and 6 on problem solving who also achieved that outcome on one of the other three assessments: reading, maths and science.

It argues that problem solving is a distinct and separate domain. However:

‘On average, about 68% of the problem-solving score reflects skills that are also measured in one of the three regular assessment domains. The remaining 32% reflects skills that are uniquely captured by the assessment of problem solving. Of the 68% of variation that problem-solving performance shares with other domains, the overwhelming part is shared with all three regular assessment domains (62% of the total variation); about 5% is uniquely shared between problem solving and mathematics only; and about 1% of the variation in problem solving performance hinges on skills that are specifically measured in the assessments of reading or science.’

It discusses the correlation between these different assessments:

‘A key distinction between the PISA 2012 assessment of problem solving and the regular assessments of mathematics, reading and science is that the problem-solving assessment does not measure domain-specific knowledge; rather, it focuses as much as possible on the cognitive processes fundamental to problem solving. However, these processes can also be used and taught in the other subjects assessed. For this reason, problem-solving tasks are also included among the test units for mathematics, reading and science, where their solution requires expert knowledge specific to these domains, in addition to general problem-solving skills.

It is therefore expected that student performance in problem solving is positively correlated with student performance in mathematics, reading and science. This correlation hinges mostly on generic skills, and should thus be about the same magnitude as between any two regular assessment subjects.’

These overall correlations are set out in the table below, which shows that maths has a higher correlation with problem solving than either science or reading, but that this correlation is lower than those between the three subject-related assessments.

The correlation between maths and science (0.90) is comfortably the strongest (despite the relationship between reading and science at the top end of the distribution noted above).

PISA problem solving correlations capture

Correlations are broadly similar across jurisdictions, but the Report notes that the association is comparatively weak in some of these, including Hong Kong. Students here are more likely to perform poorly on problem solving and well on other assessments, or vice versa.

There is also broad consistency at different performance levels, but the Report identifies those jurisdictions where students with the same level of performance exceed expectations in relation to problem-solving performance. These include South Korea, the USA, England, Australia, Singapore and – to a lesser extent – Canada.

Those with lower than expected performance include Shanghai, Ireland, Hong Kong, Taiwan and Finland.

The Report notes:

‘In Shanghai-China, 86% of students perform below the expected level in problem solving, given their performance in mathematics, reading and science. Students in these countries/economies struggle to use all the skills that they demonstrate in the other domains when asked to perform problem-solving tasks.’

However, there is variation according to students’ maths proficiency:

  • Jurisdictions whose high scores on problem solving are mainly attributable to strong performers in maths include Australia, England and the USA. 
  • Jurisdictions whose high scores on problem solving are more attributable to weaker performers in maths include Ireland. 
  • Jurisdictions whose lower scores in problem solving are more attributable to weakness among strong performers in maths include Korea. 
  • Jurisdictions whose lower scores in problem solving are more attributable to weakness among weak performers in maths include Hong Kong and Taiwan. 
  • Jurisdictions whose weakness in problem solving is fairly consistent regardless of performance in maths include Shanghai and Singapore.

The Report adds:

‘In Italy, Japan and Korea, the good performance in problem solving is, to a large extent, due to the fact that lower performing students score beyond expectations in the problem-solving assessment….This may indicate that some of these students perform below their potential in mathematics; it may also indicate, more positively, that students at the bottom of the class who struggle with some subjects in school are remarkably resilient when it comes to confronting real-life challenges in non-curricular contexts…

In contrast, in Australia, England (United Kingdom) and the United States, the best students in mathematics also have excellent problem-solving skills. These countries’ good performance in problem solving is mainly due to strong performers in mathematics. This may suggest that in these countries, high performers in mathematics have access to – and take advantage of – the kinds of learning opportunities that are also useful for improving their problem-solving skills.’

What proportion of high performers in problem solving are also high performers in one of the other assessments?

The percentages of high achieving students (proficiency level 5 and above) in my sample of eleven jurisdictions who perform equally highly in each of the three domain-specific assessments are shown in Table 6 and Chart 7 below.

These show that Shanghai leads the way in each case, with 98.0% of all students who achieve proficiency level 5+ in problem solving also achieving the same outcome in maths. For science and reading the comparable figures are 75.1% and 71.7% respectively.

Taiwan is the nearest competitor in respect of problem solving plus maths, Finland in the case of problem solving plus science and Ireland in the case of problem solving plus reading.

South Korea, Taiwan and Canada are atypical of the rest in recording a higher proportion of problem solving plus reading at this level than problem solving plus science.

Singapore, Shanghai and Ireland are the only three jurisdictions that score above 50% on all three of these combinations. However, the only jurisdictions that exceed the OECD averages in all three cases are Singapore, Hong Kong, Shanghai and Finland.

Table 6: PISA problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments

  PS + Ma PS + Sci PS + Re
Singapore 84.1 57.0 50.2
South Korea 73.5 34.1 40.3
Hong Kong 79.8 49.4 48.9
Shanghai 98.0 75.1 71.7
Taiwan 93.0 35.3 43.7
Canada 57.7 43.9 44.5
Australia 61.3 54.9 47.1
Finland 66.1 65.4 49.5
England (UK) 59.0 52.8 41.7
USA 54.6 46.9 45.1
Ireland 59.0 57.2 52.0
OECD Average 63.5 45.7 41.0

Chart 7: PISA Problem-solving: Percentage of students achieving proficiency level 5+ in domain-specific assessments

Problem solving chart 7.

What proportion of students achieve highly in one or more assessments?

Table 7 and Chart 8 below show how many students in each of my sample achieved proficiency level 5 or higher in problem-solving only, in problem solving and one or more assessments, in one or more assessments but not problem solving and in at least one assessment (ie the total of the three preceding columns).

I have also repeated in the final column the percentage achieving this proficiency level in each of maths, science and reading. (PISA has not released information about the proportion of students who achieved this feat across all four assessments.)

These reveal that the percentages of students who achieve proficiency level 5+ only in problem solving are very small, ranging from 0.3% in Shanghai to 6.7% in South Korea.

Conversely, the percentages of students achieving proficiency level 5+ in any one of the other assessments but not in problem solving are typically significantly higher, ranging from 4.5% in the USA to 38.1% in Shanghai.

There is quite a bit of variation in terms of whether jurisdictions score more highly on ‘problem solving and at least one other’ (second column) and ‘at least one other excluding problem solving (third column).

More importantly, the fourth column shows that the jurisdiction with the most students achieving proficiency level 5 or higher in at least one assessment is clearly Shanghai, followed by Singapore, Hong Kong, South Korea and Taiwan in that order.

The proportion of students achieving this outcome in Shanghai is close to three times the OECD average, comfortably more than twice the rate achieved in any of the ‘Western’ countries and three times the rate achieved in the USA.

The same is true of the proportion of students achieving this level in the three domain-specific assessments.

On this measure, South Korea and Taiwan fall significantly behind their Asian competitors, and the latter is overtaken by Australia, Finland and Canada.

 .

Table 7: Percentage achieving proficiency level 5+ in different combinations of PISA assessments

  PS only% PS + 1 or more% 1+ butNot PS% L5+ in at least one % L5+ in Ma + Sci + Re %
Singapore 4.3 25.0 16.5 45.8 16.4
South Korea 6.7 20.9 11.3 38.9 8.1
Hong Kong 3.4 15.9 20.5 39.8 10.9
Shanghai 0.3 17.9 38.1 56.3 19.6
Taiwan 1.2 17.1 20.4 38.7 6.1
Canada 5.5 12.0 9.9 27.4 6.5
Australia 4.7 12.0 7.7 24.4 7.6
Finland 3.0 12.0 11.9 26.9 7.4
England (UK) 4.4 9.8 6.8 21.0 5.7* all UK
USA 4.1 7.5 4.5 16.1 4.7
Ireland 2.6 6.8 10.1 19.5 5.7
OECD Average 3.1 8.2 8.5 19.8 4.4

Chart 8: Percentage achieving proficiency level 5+ in different combinations of PISA assessments

Problem solving chart 8

The Report comments:

The proportion of students who reach the highest levels of proficiency in at least one domain (problem solving, mathematics, reading or science) can be considered a measure of the breadth of a country’s/economy’s pool of top performers. By this measure, the largest pool of top performers is found in Shanghai-China, where more than half of all students (56%) perform at the highest levels in at least one domain, followed by Singapore (46%), Hong  Kong-China (40%), Korea and Chinese  Taipei (39%)…Only one OECD country, Korea, is found among the five countries/economies with the largest proportion of top performers. On average across OECD countries, 20% of students are top performers in at least one assessment domain.

The proportion of students performing at the top in problem solving and in either mathematics, reading or science, too can be considered a measure of the depth of this pool. These are top performers who combine the mastery of a specific domain of knowledge with the ability to apply their unique skills flexibly, in a variety of contexts. By this measure, the deepest pools of top performers can be found in Singapore (25% of students), Korea (21%), Shanghai-China (18%) and Chinese Taipei (17%). On average across OECD countries, only 8% of students are top performers in both a core subject and in problem solving.’

There is no explanation of why proficiency level 5 should be equated by PISA with the breadth of a jurisdiction’s ‘pool of top performers’. The distinction between proficiency levels 5 and 6 in this respect requires further discussion.

In addition to updated ‘all-rounder’ data showing what proportion of students achieved this outcome across all four assessments, it would be really interesting to see the proportion of students achieving at proficiency level 6 across different combinations of these four assessments – and to see what proportion of students achieving that outcome in different jurisdictions are direct beneficiaries of targeted support, such as a gifted education programme.

In the light of this analysis, what are jurisdictions’ priorities for improving  problem solving performance?

Leaving aside strengths and weaknesses in different elements of problem solving discussed above, this analysis suggests that the eleven jurisdictions in my sample should address the following priorities:

Singapore has a clear lead at proficiency level 6, but falls behind South Korea at level 5 (though Singapore re-establishes its ascendancy when levels 5 and 6 are considered together). It also has more level 1 performers than South Korea. It should perhaps focus on reducing the size of this tail and pushing through more of its mid-range performers to level 5. There is a pronounced imbalance in favour of boys at level 6, so enabling more girls to achieve the highest level of performance is a clear priority. There may also be a case for prioritising the children of semi-skilled workers.

South Korea needs to focus on getting a larger proportion of its level 5 performers to level 6. This effort should be focused disproportionately on girls, who are significantly under-represented at both levels 5 and 6. South Korea has a very small tail to worry about – and may even be getting close to minimising this. It needs to concentrate on improving the problem solving skills of its stronger performers in maths.

Hong Kong has a slightly bigger tail than Singapore’s but is significantly behind at both proficiency levels 5 and 6. In the case of level 6 it is equalled by Canada. Hong Kong needs to focus simultaneously on reducing the tail and lifting performance across the top end, where girls and weaker performers in maths are a clear priority.

Shanghai has a similar profile to Hong Kong’s in all respects, though with somewhat fewer level 6 performers. It also needs to focus effort simultaneously at the top and the bottom of the distribution. Amongst this sample, Shanghai has the worst under-representation of girls at level 5 and levels 5 and 6 together, so addressing that imbalance is an obvious priority. It also demonstrated the largest variation in performance against PISA’s IESC index, which suggests that it should target young people from disadvantaged backgrounds, as well as the children of semi-skilled workers.

Taiwan is rather similar to Hong Kong and Shanghai, but its tail is slightly bigger and its level 6 cadre slightly smaller, while it does somewhat better at level 5. It may need to focus more at the very bottom, but also at the very top. Taiwan also has a problem with high-performing girls, second only to Shanghai as far as level 5 and levels 5 and 6 together are concerned. However, like Shanghai, it does comparatively better than the other ‘Asian Tigers’ in terms of girls at level 6. It also needs to consider the problem solving performance of its weaker performers in maths.

Canada is the closest western competitor to the ‘Asian Tigers’ in terms of the proportions of students at levels 1 and 5 – and it already outscores Shanghai and Taiwan at level 6. It needs to continue cutting down the tail without compromising achievement at the top end. Canada also has small but significant gender imbalances in favour of boys at the top end.

Australia by comparison is significantly worse than Canada at level 1, broadly comparable at level 5 and somewhat worse at level 6. It too needs to improve scores at the very bottom and the very top. Australia’s gender imbalance is more pronounced at level 6 than level 5.

Finland has the same mean score as Australia’s but a smaller tail (though not quite as small as Canada’s). It needs to improve across the piece but might benefit from concentrating rather more heavily at the top end. Finland has a slight gender imbalance in favour of girls at level 5, but boys are more in the ascendancy at level 6 than in either England or the USA. As in Australia, this latter point needs addressing.

England has a profile similar to Australia’s, but less effective at all three selected proficiency levels. It is further behind at the top than at the bottom of the distribution, but needs to work hard at both ends to catch up the strongest western performers and maintain its advantage over the USA and Ireland. Gender imbalances are small but nonetheless significant.

USA has a comparatively long tail of low achievement at proficiency level 1 and, with the exception of Ireland, the fewest high achievers. This profile is very close to the OECD average. As in England, the relatively small size of gender imbalances in favour of boys does not mean that these can be ignored.

Ireland has the longest tail of low achievement and the smallest proportion of students at proficiency levels 5, 6 and 5 and 6 combined. It needs to raise the bar at both ends of the achievement distribution. Ireland has a larger preponderance of boys at level 6 than its Western competitors and this needs addressing. The limited socio-economic evidence suggests that Ireland should also be targeting the offspring of parents with semi-skilled and elementary occupations.

So there is further scope for improvement in all eleven jurisdictions. Meanwhile the OECD could usefully provide a more in-depth analysis of high achievers on its assessments that features:

  • Proficiency level 6 performance across the board.
  • Socio-economic disparities in performance at proficiency levels 5 and 6.
  • ‘All-rounder’ achievement at these levels across all four assessments and
  • Correlations between success at these levels and specific educational provision for high achievers including gifted education programmes.

.

GP

April 2014

What Has Become of the European Talent Network? Part One

This post discusses recent progress by the European Talent Centre towards a European Talent Network.

EU flag CaptureIt is a curtain-raiser for an imminent conference on this topic and poses the critical questions I would like to see addressed at that event.

It should serve as a briefing document for prospective delegates and other interested parties, especially those who want to dig beneath the invariably positive publicity surrounding the initiative.

It continues the narrative strand of posts I have devoted to the Network, concentrating principally on developments since my last contribution in December 2012.

 

Flag_of_HungaryThe post is organised part thematically and part chronologically and covers the following ground:

  • An updated description of the Hungarian model for talent support and its increasingly complex infrastructure.
  • The origins of the European Talent project and how its scope and objectives have changed since its inception.
  • The project’s advocacy effort within the European Commission and its impact to date.
  • Progress on the European Talent Map and promised annual European Talent Days and conferences.
  • The current scope and effectiveness of the network, its support structures and funding.
  • Key issues and obstacles that need to be addressed.

To improve readability I have divided the text into two sections of broadly equivalent length. Part One is dedicated largely to bullets one to three above, while Part Two deals with bullets three to six.

Previous posts in this series

If I am to do justice to this complex narrative, I must necessarily draw to some extent on material I have already published in earlier posts. I apologise for the repetition, which I have tried to keep to a minimum.

On re-reading those earlier posts and comparing them with this, it is clear that my overall assessment of the EU talent project has shifted markedly since 2010, becoming progressively more troubled and pessimistic.

This seems to me justified by an objective assessment of progress, based exclusively on evidence in the public domain – evidence that I have tried to draw together in these posts.

However, I feel obliged to disclose the influence of personal frustration at this slow progress, as well as an increasing sense of personal exclusion from proceedings – which seems completely at odds with the networking principles on which the project is founded.

I have done my best to control this subjective influence in the assessment below, confining myself as far as possible to an objective interpretation of the facts.

However I refer you to my earlier posts if you wish to understand how I reached this point.

  • In April 2011 I attended the inaugural conference in Budapest, publishing a report on the proceedings and an analysis of the Declaration produced, plus an assessment of the Hungarian approach to talent support as it then was and its potential scalability to Europe as a whole.
  • In December 2012 I described the initial stages of EU lobbying, an ill-fated 2012 conference in Poland, the earliest activities of the European Talent Centre and the evolving relationship between the project and ECHA, the European Council for High Ability.

I will not otherwise comment on my personal involvement, other than to say that I do not expect to attend the upcoming Conference, judging that the cost of attending will not be exceeded by the benefits of doing so.

This post conveys more thoroughly and more accurately the points I would have wanted to make during the proceedings, were suitable opportunities provided to do so.

A brief demographic aside

It is important to provide some elementary information about Hungary’s demographics, to set in context the discussion below of its talent support model and the prospects for Europe-wide scalability.

Hungary is a medium-sized central European country with an area roughly one-third of the UK’s and broadly similar to South Korea or Portugal.

It has a population of around 9.88 million (2013) about a sixth of the size of the UK population and similar in size to Portugal’s or Sweden’s.

Hungary is the 16th most populous European country, accounting for about 1.4% of the total European population and about 2% of the total population of the European Union (EU).

It is divided into 7 regions and 19 counties, plus the capital, Budapest, which has a population of 1.7 million in its own right.

RegionsHungary

Almost 84% of the population are ethnic Hungarians but there is a Roma minority estimated (some say underestimated) at 3.1% of the population.

Approximately 4 million Hungarians are aged below 35 and approximately 3.5m are aged 5-34.

The GDP (purchasing power parity) is $19,497 (source: IMF), slightly over half the comparable UK figure.

The Hungarian Talent Support Model

The Hungarian model has grown bewilderingly complex and there is an array of material describing it, often in slightly different terms.

Some of the English language material is not well translated and there are gaps that can be filled only with recourse to documents in Hungarian (which I can only access through online translation tools).

Much of this documentation is devoted to publicising the model as an example of best practice, so it can be somewhat economical with the truth.

The basic framework is helpfully illustrated by this diagram, which appeared in a presentation dating from October 2012.

EU talent funding Capture

 .

It shows how the overall Hungarian National Talent Programme (NTP) comprises a series of time-limited projects paid for by the EU Social Fund, but also a parallel set of activities supported by a National Talent Fund which is fed mainly by the Hungarian taxpayer.

The following sections begin by outlining the NTP, as described in a Parliamentary Resolution dating from 2008.

Secondly, they describe the supporting infrastructure for the NTP as it exists today.

Thirdly, they outline the key features of the time-limited projects: The Hungarian Genius Programme (HGP) (2009-13) and the Talent Bridges Programme (TBP) (2012-14).

Finally, they try to make sense of the incomplete and sometimes conflicting information about the funding allocated to different elements of the NTP.

Throughout this treatment my principal purpose is to show how the European Talent project fits into the overall Hungarian plan, as precursor to a closer analysis of the former in the second half of the post.

I also want to show how the direction of the NTP has shifted since its inception.

 .

The National Talent Programme (NTP) (2008-2028)

The subsections below describe the NTP as envisaged in the original 2008 Parliamentary Resolution. This remains the most thorough exposition of the broader direction of travel that I could find.

Governing principles

The framework set out in the Resolution is built on ten general principles that I can best summarise as follows:

  • Talent support covers the period from early childhood to age 35, so extends well beyond compulsory education.
  • The NTP must preserve the traditions of existing successful talent support initiatives.
  • Talent is complex and so requires a diversity of provision – standardised support is a false economy.
  • There must be equality of access to talent support by geographical area, ethnic and socio-economic background.
  • Continuity is necessary to support individual talents as they change and develop over time; special attention is required at key transition points.
  • In early childhood one must provide opportunities for talent to emerge, but selection on the basis of commitment and motivation become increasingly significant and older participants increasingly self-select.
  • Differentiated support is needed to support different levels of talent; there must be opportunities to progress and to step off the programme without loss of esteem.
  • In return for talent support, the talented individual has a social responsibility to support talent development in others.
  • Those engaged in talent support – here called talent coaches – need time and support.
  • Wider social support for talent development is essential to success and sustainability.

Hence the Hungarians are focused on a system-wide effort to promote talent development that extends well beyond compulsory education, but only up to the age of 35. As noted above, if 0-4 year-olds are excluded, this represents an eligible population of about 3.5 million people.

The choice of this age 35 cut-off seems rather arbitrary. Having decided to push beyond compulsory education into adult provision, it is not clear why the principle of lifelong learning is then set aside – or exactly what happens when participants reach their 36th birthdays.

Otherwise the principles above seem laudable and broadly reflect one tradition of effective practice in the field.

Goals

The NTP’s goals are illustrated by this diagram

NTP goals Capture

 .

The elements in the lower half of the diagram can be expanded thus:

  • Talent support traditions: support for existing provision; development of new provision to fill gaps; minimum standards and professional development for providers; applying models of best practice; co-operation with ethnic Hungarian programmes outside Hungary (‘cross border programmes’); and ‘systematic exploration and processing of the talent support experiences’ of EU and other countries which excel in this field. 
  • Integrated programmes: compiling and updating a map of the talent support opportunities available in Hungary as well as ‘cross border programmes’; action to support access to the talent map; a ‘detailed survey of the international talent support practice’; networking between providers with cooperation and collaboration managed through a set of talent support councils; monitoring of engagement to secure continuity and minimise drop-out. 
  • Social responsibility: promoting the self-organisation of talented youth;  developing their innovation and management skills; securing counselling; piloting  a ‘Talent Bonus – Talent Coin’ scheme to record in virtual units the monetary value of support received and provided, leading to consideration of a LETS-type scheme; support for ‘exceptionally talented youth’; improved social integration of talented youth and development of a talent-friendly society. 
  • Equal opportunities: providing targeted information about talent support opportunities; targeted programming for disadvantaged, Roma and disabled people and wider emphasis on integration; supporting the development of Roma talent coaches; and action to secure ‘the desirable gender distribution’. 
  • Enhanced recognition: improving financial support for talent coaches; reducing workload and providing counselling for coaches; improving recognition and celebrating the success of coaches and others engaged in talent support. 
  • Talent-friendly society: awareness-raising activity for parents, family and friends of talented youth; periodic talent days to mobilise support and ‘promote the local utilisation of talent’; promoting talent in the media, as well as international communication about the programme and ‘introduction in both the EU and other countries by exploiting the opportunities provided by Hungary’s EU Presidency in 2011’; ‘preparation for the foreign adaptation of the successful talent support initiatives’ and organisation of EU talent days. 

Hence the goals incorporate a process of learning from European and other international experience, but also one of feeding back to the international community information about the Hungarian talent support effort and extending the model into other European countries.

There is an obvious tension in these goals between preserving the traditions of existing successful initiatives and imposing a framework with minimum standards and built-in quality criteria. This applies equally to the European project discussed below.

The reference to a LETS-type scheme is intriguing but I could trace nothing about its subsequent development.

 .

Planned Infrastructure

In 2008 the infrastructure proposed to undertake the NTP comprised:

  • A National Talent Co-ordination Board, chaired at Ministerial level, to oversee the programme and to allocate a National Talent Fund (see below).
  • A National Talent Support Circle [I’m not sure whether this should be ‘Council’] consisting of individuals from Hungary and abroad who would promote talent support through professional opportunities, financial contribution or ‘social capital opportunities’.
  • A National Talent Fund comprising a Government contribution and voluntary contributions from elsewhere. The former would include the proceeds of a 1% voluntary income tax levy (being one of the good causes towards which Hungarian taxpayers could direct this contribution). Additional financial support would come from ‘the talent support-related programmes of the New Hungary Development Plan’.
  • A system of Talent Support Councils to co-ordinate activity at regional and local level.
  • A national network of Talent Points – providers of talent support activity.
  • A biennial review of the programme presented to Parliament, the first being in 2011.

Presumably there have been two of these biennial reviews to date. They would make interesting reading, but I could find no material in English that describes the outcomes.

The NTP Infrastructure Today

The supporting infrastructure as described today has grown considerably more complex and bureaucratic than the basic model above.

  • The National Talent Co-ordination Board continues to oversee the programme as a whole. Its membership is set out here.
  • The National Talent Support Council was established in 2006 and devised the NTP as set out above. Its functions are more substantial than originally described (assuming this is the ‘Circle’ mentioned in the Resolution), although it now seems to be devolving some of these. Until recently at least, the Council: oversaw the national database of talent support initiatives and monitored coverage, matching demand – via an electronic mailing list – with the supply of opportunities; initiated and promoted regional talent days; supported the network of talent points and promoted the development of new ones; invited tenders for niche programmes of various kinds; collected and analysed evidence of best practice and the research literature; and promoted international links paying ‘special attention to the reinforcement of the EU contacts’. The Council has a Chair and six Vice Presidents as well as a Secretary and Secretariat. It operates nine committees: Higher Education, Support for Socially Disadvantaged Gifted People, Innovations, Public Education, Foreign Relations, Public and Media Relations, Theory of Giftedness, Training and Education and Giftedness Network.
  • The National Talent Point has only recently been identified as an entity in its own right, distinct from the National Council. Its role is to maintain the Talent Map and manage the underpinning database. Essentially it seems to have acquired the Council’s responsibilities for delivery, leaving the Council to concentrate on policy. It recently acquired a new website.
  • The Association of Hungarian Talent Support Organizations (MATEHETZ) is also a new addition. Described as ‘a non-profit umbrella organization that legally represents its members and the National Talent Support Council’, it is funded by the National Council and through membership fees. The Articles of Association date from February 2010 and list 10 founding organisations. The Association provides ‘representation’ for the National Council’ (which I take to mean the membership). It manages the time-limited programmes (see below) as well asthe National Talent Point and the European Talent Centre.
  • Talent Support Councils: Different numbers of these are reported. One source says 76; another 65, of which some 25% were newly-established through the programme. Their role seems broadly unchanged, involving local and regional co-ordination, support for professionals, assistance to develop new activities, helping match supply with demand and supporting the tracking of those with talent.
  • Talent Point Network: there were over 1,000 talent points by the end of 2013. (Assuming 3.5 million potential participants, that is a talent point for every 3,500 people.) Talent points are providers of talent support services – whether identification, provision or counselling. They are operated by education providers, the church and a range of other organisations and may have a local, regional or national reach. They join the network voluntarily but are accredited. In 2011 there were reportedly 400 talent points and 200 related initiatives, so there has been strong growth over the past two years.
  • Ambassadors of Talent: Another new addition, introduced by the National Talent Support Council in 2011. There is a separate Ambassador Electing Council which appoints three new ambassadors per year. The current list has thirteen entries and is markedly eclectic.
  • Friends of Talent Club: described in 2011 as ‘a voluntary organisation that holds together those, who are able and willing to support talents voluntarily and serve the issue of talent support…Among them, there are mentors, counsellors and educators, who voluntarily help talented people develop in their professional life. The members of the club can be patrons and/or supporters. “Patrons” are those, who voluntarily support talents with a considerable amount of service. “Supporters” are those, who voluntarily support the movement of talent support with a lesser amount of voluntary work, by mobilizing their contacts or in any other way.’ This sounds similar to the originally envisioned ‘National Talent Support Circle’ [sic]. I could find little more about the activities of this branch of the structure.
  • The European Talent Centre: The National Talent Point says that this:

‘…supports and coordinates European actions in the field of talent support in order to find gifted people and develop their talent in the interest of Europe as a whole and the member states.’

Altogether this is a substantial endeavour requiring large numbers of staff and volunteers and demanding a significant budgetary topslice.

I could find no reliable estimate of the ratio of the running cost to the direct investment in talent support, but there must be cause to question the overall efficiency of the system.

My hunch is that this level of bureaucracy must consume a significant proportion of the overall budget.

Clearly the Hungarian talent support network is a long, long way from being financially self-sustaining, if indeed it ever could be.

 .

Hungary Parliament Building Budapest

Hungarian Parliament Building

.

The Hungarian Genius Programme (HGP) (2009-13)

Launched in June 2009, the HGP had two principal phases lasting from 2009 to 2011 and from 2011 to 2013. The fundamental purpose was to establish the framework and infrastructure set out in the National Talent Plan.

This English language brochure was published in 2011. It explains that the initial focus is on adults who support talents, establishing a professional network and training experts, as well as creating the network and map of providers.

It mentions that training courses lasting 10 to 30 hours have been developed and accredited in over 80 subjects to:

‘…bring concepts and methods of gifted and talented education into the mainstream and reinforce the professional talent support work… These involve the exchange of experience and knowledge expansion training, as well as programs for those who deal with talented people in developing communities, and awareness-raising courses aimed at the families and environment of young pupils, on the educational, emotional and social needs of children showing special interest and aptitude in one or more subject(s). The aims of the courses are not only the exchange of information but to produce and develop the professional methodology required for teaching talents.’

The brochure also describes an extensive talent survey undertaken in 2010, the publication of several good practice studies and the development of a Talent Loan modeled on the Hungarian student loan scheme.

It lists a seven-strong strategic management group including an expert adviser, project manager, programme co-ordinator and a finance manager. There are also five operational teams, each led by a named manager, one of which focused on ‘international relations: collecting and disseminating international best practices; international networking’.

A subsequent list of programme outputs says:

  • 24,000 new talents were identified
  • The Talent Map was drawn and the Talent Network created (including 867 talent points and 76 talent councils).
  • 23,500 young people took part in ‘subsidised talent support programmes’
  • 118 new ‘local educational talent programmes’ were established
  • 25 professional development publications were written and made freely available
  • 13,987 teachers (about 10% of the total in Hungary) took part in professional development.

Evidence in English of rigorous independent evaluation is, however, limited:

‘The efficiency of the Programme has been confirmed by public opinion polls (increased social acceptance of talent support) and impact assessments (training events: expansion of specialised knowledge and of the methodological tool kit).’

 .

The Talent Bridges Project (TBP) (2012-2014)

TBP began in November 2012 and is scheduled to last until ‘mid-2014’.

The initially parallel TBP is mentioned in the 2011 brochure referenced above:

‘In the strategic plan of the Talent Bridges Program to begin in 2012, we have identified three key areas for action: bridging the gaps in the Talent Point network, encouraging talents in taking part in social responsibility issues and increasing media reach. In order to become sustainable, much attention should be payed [sic] to maintaining and expanding the support structure of this system, but the focus will significantly shift towards direct talent care work with the youth.’

Later on it says:

‘Within the framework of the Talent Bridges Program the main objectives are: to further improve the contact system between the different levels of talent support organisations; to develop talent peer communities based on the initiatives coming from young people themselves; to engage talents in taking an active role in social responsibility; to increase media reach in order to enhance the recognition and social support for both high achievers and talent support; and last, but not least, to arrange the preliminary steps of setting up an EU Institute of Talent Support in Budapest.’

A list of objectives published subsequently contains the following items:

  • Creating a national talent registration and tracking system
  • Developing programmes for 3,000 talented young people from  disadvantaged backgrounds and with special educational needs
  • Supporting the development of ‘outstanding talents’ in 500 young people
  • Supporting 500 enrichment programmes
  • Supporting ‘the peer age groups of talented young people’
  • Introducing programmes to strengthen interaction between parents, teachers and  talented youth benefiting  5,000 young people
  • Introducing ‘a Talent Marketplace’ to support ‘the direct social utilisation of talent’ involving ‘150 controlled co-operations’
  • Engaging 2,000 mentors in supporting talented young people and training 5,000 talent support facilitators and mentors
  • Launching a communication campaign to reach 100,000 young people and
  • Realise European-Union-wide communication (in addition to the current 10, to involve 10 more EU Member States into the Hungarian initiatives, in co-operation with the European Talent Centre in Budapest established in the summer of 2012).

Various sources describe how the TBP is carved up into a series of sub-projects. The 2013 Brochure ‘Towards a European Talent Support Network’ lists 14 of these, but none mention the European work.

However, what appears to be the bid for TBP (in Hungarian) calls the final sub-project ‘an EU Communications Programme’ (p29), which appears to involve:

  • Raising international awareness of Hungary’s talent support activities
  • Strengthening Hungary’s position in the EU talent network
  • Providing a foreign exchange experience for talented young Hungarians
  • Influencing policy makers.

Later on (p52) this document refers to an international campaign, undertaken with support from the European Talent Centre, targeting international organisations and the majority of EU states.

Work to be covered includes the preparation of promotional publications in foreign languages, the operation of a ‘multilingual online platform’, participation in international conferences (such as those of ECHA, the World Council, IRATDE and ICIE); and ‘establishing new professional collaborations with at least 10 new EU countries or international organisations’.

Funding

It is not a straightforward matter to reconcile the diverse and sometimes conflicting sources of information about the budgets allocated to the National Talent Fund, HGP and the TBP, but this is my best effort, with all figures converted into pounds sterling.

 .

2009 2010 2011 2012 2013 2014 Total
NTF x £2.34m.or £4.1m  £2.34m.or £4.1m £8.27m tbc tbc tbc
Of which ETC x x x £80,000 £37,500 £21,350 £138,850
HGP £8.0m £4.6m x £12.6m
TBP x x x £5.3m £5.3m
Of which EU comms x x x £182,000 £182,000

Several sources say that the Talent Fund is set to increase in size over the period.

‘This fund has an annual 5 million EUR support from the national budget and an additional amount from tax donations of the citizens of a total sum of 1.5 million EUR in the first year doubled to 3 million EUR and 6 million EUR in the second and third years respectively.’ (Csermely 2012)

That would translate into a budget of £5.4m/£6.7m/£9.2m over the three years in question, but it is not quite clear which three years are included.

Even if we assume that the NTF budget remains the same in 2013 and 2014 as in 2012, the total investment over the period 2009-2014 amounts to approximately £60m.

That works out at about £17 per eligible Hungarian. Unfortunately I could find no reliable estimate of the total number of Hungarians that have benefited directly from the initiative to date.

On the basis of the figures I have seen, my guesstimate is that the total will be below 10% of the total eligible population – so under 350,000. But I must stress that there is no evidence to support this.

Whether or not the intention is to reach 100% of the population, or whether there is an in-built assumption that only a proportion of the population are amenable to talent development, is a moot point. I found occasional references to a 25% assumption, but it was never clear whether this was official policy.

Even if this applies, there is clearly a significant scalability challenge even within Hungary’s national programme.

It is also evident that the Hungarians have received some £18m from the European Social Fund over the past five years and have invested at least twice as much of their own money. That is a very significant budget indeed for a country of this size.

Hungary’s heavy reliance on EU funding is such that they will find it very difficult to sustain the current effort if that largesse disappears.

One imagines that they will be seeking continued support from EU sources over the period 2014-2020. But, equally, one would expect the EU to demand robust evidence that continued heavy dependency on EU funding will not be required.

And of course a budget of this size also begs questions about scalability to Europe in the conspicuous absence of a commensurate figure. There is zero prospect of equivalent funding being available to extend the model across Europe. The total bill would run into billions of pounds!

A ‘Hungarian-lite’ model would not be as expensive, but it would require a considerable budget.

However, it is clear from the table that the present level of expenditure on the European network has been tiny by comparison with the domestic investment – probably not much more than £100,000 per year.

Initially this came from the National Talent Fund budget but it seems as though the bulk is now provided through the ESF, until mid-2014 at least.

This shift seems to have removed a necessity for the European Talent Centre to receive its funding in biannual tranches through a perpetual retendering process.

For the sums expended from the NTF budget are apparently tied to periods of six months or less.

The European Talent Centre website currently bears the legend:

‘Operation of the European Talent Centre – Budapest between 15th December 2012 and 30th June 2013 is realised with the support of Grant Scheme No. NTP-EUT-M-12 announced by the Institute for Educational Research and Development and the Human Resources Support Manager on commission of the Ministry of Human Resources “To support international experience exchange serving the objectives of the National Talent Programme, and to promote the operation and strategic further development of the European Talent Centre – Budapest”.’

But when I wrote my 2012 review it said:

‘The operation of the European Talent Centre — Budapest is supported from 1 July 2012 through 30 November 2012 by the grant of the National Talent Fund. The grant is realised under Grant Scheme No. NTP-EU-M-12 announced by the Hungarian Institute for Educational Research and Development and the SándorWekerle Fund Manager of the Ministry of Administration and Justice on commission of the Ministry of Human Resources, from the Training Fund Segment of the Labour Market Fund.’

A press release confirmed the funding for this period as HUF 30m.

Presumably it will now need to be amended to reflect the arrival of £21.3K under Grant Scheme No. NTP-EU-M-13 – and possibly to reflect income from the ESF-supported TBP too.

A comparison between the Hungarian http://tehetseg.hu/ website and the European Talent Centre website is illustrative of the huge funding imbalance in favour of the former.

Danube Bend at Visegrad courtesy of Phillipp Weigell

Danube Bend at Visegrad courtesy of Phillipp Weigell

.

Origins of the European Talent Project: Evolution to December 2012

Initial plans

Hungary identified talent support as a focus during its EU Presidency, in the first half of 2011, citing four objectives:

  • A talent support conference scheduled for April 2011
  • A first European Talent Day to coincide with the conference, initially ‘a Hungarian state initiative…expanding it into a public initiative by 2014’.
  • Talent support to feature in EU strategies and documents, as well as a Non-Legislative Act (NLA). It is not specified whether this should be a regulation, decision, recommendation or opinion. (Under EU legislation the two latter categories have no binding force.)
  • An OMCexpert group on talent support – ie an international group run under the aegis of the Commission.

The Budapest Declaration

The Conference duly took place, producing a Budapest Declaration on Talent Support in which conference participants:

  • ‘Call the European Commission and the European Parliament to make every effort to officially declare the 25th of March the European Day of the Talented and Gifted.’
  • ‘Stress the importance of…benefits and best practices appearing in documents of the European Commission, the European Council and the European Parliament.’
  • ‘Propose to establish a European Talent Resource and Support Centre in Budapest’ to ‘coordinate joint European actions in the field’.
  • ‘Agree to invite stakeholders from every country of the European Union to convene annually to discuss the developments and current questions in talent support. Upon the invitation of the Government of Poland the next conference will take place in Warsaw.’

The possibility of siting a European Centre anywhere other than Budapest was not seriously debated.

 .

Evolution of a Written Declaration to the EU

Following the Conference an outline Draft Resolution of the European Parliament was circulated for comment.

This proposed that:

 ‘A Europe-wide talent support network should be formed and supported with an on-line and physical presence to support information-sharing, partnership and collaborations. This network should be open for co-operation with all European talent support efforts, use the expertise and networking experiences of existing multinational bodies such as the European Council of High Ability and support both national and multinational efforts to help talents not duplicating existing efforts but providing an added European value.’

Moreover, ‘A European Talent Support Centre should be established…in Budapest’. This:

‘…should have an Advisory Board having the representatives of interested EU member states, all-European talent support-related institutions as well as key figures of European talent support.’

The Centre’s functions are five-fold:

‘Using the minimum bureaucracy and maximising its use of online solutions the European Talent Support Centre should:

  • facilitate the development and dissemination of best curricular and extra-curricular talent support practices;
  • coordinate the trans-national cooperation of Talent Points forming an EU Talent Point network;
  • help  the spread of the know-how of successful organization of Talent Days;
  • organize annual EU talent support conferences in different EU member states overseeing the progress of cooperation in European talent support;
  • provide a continuously updated easy Internet access for all the above information.’

Note the references on the one hand to an inclusive approach, a substantial advisory group (though without the status of an EU-hosted OMC expert group) and a facilitating/co-ordinating role, but also – on the other hand – the direct organisation of annual EU-wide conferences and provision of a sophisticated supporting online environment.

MEPs were lined up to submit the Resolution in Autumn 2011 but, for whatever reason, this did not happen.

Instead a new draft Written Declaration was circulated in January 2012. This called on:

  •  Member States to consider measures helping curricular and extracurricular forms of talent support including the training of educational professionals to recognize and help talent;
  • The Commission to consider talent support as a priority of future European strategies, such as the European Research Area and the European Social Fund;
  • Member States and the Commission to support the development of a Europe-wide talent support network, formed by talent support communities, Talent Points and European Talent Centres facilitating cooperation, development and dissemination of best talent support practices;
  • Member States and the Commission to celebrate the European Day of the Talented and Gifted.’

The focus has shifted from the Budapest-centric network to EU-led activity amongst member states collectively. Indeed, no specific role for Hungary is mentioned.

There is a new emphasis on professional development and – critically – a reference to ‘European talent centres’. All mention of NLAs and OMC expert groups has disappeared.

There followed an unexplained 11-month delay before a Final Written Declaration was submitted by four MEPs in November 2012.

 .

The 2012 Written Declaration 

There are some subtle adjustments in the final version of WD 0034/2012. The second bullet point has become:

  • ‘The Commission to consider talent support as part of ‘non-formal learning’ and a priority in future European strategies, such as the strategies guiding the European Research Area and the European Social Fund’.

While the third now says:

  • ‘Member States and the Commission to support the development of a Europe-wide talent support network bringing together talent support communities, Talent Points and European Talent Centres in order to facilitate cooperation and the development and dissemination of the best talent support practices.’

And the fourth is revised to:

  • ‘Member States and the Commission to celebrate the European Day of Highly Able People.’

The introduction of a phrase that distinguishes between education and talent support is curious.

CEDEFOP – which operates a European Inventory on Validation of Non-formal and Informal Learning – defines the latter as:

‘…learning resulting from daily work-related, family or leisure activities. It is not organised or structured (in terms of objectives, time or learning support). Informal learning is in most cases unintentional from the learner’s perspective. It typically does not lead to certification.’

One assumes that a distinction is being attempted between learning organised by a school or other formal education setting and that which takes place elsewhere – presumably because EU member states are so fiercely protective of their independence when it comes to compulsory education.

But surely talent support encompasses formal and informal learning alike?

Moreover, the adoption of this terminology appears to rule out any provision that is ‘organised or structured’, excluding huge swathes of activity (including much of that featured in the Hungarian programme). Surely this cannot have been intentional.

Such a distinction is increasingly anachronistic, especially in the case of gifted learners, who might be expected to access their learning from a far richer blend of sources than simply in-school classroom teaching.

Their schools are no longer the sole providers of gifted education, but facilitators and co-ordinators of diverse learning streams.

The ‘gifted and talented’ terminology has also disappeared, presumably on the grounds that it would risk frightening the EU horses.

Both of these adjustments seem to have been a temporary aberration. One wonders who exactly they were designed to accommodate and whether they were really necessary.

 .

Establishment and early activity of the EU Talent Centre in Budapest

The Budapest centre was initially scheduled to launch in February 2012, but funding issues delayed this, first until May and then the end of June.

The press release marking the launch described the long-term goal of the Centre as:

‘…to contribute on the basis of the success of the Hungarian co-operation model to organising the European talent support actors into an open and flexible network overarching the countries of Europe.’

Its mission is to:

‘…offer the organisations and individuals active in an isolated, latent form or in a minor network a framework structure and an opportunity to work together to achieve the following:

  • to provide talent support an emphasis commensurate with its importance in every European country
  • to reduce talent loss to the minimum in Europe,
  • to give talent support a priority role in the transformation of the sector of education; to provide talented young persons access to the most adequate forms of education in every Member State,
  • to make Europe attractive for the talented youth,
  • to create talent-friendly societies in every European country.’

The text continues:

‘It is particularly important that network hubs setting targets similar to those of the European Talent Centre in Budapest should proliferate in the longer term.

The first six months represent the first phase of the work: we shall lay the bases [sic] for establishing the European Talent Support Network. The expected key result is to set up a team of voluntary experts from all over Europe who will contribute to that work and help draw the European talent map.

But what exactly are these so-called network hubs? We had to wait some time for an explanation.

There was relatively little material on the website at this stage and this was also slow to change.

My December 2012 post summarised progress thus:

‘The Talent Map includes only a handful of links, none in the UK.

The page of useful links is extensive but basically just a very long list, hard to navigate and not very user-friendly. Conversely, ‘best practices’ contains only three resources, all of them produced in house.

The whole design is rather complex and cluttered, several of the pages are too text-heavy and occasionally the English leaves something to be desired.’

 

Here ends the first part of this post. Part Twoexplains the subsequent development of the ‘network hubs’ concept, charts the continuation of the advocacy effort  and reviews progress in delivering the services for which the Budapest Centre is  responsible.

It concludes with an overall assessment of the initiative highlighting some of its key weaknesses.

GP

March 2014

How Well Does Gifted Education Use Social Media?

.

This post reviews the scope and quality of gifted education coverage across selected social media.

It uses this evidence base to reflect on progress in the 18 months since I last visited this topic and to establish a benchmark against which to judge future progress.

tree-240470_640More specifically, it:

  • Proposes two sets of quality criteria – one for blogs and other websites, the other for effective use of social media;
  • Reviews gifted education-related social media activity:

By a sample of six key players  – the World Council (WCGTC) and the European Council for High Ability (ECHA), NAGC and SENG in the United States and NACE and Potential Plus UK over here

Across the Blogosphere and five of the most influential English language social media platforms – Facebook, Google+, LinkedIn, Twitter and You Tube and

Utilising four content curation tools particularly favoured by gifted educators, namely PaperLi, Pinterest, ScoopIt and Storify.

  • Considers the gap between current practice and the proposed quality criteria – and whether there has been an improvement in the application of social media across the five dimensions of gifted education identified in my previous post.

I should declare at the outset that I am a Trustee of Potential Plus UK and have been working with them to improve their online and social media presence. This post lies outside that project, but some of the underlying research is the same.

.

I have been this way before

This is my second excursion into this territory.

In September 2012 I published a two-part response to the question ‘Can Social Media Help Overcome the Problems We Face in Gifted Education?’

  • Part One outlined an analytical framework based on five dimensions of gifted education. Each dimension is stereotypically associated with a particular stakeholder group though, in reality, each group operates across more than one area. The dimensions (with their associated stakeholder groups in brackets) are: advocacy (parents); learning (learners); policy-making (policy makers); professional development (educators); and research (academics).
  • Part Two used this framework to review the challenges faced by gifted education, to what extent these were being addressed through social media and how social media could be applied more effectively to tackle them. It also outlined the limitations of a social media-driven approach and highlighted some barriers to progress.

The conclusions I reached might be summarised as follows:

  • Many of the problems associated with gifted education are longstanding and significant, but not insurmountable. Social media will not eradicate these problems but can make a valuable contribution towards that end by virtue of their unrivalled capacity to ‘only connect’.
  • Gifted education needs to adapt if it is to thrive in a globalised environment with an increasingly significant online dimension driven by a proliferation of social media. The transition from early adoption to mainstream practice has not yet been effected, but rapid acceleration is necessary otherwise gifted education will be left behind.
  • Gifted education is potentially well-placed to pioneer new developments in social media but there is limited awareness of this opportunity, or the benefits it could bring.

The post was intended to inform discussion at a Symposium at the ECHA Conference in Munster, Germany in September 2012. I published the participants’ presentations and a report on proceedings (which is embedded within a review of the Conference as a whole).

.

Defining quality

I have not previously attempted to pin down what constitutes a high quality website or blog and effective social media usage, not least because so many have gone before me.

But, on reviewing their efforts, I could find none that embodied every dimension I considered important, while several appeared unduly restrictive.

It seems virtually impossible to reconcile these two conflicting pressures, defining quality with brevity but without compromising flexibility. Any effort to pin down quality risks reductionism while also fettering innovation and wilfully obstructing the pioneering spirit.

I am a strong advocate of quality standards in gifted education but, in this context, it seemed beyond my capacity to find or generate the ideal ‘flexible framework’, offering clear guidance without compromising innovation and capacity to respond to widely varying needs and circumstances.

But the project for Potential Plus UK required us to consult stakeholders on their understanding of quality provision, so that we could reconcile any difference between their perceptions and our own.

And, in order to consult effectively, we needed to make a decent stab at the task ourselves.

So I prepared some draft success criteria, drawing on previous efforts I could find online as well as my own experience over the last four years.

I have reproduced the draft criteria below, with slight amendment to make them more universally applicable. The first set – for a blog or website – are generic, while those relating to wider online and social media presence are made specific to gifted education.

.

Draft Quality Criteria for a Blog or Website

1.    The site is inviting to regular and new readers alike; its purpose is up front and explicit; as much content as possible is accessible to all.

 2.    Readers are encouraged to interact with the content through a variety of routes – and to contribute their own (moderated) content.

3.    The structure is logical and as simple as possible, supported by clear signposting and search.

 4.    The design is contemporary, visually attractive but not obtrusive, incorporating consistent branding and a complementary colour scheme. There is no external advertising.

 5.    The layout makes generous and judicious use of space and images – and employs other media where appropriate.

 6.    Text is presented in small blocks and large fonts to ensure readability on both tablet and PC.

 7.    Content is substantial, diverse and includes material relevant to all the site’s key audiences.

 8.    New content is added weekly; older material is frequently archived (but remains accessible).

 9.    The site links consistently to – and is linked to consistently by – all other online and social media outlets maintained by the authors.

 10. Readers can access site content by multiple routes, including other social media, RSS and email.

.

Draft quality criteria for wider online/social media activity

1.    A body’s online and social media presence should be integral to its wider communications strategy which should, in turn, support its purpose, objectives and priorities.

 2.    It should:

 a.    Support existing users – whether they are learners, parents/carers, educators, policy-makers or academics – and help to attract new users;

 b.    Raise the entity’s profile and build its reputation – both nationally and internationally – as a first-rate provider in one or more of the five areas of gifted education;

 c.    Raise the profile of gifted education as an  issue and support  campaigning for stronger provision;

 d.    Help to generate income to support the pursuit of these objectives and the body’s continued existence.

3.    It should aim to:

 a.    Provide a consistently higher quality and more compelling service than its main competitors, generating maximum benefit for minimum cost.

 b.    Use social media to strengthen interaction with and between users and provide more effective ‘bottom-up’ collaborative support.

 c.    Balance diversity and reach against manageability and effectiveness, prioritising media favoured by users but resisting pressure to diversify without justification and resource.

 d.    Keep the body’s online presence coherent and uncomplicated, with clear and consistent signposting so users can navigate quickly and easily between different online locations.

e.    Integrate all elements of the body’s online presence, ensuring they are mutually supportive.

 4.    It should monitor carefully the preferences of users, as well as the development of online and social media services, adjusting the approach only when there is a proven business case for doing so.

.

P1010262-001

Perth Pelicans by Gifted Phoenix

.

Applying the Criteria

These draft criteria reflect the compromise I outlined above. They are not the final word. I hope that you will help us to refine them as part of the consultation process now underway and I cannot emphasise too much that they are intended as guidelines, to be applied with some discretion.

I continue to maintain my inalienable right – as well as yours – to break any rules imposed by self-appointed arbiters of quality.

To give an example, readers will know that I am particularly exercised by any suggestion that good blog posts are, by definition, brief!

I also maintain your inalienable right to impose your own personal tastes and preferences alongside (or in place of) these criteria. But you might prefer to do so having reflected on the criteria – and having dismissed them for logical reasons.

There are also some fairly obvious limitations to these criteria.

For example, bloggers like me who use hosted platforms are constrained to some extent by the restrictions imposed by the host, as well as by our preparedness to pay for premium features.

Moreover, the elements of effective online and social media practice have been developed with a not-for-profit charity in mind and some in particular may not apply – or may not apply so rigorously – to other kinds of organisations, or to individuals engaged in similar activity.

In short, these are not templates to be followed slavishly, but rather a basis for reviewing existing provision and prompting discussion about how it might be further improved.

It would be forward of me to attempt a rigorous scrutiny against each of the criteria of the six key players mentioned above, or of any of the host of smaller players, including the 36 active gifted education blogs now listed on my blogroll.

I will confine myself instead to reporting factually all that I can find in the public domain about the activity of the six bodies, comparing and contrasting their approaches with broad reference to the criteria and arriving at an overall impressionistic judgement.

As for the blogs, I will be even more tactful, pointing out that my own quick and dirty self-review of this one – allocating a score out of ten for each of the ten items in the first set of criteria – generated a not very impressive 62%.

Of course I am biased. I still think my blog is better than yours, but now I have some useful pointers to how I might make it even better!

.

Comparing six major players

I wanted to compare the social media profile of the most prominent international organisations, the most active national organisations based in the US (which remains the dominant country in gifted education and in supporting gifted education online) and the two major national organisations in the UK.

I could have widened my reach to include many similar organisations around the world but that would have made this post more inaccessible. It also struck me that I could evidence my key messages by analysis of this small sample alone – and that my conclusions would be equally applicable to others in the field, wherever they are located geographically.

My analysis focuses on these organisations’:

  • Principal websites, including any information they contain about their wider online and social media activity;
  • Profile across the five selected social media platforms and use of blogs plus the four featured curational tools.

I have confined myself to universally accessible material, since several of these organisations have additional material available only to their memberships.

I have included only what I understand to be official channels, tied explicitly to the main organisation. I have included accounts that are linked to franchised operations – typically conferences – but have excluded personal accounts that belong to individual employees or trustees of the organisations in question.

Table 1 below shows which of the six organisations are using which social media. The table includes hyperlinks to the principal accounts and I have also repeated these in the commentary that follows.

.

Table 1: The social media used by the sample of six organisations

WCGTC ECHA SENG NAGC PPUK NACE
Blog No No [Yes] No No No
Facebook Yes Yes Yes Yes Yes No
Google+ Yes No Yes No Yes Yes
LinkedIn Yes No Yes No Yes No
Twitter Yes No Yes Yes Yes Yes
You Tube Yes No Yes Yes No Yes
PaperLi Yes No No No No No
Pinterest No No No Yes Yes No
ScoopIt No No No No No No
Storify No No No Yes No No

.

The table gives no information about the level or quality of activity on each account – that will be addressed in the commentary below – but it gives a broadly reliable indication of which organisations are comparatively active in social media and which are less so.

The analysis shows that Facebook and Twitter are somewhat more popular platforms than Google+, LinkedIn and You Tube, while Pinterest leads the way amongst the curational tools. This distribution of activity is broadly representative of the wider gifted education community.

The next section takes a closer look at this wider activity on each of the ten platforms and tools.

.

Comparing gifted-related activity on the ten selected platforms and tools

 .

Blogs

As far as I can establish, none of the six organisations currently maintains a blog. SENG does have what it describes as a Library of Articles, which is a blog to all intents and purposes – and Potential Plus UK is currently planning a blog.

Earlier this year I noticed that my blogroll was extremely out of date and that several of the blogs it contained were no longer active. I reviewed all the blogs I could find in the field and sought recommendations from others.

I imposed a rule to distinguish live blogs from those that are dead or dormant – they had to have published three or more relevant posts in the previous six months.

I also applied a slightly more subjective rule, in an effort to sift out those that had little relevance to anyone beyond the author (being cathartic diaries of sorts) and those that are entirely devoted to servicing a small local advocacy group.

I ended up with a long shortlist of 36 blogs, which now constitutes the revised blogroll in the right hand column.  Most are written in English but I have also included a couple of particularly active blogs in other languages.

The overall number of active blogs is broadly comparable with what I remember in 2010 when I first began, but the number of posts has probably fallen.

I don’t know to what extent this reflects changes in the overall number of active blogs and posts, either generically or in the field of education. In England there has been a marked renaissance in edublogging over the last twelve months, yet only three bloggers venture regularly into the territory of gifted education.

.

Facebook

Alongside Twitter, Facebook has the most active gifted education community.

There are dozens of Facebook Groups focused on giftedness and high ability. At the time of writing, the largest and most active are:

The Facebook Pages with the most ‘likes’ have been established by bodies located in the United States. The most favoured include:

There is a Gifted Phoenix page, which is rigged up to my Twitter account so all my tweets are relayed there. Only those with a relevant hashtag – #gtchat or #gtvoice – will be relevant to gifted education.

.

Google+

To date there is comparatively little activity on Google+, though many have established an initial foothold there.

Part of the problem is lack of familiarity with the platform, but another obstacle is the limited capacity to connect other parts of one’s social media footprint with one’s Google+ presence.

There is only one Google+ Community to speak of: ‘Gifted and Talented’ currently with 134 members.

A search reveals a large number of people and pages ostensibly relevant to gifted education, but few are useful and many are dormant.

Amongst the early adopters are:

My own Google+ page is dormant. It should now be possible to have WordPress.com blogposts appear automatically on a Google+ page, but the service seems unreliable. There is no capacity to link Twitter and Google+ in this fashion. I am waiting on Google to improve the connectivity of their service.

.

LinkedIn

LinkedIn is also comparatively little used by the gifted education community. There are several groups:

But none is particularly active, despite the rather impressive numbers above. Similarly, a handful of organisations have company pages on LinkedIn, but only one or two are active.

The search purports to include a staggering 98,360 people who mention ‘gifted’ in their profiles, but basic account holders can only see 100 results at a time.

My own LinkedIn page is registered under my real name rather than my social media pseudonym and is focused principally on my consultancy activity. I often forget it exists.

 .

Twitter

By comparison, Twitter is much more lively.

My brief January post mentioned my Twitter list containing every user I could find who mentions gifted education (or a similar term, whether in English or a selection of other languages) in their profile.

The list currently contains 1,263 feeds. You are welcome to subscribe to it. If you want to see it in action first, it is embedded in the right-hand column of this Blog, just beneath the blogroll.

The majority of the gifted-related activity on Twitter takes place under the #gtchat hashtag, which tends to be busier than even the most popular Facebook pages.

This hashtag also accommodates an hour long real-time chat every Friday (at around midnight UK time) and at least once a month on Sundays, at a time more conducive to European participants.

Other hashtags carrying information about gifted education include: #gtvoice (UK-relevant), #gtie (Ireland-relevant), #hoogbegaafd (Dutch-speaking); #altascapacidades (Spanish-speaking), #nagc and #gifteded.

Chats also take place on the #gtie and #nagc hashtags, though the latter may now be discontinued.

Several feeds provide gifted-relevant news and updates from around the world. Amongst the most followed are:

  • NAGC (4,240 followers)
  • SENG (2,709 followers)

Not forgetting Gifted Phoenix (5,008 followers) who publishes gifted-relevant material under the #gtchat (globally relevant material) and #gtvoice (UK-relevant material) hashtags.

.

Twitter network 2014 Capture

Map of Gifted Phoenix’s Twitter Followers March 2014

.

You Tube

You Tube is of course primarily an audio-visual channel, so it tends to be used to store public presentations and commercials.

A search on ‘gifted education’ generates some 318,000 results including 167,000 videos and 123,000 channels, but it is hard to see the wood for the trees.

The most viewed videos and the most used channels are an eclectic mix and vary tremendously in quality.

Honourable mention should be made of:

The most viewed video is called ‘Top 10 Myths in Gifted Education’, a dramatised presentation which was uploaded in March 2010 by the Gifted and Talented Association of Montgomery County. This has had almost 70,000 views.

Gifted Phoenix does not have a You Tube presence.

.

Paper.li

Paper.li describes itself as ‘a content curation service’ which ‘enables people to publish newspapers based on topics they like and treat their readers to fresh news, daily.’

It enables curators to draw on material from Facebook, Twitter, Google+, embeddable You Tube videos and websites via RSS feeds.

In September 2013 it reported 3.7m users each month.

I found six gifted-relevant ‘papers’ with over 1,000 subscriptions:

There is, as yet, no Gifted Phoenix presence on paper.li, though I have been minded for some months to give it a try.

.

Pinterest

Pinterest is built around a pinboard concept.  Pins are illustrated bookmarks designating something found online or already on Pinterest, while Boards are used to organise a collection of pins. Users can follow each other and others’ boards.

Pinterest is said to have 70 million users, of which 80% are female.

A search on ‘gifted education’ reveals hundreds of boards dedicated to the topic, but unfortunately there is no obvious way to rank them by number of followers or number of pins.

Since advanced search capability is conspicuous by its absence, the user apparently has little choice but to sift laboriously through each board. I have not undertaken this task so I can bring you no useful information about the most used and most popular boards.

Judging by the names attached to these boards, they are owned almost exclusively by women. It is interesting to hypothesise about what causes this gender imbalance – and whether Pinterest is actively pursuing female users at the expense of males.

There are, however, some organisations in the field making active use of Pinterest. A search of ‘pinners’ suggests that amongst the most popular are:

  • IAGC Gifted which has 26 boards, 734 pins and 400 followers.

Gifted Phoenix is male and does not have a presence on Pinterest…yet!

 .

Scoop.it

Scoop.it stores material on a page somewhere between a paper.li-style newspaper and a Pinterest-style board. It is reported to have almost seven million unique visitors each month.

‘Scoopable’ material is drawn together via URLs, a programmable ‘suggestions engine’ and other social media, including all the ‘big four’. The free version permits a user to link only two social media accounts however, putting significant restrictions on Scoop.it’s curational capacity.

Scoop.it also has limited search engine capability. It is straightforward to conduct an elementary search like this one on ‘gifted’ which reveals 107 users.

There is no quick way of finding those pages that are most used or most followed, but one can hover over the search results for topics to find out which have most views:

Gifted Phoenix has a Scoop.it topic which is still very much a work in progress.

.

Storify

Storify is a slightly different animal to the other three tools. It describes itself as:

‘the leading social storytelling platform, enabling users to easily collect tweets, photos, videos and media from across the web to create stories that can be embedded on any website.  With Storify, anyone can curate stories from the social web to embed on their own site and share on the Storify platform.’

Estimates of user numbers vary but are typically from 850,000 to 1m.

Storify is a flexible tool whose free service permits one to collect material already located on the platform and from a range of other sources including Twitter, Facebook, You Tube, Flickr, Instagram, Google search, Tumblr – or via RSS or URL.

The downside is that there is no way to search within Storify for stories or users, so one cannot provide information about the level of activity or users that it might be helpful to follow.

However, a Google search reveals that users of Storify include:

  • IGGY with 9 followers

These tiny numbers show that Storify has not really taken off as a curational platform in its own right, though it is an excellent supporting tool, particularly for recording transcripts of Twitter chats.

Gifted Phoenix has a Storify profile and uses the service occasionally.

 .

The Cold Shoulder in Perth Zoo by Gifted Phoenix

The Cold Shoulder in Perth Zoo by Gifted Phoenix

.

Comparing the six organisations

So, having reviewed wider gifted education-related activity on these ten social media platforms and tools, it is time to revisit the online and social media profile of the six selected organisations.

.

World Council

The WCGTC website was revised in 2012 and has a clear and contemporary design.

The Council’s Mission Statement has a strong networking feel to it and elsewhere the website emphasises the networking benefits associated with membership:

‘…But while we’re known for our biennial conference the spirit of sharing actually goes on year round among our membership.

By joining the World Council you can become part of this vital network and have access to hundreds of other peers while learning about the latest developments in the field of gifted children.’

The home page includes direct links to the organisation’s Facebook Page and Twitter feed. There is also an RSS feed symbol but it is not active.

Both Twitter and Facebook are of course available to members and non-members alike.

At the time of writing, the Facebook page has 1,616 ‘likes’ and is relatively current, with five posts in the last month, though there is relatively little comment on these.

The Twitter feed typically manages a daily Tweet. Hashtags are infrequently if ever employed. At the time of writing the feed has 1,076 followers.

Almost all the Tweets are links to a daily paper.li production ‘WCGTC Daily’ which was first published in late July 2013, just before the last biennial conference. This has 376 subscribers at the present time, although the gifted education coverage is selective and limited.

However, the Council’s most recent biennial conference was unusual in making extensive use of social media. It placed photographs on Flickr, videos of keynotes on YouTube and podcasts of keynotes on Mixlr.

There was also a Blog – International Year of Giftedness and Creativity – which was busy in the weeks immediately preceding the Conference, but has not been active since.

There are early signs that the 2015 Conference will also make strong use of social media. In addition to its own website, it already has its own presence on Twitter and Facebook.

One of the strands of the 2015 Conference is:

‘Online collaboration

  • Setting the stage for future sharing of information
  • E-networking
  • E-learning options’

And one of the sponsors is a social media company.

As noted above, the World Council website provides links to two of its six strands of social media activity, but not the remaining four. It is not yet serving as an effective hub for the full range of this activity.

Some of the strands link together well – eg Twitter to paper.li – but there is considerable scope to improve the incidence and frequency of cross-referencing.

.

ECHA

Of the six organisations in this sample, ECHA is comfortably the least active in social media with only a Facebook page available to supplement its website.

The site itself is rather old-fashioned and could do with a refresh. It includes a section ‘Introducing ECHA’ which emphasises the organisation’s networking role:

‘The major goal of ECHA is to act as a communications network to promote the exchange of information among people interested in high ability – educators, researchers, psychologists, parents and the highly able themselves. As the ECHA network grows, provision for highly able people improves and these improvements are beneficial to all members of society.’

This is reinforced in a parallel Message from the President.

There is no reference on the website to the Facebook group which is closed, but not confined solely to ECHA members. There are currently 191 members. The group is fairly active, but does not rival those with far more members listed above.

There’s not much evidence of cross-reference between the Facebook group and the website, but that may be because the website is infrequently updated.

As with the World Council, ECHA conferences have their own social media profile.

At the 2012 Conference in In Munster this was left largely to the delegates. Several of us live Tweeted the event.

I blogged about the Conference and my part in it, providing links to transcripts of the Twitter record. The post concluded with a series of learning points for this year’s ECHA Conference in Slovenia.

The Conference website explains that the theme of the 2014 event is ‘Rethinking Giftedness: Giftedness in the Digital Age’.

Six months ahead of the event, there is a Twitter feed with 29 followers that has been dormant for three months at the time of writing and a LinkedIn group with 47 members that has been quiet for five months.

A Forum was also established which has not been used for over a year. There is no information on the website about how the event will be supported by social media.

I sincerely hope that my low expectations will not be fulfilled!

.

SENG

SENG is far more active across social media. Its website carries a 2012 copyright notice and has a more contemporary feel than many of the others in this sample.

The bottom of the home page extends an invitation to ‘connect with the SENG community’ and carries links to Facebook, Twitter and LinkedIn (though not to Google+ or You Tube).

In addition, each page carries a set of buttons to support the sharing of this information across a wide range of social media.

The organisation’s Strategic Plan 2012-2017 makes only fleeting reference to social media, in relation to creating a ‘SENG Liaison Facebook page’ to support inter-state and international support.

It does, however, devote one of its nine goals to the further development of its webinar programme (each costs $40 to access or $40 to purchase a recording for non-participants).

SENG offers online parent support groups but does not state which platform is used to host these. It has a Technology/Social Media Committee but its proceedings are not openly available.

Reference has already been made above to the principal Facebook Page which is popular, featuring posts on most days and a fair amount of interaction from readers.

The parallel group for SENG Liaisons is also in place, but is closed to outsiders, which rather seems to defeat the object.

The SENG Twitter feed is relatively well followed and active on most days. The LinkedIn page is somewhat less active but can boast 142 followers while Google+ is clearly a new addition to the fold.

The You Tube channel has 257 subscribers however and carries 16 videos, most of them featuring presentations by James Webb. Rather strangely, these don’t seem to feature in the media library carried by the website.

SENG is largely a voluntary organisation with little staff resource, but it is successfully using social media to extend its footprint and global influence. There is, however, scope to improve coherence and co-ordination.

.

National Association for Gifted Children

The NAGC’s website is also in some need of refreshment. Its copyright notice dates from 2008, which was probably when it was designed.

There are no links to social media on the home page but ‘NAGC at a glance’ carries a direct link to the Facebook group and a Twitter logo without a link, while the page listing NAGC staff has working links to both Facebook and Twitter.

In the past, NAGC has been more active in this field.

There was for a time a Parenting High Potential Blog but the site is now marked private.

NAGC’s Storify account contains the transcripts of 6 Twitter chats conducted under the hashtag #nagcchat between June and August 2012. These were hosted by NAGC’s Parent Outreach Specialist.

But, by November 2012 I was tweeting:

.

.

And in February 2013:

.

.

This post was filled by July 2013. The postholder seems to have been concentrating primarily on editing the magazine edition of Parenting High Potential, which is confined to members only (but also has a Facebook presence – see below).

NAGC’s website carries a document called ‘NAGC leadership initiatives 2013-14’ which suggests further developments in the next few months.

The initiatives include:

‘Leverage content to intentionally connect NAGC resources, products and programs to targeted audiences through an organization-wide social media strategy.’

and

‘Implement a new website and membership database that integrates with social media and provides a state-of-the-art user interface.’

One might expect NAGC to build on its current social media profile which features:

  • A Facebook Group which currently has 2,420 members and is reasonably active, though not markedly so. Relatively few posts generate significant comments.
  • A Twitter feed boasting an impressive 4,287 followers. Tweets are published on a fairly regular basis

There is additional activity associated with the Annual NAGC Convention. There was extensive live Tweeting from the 2013 Convention under the rival hashtags #NAGC2013 and #NAGC13. #NAGC14 looks the favourite for this year’s Convention which has also established a Facebook presence

NAGC also has its own networks. The website lists 15 of these but hardly any of their pages give details of their social media activity. A cursory review reveals that:

Overall, NAGC has a fairly impressive array of social media activity but demonstrates relatively little evidence of strategic coherence and co-ordination. This may be expected to improve in the next six months, however.

.

NACE

NACE is not quite the poorest performer in our sample but, like ECHA, it has so far made relatively little progress towards effective engagement with social media.

Its website dates from 2010 but looks older. Prominent links to Twitter and Facebook appear on the front page as well as – joy of joys – an RSS feed.

However, the Facebook link is not to a NACE-specific page or group and the RSS feed doesn’t work.

There are references on the website to the networking benefits of NACE membership, but not to any role for the organisation in wider networking activity via social media. Current efforts seem focused primarily on advertising NACE and its services to prospective members and purchasers.

The Twitter feed has a respectable 1,426 followers but Tweets tend to appear in blocks of three or four spaced a few days apart. Quality and relevance are variable.

The Google+ page and You Tube channel contain the same two resources, posted last November.

There is much room for improvement.

.

Potential Plus UK

All of which brings us back to Potential Plus and the work I have been supporting to strengthen its online and social media presence.

.

Current Profile

Potential Plus’s current social media profile is respectably diverse but somewhat lacking in coherence.

The website is old-fashioned. There is a working link to Facebook on the home page, but this takes readers to the old NAGC Britain page which is no longer used, rather than directing them to the new Potential Plus UK page.

Whereas the old Facebook page had reached 1,344 likes, the new one is currently at roughly half that level – 683 – but the level of activity is reasonably impressive.

There is a third Facebook page dedicated to the organisation’s ‘It’s Alright to Be Bright’ campaign, which is not quite dormant.

All website pages carry buttons supporting information-sharing via a wide range of social media outlets. But there is little reference in the website content to its wider social media activity.

The Twitter feed is fairly lively, boasting 1,093 followers. It currently has some 400 fewer followers than NACE but has published about 700 more Tweets. Both are publishing at about the same rate. Quality and relevance are similarly variable.

The LinkedIn page is little more than a marker and does not list the products offered.

The Google+ presence uses the former NAGC Britain name and is also no more than a marker.

But the level of activity on Pinterest is more significant. There are 14 boards each containing a total of 271 pins and attracting 26 followers.  This material has been uploaded during 2014.

There is at present no substantive blog activity, although the stub of an old wordpress.com site still exists and there is also a parallel stub of an old wordpress.com children’s area.

There are no links to any of these services from the website – nor do these services link clearly and prominently with each other.

.

Future Strategy

The new wordpress.com test site sets out our plans for Potential Plus UK, which have been shaped in accordance with the two sets of draft success criteria above.

The purpose of the project is to help the organisation to:

  • improve how it communicates and engage with its different audiences clearly and effectively
  • improve support for members and benefit all its stakeholder groups
  • provide a consistently higher quality and more compelling service than its main competitors that generates maximum benefit for minimum cost

Subject to consultation and if all goes well, the outcome will be:

  • A children’s website on wordpress.org
  • A members’ and stakeholders’ website on wordpress.com (which may transfer to wordpress.org in due course)
  • A new forum and a new ‘bottom-up’ approach to support that marries curation and collaboration and
  • A coherent social media strategy that integrates these elements and meets audiences’ needs while remaining manageable for PPUK staff.

You can help us to develop this strategy by responding to the consultation here by Friday 18 April.

.

La Palma Panorama by Gifted Phoenix

La Palma Panorama by Gifted Phoenix

.

Conclusion

.

Gifted Phoenix

I shall begin by reflecting on Gifted Phoenix’s profile across the ten elements included in this analysis:

  • He has what he believes is a reasonable Blog.
  • He is one of the leading authorities on gifted education on Twitter (if not the leading authority).
  • His Facebook profile consists almost exclusively of ‘repeats’ from his Twitter feed.
  • His LinkedIn page reflects a different identity and is not connected properly to the rest of his profile.
  • His Google+ presence is embryonic.
  • He has used Scoop.it and Storify to some extent, but not Paper.li or Pinterest.

GP currently has a rather small social media footprint, since he is concentrating on doing only two things – blogging and microblogging – effectively.

He might be advised to extend his sphere of influence by distributing the limited available human resource more equitably across the range of available media.

On the other hand he is an individual with no organisational objectives to satisfy. Fundamentally he can follow his own preferences and inclinations.

Maybe he should experiment with this post, publishing it as widely as possible and monitoring the impact via his blog analytics…

.

The Six Organisations

There is a strong correlation between the size of each organisation’s social media footprint and the effectiveness with which they use social media.

There are no obvious examples – in this sample at least – of organisations that have a small footprint because of a deliberate choice to specialise in a narrow range of media.

If we were to rank the six in order of effectiveness, the World Council, NAGC and SENG would be vying for top place, while ECHA and NACE would be competing for bottom place and Potential Plus UK would be somewhere in the middle.

But none of the six organisations would achieve more than a moderate assessment against the two sets of quality criteria. All of them have huge scope for improvement.

Their priorities will vary, according to what is set out in their underlying social media strategies. (If they have no social media strategy, the obvious priority is to develop one, or to revise it if it is outdated.)

.

The Overall Picture across the Five Aspects of Gifted Education

This analysis has been based on the activities of a small sample of six generalist organisations in the gifted education field, as well as wider activity involving a cross-section of tools and platforms.

It has not considered providers who specialise in one of the five aspects – advocacy, learning, professional development, policy-making and research – or the use being made of specialist social media, such as MOOCs and research tools.

So the judgements that follow are necessarily approximate. But nothing I have seen across the wider spectrum of social media over the past 18 months would seriously call into question the conclusions reached below.

  • Advocacy via social media is slightly stronger than it was in 2012 but there is still much insularity and too little progress has been made towards a joined up global movement. The international organisations remain fundamentally inward-looking and have been unable to offer the leadership and sense of direction required.  The grip of the old guard has been loosened and some of the cliquey atmosphere has dissipated, but academic research remains the dominant culture.
  • Learning via social media remains limited. There are still several niche providers but none has broken through in a global sense. The scope for fruitful partnership between gifted education interests and one or more of the emerging MOOC powerhouses remains unfulfilled. The potential for social media to support coherent and targeted blended learning solutions – and to support collaborative learning amongst gifted learners worldwide – is still largely unexploited.
  • Professional development via social media has been developed at a comparatively modest level by several providers, but the prevailing tendency seems to be to regard this as a ‘cash cow’ generating income to support other activities. There has been negligible progress towards securing the benefits that would accrue from systematic international collaboration.
  • Policy-making via social media is still the poor relation. The significance of policy-making (and of policy makers) within gifted education is little appreciated and little understood. What engagement there is seems focused disproportionately on lobbying politicians, rather than on developing at working level practical solutions to the policy problems that so many countries face in common.
  • Research via social media is negligible. The vast majority of academic researchers in the field are still caught in a 20th Century paradigm built around publication in paywalled journals and a perpetual round of face-to-face conferences. I have not seen any significant examples of collaboration between researchers. A few make a real effort to convey key research findings through social media but most do not. Some of NAGC’s networks are beginning to make progress and the 2013 World Conference went further than any of its predecessors in sharing proceedings with those who could not attend. Now the pressure is on the EU Talent Conference in Budapest and ECHA 2014 in Slovenia to push beyond this new standard.

Overall progress has been limited and rather disappointing. The three conclusions I drew in 2012 remain valid.

In September 2012 I concluded that ‘rapid acceleration is necessary otherwise gifted education will be left behind’. Eighteen months on, there are some indications of slowly gathering speed, but the gap between practice in gifted education and leading practice has widened meanwhile – and the chances of closing it seem increasingly remote.

Back in 2010 and 2011 several of my posts had an optimistic ring. It seemed then that there was an opportunity to ‘only connect’ globally, but also at European level via the EU Talent Centre and in the UK via GT Voice. But both those initiatives are faltering.

My 2012 post also finished on an optimistic note:

‘Moreover, social media can make a substantial and lasting contribution to the scope, value and quality of gifted education, to the benefit of all stakeholders, but ultimately for the collective good of gifted learners.

No, ‘can’ is too cautious, non-assertive, unambitious. Let’s go for WILL instead!’

Now in 2014 I am resigned to the fact that there will be no great leap forward. The very best we can hope for is disjointed incremental improvement achieved through competition rather than collaboration.

I will be doing my best for Potential Plus UK. Now what about you?

.

GP

March 2014

A Brief Discussion about Gifted Labelling and its Permanency

.

Some of my readership may be interested in this Twitter exchange with Ellen Spencer a researcher at the Centre for Real-World Learning, the Claxton-Lucas vehicle based at the University of Winchester.

The sequence of Tweets is embedded below (scroll down to the bottom for the start)

.

.

We discussed the issue of labelling gifted learners and the idea that such labels may not be permanent sifting devices, but temporary markers attached to such learners only while they need additional challenge and support.

This is not to deny that some gifted learners may warrant a permanent marker, but it does imply that many – probably most – will move in and out of scope as they develop in non-linear fashion and differentially to their peers.

Of course much depends on one’s understanding of giftedness and gifted education, a topic I have addressed frequently, starting with my inaugural post in May 2010.

Three-and-a-half years on, it seems to me that the default position has shifted somewhat further towards the Nurture, Equity and Personalisation polarities.

But the notion of giftedness as dynamic in both directions – with learners shifting in and out of scope as they develop – may be an exception to that broader direction of travel.

Of course there’s been heavy emphasis on movement into scope (the broader notion of giftedness as learned behaviour and achievable through effort) but very little attention given to progress in the opposite direction.

It is easy to understand how this would be a red rag to several bulls in the gifted education field, while outward movement raises difficult questions for everybody – whether or not advocates for gifted education – about communication and management of self-esteem.

But reform and provocation are often stalwart bedfellows. Feel free to vent your spleen in the comments section below.

.

GP

February 2014

Gifted Education Activity in the Blogosphere and on Twitter

.

4-Eyes-resized-greenjacketfinalI have been doing some groundwork for an impending analysis of the coverage of gifted education (and related issues) in social media – and reflecting on how that has changed in the four years I have been involved.

As a first step I revised my Blogroll (normally found in the right hand margin, immediately below the Archives).

I decided to include only Blogs that have published three or more relevant posts in the last six months – and came up with the following list of 23, which I have placed in alphabetical order.

.

Begabungs

Belin-Blank Center

Distilling G and T Ideas

Dona Matthews

Gifted and Talented Ireland

Gifted Challenges

Gifted Education Perspectives

Gifted Exchange

Gifted Parenting Support

Global #gtchat powered by TAGT

headguruteacher  (posts tagged #gtvoice)

Irish Gifted Education Blog

Krummelurebloggen

Laughing at Chaos

Living the Life Fantastic

Ramblings of a Gifted Teacher

smarte barn

Talent Igniter

Talent Talk

Talento y Educacion

The Deep End

The Prufrock Press Blog

Unwrapping the Gifted

WeAreGifted2

.

This is rather a short list, which might suggest a significant falling off of blogging activity since 2010. I had to delete the majority of the entries in the previous version of the Blogroll because they were dormant or dead.

But I might have missed some deserving blogs, particularly in other languages. Most on this list are written in English.

If you have other candidates for inclusion do please suggest them through the comments facility below, or pass them on via Twitter.

You may have views about the quantity and quality of blogging activity – and whether there is an issue here that needs to be addressed. Certainly the apparent decline in gifted education blogging comes at a time when edublogging in England has never been more popular. Perhaps you have ideas for stimulating more posts.

On the other hand, you might take the view that blogging is increasingly irrelevant, given the inexorable rise of microblogging – aka Twitter – and the continued popularity of Facebook, let alone the long list of alternatives.

Speaking of Twitter, I thought it might be an interesting exercise to compile a public list of every feed I could find that references gifted education (or an equivalent term, whether in English or another language) in its profile.

The full list – which you can find at https://twitter.com/GiftedPhoenix/lists/gifted-education – contains 1,245 members at present.

I have embedded the timeline below, and you can also find it in the right hand margin, immediately below the Blogroll.

.

.

The list includes some leading academic authorities on the subject, but is dominated by gifted education teachers and the parents of gifted learners, probably in roughly equal measure.

The clear majority is based in the United States, but there is a particularly strong community in the Netherlands and reasonable representation in Australia, Canada, the Netherlands, Spain and the UK. Several other countries are more sparsely represented.

(One authority – who shall remain nameless – has unaccountably blocked me, which prevents his inclusion in the list. But he has only produced eight tweets, the most recent over a year old, so I suppose he is no great loss.)

I cannot compare this with earlier lists, but it feels as though there has been a significant expansion of the gifted Twittersphere since I began in 2010.

That said I have no information yet about how many of the feeds are active – and just how active they are.

If I have inadvertently omitted you from the list, please Tweet to let me know. Please feel free to make use of the list as you wish, or to offer suggestions for how I might use it.

There will be further segmented lists in due course.

 

Postscript 13 January:

Many thanks for your really positive response. The blogroll now has 34 entries…and there’s always room for more.

If you’d like to subscribe to the Twitter list but are not sure how, here’s Twitter’s guide (see bottom of page).

If you’re not on the list but would like to be, please either follow me (making sure there’s a reference to gifted or similar in your profile) or send me a tweet requesting to be added.

You can follow or tweet me direct from this blog by going to the ‘Gifted Phoenix on Twitter’ embed in the right hand column.

 

.

GP

January 2014

Gifted Phoenix’s 2013 Review and Retrospective

.

This final post of 2013 takes a reflective look back at this year’s activity.

4-Eyes-resized-greenjacketfinal

One purpose is straightforward self-congratulation – a self-administered pat on the back for all my hard work!

This is also an opportunity to review the bigger picture, to reflect on the achievements and disappointments of the year now ending and to consider the prospects for 2014 and beyond.

Perhaps I can also get one or two things off my chest…

…So, by way of an aside, let me mention here that I provide this information to you entirely free of charge, partly because I believe that global progress in (gifted) education is obstructed by the rationing of knowledge, partly to encourage those who construct and shelter behind paywalls to reflect on the negative consequences of their behaviour.

I try my best to offer you a factual, balanced and objective assessment, to flag up weaknesses as well as strengths. In short, I tell it like it is. I have no interest in self-aggrandisement, in reputation or the trappings of academia. You will search in vain for those trappings in my CV, but I speak and write with commensurate authority, based on extended experience as a national policy maker and student of the field …

Another purpose is to provide an annotated list of my posts, so that readers can catch up with anything they missed.

I make this my 35th post of 2013, five fewer than I managed in 2012. I took an extended break during August and September this year, half of it spent on tour in Western Australia and the remainder engaged on other projects.

During the course of the year I’ve made a conscious effort simultaneously to narrow and diversify my focus.

I’ve devoted around two-thirds of my posts to educational reform here in England, while the remainder continued to address global issues.

Some of the Anglocentric posts were intended to draw out the wider implications of these reforms, rather than confining themselves exclusively to gifted education and the impact on gifted learners.

I wanted to paint on a broader canvas. It is all too easy to exist in a gifted education ghetto, forgetting that it must be integral to our national educational systems as well as a global endeavour in its own right.

 .

Global Gifted Education

During 2013 I published two feature-length posts about the performance of high achievers in international comparisons studies:

Like it or not, these international tests are becoming increasingly influential in most countries around the world. Those involved in gifted education ignore them at their peril.

Many of the countries that top the rankings already invest significantly in gifted education – and some of those that do not (invest significantly and/or top the rankings) ought seriously to consider this as a potential route to further improvement.

Other posts with a global gifted focus include:

My best effort at a personal credo, derived from the experience of writing this Blog. Colleagues were very flattering

.

.

I supplemented the post with a vision for delivery, primarily to inform UK-based discussion within GT Voice, but also relevant to Europe (the EU Talent Centre) and globally (the World Council).

I took a second look at this nascent field, exploring developments since I first blogged about it in 2010. I like to flatter myself that I invented the term.

The post tells of the passing interest exhibited by IRATDE and notes the reference in the July 2012 World Council Newsletter to a special issue of Gifted and Talented International (GTI) that will be devoted to the topic.

I heard in May that an unnamed specialist had been invited to prepare a ‘target paper’, but nothing has materialised to date. The wheels of academic publishing turn parlous slow.

I concluded the post with a tongue-in cheek contribution of my own – the Gifted Phoenix Equation!

Minimising the Excellence Gap and Optimising the Smart Fraction maximises impact on Economic Growth (Min EG + Optimal SF = Max EG)

This post opened with a self-confessed rant about the ‘closed shop’ operated by academics in the field, defended by research paywalls and conference keynote monopolies.

But I set aside my prejudices to review the nine leading academic journals in gifted education, examine the rights the publishers offer their authors and offer a constructive set of proposals for improving the accessibility of research.

There were also a handful of new national studies:

the last of which is strictly a transatlantic study of support for low income high ability students, developed from analysis of the US NAGC publication of the same name.

.

Gifted Education in England

Two posts examined material within England’s national school performance tables relating to high attainment and high attainers.

The latter is the second such analysis I have provided, following one on the 2012 Tables published last December. The former will be supplanted by a new version when the Secondary Tables are published in January.

I also offered a detailed treatment of the underlying accountability issues in:

These posts explored the rather haphazard treatment now afforded ‘the most able students’ in documents supporting the School Inspection Framework, as well as the different definitions deployed in the Performance Tables and how these might change as a consequence of the trio of accountability consultations launched this year.

.

.

During the Spring I wrote:

Despite the Government’s reported intention to establish a national network of up to twelve of these, still only two have been announced – sponsored by King’s College London and Exeter University respectively.

I might devote a 2014 post to updating my progress report.

There was also special mini-series, corralled under the speculatively optimistic title: A Summer of Love for Gifted Education?’

This is fundamentally a trilogy:

The original conceit had been to build each episode around a key publication expected during the year. Episodes One and Two fitted this description but the third, an ‘Investigation of school- and college- level strategies to raise the Aspirations of High-Achieving Disadvantaged Pupils to pursue higher education’ was (is still) overdue, so I had to adjust the focus.

Episode Two was a particularly rigorous examination of the Ofsted report that led to the changes to the inspection documentation.

.

.

In Episode Three, I took the opportunity to expose some questionable use of statistics on the part of selective universities and their representative bodies, setting out a 10-point plan to strengthen the representation of disadvantaged students at Oxford and Cambridge. This was accompanied by a flying pig.

.

.

There were also some supplementary posts associated with the Summer of Love:

And some material I produced at the time that Ofsted published ‘The Most Able Students’:

Did it turn out to be a ‘Summer of Love’? Looking back now, I have mixed feelings. Significant attention was paid to meeting the needs of high attaining learners, and those needs are likely to be better recognised and responded to as a consequence.

But the response, such as it is, relies almost exclusively on the accountability system. There is still a desperate need for authoritative updated national framework guidance. Ideally this should be developed by the national gifted education community, working collaboratively with government seed funding.

But the community shows little sign of readiness to take on that responsibility. Collaboration is virtually non-existent:  GT Voice has failed thus far to make any impact (justifying my decision to stand down from the board in protest at frustratingly slow progress).

Meanwhile, several players are pursuing their own diverse agendas. Most are prioritising income generation, either to survive or simply for commercial gain. Everyone is protecting their corner. Too many scores are being settled. Quality suffers.

For completeness, I should also mention a couple of shorter posts:

a piece I wrote for another publisher about how free schools might be rolled into this national collaborative effort, and

which was my best effort to summarise the ‘current state’ on the other side of Ofsted’s Report, as well as an alternative future vision, avoiding the Scylla of top-down centralised prescription and the Charybdis of bottom-up diffused autonomy.

 

Wider English Educational Reform

Almost all the posts I have written within this category are associated with emerging national policy on curriculum and assessment:

.

.

There was even

which I still expect to see in a manifesto come 2015!

As things stand, there are still many unanswered questions, not least where Labour stands on these issues.

Only one of three accountability consultations has so far received a Government response. The response to the primary consultation – comfortably the least persuasive of the three – was due in ‘the autumn’ but hadn’t appeared by Christmas.

The decision to remove National Curriculum levels looks set to have several unintended negative consequences, not least HMCI Wilshaw’s recent call for the reintroduction of national testing at KS1 and KS3.

I am still to be persuaded that this decision is in the best interest of high attainers.

 

Social Media

This year I have spent more time tweeting and less time producing round-ups of my Twitter activity.

At the time of writing, my follower count has reached 4,660 and I have published something approaching 18,700 Tweets on educational topics.

I try to inform my readers about wider developments in UK (especially English) education policy, keeping a particularly close eye on material published by the Government and by Parliament.

I continue to use #gtchat (global) and #gtvoice (UK) to hashtag material on gifted education and related issues. I look out particularly for news about developments worldwide. I publish material that seems interesting or relevant, even though I might disagree with it. I try to avoid promotional material or anything that is trying to sell you something.

I began 2013 intending to produce round-ups on ‘a quarterly-cum-termly basis’ but have managed only two editions:

The next volume is already overdue but I simply can’t face the grinding effort involved in the compilation process. I may not continue with this sequence in 2014.

I was also invited to answer the question:

ResearchED was a conference organised via Twitter which took place in September.

The post argued for a national network of UK education bloggers. This hasn’t materialised, although the status and profile of edublogging has improved dramatically during 2013, partly as a consequence of the interest taken by Michael Gove.

There are many more blogs and posts than a year ago, several co-ordinated through Blogsync and/or reblogged via The Echo Chamber.

Precious few bloggers enter the field of gifted education, though honourable mentions must go to Distilling G&T Ideas and Headguruteacher.

Elsewhere in the world, not too many gifted education bloggers are still generating a constant flow of material.

Exceptions include Lisa Conrad, who is maintaining two blogs in the US Gifted Parenting Support and Global #GT Chat Powered by TAGT. Also Kari Kolberg who produces Krummelurebloggen (in Norwegian) and Javier Touron who writes Talento y Educacion (in Spanish).

I need urgently to revisit my Blogroll. I might also write a post about the general state of global gifted education blogging in the early part of 2014.

 

Reference

I have made only limited progress this year with the reference pages on this Blog:

  • Who’s Who?  remains embryonic. I had plans to force myself to produce a handful of entries each day, but managed only two days in succession! There isn’t a great deal of intellectual challenge in this process – life may be too short!
  • Key Documents is a mixed bag. The UK pages are fully stocked. You should be able to find every significant national publication since 2000. The Rest of the World section is still largely empty.

Rightly or wrongly, the production of blog posts is taking priority.

 

Analytics

Compared with 2012, the number of page views has increased by over 30%, although the number of posts is down by 12.5%. I’m happy with that.

Some 40% of views originate in the UK. Other countries displaying significant interest include the US, Singapore, Australia, India, Hong Kong, Saudi Arabia, New Zealand, Canada and Spain. Altogether there have been visits from 169 countries.

The most popular posts published this year are, in order of popularity:

  • Whither National Curriculum Assessment Without Levels?
  • What the KS2/KS4 Transition Matrices Show About High Attainers’ Performance
  • High Attaining Students in the 2012 Secondary School Performance Tables
  • Analysis of the Primary Assessment and Accountability Consultation Document and
  • A Summer of Love for English Gifted Education Episode 2: Ofsted’s ‘The Most Able Students’

.

Visuals

I have changed the theme of my Blog twice this year – initially to Zoren and more recently to Highwind. I wanted a clearer, spacier look and a bigger font.

During the course of the year I have alternated between using my photographs within posts and producing work that is largely free of illustration. I have mixed feelings about this.

It seems somehow incongruous to intersperse unrelated photographs within a post about educational matters, but the stock of education-relevant non-copyrighted illustration is severely limited. Then again, screeds of unbroken text can be rather dreary to the eye.

So readers can expect some more views of Western Australia (especially) during 2014! Here’s one to whet your appetite.

.

Flora 2 by Gifted Phoenix

Flora 2 by Gifted Phoenix

 

The Future

I close 2013 in a pessimistic mood. Despite the more favourable domestic policy climate, I am markedly less optimistic about the future of gifted education than I was at the start of the year.

Disillusion is setting in, reinforced by negligible progress towards the objectives I hold most dear.

The ‘closed shop’ is as determinedly closed as ever; vested interests are shored up; governance is weak. There is fragmentation and vacuum where there should be inclusive collaboration for the benefit of learners. Too many are on the outside, looking in. Too many on the inside are superannuated and devoid of fresh ideas.

Every so often I witness dispiriting egotism, duplicity or even vengefulness. Disagreements fester because one or both of the parties is unwilling to work towards resolution.

The world of gifted education is often not a happy place – and while it remains that way there is no real prospect of achieving significant improvements in the education and life chances of gifted learners.

To mix some metaphors, it may soon be time to cut my losses, stop flogging this moribund horse and do something else instead.

Happy New Year!

.

GP

December 2013

PISA 2012: International Comparison of High Achievers’ Performance

.

This post examines what PISA 2012 can tell us about the comparative performance of high achievers in England, other English-speaking countries and those that top the PISA rankings.

Introductory Brochure for PISA 2012 by Kristjan Paur

Introductory Brochure for PISA 2012 by Kristjan Paur

It draws on a similar range of evidence to that deployed in my post on the PISA 2009 results (December 2010).

A more recent piece, ‘The Performance of Gifted High Achievers in TIMSS, PIRLS and PISA’ (January 2013) is also relevant.

The post reviews:

  • How the PISA 2012 Assessment Framework defines reading, mathematical and scientific literacy and its definitions of high achievement in each of the three core domains.
  • How average (headline) performance on the three core measures has changed in each jurisdiction compared with PISA 2006 and PISA 2009.
  • By comparison, how high achievers’ performance – and the balance between high and low achievers’ performance – has changed in each jurisdiction over the same period.
  • How jurisdictions compare on the ‘all-rounder’ measure, derived from achievement of a high performance threshold on all three assessments.

The twelve jurisdictions included in the main analysis are: Australia, Canada, England, Finland, Hong Kong (China), Ireland, New Zealand, Shanghai (China), Singapore, South Korea, Taiwan and the USA.

The post also compares the performance of the five home countries against the high achievement thresholds. I have foregrounded this analysis, which appears immediately below, save only for the headline (but potentially misleading) ‘top 10’ high achiever rankings for 2012.

.

Headlines

 .

World Leaders against PISA’s High Achievement Benchmarks

The top 10 performers in PISA 2012 against the high achievement benchmarks (Level 5 and above), in reading, maths and science respectively, are set out in Table 1 below.

The 2009 rankings are shown in brackets and the 2012 overall average rankings in bold, square brackets. I have also included England’s rankings.

.

Table 1

Rank Reading Maths Science
1 Shanghai (1) [1] Shanghai (1) [1] Shanghai (1) [1]
2 Singapore (3) [3] Singapore (2) [2] Singapore (2) [3]
3 Japan (5) [4] Taiwan (4) [4] Japan (5) [4]
4 Hong Kong (9) [2] Hong Kong (3) [3] Finland (3) [5]
5 S. Korea (6) [5] S Korea (5) [5] Hong Kong (6) [2]
6 N Zealand (2) [13] Liechtenstein (13) [8] Australia (7) [16]
7 Finland (4) [6] Macao (15) [6] N Zealand (4) [18]
8 Canada (7=) [8] Japan (8) [7] Estonia (17) [6]
9 France (13) [21] Switzerland (6) [9] Germany (8) [12]
10 Belgium (10) [16] Belgium (9) [15] [15] Netherlands (9) [14]
England 19th (19) [23] England 24th (32) [25] England 11th  (12) [18]

 .

On the basis of these crude rankings alone, it is evident that Shanghai has maintained its ascendancy across all three domains.

Singapore has reinforced its runner-up position by overtaking New Zealand in reading. Hong Kong and Japan also make it into the top ten in all three domains.

Notable improvements in the rankings have been made by:

  • Japan, Hong Kong and France in reading
  • Liechtenstein and Macao in maths
  • Japan and Estonia in science

.

.

Jurisdictions falling down the rankings include:

  • Australia, New Zealand and Finland in reading
  • Finland and Switzerland in maths
  • Canada and New Zealand in science.

Those whose high achiever rankings significantly exceed their average rankings include:

  • New Zealand, France and Belgium in reading
  • Belgium in maths
  • Australia, New Zealand, Germany and the Netherlands in science

The only one of the top ten jurisdictions exhibiting the reverse pattern with any degree of significance is Hong Kong, in science.

On this evidence, England has maintained its relatively strong showing in science and a mid-table position in reading, but it has slipped several places in maths.

Comparing England’s rankings for high achievers with its rankings for average performance:

  • Reading 19th versus 23rd
  • Maths 24th versus 25th
  • Science 11th versus 18th

This suggests that England is substantively stronger at the top end of the achievement spectrum in science, slightly stronger in reading and almost identical in maths. (The analysis below explores whether this is borne out by the proportions of learners achieving the relevant PISA thresholds.)

Overall, these rankings suggest that England is a respectable performer at the top end, but nothing to write home about. It is not deteriorating, relatively speaking – with the possible exception of mathematics – but it is not improving significantly either. The imbalance is not atypical and it requires attention, but only as part of a determined effort to build performance at both ends.

.

Comparing the Home Countries’ Performance

Table 2 below shows how each home country has performed at Level 5 and above in each of the three core PISA assessments since 2006.

.

Table 2

  2012 Level 5+ 2009 Level 5+ 2006 Level 5+
  Read Maths Sci Read Maths Sci Read Maths Sci
England 9.1 12.4 11.7 8.1 9.9 11.6 9.2 11.2 14.0
N Ireland 8.3 10.3 10.3 9.3 10.3 11.8 10.4 12.2 13.9
Scotland 7.8 10.9 8.8 9.2 12.3 11.0 8.5 12.1 12.5
Wales 4.7 5.3 5.7 5.0 5.0 7.8 6.4 7.2 10.9
UK 8.8 11.9 11.1 8.0 9.9 11.4 9.0 11.2 13.8
OECD average 8.4 12.6 8.4 7.6 12.7 8.5 8.6 13.3 9.0

.

In 2012, England is ahead of the other home countries in all three domains. Northern Ireland is runner-up in reading and science, Scotland in maths. Wales is a long way behind the other four in all three assessments.

Only England tops the OECD average in reading. All the home countries fall below the OECD average in maths, though all but Wales are above it in science.

Compared with 2006, England’s performance has changed little in reading, increased somewhat in maths (having fallen back betweentimes) and fallen quite significantly in science.

In comparison, Northern Ireland is on a downward trend in all three domains, as is Scotland (though it produced small improvements in maths and reading in 2009). Wales has fallen back significantly in science, though somewhat less so in reading and maths.

It seems that none of the home countries is particularly outstanding when it comes to the performance of their high achievers, but England is the strongest of the four, while Wales is clearly the weakest.

A slightly different perspective can be gained by comparing high and low performance in 2012.

Table 3 below shows that the proportion of low achievers is comfortably larger than the proportion of high achievers. This is true of all the home countries and all subjects, though the difference is less pronounced in science across the board and also in Scotland. Conversely, the imbalance is much more significant in Wales.

 .

Table 3

2012 Reading Maths Science
  L5+6 L1+below L5+6 L1+below L5+6 L1+below
England 9.1 16.7 12.4 21.7 11.7 14.9
N Ireland 8.3 16.7 10.3 24.1 10.3 16.8
Scotland 7.8 12.5 10.9 18.2 8.8 12.1
Wales 4.7 20.6 5.3 29.0 5.7 19.4
UK 8.8 16.7 11.9 21.8 11.1 15.0
OECD average 8.4 8.4 12.6 23.0 8.4 17.8

.

The ‘tail’ in reading is significantly higher than the OECD average in all four countries but – with the exception of Wales – somewhat lower in science.

In maths, the ‘tail’ is higher than the OECD average in Wales and Northern Ireland, but below average in England and Scotland.

The average figures suggest that, across the OECD as a whole, the top and bottom are broadly balanced in reading, there is a small imbalance in science towards the bottom end and a more significant imbalance in maths, again towards the bottom end.

By comparison, the home countries have a major issue at the bottom in reading, but are less significantly out of line in maths and science.

Overall, there is some evidence here of a longish tail of low achievement, but with considerable variation according to country and domain.

The bottom line is that all of the home countries have significant issues to address at both the top and the bottom of the achievement distribution. Any suggestion that they need to concentrate exclusively on low achievers is not supported by this evidence.

.

Francois Peron National Park by Gifted Phoenix 2013

Francois Peron National Park by Gifted Phoenix 2013

.

Background to PISA

 .

What is PISA?

The Programme for International Student Assessment (PISA) is a triennial OECD survey of the performance of 15 year-old students which typically covers maths, science and reading. Science was the main focus in 2006, reading in 2009 and maths in 2012.

PISA 2012 also included a computer-based assessment of problem-solving and a financial literacy assessment. However, some jurisdictions did not participate in the problem-solving exercise owing to ‘technical issues’ and financial literacy was undertaken by some countries only, as an optional extra.

Fifty-eight jurisdictions took part in PISA 2006 and 74 in PISA 2009 (65 undertook the assessment in 2009 and a further nine did so in 2010).

To date, a total of 65 jurisdictions have also taken part in PISA 2012.

According to the OECD’s own FAQ:

  • PISA tests reading, mathematical and scientific literacy ‘in terms of general competencies, that is, how well students can apply the knowledge and skills they have learned at school to real-life challenges. PISA does not test how well a student has mastered a school’s specific curriculum.’
  • Student performance in each field is comparable between assessments – one cannot reasonably argue therefore that a drop in performance is attributable to a more difficult assessment.
  • Each participating jurisdiction receives an overall score in each subject area – the average of all its students’ scores. The average score among OECD countries is set at 500 points (with a standard deviation of 100 points).
  • Participating jurisdictions are ranked in each subject area according to their mean scores, but:

‘is not possible to assign a single exact rank in each subject to each country…because PISA tests only a sample of students from each country and this result is then adjusted to reflect the whole population of 15-year-old students in that country. The scores thus reflect a small measure of statistical uncertainty and it is therefore only possible to report the range of positions (upper rank and lower rank) within which a country can be placed.’

Outside the confines of reports by the OECD and its national contractors, this is honoured more in the breach than the observance.

  • Scores are derived from scales applied to each subject area. Each scale is divided into levels, Level 1 being the lowest and Level 6 typically the highest

Further background detail on the 2012 assessments is set out in the ‘PISA 2012 Assessment and Analytical Framework’ (2013).

This explains that the framework for assessing maths was completely revised ahead of the 2012 cycle and ‘introduces three new mathematical processes that form the basis of developments in the reporting of PISA mathematics outcomes’, whereas those for science and reading were unchanged (the science framework was revised when it was the main focus in 2006 and ditto for reading in 2009).

The Framework clarifies the competency-based approach summarised in the FAQ:

‘ISA focuses on competencies that 15-year-old students will need in the future and seeks to assess what they can do with what they have learnt – reflecting the ability of students to continue learning throughout their lives by applying what they learn in school to non-school environments, evaluating their choices and making decisions. The assessment is informed, but not constrained, by the common denominator of national curricula. Thus, while it does assess students’ knowledge, PISA also examines their ability to reflect, and to apply their knowledge and experience to real-life issues in a reflective way. For example, in order to understand and evaluate scientific advice on food safety, an adult would need not only to know some basic facts about the composition of nutrients, but also to be able to apply that information.’

It explains that between 4,500 and 10,000 students drawn from 150 schools are typically tested in each jurisdiction.

Initial reports suggested that England would not take part in the 2012 assessments of problem-solving and financial literacy, but it subsequently emerged that this decision had been reversed in respect of problem-solving.

.

Setting PISA Outcomes in Context

There are plenty of reasons why one should not place excessive weight on PISA outcomes:

  • The headline rankings carry a significant health warning, which remains important, even though it is commonly ignored.

‘As the PISA 2000 and PISA 2003 samples for the United Kingdom did not meet the PISA response-rate standards, no trend comparisons are possible for these years.’ (p.1)

Hence, for the UK at least, reliable comparisons with pre-2006 results are off the table.

‘The pressure from policymakers for advice based on PISA interacts with this unhealthy mix of policy and technical people. The technical experts make sure that the appropriate caveats are noted, but the warnings are all too often ignored by the needs of the policy arm of PISA. As a result, PISA reports often list the known problems with the data, but then the policy advice flows as though those problems didn’t exist. Consequently, some have argued that PISA has become a vehicle for policy advocacy in which advice is built on flimsy data and flawed analysis.’

  • PISA is not the only game in town. TIMSS and PIRLS are equally significant, though relatively more focused on content knowledge, whereas PISA is primarily concerned with the application of skills in real life scenarios.
  • There are big political risks associated with worshipping at the PISA altar for, if the next set of outcomes is disappointing, the only possible escape route is to blame the previous administration, a strategy that wears increasingly thin with the electorate the longer the current administration has been in power.

 .

.

It would be quite wrong to dismiss PISA results out of hand, however. They are a significant indicator of the comparative performance of national (and regional) education systems. But they are solely an indicator, rather than a statement of fact.

.

What is assessed – and what constitutes high achievement – in each domain

The Assessment and Analytical Framework provides definitions of each domain and level descriptors for each level within the assessments.

.

Mathematical Literacy

The PISA 2012 mathematics framework defines mathematical literacy as:

‘An individual’s capacity to formulate, employ and interpret mathematics in a variety of contexts. It includes reasoning mathematically and using mathematical concepts, procedures, facts and tools to describe, explain and predict phenomena. It assists individuals to recognise the role that mathematics plays in the world and to make the well-founded judgments and decisions needed by constructive, engaged and reflective citizens.’

Three aspects of maths are identified:

  • Mathematical processes and the fundamental capabilities underlying them. Three processes are itemised: formulating situations mathematically; employing mathematical concepts, facts, procedures and reasoning; and interpreting, applying and evaluating mathematical outcomes. The capabilities are: communication; mathematizing (transforming a real life problem to a mathematical form); representation; reasoning and argument; devising problem-solving strategies; using symbolic, formal and technical language and operations; and using mathematical tools.
  • Content knowledge, comprising four elements: change and relationships; space and shape; quantity; and uncertainty and data.
  • The contexts in which mathematical challenges are presented: personal; occupational; societal and scientific.

Six levels are identified within the PISA 2012 mathematics scale’. The top two are described thus:

  • ‘At Level 6 students can conceptualise, generalise and utilise information based on their investigations and modelling of complex problem situations. They can link different information sources and representations and flexibly translate among them. Students at this level are capable of advanced mathematical thinking and reasoning. These students can apply their insight and understandings along with a mastery of symbolic and formal mathematical operations and relationships to develop new approaches and strategies for attacking novel situations. Students at this level can formulate and precisely communicate their actions and reflections regarding their findings, interpretations, arguments and the appropriateness of these to the original situations.’
  • ‘At Level 5 students can develop and work with models for complex situations, identifying constraints and specifying assumptions. They can select, compare and evaluate appropriate problem-solving strategies for dealing with complex problems related to these models. Students at this level can work strategically using broad, well-developed thinking and reasoning skills, appropriate linked representations, symbolic and formal characterisations and insight pertaining to these situations. They can reflect on their actions and formulate and communicate their interpretations and reasoning.’

.

Reading literacy

Reading Literacy is defined as:

‘An individual’s capacity to understand, use, reflect on and engage with written texts, in order to achieve one’s goals, to develop one’s knowledge and potential, and to participate in society.’

The assessment ‘is built on three major task characteristics’:

  • Situation – the context or purpose for which reading takes place, which may be personal (practical and intellectual interests), public (activities and concerns of society), educational (for learning purposes) or occupational (accomplishment of a task).
  • Text – the range of material that is read, which may be print or digital. In the case of digital text, the environment may be authored (the reader is receptive), message based, or mixed. In the case of both print and digital text, the format may be continuous (sentences and paragraphs), non-continuous (eg graphs, lists), mixed or multiple, while the text type may be description, narration, exposition, argumentation, instruction or transaction.
  • Aspect – how readers engage with the text, which includes accessing and retrieving; integrating and interpreting; and reflecting and evaluating.

Separate proficiency scales are provided for print and digital reading respectively. Both describe achievement in terms of the task rather than the student.

The print reading scale has six levels (Level One is subdivided into two). The top levels are described as follows:

  • Level 6: Tasks at this level typically require the reader to make multiple inferences, comparisons and contrasts that are both detailed and precise. They require demonstration of a full and detailed understanding of one or more texts and may involve integrating information from more than one text. Tasks may require the reader to deal with unfamiliar ideas, in the presence of prominent competing information, and to generate abstract categories for interpretations. Reflect and evaluate tasks may require the reader to hypothesise about or critically evaluate a complex text on an unfamiliar topic, taking into account multiple criteria or perspectives, and applying sophisticated understandings from beyond the text. A salient condition for access and retrieve tasks at this level is precision of analysis and fine attention to detail that is inconspicuous in the texts.
  • Level 5: Tasks at this level that involve retrieving information require the reader to locate and organise several pieces of deeply embedded information, inferring which information in the text is relevant. Reflective tasks require critical evaluation or hypothesis, drawing on specialised knowledge. Both interpretative and reflective tasks require a full and detailed understanding of a text whose content or form is unfamiliar. For all aspects of reading, tasks at this level typically involve dealing with concepts that are contrary to expectations.

For digital reading there are only four levels, categorised as 2-5. Level 5 is described thus:

‘Tasks at this level typically require the reader to locate, analyse and critically evaluate information, related to an unfamiliar context, in the presence of ambiguity. They require generating criteria to evaluate the text. Tasks may require navigation across multiple sites without explicit direction, and detailed interrogation of texts in a variety of formats.’

 .

Scientific literacy

Scientific literacy is defined as:

‘An individual’s scientific knowledge and use of that knowledge to identify questions, to acquire new knowledge, to explain scientific phenomena, and to draw evidence-based conclusions about science-related issues, understanding of the characteristic features of science as a form of human knowledge and enquiry, awareness of how science and technology shape our material, intellectual, and cultural environments, and willingness to engage in science-related issues, and with the ideas of science, as a reflective citizen.’

The domain consists of four interrelated aspects:

  • Context – life situations involving science and technology. Contexts are personal, social or global and may relate to health, natural resources, environment, hazard or the frontiers of science and technology.
  • Knowledge – knowledge of the natural world (covering physical systems, living systems, earth and space systems and technology systems) and knowledge about science itself (scientific enquiry and scientific explanations).
  • Competencies , of which  three are identified: identify scientific issues, explain phenomena scientifically and use scientific evidence.
  • Attitudes, including an interest in science, support for scientific enquiry and a motivation to act responsibly towards the natural world.

A 6-level proficiency scale is defined with the top levels explained as follows:

  • At Level 6, students can consistently identify, explain and apply scientific knowledge and knowledge about science in a variety of complex life situations. They can link different information sources and explanations and use evidence from those sources to justify decisions. They clearly and consistently demonstrate advanced scientific thinking and reasoning, and they use their scientific understanding in support of solutions to unfamiliar scientific and technological situations. Students at this level can use scientific knowledge and develop arguments in support of recommendations and decisions that centre on personal, social or global situations.
  • At Level 5, students can identify the scientific components of many complex life situations, apply both scientific concepts and knowledge about science to these situations, and can compare, select and evaluate appropriate scientific evidence for responding to life situations. Students at this level can use well-developed inquiry abilities, link knowledge appropriately and bring critical insights to situations. They can construct explanations based on evidence and arguments based on their critical analysis.

.

Denham Sunset by Gifted Phoenix

Denham Sunset by Gifted Phoenix

.

 Changes in Average Performance in Reading, Maths and Science

The OECD published PISA outcomes for maths, science and reading on 3 December 2013.

Similarly, the PISA National Report on England, published simultaneously, covers the three core assessments.

This section looks briefly at the headline average scores and rankings across the selected sample of twelve jurisdictions, principally to enable comparisons to be drawn with the subsequent analysis of high achievers’ performance.

I apologise in advance for any transcription errors. Please let me know if you spot any and I will correct the tables accordingly.

.

Reading

Table 4 below gives the headline average numerical scores and ranks in reading from PISA 2006, 2009 and 2012 respectively.

.

Table 4

Country 2012 2009 2006
score rank score rank score rank
Australia 512↓ 13↓ 515↑ 9↓ 513 7
Canada 523↓ 8↓ 524↓ 6↓ 527 4
Finland 524↓ 6↓ 536↓ 3↓ 547 2
Hong Kong 545↑ 2↑ 533↓ 4↓ 536 3
Ireland 523↑ 7↑ 496↓ 21↓ 517 6
S Korea 536↓ 5↓ 539↓ 2↓ 556 1
New Zealand 512↓ 13↓ 521 7↓ 521 5
Shanghai 570↑ 1= 556 1 N/A N/A
Singapore 542↑ 3↑ 526 5 N/A N/A
Taiwan 523↑ 8↑ 495↓ 23↓ 496 16
UK (England) 500↑ 23↑ 495↓ 25↓ 496 17
US 498↓ 24↓ 500 17 N/A N/A
OECD Average 496↑ 493↓ 495

.

Shanghai has retained the ascendancy it established in 2009, adding a further 14 points to its average 2009 score. Whereas it was only 17 points beyond its nearest competitor in 2009, that lead has now been extended to 25 points.

South Korea’s performance has fallen slightly and it has been leapfrogged in the rankings by Hong Kong (up 12 points), Singapore (up 16 points), and Japan (not included in the table).

Two countries making even more significant improvements are Taiwan (up 28 points) and Ireland (up 27 points). Conversely, the performance of Finland (down 12 points) and New Zealand (down 9 points) has noticeably declined. Finland’s performance has been declining since 2006.

Results remain broadly unchanged in Australia, Canada, England, South Korea and the USA. South Korea has been unable to make up the ground it lost in 2009.

Ireland’s huge improvement from a very similar starting point in 2009 throws England’s lack of progress into sharper relief, although it is largely catching up lost ground in 2009, having performed relatively well in 2006.

England, like the US, continues to perform slightly above the OECD average, but has fallen further behind the Asian Tigers. The gap with the world’s leader in each assessment is now 70 points (up from 60 in 2006),

.

Maths

Table 5 below sets out scores and rankings in maths since PISA 2006

.

Table 5

Country 2012 2009 2006
  score rank score rank score rank
Australia 504↓ 19↓ 514↓ 15↓ 520 13
Canada 518↓ 13↓ 527= 10↓ 527 7
Finland 519↓ 12↓ 541↓ 6↓ 548 2
Hong Kong 561↑ 3= 555↑ 3 547 3
Ireland 501↑ 20↑ 487↓ 32↓ 501 22
S Korea 554↑ 5↓ 546↓ 4 547 4
New Zealand 500↓ 23↓ 519↓ 13↓ 522 11
Shanghai 613↑ 1= 600 1 N/A N/A
Singapore 573↑ 2= 562 2 N/A N?A
Taiwan 560↑ 4↑ 543↓ 5↓ 549 1
UK (England) 495↑ 25↑ 493↓ 27↓ 495 24
US 481↓ 36↓ 487↑ 31↑ 474 35
OECD Average 494↓   496↓   497  

 .

The overall picture is rather similar to that for reading.

Shanghai (up 13 points) and Singapore (up 11 points) continue to stretch away at the head of the field. Taiwan (up 17 points) has also made significant improvement and is now close behind Hong Kong.

There has been relatively more modest improvement in Hong Kong and South Korea (which has been overtaken by Taiwan).

Elsewhere, Ireland has again made significant headway and is back to the level it achieved in 2006. But Finland’s score has plummeted 22 points. New Zealand is not far behind (down 19). There have also been significant falls in the performance of Australia (down 10) Canada (down 9) and the US (down 6).

The US is now trailing 13 points below the OECD average, having failed to sustain the substantial improvement it made in 2009.

In England meanwhile, results are largely unchanged, though now just above the OECD average rather than just below it.

The gap between England and world leader Shanghai has reached 118 points, compared with a gap in 2006 between England and world leader Taiwan of 54 points. The gap between England and its main Commonwealth competitors has narrowed, but only as a consequence of the significant declines in the latter.

.

Science

Table 6 below provides the same data in respect of science.

.

Table 6

Country 2012 2009 2006
  score rank score rank score rank
Australia 521↓ 16↓ 527= 10↓ 527 8
Canada 525↓ 10↓ 529↓ 8↓ 534 3
Finland 545↓ 5↓ 554↓ 2↓ 563 1
Hong Kong 555↑ 2↑ 549↑ 3↓ 542 2
Ireland 522↑ 15↑ 508 20 508 20
S Korea 538= 7↓ 538↑ 6↑ 522 11
New Zealand 516↓ 18↓ 532↑ 7 530 7
Shanghai 580↑ 1= 575 1 N/A N/A
Singapore 551↑ 3↑ 542 4 N/A N/A
Taiwan 523↑ 13↓ 520↓ 12↓ 532 4
UK (England) 516↑ 18↓ 515↓ 16↓ 516 14
US 497↓ 28↓ 502↑ 23↑ 489 29
OECD Average 501=   501↑   498  

 .

Shanghai is again out in front, having repeated the clean sweep it achieved in 2009.

However, it has managed only a 5-point improvement, while Taiwan has improved by 13 points and Singapore by 9 points. Hong Kong has moved up by 6 points and Taiwan by 3 points, but South Korea’s score is unchanged from 2009.

New Zealand has dropped by 16 points and Finland by 9 points compared with 2009. There have been comparatively smaller declines in Australia and Canada, while Ireland has once again improved dramatically, by 14 points, and – in this case – the improvement is not simply clawing back ground lost in 2009.

England remains comfortably above the OECD average, but has made negligible improvement since 2006. US performance has dropped back below the OECD average as it has lost some of the ground it made up in 2009.

The gap between England and the world leaders is comparable with that in maths and significantly lower than in reading. The gap is now 64 points, compared with just 47 points in 2006.

.

Overall

Overall, the Asian Tigers have consolidated their positions by maintaining improvement in all three domains, though South Korea appears to be struggling to maintain the success of earlier years.

Finland and New Zealand are in worrying decline while Ireland is making rapid progress in the opposite direction.

.

.

The US results are stagnant, remaining comparatively poor, particularly in maths.

England has broadly maintained its existing performance profile, neither improving nor declining significantly. But, it is conspicuously losing ground on the world leaders, especially in maths. Other than in science it is close to the OECD average.

There is nothing here to give comfort to either the previous Government or the present incumbents. There might be some limited relief – even a degree of shadenfreude – in the fact that several better-placed nations are falling back more severely. But of course one cannot win the ‘global race’ by simply standing still.

.

Floral by Gifted Phoenix

Floral by Gifted Phoenix

 .

Changes in High Achievers’ Performance

So much for the average headline figures.

The remainder of this post is focused on  high achievement data. The ensuing sections once more examine reading, maths and science in that order, followed by a section on all-rounders.

.

Reading

Table 7 shows how the percentage achieving higher levels in reading has changed since PISA 2006, providing separate columns for Level 6 and above level 5 respectively (there was no Level 6 in 2006)..

.

Table 7

Country 2012 2009 2006
Level 6 Levels 5 and 6 Level 6 Levels 5+6 Level 5
Australia 1.9 11.7 2.1 12.8 10.6
Canada 2.1 12.9 1.8 12.8 14.5
Finland 2.2 13.5 1.6 14.5 16.7
Hong Kong 1.9 16.8 1.2 12.4 12.8
Ireland 1.3 11.4 0.7 7.0 11.7
S Korea 1.6 14.2 1.0 12.9 21.7
New Zealand 3.0 13.9 2.9 15.8 15.9
Shanghai 3.8 25.1 2.4 19.4 N/A
Singapore 5.0 21.2 2.6 15.7 N/A
Taiwan 1.4 11.8 0.4 5.2 4.7
UK (England) 1.3 9.1 1.0 8.1 9.2
US 1.0 7.9 1.5 9.9 N/A
OECD Average 1.1 8.4 1.0 7.0 8.6

 

This reveals that:

  • In 2012, Singapore has a clear lead on its competitors at Level 6, but it is overtaken by Shanghai at Level 5 and above. New Zealand also remains comparatively strong at Level 6, but falls back significantly when Levels 5 and 6 are combined.
  • The other Asian Tigers do not perform outstandingly well at Level 6: Hong Kong, South Korea and Taiwan are all below 2.0%, behind Canada and Finland. However, all but Taiwan outscore their competitors when Levels 5 and 6 are combined.
  • Hong Kong, Shanghai, Singapore and Taiwan are all making fairly strong progress over time. Patterns are rather less discernible for other countries, though there is a downward trend in the US.
  • In Finland, New Zealand and Canada – countries that seem to be falling back overall – the percentage of Level 6 readers continues to improve. This might suggest that the proportion of the highest performers in reading is not significantly affected when national performance begins to slide.
  • When judged against these world leaders, England’s comparative performance is brought into much clearer perspective. At Level 6 it is not far behind Taiwan, South Korea and even Hong Kong. But, at Level 5 and above, the gap is somewhat more pronounced. England is improving, but very slowly.
  • The comparison with Taiwan is particularly stark. In 2006, England had roughly twice as many students performing at Level 5. By 2009 Taiwan had caught up some of this ground and, by 2012, it had overtaken.

Table 8 compares changes since PISA 2006 in national performance at Level 5 and above with changes at Level 1 and below.

This is intended to reveal the balance between top and bottom – and whether this sample of world-leading and other English-speaking jurisdictions is making consistent progress at either end of the spectrum.

.

 Table 8

Country Levels 5 (and 6 from 2009) Level 1 (or equivalent) and below
2006 2009 2012 2006 2009 2012
Australia 10.6 12.8 11.7 13.4 14.3 14.2
Canada 14.5 12.8 12.9 11.0 10.3 10.9
Finland 16.7 14.5 13.5 4.8 8.1 11.3
Hong Kong 12.8 12.4 16.8 7.2 8.3 6.8
Ireland 11.7 7.0 11.4 12.2 17.2 9.7
S Korea 21.7 12.9 14.2 5.7 5.8 7.6
New Zealand 15.9 15.8 13.9 14.6 14.3 16.3
Shanghai N/A 19.4 25.1 N/A 4.1 2.9
Singapore N/A 15.7 21.2 N/A 12.4 9.9
Taiwan 4.7 5.2 11.8 14.3 15.6 11.5
UK (England) 9.2 8.1 9.1 18.9 18.4 16.7
US N/A 9.9 7.9 N/A 17.7 16.7
OECD Average 8.6 7.0 8.4 20.1 18.8 18

 

We can see that:

  • The countries with the highest proportion of students at Level 5 and above tend to have the lowest proportion at Level 1 and below. In Shanghai in 2012, there is a 22% percentage point gap between these two populations and fewer than 3 in every hundred fall into the lower attaining group.
  • Singapore is much closer to Shanghai at the top end than it is at the bottom. But even Shanghai seems to be making faster progress at the top than at the bottom, which might suggest that it is approaching the point at which the proportion of low achievers cannot be further reduced.
  • Compared with Hong Kong and South Korea, Singapore has a higher proportion of both high achievers and low achievers.
  • Whereas Taiwan had three times as many low achievers as high achievers in 2006, by 2012 the proportions were broadly similar, but progress at the top end is much faster than at the bottom.
  • The decline in Finland has less to do with performance at the top end (which has fallen by three percentage points) than with performance at the bottom (which has increased by more than six percentage points).
  • Canada has consistently maintained a higher percentage of high achievers than low achievers, but the reverse is true in Australia. In New Zealand the percentage at the top is declining and the percentage at the bottom is increasing. The gap between the two has narrowed slightly in England, but not significantly so.
  • To catch up with Shanghai, England has to close a gap of some 16 percentage points at the top end, compared with one of around 14 percentage points at the bottom.

The PISA National Report on England offers some additional analysis, noting that 18 jurisdictions had a higher proportion of pupils than England at Level 5 or above in 2012, including all those that outperformed England overall (with the exception of Estonia and Macao), and also France and Norway.

The National Report relies more heavily on comparing the performance of learners at the 5th and 95th percentiles in each country, arguing that:

‘This is a better measure for comparing countries than using the lowest and highest scoring pupils, as such a comparison may be affected by a small number of pupils in a country with unusually high or low scores.’

This is true in the sense that a minimum sample of 4,500 PISA participants would result in fewer than 100 at Level 6 in many jurisdictions.

On the other hand, the National Report fails to point out that analysis on this basis is not particularly informative about comparative achievement of the criterion-referenced standards denoted by the PISA thresholds.

It says rather more about the spread of performance in each country and rather less about direct international comparisons.

Key points include:

  • In England the score of learners at the 5th percentile was 328, compared with 652 at the 95th percentile. This difference of 324 points is slightly larger than the OECD average difference of 310 points. More than two-thirds of OECD countries had a smaller difference between these percentiles.
  • Compared with PISA 2012, the score of high achievers at the 95th percentile in PISA 2009 increased by six points to 652, while the score of low achievers at the 5th percentile fell by six points to 328. This increase in the attainment gap is higher than in 2009 (312) but lower than in 2006 (337). Thirteen OECD countries reported a wider spread of attainment than England.
  • Of countries outperforming England, only Japan (325 points), Singapore (329 points) Belgium (339 points) and New Zealand (347 points) demonstrated a similar or wider spread of attainment. Shanghai had the lowest difference (259 points) followed by Estonia (263).
  • The strongest performing jurisdictions at the 95th percentile were Singapore (698), Shanghai (690) and Japan (689), compared with 652 for England.
  • Amongst jurisdictions ranked higher than England, only the Netherlands, Liechtenstein, Estonia and Macao secured a lower score at the 95th percentile. Only Belgium reported a lower score at the 5th percentile.

.

Maths

Turning to maths, Table 9 illustrates changes in the pattern of high achievement since 2006, again showing the percentages performing at Level 6 and above Level 5 respectively.

.

Table 9

Country 2012 2009 2006
  Level 6 Levels 5 + 6 Level 6 Levels 5+6 Level 6 Levels 5+6
Australia 4.3 14.8 4.5 16.4 4.3 16.4
Canada 4.3 16.4 4.4 18.3 4.4 18
Finland 3.5 15.2 4.9 21.6 6.3 24.4
Hong Kong 12.3 33.4 10.8 30.7 9 27.7
Ireland 2.2 10.7 0.9 6.7 1.6 10.2
S Korea 12.1 30.9 7.8 25.5 9.1 27.1
New Zealand 4.5 15.0 5.3 18.9 5.7 18.9
Shanghai 30.8 55.4 26.6 50.7 N/A N/A
Singapore 19.0 40.0 15.6 35.6 N/A N/A
Taiwan 18.0 37.2 11.3 28.5 11.8 31.9
UK (England) 3.1 12.4 1.7 9.9 2.5 11.2
US 2.2 9.0 1.9 9.9 1.3 7.7
Average 3.3 12.6 3.1 12.7 3.3 13.4

.

The variations between countries tend to be far more pronounced than in reading:

  • There is a huge 28 percentage point spread in performance at Level 6 within this sample – from 2% to 30% – compared with a three percentage point spread in reading. The spread at Level 5 and above is also significantly larger – 46 percentage points compared with 17 percentage points in reading.
  • Shanghai has an 11 percentage point lead over its nearest competitor at Level 6 and an even larger 15 percentage point lead for Level 5 and above. Moreover it has improved significantly on both counts since 2009. Well over half its sample is now performing at Level 5 or above and almost a third are at Level 6.
  • Singapore and Taiwan are the next best performers, both relatively close together. Both are improving but, following a small dip in 2009, Taiwan is improving at a faster rate – faster even than Shanghai.
  • Hong Kong and South Korea also have similar 2012 profiles, as they did back in 2006. South Korea also lost ground in 2009, but is now improving at a faster rate than Hong Kong.
  • Finland appears to be experiencing quite significant decline: the proportion of Level 6 performers in 2012 is not far short of half what it was in 2006 and performance above Level 5 has fallen by more than nine percentage points. This is a somewhat different pattern to reading, in that the top performers are also suffering from the overall decline.

.

.

  • Australia, Canada and New Zealand have maintained broadly the same performance over time, though all are showing a slight falling off at Level 5 and above, and in New Zealand this also applies at Level 6.
  • After a serious slump in 2006, Ireland has overtaken its 2006 position. Meanwhile, the US has been making some progress at Level 6 but is less convincing at Level 5 and above.
  • Once again, this comparison does not particularly flatter England. It is not too far behind the Commonwealth countries and declining Finland at Level 6 but the gap is slightly larger at Level 5 and above. That said, England has consistently performed below the OECD average and remains in that position.
  • There are, however, some grounds for domestic celebration, in that England has improved by 2.5% at Level 5 and above, and by 1.4% at Level 6. This rate of improvement bears comparison with Hong Kong, albeit from a much lower base. It suggests a narrowing gap between England and its Commonwealth counterparts.

Table 10 gives the comparison with achievement at the bottom end of the distribution, setting out the percentages performing at different levels.

.

Table 10

Country Levels 5 and 6 Level 1 and below
  2006 2009 2012 2006 2009 2012
Australia 16.4 16.4 14.8 13.0 15.9 18.6
Canada 18 18.3 16.4 10.8 11.4 13.8
Finland 24.4 21.6 15.2 5.9 7.8 12.2
Hong Kong 27.7 30.7 33.4 9.5 8.8 8.5
Ireland 10.2 6.7 10.7 16.4 20.9 16.9
S Korea 27.1 25.5 30.9 8.8 8.1 9.1
New Zealand 18.9 18.9 15.0 14.0 15.5 22.6
Shanghai N/A 50.7 55.4 N/A 4.8 3.7
Singapore N/A 35.6 40.0 N/A 9.8 8.3
Taiwan 31.9 28.5 37.2 11.9 12.8 12.8
UK (England) 11.2 9.9 12.4 19.9 19.8 21.7
US 7.7 9.9 9.0 28.1 23.4 25.9
Average 13.4 12.7 12.6 21.3 22.0 23.0

.

Key points include:

  • The same pattern is discernible amongst the strongest performers as was evident with reading: those with the highest percentages at the top end tend to have the lowest percentages at the bottom. If anything this distinction is even more pronounced. Shanghai records a 52 percentage point gap between its highest and lowest performers and the latter group is only slightly larger than the comparable group in the reading assessment.
  • Amongst the Asian Tigers, the ratio between top and bottom is at least 3:1 in favour of the top. For most of the other countries in the sample, there is never more than a 7 percentage point gap between top and bottom, but this stretches to 9 in the case of England and 13 for the USA. Needless to say, the low achievers are in the majority in both cases.
  • Although the percentages for top and bottom in Australia are broadly comparable, it has shifted since 2006 from a position where the top end was in the majority by 3 percentage points to almost a mirror image of that pattern. In New Zealand, the lower achievers have increased by almost 9 percentage points, almost double the rate of decline at the top end, as their ‘long tail’ grows significantly longer.
  • Apart from Shanghai, only Singapore, Hong Kong and South Korea have fewer than 10% in the lower performing category. Despite its reputation as a meritocratic environment, Singapore gets much closer to Shanghai at the bottom of the distribution than it does at the top. The same is true of Hong Kong and South Korea.
  • It is also noticeable that none of the Tigers is making extraordinary progress at the bottom end. Hong Kong has reduced this population by 1% since 2003, Singapore by 1.5% since 2006, Shanghai by only 0.9% since 2006. The percentage has increased in South Korea and Taiwan. Improvement has been significantly stronger at the top of the distribution. Again this might suggest that the Tigers are closing in on the point where they cannot improve further at the bottom end.
  • In Finland, the percentage achieving the higher levels has fallen by over 9 percentage points since 2006, while the increase at the lower levels is over 6 percentage points. This compares with a 3 point fall at the top and a 6 point rise at the bottom in reading. The slump amongst Finland’s high achievers is clearly more pronounced in maths.
  • England’s 9.3 percentage point gap between the top and bottom groups in 2012 is lightly larger than the 8.7 point gap in 2006. It has a whopping 43 percentage point gap to make up on Shanghai at the top end, and an 18 point gap at the bottom. England is just on the right side of the OECD average at the bottom and just on the wrong side at the top.

.

.

The National Report notes that all jurisdictions ahead of England in the rankings had a higher percentage of learners at Level 5 or above.

As for percentiles

  • The difference between the 5th percentile (335 points) and the 95th percentile (652 points) was 316 in England. The average difference for OECD countries was 301, only slightly lower than that.
  • Ten countries had a greater difference than this, five of them amongst those the highest overall mean scores. Others were Israel, Belgium, Slovakia, New Zealand and France.
  • Whereas the difference between the lowest and highest percentiles has increased very slightly across all OECD countries, this is more pronounced in England, increasing from 285 points in 2009 to 316 points in 2012. This is attributable to decreasing scores at the 5th percentile (350 in 2006, 349 in 2009 and 335 in 2012) compared with changes at the 95th percentile (643 in 2006, 634 in 2009 and 652 in 2012).

.

Science

Table 11 compares the performance of this sample of PISA participants at the higher levels in the science assessment on the last three occasions.

.

Table 11

Country 2012 2009 2006
  Level 6 Levels 5 + 6 Level 6 Levels 5+6 Level 6 Levels 5+6
Australia 2.6 13.5 3.1 14.6 2.8 14.6
Canada 1.8 11.3 1.6 12.1 2.4 14.4
Finland 3.2 17.1 3.3 18.7 3.9 20.9
Hong Kong 1.8 16.7 2 16.2 2.1 15.9
Ireland 1.5 10.8 1.2 8.7 1.1 9.4
S Korea 1.1 11.7 1.1 11.6 1.1 10.3
New Zealand 2.7 13.4 3.6 17.6 4 17.6
Shanghai 4.2 27.2 3.9 24.3 N/A N/A
Singapore 5.8 22.7 4.6 19.9 N/A N/A
Taiwan 0.6 8.4 0.8 8.8 1.7 14.6
UK (England) 1.9 11.7 1.9 11.6 3.0 14.0
US 1.1 7.4 1.3 9.2 1.5 9.1
Average 1.2 8.4 1.1 8.5 1.3 8.8

.

In science, the pattern of high achievement has more in common with reading than maths. It shows that:

  • There is again a relatively narrow spread of performance between this sample of jurisdictions – approaching five percentage points at Level 6 and 20 percentage points at Level 5 and above.
  • As in reading, Singapore outscores Shanghai at the top level 6, but is outperformed by Shanghai at Level 5 and above. Both are showing steady improvement, but Singapore’s improvement at Level 6 is more pronounced than Shanghai’s.
  • Finland remains the third best performer, although the proportion of learners achieving at both Level 6 and Level 5 plus has been declining slightly since 2006.
  • Another similarity with reading is that Australia, Finland and New Zealand all perform significantly better at Level 6 than Hong Kong, South Korea and Taiwan. Hong Kong alone performs equally well at Level 5 and above. None of these three Asian Tigers has made significant progress since 2006.
  • In Australia, Canada, New Zealand and the US there has also been relatively little progress over time – indeed some evidence to suggest a slight decline. Conversely, Ireland seems to be moving forward again after a slight dip at Level 5 and above in 2009.

.

.

  • England was a strong performer in 2006, broadly comparable with many of its competitors. But it fell back significantly in 2009 and has made no progress since then. The proportions are holding up but there is no substantive improvement since 2009, unlike in maths and (to a lesser extent) reading. However England continues to perform somewhat higher than the OECD average. There is an interesting parallel with Taiwan, although that country dipped even further than England in 2009.

Table 12 provides the comparison with the proportions achieving the lower thresholds.

.

Table 12

Country Levels 5 and 6 Levels 1 and Below
  2006 2009 2012 2006 2009 2012
Australia 14.6 14.6 13.5 12.8 12.6 13.6
Canada 14.4 12.1 11.3 10.0 9.5 10.4
Finland 20.9 18.7 17.1 4.1 6.0 7.7
Hong Kong 15.9 16.2 16.7 8.7 6.6 5.6
Ireland 9.4 8.7 10.8 15.5 15.1 11.1
S Korea 10.3 11.6 11.7 11.2 6.3 6.7
New Zealand 17.6 17.6 13.4 13.7 13.4 16.3
Shanghai N/A 24.3 27.2 N/A 3.2 2.7
Singapore N/A 19.9 22.7 N/A 11.5 9.6
Taiwan 14.6 8.8 8.4 11.6 11.1 9.8
UK (England) 14.0 11.6 11.7 16.7 14.8 14.9
US 9.1 9.2 7.4 24.4 18.1 18.2
Average 8.8 8.5 8.4 19.3 18.0 17.8

 .

  • Amongst the top performers the familiar pattern reappears. In 2012 Shanghai has 27% in the top categories against 2.7% in the bottom categories. This is very similar to reading (25.1% against 2.9%). At the bottom end, Shanghai’s nearest competitors are Hong Kong and South Korea, while Singapore and Taiwan are each approaching 10% at these levels. This is another similarity with reading (whereas, in maths, Singapore is more competitive at the lower end).
  • Since 2009, Shanghai has managed only a comparatively modest 0.5% reduction in the proportion of its students at the bottom end, compared with an increase of almost 3% at the top end. This may lend further support to the hypothesis that it is approaching the point at which further bottom end improvement is impossible.
  • No country has made consistently strong progress at the bottom end, though Ireland has made a significant improvement since 2009. There has been steady if unspectacular improvement in Hong Kong, Taiwan and Singapore. South Korea, having achieved a major improvement in 2009 has found itself unable to continue this positive trend.
  • Finland’s negative trend is consistent since 2006 at both ends of the achievement spectrum, though the decline is not nearly as pronounced as in maths. In science Finland is maintaining a ratio of 2:1 in favour of the performers at the top end, while percentages at top and bottom are now much closer together in both reading and maths.
  • There are broadly similar negative trends at top and bottom alike in the Commonwealth countries of Australia, Canada and New Zealand, although they have fallen back in fits and starts. In New Zealand the balance between top and bottom has shifted from being 4 percentage points in favour of the top end in 2006, to 3 percentage points in favour of the bottom end by 2012.
  • A similar gap in favour of lower achievers also exists in England and is unchanged from 2009. By comparison with the US (which is a virtual mirror image of the top-bottom balance in Finland, Singapore or South Korea) it is in a reasonable position, rather similar to New Zealand, now that it has fallen back.
  • England has a 1.5 percentage point gap to make up on Shanghai at the top end of the distribution, compared with a 12.2 percentage point gap at the bottom.

The PISA 2012 National Study reports that only the handful of jurisdictions shown in Table 11 above has a larger percentage of learners achieving Level 6. Conversely, England has a relatively large number of low achievers compared with these jurisdictions.

Rather tenuously, it argues on this basis that:

‘Raising the attainment of lower achievers would be an important step towards improving England’s performance and narrowing the gap between highest and lowest performers.’

When it comes to comparison of the 5th and 95th percentiles:

  • The score at the 5th percentile (343) and at the 95th percentile (674) gives a difference of 331 points, larger than the OECD average of 304 points. Only eight jurisdictions had a wider distribution: Israel, New Zealand, Luxembourg, Slovakia, Belgium, Singapore and Bulgaria.
  • The OECD average difference between the 5th and 95th percentiles has reduced slightly (from 311 in 2006 to 304 in 2012) and there has also been relatively little change in England.

.

Top-Performing All-Rounders

Volume 1 of the OECD’s ‘PISA 2012 Results’ document provides additional data about all-round top performers achieving Level 5 or above in each of the three domains.

.

PISA 2012 top performers Capture.

The diagram shows that 4.4% of learners across OECD countries achieve this feat.

This is up 0.3% on the PISA 2009 figure revealed in this PISA in Focus publication.

Performance on this measure in 2012, compared with 2009, amongst the sample of twelve jurisdictions is shown in the following Table 13. (NB that the UK figure is for the UK combined, not just England).

.

Table 13

2012 2009
%age rank %age rank
Australia 7.6 7 8.1 6
Canada 6.5 9 6.8 8
Finland 7.4 8 8.5 4
Hong Kong 10.9 4 8.4 5
Ireland 5.7 15 3.2 23
S Korea 8.1 5 7.2 7
New Zealand 8.0 6 9.9 3
Shanghai 19.6 1 14.6 1
Singapore 16.4 2 12.3 2
Taiwan 6.1 10 3.9 17
UK 5.7 15 4.6 14
US 4.7 18 5.2 11
Average 4.4 4.1

 .

In terms of percentage increases, the fastest progress on this measure is being made by Hong Kong, Ireland, Shanghai, Singapore and Taiwan. Shanghai has improved a full five percentage points and one in five of its students now achieve this benchmark.

The UK is making decent progress, particularly compared with Australia, Canada, Finland New Zealand and the US, which are moving in the opposite direction.

The Report notes:

‘Among countries with similar mean scores in PISA, there are remarkable differences in the percentage of top-performing students. For example, Denmark has a mean score of 500 points in mathematics in PISA 2012 and 10% of students perform at high proficiency levels in mathematics, which is less than the average of around 13%. New Zealand has a similar mean mathematics score of 500 points, but 15% of its students attain the highest levels of proficiency, which is above the average…these results could signal the absence of a highly educated talent pool for the future.

Having a large proportion of top performers in one subject is no guarantee of having a large proportion of top performers in the others. For example, Switzerland has one of the 10 largest shares of top performers in mathematics, but only a slightly-above-average share of top performers in reading and science.

Across the three subjects and across all countries, girls are as likely to be top performers as boys. On average across OECD countries, 4.6% of girls and 4.3% of boys are top performers in all three subjects…To increase the share of top-performing students, countries and economies need to look at the barriers posed by social background…the relationship between performance and students’… and schools’ organisation, resources and learning environment.’ (p65)

.

Denizen by Gifted Phoenix

Denizen by Gifted Phoenix

 

Conclusions

Priorities for Different Countries

On the basis of this evidence, it is possible to draw up a profile of the performance of different countries across the three assessments at these higher levels, and so make a judgement about the prospects in each of ‘a highly educated talent pool for the future’. The twelve jurisdictions in our sample might be advised as follows:

  • Shanghai should be focused on establishing ascendancy at Level 6 in reading and science, particularly if there is substance to the suspicion that scope for improvement at the bottom of the spectrum is now rather limited. Certainly it is likely to be easier to effect further improvement at the very top.
  • Singapore has some ground to catch up with Shanghai at Level 6 in maths. It has narrowed that gap by three percentage points since 2009, but there is still some way to go. Otherwise it should concentrate on strengthening its position above Level 5, where Shanghai is also conspicuously stronger.
  • Hong Kong needs to focus on Level 6 in reading and science, but perhaps also in maths where it has been extensively outpaced by Taiwan since 2009. At levels 5 and above it faces strong pressure to maintain proximity with Shanghai and Singapore, as well as marking the charge made by Taiwan in reading and maths. Progress in science is relatively slow.
  • South Korea should also pay attention to Level 6 in reading and science. It is improving faster than Hong Kong at Level 6 in maths but is also losing ground on Taiwan. That said, although South Korea now seems back on track at Level 5 and above in maths, but progress remains comparatively slow in reading and science, so both Levels 5 and 6 need attention.
  • Taiwan has strong improvement in reading and maths since 2009, but is deteriorating in science at both Levels 5 and 6. It still has much ground to pick up at Level 6 in reading. Its profile is not wildly out of kilter with Hong Kong and South Korea.
  • Finland is bucking a downward trend at Level 6 in reading and slipping only slightly in science, so the more noticeable decline is in maths. However, the ground lost is proportionately greater at Level 5 and above, once again more prominently in maths. As Finland fights to stem a decline at the lower achievement levels, it must take care not to neglect those at the top.
  • Australia seems to be slipping back at both Levels 5 and 6 across all three assessments, while also struggling at the bottom end. There are no particularly glaring weaknesses, but it needs to raise its game across the board.
  • Canada is just about holding its own at Level 6, but performance is sliding back at Level 5 and above across all three domains. This coincides with relatively little improvement and some falling back at the lower end of the achievement distribution. It faces a similar challenge to Finland’s although not so pronounced.
  • New Zealand can point to few bright points in an otherwise gloomy picture, one of which is that Level 6 performance is holding up in reading. Elsewhere, there is little to celebrate in terms of high achievers’ performance. New Zealand is another country that, in tackling more serious problems with the ‘long tail’, should not take its eye off the ball at the top.

.

.

  • The US is also doing comparatively well in reading at Level 6, but is otherwise either treading water or slipping back a little. Both Level 6 and Level 5 and above need attention. The gap between it and the world’s leading countries continues to increase, suggesting that it faces future ‘talent pool’ issues unless it can turn round its performance.
  • Ireland is a good news story, at the top end as much as the bottom. It has caught up lost ground and is beginning to push beyond where it was in 2006. Given Ireland’s proximity, the home countries might want to understand more clearly why their nearest neighbour is improving at a significantly faster rate. That said, Ireland has significant room for improvement at both Level 6 and Level 5 and above.
  • England’s performance at Level 6 and Level 5 and above has held up surprisingly well compared with 2009, especially in maths. When the comparison is solely historical, there might appear to be no real issue. But many other countries are improving at a much stronger rate and so England (as well as the other home countries) risks being left behind in the ‘global race’ declared by its Prime Minister. The world leaders now manage three times as many Level 6 performers in science, four times as many in reading and ten times as many in maths. It must withstand the siren voices urging it to focus disproportionately at the bottom end.

.

Addressing These Priorities

It is far more straightforward to pinpoint these different profiles and priorities than to recommend convincingly how they should be addressed.

The present UK Government believes firmly that its existing policy direction will deliver the improvements that will significantly strengthen its international competiveness, as judged by PISA outcomes. It argues that it has learned these lessons from careful study of the world’s leading performers and is applying them carefully and rigorously, with due attention to national needs and circumstances.

.

.

But – the argument continues – it is too soon to see the benefits of its reforms in PISA 2012, such is the extended lag time involved in improving the educational outcomes of 15 year-olds. According to this logic, the next Government will reap the significant benefits of the present Government’s reform programme, as revealed by PISA 2015.

Recent history suggests that this prediction must be grounded more in hope than expectation, not least because establishing causation between indirect policy interventions and improved test performance must surely be the weakest link in the PISA methodology.

But, playing devil’s advocate for a moment, we might reasonably conclude that any bright spots in England’s performance are attributable to interventions that the previous Government got right between five and ten years ago. It would not be unreasonable to suggest that the respectable progress made at the top PISA benchmarks is at least partly attributable to the national investment in gifted education during that period.

We might extend this argument by suggesting a similar relationship between progress in several of the Asian Tigers at these higher levels and their parallel investment in gifted education. Previous posts have drawn attention to the major programmes that continue to thrive in Hong Kong, Singapore, South Korea and Taiwan.

Shanghai might have reached the point where success in mainstream education renders investment in gifted education unnecessary. On the other hand, such a programme might help it to push forward at the top in reading and science – perhaps the only conspicuous chink in its armour. There are lessons to be learned from Singapore. (Gifted education is by no means dormant on the Chinese Mainland and there are influential voices pressing the national government to introduce more substantive reforms.)

Countries like Finland might also give serious consideration to more substantive investment in gifted education geared to strengthening high attainment in these core domains. There is increasingly evidence that the Finns need to rethink their approach.

.

.

The relationship between international comparisons studies like PISA and national investment in gifted education remains poorly researched and poorly understood, particularly how national programmes can most effectively be aligned with and support such assessments.

The global gifted education community might derive some much-needed purpose and direction by establishing an international study group to investigate this issue, providing concrete advice and support to governments with an interest.

.

GP

December 2013

Gifted Phoenix Twitter Round-up Volume 12: Giftedness and Gifted Education

.

Here is a slightly overdue termly round-up of activity on the Gifted Phoenix Twitter feed.

4-Eyes-resized-greenjacketfinalThe sheer volume of activity undertaken over the four month period since my last review – attributable to my efforts to cover domestic education policy alongside global gifted activity – has led me to experiment with separating those two strands.

So this section of Volume 12 is dedicated to giftedness and gifted education over the period February 24 to July 3 2013.

Two further sections are devoted to wider education policy, organised on a thematic basis.

The material is organised into the following categories:

  • Global coverage, including sub-sections for each continent. As ever, this broadly reflects the distribution of activity worldwide, with little happening in Africa and a lot in the US.
  • UK coverage, including a discrete sub-section on Ofsted’s ‘Most Able Students’ survey, published in June 2013.
  • Thematic coverage, containing sub-sections on Intelligence and Neuroscience, Creativity and Innovation, Twice-Exceptional and Gifted Research.
  • Gifted Commentary, with subsections devoted to Yours Truly, Twitter chats and other posts.

Because the timespan covered by this review is relatively long, I have decided to keep the broad chronological order rather than grouping tweets thematically within sections. This means that readers will need to search a little more – for example for the limited non-US coverage within the sub-section devoted to The Americas.

As usual I have relied almost exclusively on my own Tweets, including only those that carry a hyperlink. I have not checked that all links remain live. I have included a few retweets and modified tweets originated by others.

.

GP

July 2013

.

Giftedness and Gifted Education Around the World

.

Global

A Learnist board on gifted education: http://t.co/hXvqfPPpsX

The Open Education Database includes a single offering on gifted education: http://t.co/n5HtaP66gm Frankly that’s pathetic

Confirmation that @LesLinks is the new World Council President: http://t.co/A56sCrvGGi – I shall have to mind my Ps and Qs!

Looks as though ICIE’s 2014 Conference is in Chennai, India: http://t.co/RQZKuG5SDr – Usual suspects involved

IRATDE’s latest journal – Talent Development and Excellence Vol 5 No 1 (2013): http://t.co/OWuqQf1CQ2

Inside view of WCGTC Conference preparations: http://t.co/UO5srcTllP – I hadn’t appreciated that Denmark is hosting in 2015

World Council Conference in Kentucky is up to 350 acceptances: http://t.co/o4C2b81NbJ so they need a last-minute surge

World Council 2015 Gifted Conference in Denmark will be located in Odense, August 10-14: http://t.co/UT3KQX689u No direct flights?

.

Africa

Guardian feature on Sheikh School, the ‘Eton of Somaliland’: http://t.co/OaOaYIMihD

.

.

.

Americas

A more hostile position on the expansion of Renzulli academies in Connecticut: http://t.co/jaICAcZPT2

About US NAGC’s Administrator’s Toolbox for Gifted Education http://t.co/Dv5vCrzgXG – which is here http://t.co/DXQO69mgGL

The row about NYC’s gifted programme rumbles on and on…and on: http://t.co/JGXD0nFjZH

New Executive Director of US federal initiative to secure Educational Excellence for African Americans: http://t.co/qXjabm2die

How does Insight Help Gifted Children? http://t.co/Y3OcICZctn – Piece on Esther Katz Rosen Early Career Research Grants

Paper on impact on gifted learners of inclusion policy in British Columbia: http://t.co/9lYbM7LVnm

Evidence of a backlash against those proposed new Renzulli academies in Connecticut: http://t.co/quLDWY2v4L

Article on college readiness of gifted students by CTY’s Director: http://t.co/5voDJ6Vius

Senator Chuck Grassley continues his support for high ability students in the USA: http://t.co/oZMw19VfPd

Legal action threatened over gifted education in NY State: http://t.co/E1TlcAxf8D

.

.

Chuck Grassley press release on latest introduction of the Talent Act: http://t.co/Fu1CYSzBPN

MT @ljconrad: Vanderbilt Programs for Talented Youth Newsletter http://t.co/er6yARhY3l

CEC press release on the latest edition of the Talent Act: http://t.co/wRv4oKHdux

Missouri Senate progressing bill to establish a gifted and talented advisory council: http://t.co/NJEhvpAQSr

More on ability grouping in the US: http://t.co/tYNQ1KuqM6 and http://t.co/AbuYy17epj

The Socialist Worker perspective on gifted education in New York City: http://t.co/qPNdy6qmO3

Recap of an event to discuss gifted education issues in Ohio: http://t.co/f7CEH1wKEy

State-wide review of gifted education in Pennsylvania moves a step closer: http://t.co/fbNx3hge1x

Louisiana gifted funding plan under fire: http://t.co/jaQoi3Rnkz and this from Ravitch http://t.co/zX0G9xZhBk

A bit more negative reaction to Louisiana proposal to link gifted funding to test scores: http://t.co/VnhdKFEJdR

Why my grandson, 4, won’t be taking a gifted ed test: http://t.co/qUzOBnPKuN

Meanwhile discussion continues over Delaware’s grants for gifted education bill: http://t.co/gkVWfM8hxB

The Renzulli Academy planned in New London is relegated to an incubator programme: http://t.co/GNKGWRyg1k

NEPC review says recent ‘Does Sorting Students Improve Test Scores’ paper too poor to inform tracking policy: http://t.co/obpw7NfPD9

Pro-acceleration legislation enacted in Colorado: http://t.co/7vFyB4NtwV

Belin-Blank on Grassley’s Talent Act: http://t.co/BYXt7WCICc

Rapper Wale (next album ‘Gifted) to perform at WKU, home of the World Council. A publicist’s dream! http://t.co/Bym9lN029e

Florida’s apprach to gifted education begins to focus more strongly on equity issues: http://t.co/kRL2iloSGM

Jann Leppien lands that Gifted Chair at Whitworth U (reserved for someone of a Christian persuasion) http://t.co/xg6dke3EMQ

Report on Talent Management in US Education: http://t.co/kPKMYUcltd – They and we could start the process with school-age students

Loveless reviews the US history of tracking and ability grouping and calls for more research: http://t.co/AIZLQyFWqM

A couple of reports on initial impact of changes to tests for the NYC gifted programme: http://t.co/YzdRaTxakf and http://t.co/oOOwOJiVvY

January 2013 CTYI doctoral thesis about impact of the Centre for Academic Achievement (CAA): http://t.co/ODL52gznL7

Iowa elementary school teacher says gifted learners deserve attention too: http://t.co/horeVbLOZI

NGLB – No Gifted Left Behind: http://t.co/8CFuut2zyi  - a view from Illinois

Ohio’s new report cards include gifted learners. Simulation based on old data suggests shortcomings: http://t.co/LrEffMShs7

Pearson make clunking great horlicks of NY gifted test http://t.co/opewyB0e3F and http://t.co/8sp7rTPdNN Humble pie abounds

Belin-Blank Director refers you to her paywalled research http://t.co/lQED8Dzg8o I want it freely accessible

State report card shows some high performing Ohio districts don’t cut the mustard with gifted ed: http://t.co/51ziRRRbfa

Rumblings continue over Pearson’s testing issues in NYC http://t.co/QIBSIMhDac Apparently it’s being called TestingGATE (ho ho)

Democrat sources argue for reform to NYC’s gifted programme: http://t.co/k8yjUjIWJq

A call for stronger gifted education in Baltimore: http://t.co/h76lnFLwMh

Pearson’s gifted assessment contract with NYC reportedly under threat as a second error is uncovered http://t.co/4teFDmhdMY

More from across the Atlantic on grouping by ability: http://t.co/cgTGYl3Kv3

African-Americans and Hispanics are heavily under-represented in Virginia’s gifted programmes: http://t.co/KpMfDhwqbs

Profile of Sue Khim: http://t.co/iacJ6ZyNXd the founder of Brilliant: http://t.co/80kxzDpjGE

Following the testing debacle, NYC gifted admissions process now faces a parental lawsuit: http://t.co/lLTyd8ZepR

Brief feature on the founder of a Center for Talent Attention, presumably based in Mexico: http://t.co/sDRh0bbQfn

New London has rejected a Renzulli Academy: http://t.co/P7UE2nxaa6 – but is it the last word? http://t.co/DbEsQjYk7C

Latest NEPC Policy Brief is resolutely anti-tracking and so won’t go unchallenged: http://t.co/vDL5sWqYqb

Looks as though @donnayford is launching a blog: http://t.co/HBUkwt6oJC

Gatton Academy at WKU has a relationship with Harlaxton College in Grantham http://t.co/Dx0OIJdaEC

.

.

Debate about the pros and cons of ability grouping continues: http://t.co/XIkM2DME8I

Finding America’s Missing AP/IB Students http://t.co/zp3g0tuGkb Education Trust says they can help tackle excellence gaps

More about ability grouping, from the NYT: http://t.co/E5wGV5DLzY

US NAGC press release on inclusion of gifted learners in draft ESEA Reauthorisation Bill: http://t.co/vao9Go0imf

Another view on ability grouping/tracking in US http://t.co/NqkxWKI8qS Will Ofsted report on ‘most able’ reignite debate here?

Another contribution to US debate on ability grouping: http://t.co/WZ59o5vYLx

Fixing America’s Talent Problem (mostly higher education focused): http://t.co/SHHG8n5Smi

CEC press release on the latest moves to introduce a US TALENT Act: http://t.co/8VTwT2f3Wt

NAGC’s Press Release on the Talent Act: http://t.co/gWK0WVBy0m

A real slanging match in the comments on: ‘The Anti-Gifted Sentiment Behind Closing the Gap’: http://t.co/D8lXzc7a76

A giftedness blog in British Columbia has come back to life: http://t.co/Bpvs6bvgzh

Ending the neglect of Illinois’ gifted students: http://t.co/mEVTVGWiiB

This page carries a link to a powerpoint on gifted education (for women) in Costa Rica: http://t.co/yAdl2mOnQa

NYC gifted education again: http://t.co/BQtVIm1Qnk  (including the judge who needs a crash course in gifted education)

.

Asia

Crown Prince of Abu Dhabi donates $10m to Permata Pintar gifted programme in Malaysia: http://t.co/wW1IetQpik – Jealous!

There’s a talk in Cambridge next week on gifted education in Kazakhstan: http://t.co/dIhXx7J8RY

Next round of gifted education awards in the Philippines: http://t.co/j2RakCyxBd

New Wikipedia entry on the High School for Gifted Students at Hanoi University of Science: http://t.co/YxC1XdFxg4

Positive outcomes of Malaysia’s Permata Pintar Gifted programme via @noorsyakina: http://t.co/E20rqM9yrn

Over in Hong Kong, HKAGE is running a student conference on giftedness and creativity in November: http://t.co/kXbbHsEAOX

Hong Kong Academy for Gifted Education needs Associate Director for Student Programmes and Services: http://t.co/gkvdkumROH

A very brief item on Kuwaiti gifted education from the national news agency: http://t.co/Y3pp2uLaGa

Interesting feature on giftedness from the Bangkok Post: http://t.co/BunytKPhTI  (don’t be put off by the awful stock photo)

Recording of that Cambridge seminar I referenced on gifted education in Kazakhstan: http://t.co/F4DOdMdXdP

.

.

Bloom Nepal sounds like a valuable gifted education initiative in that country: http://t.co/IOwuufRCCL

Is Vietnam’s national gifted education programme a waste of money? http://t.co/0w2EBYNK8o

A Talent School of Academic and Arts (TSAA) is opening in Makati, Philippines: http://t.co/FqPGmUIrrZ

A Glance at Gifted Education in Singapore: http://t.co/eDSr2TBN7u

Brief piece on gifted education in Bahrain: http://t.co/Y65v1z2OP3

Mawhiba (gifted education in Saudi Arabia) is supporting over 12,000 students in its third phase: http://t.co/WiyZVElRTW

Evaluating the Effects of the Oasis Enrichment Model (on gifted education in Saudi Arabia): http://t.co/SGRmBNiRUz

.

.

Feature on China’s School for the Gifted Young: http://t.co/aBOMYNSig6 with an interesting opening line

RT @noorsyakina: First Lady of Mozambique visits Permata Pintar in Malaysia http://t.co/dnANlt76BU

There’s now a National Association of Gifted Education in India. Here’s its test website: http://t.co/dPGDamYkXu

Expansion of Saudi Mawhiba gifted summer school plus international girls’ programme involving CTY http://t.co/KiJ1jmUbmT

Bahraini students will take part in the Mawhiba-CTY girls only summer school: http://t.co/dkdufvfgYW

UKM in Malaysia has signed a MoU with Kazakhstan University including gifted education collaboration http://t.co/hA9sB4GbV7

The Eden Center: A Haven for Korea’s Highly Gifted Kids: http://t.co/GHeLfktoQz

A piece on teaching mathematically gifted Muslim girls from India: http://t.co/a90Tm7wzkN

Kazakhstan: Nazarbayed Intellectual Schools needs teachers (to teach in English) http://t.co/vjjkIed6Ty

Jakarta Post features an academy for poor but gifted students in Sumatra: http://t.co/Tf71eMTGar

China has launched a first Regional Talent Competitiveness Report: http://t.co/h9IvtwqDDO and http://t.co/lPk4yXgGjL

 .

Australasia

Gifted Kids in NZ has appointed a new chair: http://t.co/obXU0lMRb9

RT @jofrei: Gifted Resources March newsletter can be read online at http://t.co/Gratg3fUQm

Gifted education is a focus in state elections in Western Australia: http://t.co/jyAqQFFn3F

Feature on gifted education in the Bay of Plenty in New Zealand: http://t.co/YYjYqfPT9C

Brief Massey University press release on an upcoming regional gifted education conference in NZ: http://t.co/mpzOjxNeiI

.

.

Guidance from New Zealand about developing Professional Learning Networks in Gifted Education: http://t.co/utariVh8f9

MT @jofrei: Gifted Resources March No 2 Newsletter can be read online at http://t.co/88FWthytvj

New article from New Zealand comparing enrichment and acceleration: http://t.co/m1g8aRSuXH

TKI Gifted in NZ is now advertising the World Council Conference, shifted from NZ to Kentucky: http://t.co/e9Mnm2TqyQ

Bit of a coup for GERRIC, who are running gifted teacher education courses for ESF in Hong Kong: http://t.co/p6tDaQX0A9

MT @jofrei: Gifted Resources April Newsletter can be read online at http://t.co/pOueUEIB8t

State Government’s response to the Inquiry into Victorian gifted education begins to emerge: http://t.co/PNr9auuYJK

University of New England (Australia) seeks Lecturer in School Pedagogy/Gifted Education: http://t.co/ePU1pZqZIz

.

.

MT @jofrei: VAGTC EmpowerED Conference report on Gifted Resources blog http://t.co/G0OjSqlF6B

Extended differentiated Instruction presentation from recent gifted conference in Victoria, Australia: http://t.co/dm1E47L7BV

New Zealand’s Got Talent. The Role of Schools in Talent Development: http://t.co/0EKvjkF6Mn – Unites arguments I support and oppose

Time for the annual New Zealand Gifted Awareness Blog Tour: http://t.co/XdT1KDi54K

Welcome to the NZGAW Blog Tour 2013: http://t.co/oXGeftp9fY

‘Your MP is Probably Gifted’: http://t.co/9Fu48aVyFb – a timely comment from New Zealand Gifted Awareness Week

Australian Mensa is worried about what happens to gifted students in Australian universities: http://t.co/M1Zt0KI2WY

A contribution to the ability grouping debate (the one in NZ this time): http://t.co/6Avu7y0R8Y

Young members of Mensa New Zealand: http://t.co/pCD2xPyiQj – a world away from Child Genius!

Picture this: gifted (from NZGAW): http://t.co/e97yIHukn2

Gifted Kids at [NZ] Parliament: http://t.co/aftRSrW7EF – Green Party support for NZGAW

NZ Labour Party supports Gifted Awareness Week: http://t.co/8v4g56ksrT

Another NZGAW offering – Kiwi learners reflect on what it means to be gifted: http://t.co/bQSsLacDnT

Investigation into the Identification of Maori Gifted and Talented Students (from NZGAW): http://t.co/vc0zILWe53

RT @ljconrad: AUS: Gifted Resources Newsletter June 2013 (pdf) from @jofrei  http://t.co/RNxEYa8nu6

Interesting progress report on New South Wales’ Virtual Selective High School, xsel: http://t.co/KEjmg9nTZU

.

Europe

It’s Ireland’s 3rd National Gifted Awareness Week soon! Are you a potential sponsor? http://t.co/I9iVAuZouM

European Talent Centre website has ended its hibernation; features an essay by Roland Persson http://t.co/Lgeqj4REBe

Summary of the recent EU Hearing on Talent Support: http://t.co/Umk3BKVOeN – No comment.

The EU Talent Centre has finally published volume 2 of International Horizons of Talent Support: http://t.co/MYTWMSAmwt

ECHA is calling for bids to host its 2016 conference http://t.co/98uHStXGsz and http://t.co/oUzo2GGgPW – Deadline 10 April

Maltese Education Department reforms to support high achievers. Report: http://t.co/A4JYf7xlYI – coverage: http://t.co/psoqdt8bo3

Potential Plus and Silverman on Tour in Denmark: http://t.co/8ZUBGlrcpS

Contributions to Denmark’s 2013 Symposium on gifted including contributions from Potential Plus: http://t.co/nSYL0CAnYe

RT @GTNIrl: What if Giftedness was not defined as SEN in Ireland? http://t.co/JVQSh5AL5J

.

.

EESC Opinion Unleashing the potential of children and young people with high intellectual abilities in EU: http://t.co/J9E0hwsukb

MT @Dazzlld: Some news from the Irish Gifted Education Blog: http://t.co/ay09uPNlPU

MT @peter_lydon:Gifted And Talented Network Ireland helps parents of gifted children to support each other http://t.co/1253SSD58e

Gifted education arrives in Gozo: http://t.co/FxsZuuS6bE

RT @Begabungs: The first Gifted Awareness Week in Germany – June 3rd to June 9th 2013 https://t.co/uMtCNEKEES

Supply of Turkish gifted education inadequate to meet demand (courtesy of @ljconrad): http://t.co/MfPzEzTHpa

CTYI/DCU setting up Irish Centre for Gifted Research with support from College of William and Mary: http://t.co/R0HpozNZyr

Armenian scholarship fund for gifted learners at Dilijan International School: http://t.co/8tLFu5KNTb and: http://t.co/rUU5kh6e4Z

MT @Begabungs: Article from France! Thank you France! http://t.co/ZSbBrhlC2A

Legislative Strategies to Promote Talent in Romania (full text via PDF link): http://t.co/lFWXzY5kLQ

RT @Begabungs: The Development of Giftedness and Talent in 21st Century October 5th – 6th, 2013 Toulouse http://t.co/16J0IxGqXj

.

UK Coverage

.

News and Developments

Dance and Drama Awards Guide for 2013/14 (New Students): http://t.co/tFqhAJEHFf

Dear Treasury: economic growth is driven by human capital. Jerrim makes strong case for investment in high achievers http://t.co/ZJLRxj49gl

TES on How to Meet the Needs of Child Prodigies http://t.co/UcL9k1MtUB plus article featuring my alter ego:  http://t.co/wRp4Q8JqHn

A positive profile of Chetham’s, part of the MDS and an important part of our gifted education provision: http://t.co/32dtGvkOS1

Gove concedes that ‘there is much more that we can do’ to support high achievers: http://t.co/ZGicrHF4sn (Col 652) We’re all ears

Will removal of a flexi-schooling option impact disproportionately on gifted learners? Evidence?: http://t.co/PO7c9E1TGB

New Ofsted Report on Schools’ Use of Early Entry to GCSE Examinations (March 2013): http://t.co/5yWot7W64K

TES: Familiar portrayal of Chinese education ethos http://t.co/WtDQUxnkaz Author (a head) wants to ban use of ‘gifted and talented’

Adonis is new chair of trustees at IPPR: http://t.co/vBrUPuECIA so maybe they’ll show some interest in future of gifted education

.

.

Cridland speech to #ascl2013 asks whether gifted learners get the challenge and support they need: http://t.co/fDqdQx0mLM

Q. How can education best contribute to Cameron’s ‘global race’? A. Partly by investing in tomorrow’s high achievers: http://t.co/jaiOGpmi15

Concern at the plight of EAL support – will hit the oft-forgotten EAL gifted learners: http://t.co/7wgrOeYasU

Reports on safeguarding at Chetham’s: http://t.co/jZW9f4zhc2  and http://t.co/omukgQctTI  - will there be wider implications for MDS?

@judeenright Amazingly I’ve just had a pingback from a post on Dux you published 362 days ago!: http://t.co/nBduvBiE1p

Will Gilbert’s audit push Thurrock to improve gifted education? This mum hopes so: http://t.co/351GRxdWOH – I won’t hold my breath

RT @DMUVC: Hundreds of secondary school pupils have been on campus for DMU Gifted and Talented programme http://t.co/lJEPoRRadi

New DfE research on KS2 Level 6 Tests: http://t.co/2FdvVGeoKY – Critical of lack of guidance; doesn’t mention disappearance of L6

“It is the unfortunate nature of state schools that gifted children are often limited”: http://t.co/u8nHPUqy0G

Somewhere in England there’s a school that thinks NAGTY still exists: http://t.co/uny4HkWtUc – It closed in 2007

Sutton Trust’s future strategy features Open Access (bad) and Helping the Highly Able (depends how) – see p5 http://t.co/cUt0swcx09

TES says Government is no longer promoting setting: http://t.co/ZkVd71YWKY – but what will Ofsted say about impact on highly able?

How Level 6 tests are viewed in secondaries: http://t.co/Ie7nzkOWOA Gifted learners suffer badly from this poor transition practice

.

.

Waiting to see whether and how high attainers will be accommodated in TechBacc: http://t.co/vZRkJOHy83 and http://t.co/ZDW6dxiJ7h

Cybersecurity’s the latest industry to harness the power of gifted learners: http://t.co/yPMEixt8Bk

We had the school that thought NAGTY still existed; now we have the College seeking to re-energise YG&T: http://t.co/ACToxSXdnd

Still no TES this morning so you’ll have to make do with my new post on KS2 L6 and prospects for a Summer of Love: http://t.co/EWOMHD0sql

THE article on Universities’ sponsorship of academies http://t.co/py3vPFq91f and my piece on 16-19 maths free schools http://t.co/UQDNCNXwuX

Collaborative support for gifted education in Dudley: http://t.co/7UElTLGmCH

The importance of cross-phase collaboration: http://t.co/ge0Gpe6fp7 – critical for gifted learners as the KS2 L6 report showed

Abuse enquiries spreading across MDS schools: http://t.co/bLYoPGN2yE – Presumably some central action is under consideration

One of Labour’s policy forums urged review of gifted education policy: http://t.co/ynPH77YZJN (more detail in linked Word doc)

Cambridge University willl be sponsoring the Villiers Park Scholars Programme in Hastings: http://t.co/sc1JivwZHd

IGGY’s reached 2,500 members: http://t.co/a3rqhHEsN2 and http://t.co/75yQ79Q89K That’s slower progress than I’d anticipated

My post on IGGY discusses its membership/targets: http://t.co/ruSQuV6EUO 3,000 members’ claimed in 2012 v ‘over 2,500′ now?

Kings College 16-19 Maths School’s appointed a Head http://t.co/NQGPXxxClo My progress report on 16-19 Maths Schools http://t.co/UQDNCNXwuX

This TES report states explicitly that 16 16-19 maths schools are planned: http://t.co/NQGPXxxClo – Would like to know the source for that

Hoping for crossover between Ofsted’s upcoming reports on highly able and gap-narrowing. Excellence gaps need closing http://t.co/giEK2eymau

.

.

Estyn’s Report on KS2/3 Science says more able pupils are insufficiently stretched: http://t.co/17iWrTpNEP

TES on threat to NASA’s space education budget: http://t.co/Am5IouEKeK – would be a significant loss to gifted education

Timely publicity for Government-supported Cyber Security Talent Search for KS4 students: http://t.co/o2VvpILuOn GCHQ is a sponsor!

Thought-provoking piece ahead of ‘Child Genius’: http://t.co/x0W3a67Z6y Penultimate paragraph is the killer

Latest edition of the gtvoice Newsletter: http://t.co/Ht0vtI8Sn8 Mentions two very important meetings in this ‘Summer of Love’

Congratulations to Horndean Techonology College for being one of 8 lead schools for more able http://t.co/4RHBV3CuDM  Not sure whose scheme?

Sweeteners for university sponsors of 16-19 maths free schools http://t.co/cEcP8nIKs3 My analysis of progress to date http://t.co/UQDNCNXwuX

Here’s a brief report on Fair Access issues, especially some news about the Dux Award Scheme: http://t.co/krPc7Uweo4

STA received 240 complaints re non-registration of KS2 pupils for Level 6 tests post-deadline: http://t.co/zYAuduZ0ST (Col 531W)

 .

Ofsted Report

Still wondering why Ofsted’s rapid response gifted education survey: http://t.co/PyLE23L00o – isn’t yet listed here: http://t.co/ZItPkhyO0u

HMCI still bigging up Ofsted’s upcoming report on highly able: http://t.co/QeNdhwvv2A Identification, tracking sure, but streaming?

Telegraph says Ofsted’s ‘Most Able Pupils’ report will issue next week, but no new details of likely content http://t.co/wg9OhTcmvn

Telegraph calls the Ofsted Able pupils Report ‘damning’; Ofsted will now routinely check whether their needs are met: http://t.co/a2wB03Gc9a

Guardian coverage of the Ofsted Able Pupils Survey launch says it based on visits to 41 non-selective schools: http://t.co/ymHoV9RefL

Independent on Ofsted Able Pupils Survey: some schools not identifying most able (which was a requirement up to 2011): http://t.co/jDmAn41lfH

BBC coverage of Ofsted Able Pupils Report leads on failure to translate L5 to A* HMCI advocates setting/streaming: http://t.co/xeHiKGKBvB

Sutton Trust wants Government to fund trials of best ways to support gifted learners: http://t.co/nKlvxkMMV4 So a job for the EEF Sir Peter?

This short piece on gifted education and Learning Schools should’ve been published elsewhere today It wasn’t http://t.co/6MOnrWm6do

.

.

In which I propose a National Network of Learning Schools (to complement the Teaching Schools Network): http://t.co/6MOnrWm6do

RT @dandoj: Interesting Ofsted story on schools failing to challenge the brightest – particularly true for the poorest http://t.co/S6rTQaVKbX

@rchak100 @brianlightman @dylanwiliam There’s more data than you can shake a stick at in my analysis here: http://t.co/J0Kt7Aegpl

Ofsted Report on the Most Able Pupils now published: http://t.co/SrxQMNn1vP plus press release http://t.co/Rpdd3li9q2

Ofsted report says in only 20% of 2327 lessons observed were able pupils supported well or better: http://t.co/Td95IwjFIn (p7)

Also surprised that Ofsted most able report is silent on school-to-school collaboration. My own modest proposal here: http://t.co/6MOnrWm6do

Key Finding 1: In many schools expectations of most able are too low: http://t.co/Td95IwjFIn

Key Finding 2: In non-selective schools 65% of those achieving L5 in Eng and Ma didn’t get GCSE A*/A (2012): http://t.co/Td95IwjFIn

Key Finding 3: School leaders ‘haven’t done enough to create a culture of scholastic excellence’: http://t.co/Td95IwjFIn

Key Finding 3 (cont) Schools don’t routinely give same attention to most able as they do to those struggling http://t.co/Td95IwjFIn

Key Finding 4: Transition arrangements don’t ensure high attainers maintain momentum into Year 7: http://t.co/Td95IwjFIn

Key Finding 5: KS3 teaching is insufficiently focused on the most able: http://t.co/Td95IwjFIn

Key Finding 6: Many students become used to under-challenge. Parents and teachers accept this too readily: http://t.co/Td95IwjFIn

Key Finding 7: KS3 curriculum and early GCSE entry are key weaknesses; homework insufficiently challenging: http://t.co/Td95IwjFIn

Key Finding 8: Inequalities amongst most able aren’t being addressed satisfactorily. Particularly FSM boys: http://t.co/Td95IwjFIn

Key Finding 8 (cont): Few schools are using Pupil Premium to support most able from disadvantaged backgrounds http://t.co/Td95IwjFIn

Key Finding 9: Many schools aren’t using assessment, tracking and targeting effectively with most able: http://t.co/Td95IwjFIn

Key Finding 10: Too few schools worked with families to remove cultural/financial obstacles to HE admission http://t.co/Td95IwjFIn

Key Finding 11: Most 11-16 schools visited were insufficiently focused on progression to HE: http://t.co/Td95IwjFIn

Key Finding 12: Schools’ knowledge/expertise on application to top universities not always up-to-date: http://t.co/Td95IwjFIn

Ofsted Recommendation 1: DfE should ensure parents get annual report showing if their children are on track http://t.co/Td95IwjFIn

.

.

Ofsted Recommendation 3: DfE should promote new destinations data on progression to (leading) universities: http://t.co/Td95IwjFIn

Ofsted Recommendation 4: Schools should develop ethos so needs of most able are championed by school leaders http://t.co/Td95IwjFIn

Ofsted Recommendation 5: Schools should develop skills/confidence/attitudes to succeed at best universities: http://t.co/Td95IwjFIn

Ofsted Recommendation 6: Schools should improve primary/secondary transfer and plan KS3 lessons accordingly: http://t.co/Td95IwjFIn

Ofsted Recommendation 7: Schools should ensure work remains challenging /demanding throughout KS3: http://t.co/Td95IwjFIn

Ofsted Recommendation 8: Senior school leaders should check mixed ability teaching is challenging enough: http://t.co/Td95IwjFIn

Ofsted Recommendation 9: Schools should check that homework is sufficiently challenging for most able: http://t.co/Td95IwjFIn

Ofsted Recommendation 10: Schools should give parents of more able better infromation more frequently: http://t.co/Td95IwjFIn

Ofsted Recommendation 10 (cont) schools should raise parents’ expectations for more able where necessary: http://t.co/Td95IwjFIn

Ofsted Recommendation 11: Schools should work with (poor) families to overcome obstacles to HE progression: http://t.co/Td95IwjFIn

Ofsted Recommendation 12: Schools should develop more expertise to support progression to top universities: http://t.co/Td95IwjFIn

Ofsted Recommendation 13: Schools should publish more widely a list of university destinations of students: http://t.co/Td95IwjFIn

.

.

Ofsted Commitment 2: Will focus inspection more on use of Pupil Premium for most able disadvantaged learners http://t.co/Td95IwjFIn

Ofsted Commitment 3: Will report inspection findings more clearly in school, 6th form and college reports: http://t.co/Td95IwjFIn

Ofsted has today called for a new progress measure from KS2 to KS4/5 for most able pupils: http://t.co/Td95IwjFIn

What the Unions think of Ofsted’s Most Able Students Report – NAHT: http://t.co/3zwKMnEJTF

What the Unions think of Ofsted’s Most Able Students Report – ASCL: http://t.co/9V18fM4eXE

What the Unions think of Ofsted’s Most Able Students Report – NUT: http://t.co/RAnjMTlSFG

What the Unions think of Ofsted’s Most Able Students Report: NASUWT – http://t.co/2w0Png0y7c

What the Unions think of Ofsted’s Most Able Students Report – Voice: http://t.co/xWRPFZK8lu

What the Unions think of Ofsted’s Most Able Students Report – ATL: http://t.co/4qVn0Ii1vR

Potential Plus (formerly NAGC) press release on Ofsted’s Most Able Pupils Report: http://t.co/7gqpdttJBz

David Laws video response to Ofsted Most Able Students response: http://t.co/9Og9iuQlLs – no new commitments

Twigg: ‘David Cameron and Michael Gove have no plan for gifted children’: http://t.co/wc8VS43rPj but no commitments

Review of today’s Ofsted report on most able by @pwatsonmontrose: http://t.co/SeMs66WQ3e (thanks for the links Patrick!)

Inspired by ASCL I’ve just checked what the 2012 KS2/4 Transition Matrices say about high attainers’ performance: http://t.co/96vG1mezxX

Apropos Ofsted’s Most Able report 2012 Transition Matrices show only 50% of KS2 L5A in Maths got GCSE A*: http://t.co/YkCno8Digi

Apropos Ofsted’s Most Able Students report 2012 Transition Matrices show only 47% of KS2 L5A in English got GCSE A*: http://t.co/YkCno8Digi

IoE reminds us that some GS have an issue with able learners (and inter-departmental variation’s also problematic): http://t.co/81S00tlnhK

Sutton Trust blog on today’s Ofsted report: http://t.co/1k0KpUAACH  still wondering when we’ll hear outcome of their own call for proposals

Skidmore thinks the answer is setting (and streaming?): http://t.co/8rTqoVPR8z  Will his Select Committee explore these issues?

RT @RealGeoffBarton: From last night: ‘Pass the G&T’: my blog on a depressing day for Ofsted and state education: http://t.co/nzC4QMpFAp

This Telegraph commentary on the ‘Most Able’ Report asks whether Gove(rnment) will step up to the challenges it poses http://t.co/uRXNxl2eF4

Standard predicts that schools will introduce predictive GCSE ‘report cards’ following yesterday’s Ofsted report: http://t.co/rK5F5k9fLb

Wilby questions evidence base behind Ofsted’s ‘Most Able’ Report but this evidence shows he hasn’t read it thoroughly http://t.co/h6RduI3O0K

Spectator insists Ofsted’s ‘Most Able’ report vindicates Govian policy: http://t.co/ZziGgqmlhQ But is the challenge/support balance optimal?

RT @federicacocco: My factcheck on evidence behind Ofsted’s latest report on bright children in Comprehensive schools http://t.co/Xvuamw1Yt3

And, further to Factcheck, this is what the fine level transition matrices tell us about high attainers’ progression http://t.co/96vG1mezxX

So What Does Gifted Mean Anyway? http://t.co/mNY6mut1Ty ID’s part of assessment; teaching to the top’s admirable and integral to ID

RT @headguruteacher: NEW POST Today: My take on the OfSTED report: The Anatomy of High Expectations http://t.co/qqvvWVgiEB

Huge thanks to everyone who promoted my megapost on Ofsted’s ‘Most Able’ Report: http://t.co/J7BTMsfGdt Especially @headguruteacher

Stephen ‘Up to two-thirds of teachers do not at heart approve of special programmes for the most able’: http://t.co/5aRjGDNYoh

Telegraph take on yesterday’s ‘Most Able’ Ofsted report: http://t.co/x51mfCwb9z  – Nothing here about supporting schools to improve

.

Thematic Coverage

 

Intelligence and Neuroscience

Reasoning Training Increases Brain Connectivity Associated with High-Level Cognition by @sbkaufman: http://t.co/uJJX9XAfPG

A dose of realism over genetic selection for high IQ: http://t.co/c1ufVl1pMS

Two contrasting views of Obama’s new BRAIN initiative supporting neuroscience: http://t.co/noY9ry09by and http://t.co/bDsYRPHQ7m

.

.

A round-up of developments in working memory research: http://t.co/f4tlxChwxm

MT @NAGCBritain: Schooling Makes You Smarter: What teachers need to know about IQ: http://t.co/ATTRhNriT7

In Defence of Working Memory Training: http://t.co/RA0mWpzp4E

Intelligence can’t be explained by the size of one’s frontal lobes! http://t.co/41NP8kSOSJ

Yet another warning that research on the relationship between IQ and race is incendiary: http://t.co/WLMkMESmbG

Informative piece on the pernicious influence of ‘IQ fundamentalism’ in the wake of Richwine: http://t.co/CpFnGhCXx5

The impact of transcranial random noise stimulation on cognitive function: http://t.co/agCIcawpFX (I kid you not)

Intelligence as a function of other people’s perceptions: http://t.co/RSGk24ykRM

The distinctinction between intelligence and rationality: http://t.co/kUnP2az3hR

More about eugenics and cognitive genomics: http://t.co/LrMPD26kPg

Motion Filtering Ability Correlated to High IQ: http://t.co/yowFYoSObk

‘Intelligence is largely a hereditary trait’ states @toadmeister on meritocracy: http://t.co/NwxhBBmsHX That’s highly contestable

Neat post on Intelligence, Genetics and Environment drawing on Nisbett et al’s 2012 paper: http://t.co/85ocdVn46y

Eight ways of looking at intelligence: http://t.co/PQ4VzX9jU2

Redefining Intelligence: Q and A with @sbkaufman: http://t.co/utkNwRaHSG

MT @WendaSheard: An antidote to neuromyths perpetrated in K-12 ed conferences and publications. http://t.co/cMleiKfk2r

.

Creativity and Innovation

Start with small steps when nurturing the next Van Gogh (about fostering creativity in learning): http://t.co/i8Q5lfpsrv

A simply outstanding piece about domain dependency and ‘epistemic chameleons’: http://t.co/wEBg595RwB

Creativity lies in combining ordinary things in extraordinary ways: http://t.co/9r8jWh9AYT

OECD post on creativity: http://t.co/eD9YygkcDo and associated Education for Innovation in Asia conference papers: http://t.co/pNSGuPKeHV

Intuition as the basis for creativity: http://t.co/fddE5v5bl1

Profiling Serial Creators by @sbkaufman http://t.co/ozUtLRwdD0

I do so agree with this dismissal of Robinson’s TED flummery: http://t.co/J9WsNqqffn  - gets far more attention than it deserves

Turning adversity into creative growth: http://t.co/U8yZya0rbD

@BSheermanMP @DrSpenny I spent some time trying to get a grip on Robinson’s take on talent: http://t.co/op4PaJF2Vq – wasn’t impressed

Does education marginalise spatial thinkers? http://t.co/7596uBqu0y

RT @HuntingEnglish: Why We Should Mistrust Ken Robinson http://t.co/iqPyBVKgCi – Glad I’m not the only one!

 .

Twice-exceptional

The Invisible Side of ‘Special Needs’ Gifted Students: http://t.co/VDzO4Bhgpt

Twice-Exceptional: When Exceptions are the Norm: http://t.co/55szvEVsca

Belin-Blank presentation on Parenting Twice-Exceptional Children: http://t.co/6bPRNeNX9f

Belin-Blank has funding from the Jack Kent Cooke Foundation to support twice-exceptional students: http://t.co/7eUZ1GKDhM

Twice-exceptional, from an Indian perspective: http://t.co/Vxd8UXnhYy

Raising the Autistic Gifted Child: http://t.co/zYvUgg9Zz8

Belin-Blank on twice-exceptionality: http://t.co/M6gIucLSqv  featuring their resources

.

Gifted Research

You can access this morming’s study of ability grouping and summer born children here: http://t.co/kIgY8ZgVm8  (link at bottom)

Here’s the associated IoE press release about the MCS ability grouping and summer born children paper: http://t.co/ODOkRyJ1mR

Research showing gender differences largest in maths but smallest in reading amongst high attainers http://t.co/BVdSoqSDQL

Brown Center pieces on the incidence of ability grouping and tracking and advanced 8th Grade maths courses: http://t.co/xkRjC49dVf

Elite Athletes Also Excel at Some Cognitive Tasks: http://t.co/UzcWxtBLZv

Why Gifted Low Income Students Don’t Go To the Best Colleges: http://t.co/IIq6o1viJl

School makes you smarter: http://t.co/iKQ91VSJNg

Defining Mathematical Giftedness in Elementary School Settings: http://t.co/Kx51V6bPfO

US follow-up study finds similar academic growth rates for high-achieving students at high and low income schools: http://t.co/j8N4IbAiL5

.

.

How important is maths ability for scientific success? http://t.co/IgfDfVviwI

Brand spanking new post on The Limited Accessibility of Gifted Education Research: http://t.co/joOIDs23dJ

More on Wai’s study on the relationship between wealth and ability: http://t.co/ry5c1QQZxX

So much for 10,000 hours of deliberate practice: http://t.co/SnRXDKtC7S – hard work doesn’t deliver for everyone

The Complexity of Greatness (including more about deliberate practice) from @sbkaufman: http://t.co/OKoEbwKLdg

Are gender differences increasing in mathematical ability at the upper end? http://t.co/uK06d8JKNp

Interesting piece of open access research (hooray) on Renzulli Learning: http://t.co/YEhLzFpmBO Relevant to other providers

2 Indian publications: Introductory Reading on Giftedness in Children http://t.co/dr1UPJOp4B Case Profiles http://t.co/Lql3tnP7YV

.

Gifted Commentary

 

Gifted Phoenix

A huge(ly ambitious) new blogpost: The Economics of Gifted Education Revisited: http://t.co/jaiOGpmi15

@jakeanders @drbeckyallen What did you make of http://t.co/jaiOGpmi15 – What prospect of serious analysis of smart fraction from your ilk?

The @GiftedPhoenix Manifesto for Gifted Education: http://t.co/7a1Fhr99uK

MT @peter_lydon: The most important statement on Gifted education this year  http://t.co/DhOTt7Ee61 I’m seriously flattered. Thanks!

Peter Lydon blogs on (and reproduces) The Gifted Phoenix Manifesto for Gifted Education: http://t.co/DhOTt7Ee61

RT @peter_lydon: Special #gtie Chat on Sunday 9pm GMT ‘The Gifted Phoenix Manifesto for Gifted Education’. http://t.co/DhOTt7Ee61

Explore The Gifted Phoenix Manifesto for Gifted Education via #gtie at 21.00GMT on Sunday 24 March: http://t.co/FqKGNOvNM0

Here’s a selective, reordered Storify transcript of last night’s #gtie chat on the Gifted Phoenix Manifesto: http://t.co/UppAtGsjum

I’ve also included some tweets in the Gifted Phoenix Manifesto post, to give the flavour of #gtie discussion http://t.co/FqKGNOvNM0

Fascinating and troubling equally that positive reaction to my Gifted Manifesto is all from outside the UK! http://t.co/7a1Fhr99uK

Planning a 16-19 maths free school? Want to know more about the KCL or Exeter projects? Here’s some essential reading http://t.co/UQDNCNXwuX

GEI has now published the dialogue between Barry Hymer and yours truly (£) but original on my blog): http://t.co/dypWHntkp9

This first post in my new ‘Summer of Love’ series is mainly about Key Stage 2 Level 6 tests: http://t.co/EWOMHD0sql

.

.

My new post is a transatlantic exploration of support for high-ability low-income learners building on US NAGC’s work http://t.co/XREYgg8bmO

My new post on Indian Gifted Education: http://t.co/TWgmrtPQLu

I’ve finalised my brief post of yesterday about the future of Dux Awards, now renamed Future Scholar Awards http://t.co/krPc7Uweo4

.

Twitter Chats

MT @gtchatmod: #gtchat transcript: Coping When Extended Family Doesn’t Get Giftedness http://t.co/92FJzIaS4O

MT @gtchatmod: Storify transcript of #gtchat: Book Lists for Gifted Learners http://t.co/AmZSlPbwrM

RT @gtchatmod: “Do gifted learners think differently?” will be our #gtchat topic Friday @11PM UK http://t.co/Iqo1hvXFDF

MT @gtchatmod: Storify record of last night’s #gtchat: Do gifted learners think differently? http://t.co/AMhVQbF3lB

MT @gtchatmod: Storify transcript of last night’s #gtchat: The Value of Twitter Chats http://t.co/YliKuZ4nUq

MT @gtchatmod: Storify transcript of last night’s #gtchat: Organising the Gifted Learner http://t.co/VwhdUgDjq9

Transcripts of yesterday’s #gtchats: http://t.co/Uv1fLChSI2 and http://t.co/Ibw8nF6VAS

Transcript of last week’s #gtchat on Teaching Strategies for Underachievers: http://t.co/9xukcShsn7

MT @gtchatmod: New post: “The Misdiagnosis Initiative: An Interview with Dr. James Webb” http://t.co/cyPHeWakeC

RT @gtchatmod: Transcript for “Asynchronous Transitioning to Adulthood” now available @ #gtchat blog. http://t.co/luHZLW1uTx

RT @gtchatmod: Transcript for Supporting Exhausted Parents of Gifted Children? now available @ #gtchat blog http://t.co/09wdOxNKpd

MT @gtchatmod: Transcript from 5pm 28 June #gtchat on ‘Rigour’ now available at http://t.co/yuu0JRScDp

RT @gtchatmod: “A Multi-Talent’s Growth with Dr. Edith Johnston” New post on #gtchat Blog! http://t.co/QVpHuENlxm

MT @Frazzlld: Transcript from tonight’s #gtie chat (March 3): http://t.co/Ot8gjKSJDL

MT @Frazzlld: Thanks, everyone, for a great #gtie chat. Here’s “The Trouble With Boys” transcript: http://t.co/QIMghMo2B3

Transcripts of recent #gtie chats on Gifted Support Groups: http://t.co/owAyU1gRV3  and http://t.co/Oa3r8h1uz4

RT @GTNIrl: Support for Teachers of Gifted Students (#gtie transcript) http://t.co/3H6KCk1GoO

MT @CatherinaFisher: For those who missed #gtie chat on Sunday: Social Media and Gifted Education Awareness http://t.co/2DoNvQq4Kk

.

Other Posts

RT @ljconrad: New post @GPS, “Preaching to the Choir: They Need to Hear the Message, Too!” http://t.co/FQ96YvmRji

MT @ljconrad: “Best Practices in Gifted Parenting” is my new post @Gifted Parenting Support http://t.co/ikxXwjd6VR

.

.

Social Challenges of Gifted Adolescents: http://t.co/ef1N0sQtOO

Sorry but…Your Exceptional Child Might Not Be Gifted: http://t.co/1P6UP7qIkF

‘Studying to be Gifted’: http://t.co/1vf7yJcyxL

Gifted Kid Syndrome: http://t.co/RUOCqKbjt0 – I really like the directness of this; others won’t

How to create a science prodigy (from @JonathanLWai): http://t.co/HuOyAaf3qA

Gifted Children: Skipping Grades: http://t.co/1Gv5quZ9Va

Never trust a journalist who puts the word gifted in quotation marks: http://t.co/qOihMCcf43

The Lowest Common Denominator: http://t.co/t6YMu3kLqK

MT @ljconrad: The Socialization Question, Homeschooled and Gifted Children: http://t.co/bUzKjneASS

Giftedness should not be confused with mental disorder: http://t.co/tQSZ34XCqJ

Using the ‘G word’ with kids: http://t.co/bwx1Dh4eco

(More on) Gifted and Racially Balanced Education: http://t.co/IJTYtNxLcq

Giftedness and Non-Conformity: http://t.co/acTmwyby5Z – Reading that is just like looking in the mirror

Transcending Race in Gifted Programs: Are We There Yet? http://t.co/WvBKX9p6cw

Do Schools for the Gifted Promote Segregation? (I refuse to adopt the quotation marks): http://t.co/L84DUSOGHg

.

.

MT @karlaarcher: “Giftedness and Boredom, Part Two: Tackling the Issue Head On”  http://t.co/QfIy3BIfxD

MT @ljconrad: “An Educational Paradigm Shift for Low-Income Gifted Students” http://t.co/3XakgtI3uG

Your Child is Gifted: A Parent’s Reaction: http://t.co/O2u1KeMqDl

Do GATE Programmes Take Resources Away From Needier Students? http://t.co/NCORWJIoEc

“Live life to the fullest and rejoice in your moments of triumph because you are the best you there will ever be”: http://t.co/Cj2VWax1g9

Why is it challenging to be challenged in public schools? http://t.co/waaJKkl5qb

Gifted Doesn’t Equal Segregation: http://t.co/jpDUKhzeUd

The Misunderstood Face of Giftedness: http://t.co/c92FH39d6p

Harnessing the power of social media to advocate for gifted education: http://t.co/5ETcVaf97x

What Does ‘Gifted’ Look Like? http://t.co/HJEzyF121X

Australian opinion piece on gifted learners: http://t.co/28mAFmCb50 – has more than a whiff of suspect old-fogeydom

Choosing the right college for gifted students: http://t.co/tNwYKlypRq – much wisdom in this post

Why isn’t my child as clever as me? http://t.co/fzjeIe3jAT – nice counterbalance to parents worried about the reverse scenario

.

.

Gifted children need help too: http://t.co/x85YbzXqSe  - a piece from South Dakota

We mustn’t neglect gifted students: http://t.co/CAVe77l4dc  - a call to arms by P O-K and the Tennessee Association

The illusion of the gifted child: http://t.co/mUFbVOL9h8  - is actually about ways of improving gifted education

Gifted Children…how can we start? http://t.co/7ZsDhLWO0x  - A blogpost from Mexico

MT @BYOTNetwork: BYOT in the Gifted Classroom: A Perfect Fit! Guest post by @abkeyser http://t.co/vxk7T9waem

Does the gifted label help or harm? An ongoing conversation on Reddit: http://t.co/yHnzfM2heS

More gifted myth debunking: http://t.co/ToGoV4mzb9

Cretal reports back to Planet Zoran on Earth’s approach to education (courtesy of @sbkaufman): http://t.co/9MLFIAO9vm

Changing the label on gifted programmes: http://t.co/PzibyqZyCT – the pros and cons

The Grown-up Gifted Child: http://t.co/I2oLmUstEU

20 Reasons why it’s Awesome Growing Up Gifted: http://t.co/SN61yShixU

Problem-based learning and gifted students (from CTD): http://t.co/uBGxg98YMk

Paula O-K on flexible ability grouping: http://t.co/2MljhIIxlM

Making Room for Talent: http://t.co/tqbedBHg3q

Sharing the Gifted and Talented Curriculum: http://t.co/qHTqcIKbEb

The gifted child’s lament: How to adjust to an unjust world: http://t.co/LLqHHFUu7w

Is Talent a Defunct Concept? http://t.co/zC8axv2AvZ – Some would have you believe so but it’s more complex than that

Is divergent thinking valued in your gifted child’s classroom? http://t.co/smIwhKzZ7m

How parents can challenge stereotypes and misconceptions about giftedness: http://t.co/fs9DuiYE0W Have problems with para 3

Some teacher appreciation from Unwrapping the Gifted: http://t.co/1ZoAe1Es9d

RT @Begabungs: Day 1 – Gifted Awareness Week in Germany 3rd-9th June 2013 http://t.co/yPlDIxfYt9

Pros and cons of pull-out versus in-school enrichment: http://t.co/E1PgcPmGkT

.

.

The Matthew Effect in Educational Technology: http://t.co/NuZWT3OdJu (including an aside about identifying gifted learners)

RT @Begabungs: Interview with Prof. James Webb (USA) http://t.co/0R1SiLioLf

Social Development of Gifted Children: http://t.co/MAiv80JmFw – Highly recommended (because I agree with the analysis)

The Dichotomies of Giftedness: http://t.co/EzY5U47yyr

RT @ljconrad: New post at Gifted Parenting Support, “Are You Nurturing Your Gifted Child?” http://t.co/ZK1PYP2fww

The Parent Challenge (NZGAW contribution from @Dazzlld and @Frazzlld): http://t.co/eO8QlGPh4J

@donnayford Hi Donna. Do you now advocate selection/ID solely on the basis of attainment? This made sense to me: http://t.co/fr0lmTHnpn

What to say to your gifted child about being gifted: http://t.co/WlPmoybuM0

How best are the gifted lifted? Lots of common sense in this post: http://t.co/NBorF6LUDa

24/7 Challenge (for NZ Gifted Awareness Week): http://t.co/EIro3ThjYR

Debate on Ofsted’s Most Able Report has resonance in US and worldwide http://t.co/hXV1hIksb5  - kudos to @ljconrad (and Tom Bennett)

Advocacy Versus Curriculum: http://t.co/BUFvLK636I

‘G is for Gifted and that’s good enough for me’: http://t.co/eJ0nwH6xgJ

RT @ljconrad: New post at GPS: “The High Ability – Gifted Conundrum” http://t.co/1wlBx0BH9u

The contribution that chess can make to gifted education (from NZGAW): http://t.co/5uK6olCNlU

Stop underestimating children: http://t.co/AVx7CFz503

The gift of independent learning projects: http://t.co/QQ1cPMpoE4

Is Your Child Ungifted? by @sbkaufman – Required reading for all gifted advocates: http://t.co/LGryTncjPo

RT @peter_lydon: Are you a gifted advocate? Add your name http://t.co/dk57ygQFJ0 Find other tweeps http://t.co/Cc06yCYZjq

Choosing Your Battles (from NZGAW): http://t.co/0LFJaaMjG0 – Messages for the NZ Government and Ministry of Education

Differentiating Homework for Gifted Students (from NZGAW): http://t.co/mUDuBedsys

Giftedness in our classrooms – removing the ceiling- an Iowa perspective: http://t.co/qCFbXME2Ve

My Gifted Education Soapbox: http://t.co/SpFpDAvc7M

.

.

Hochbegabtenforderung an Schulen mittels Blended Learning: http://t.co/Q5kNpP8llU

RT @jtoufi: Es posible un sistema educativo orientado al desarrollo del talento? http://t.co/zyQWtlUHPR

RT @jtoufi: Promover el talento en Europa: White paper from Austria http://t.co/qOdbg5XJnT

RT @jtoufi: Francoys Gagne en My Friends’ corner http://t.co/eQEjzoOng7

Joseph Renzulli en My friends’ corner: http://t.co/PqIIXbW04k

RT @jtoufi: Karen Rogers en My Friends’ corner http://t.co/2HWZU3fGi5

RT @jtoufi: Rena Subotnik, Paula Olszewski-Kubilius y Frank Worrell en My Friends’ corner http://t.co/GaXvzd48qN

MT @jtoufi: Diane Montgomery en My Friends’ corner http://t.co/zoaQPp9a4Z (that’s the English DM by the way)

RT @jtoufi: Que pasa en el mundo con la atencion al desarrollo de los más capaces? http://t.co/GtZRlHqRUd

RT @jtoufi: Es tiempo reconstruir la educacion que queremos: Talento, Escuela, Tecnologia http://t.co/f8z44G5AJk

RT @jtoufi: Transforma Talento: un informe que hay que leer http://t.co/3wrnsRdXeo

RT @jtoufi: El Estado de la Nacion: o de como tomarse en serio el desarrollo del talento! http://t.co/RPJnK83Acp

RT @jtoufi: Diferenciacion del curriculo y la instruccion. La NAGC nos lo cuenta http://t.co/5J2erRhNtf

.

.

.

A Summer of Love for English Gifted Education? Episode 2: Ofsted’s ‘The Most Able Students’

 .

This post provides a close analysis of Ofsted’s Report: ‘The most able students: Are they doing as well as they should in our non-selective secondary schools?’ (June 2013)

.

,

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

This is the second post in a short series, predicated on the assumption that we are currently enjoying a domestic ‘summer of love’ for gifted education.

According to this conceit, the ‘summer of love’ is built around three official publications, all of them linked in some way with the education of gifted learners, and various associated developments.

Part One in the series introduced the three documents:

  • An Ofsted Survey of how schools educate their most able pupils (still unpublished at that point); and
  • A planned ‘Investigation of school and college level strategies to raise the aspirations of high-achieving disadvantaged pupils to pursue higher education’, this report programmed for publication in September 2013.

It provided a full analysis of the KS2 L6 Investigation and drew on the contractual specification for the Investigation of aspiration-raising strategies to set out what we know about its likely content and coverage.

It also explored the pre-publicity surrounding Ofsted’s Survey, which has been discussed exclusively by HMCI Wilshaw in the media. (There was no official announcement on Ofsted’s own website, though it did at least feature in their schedule of forthcoming publications.)

Part One also introduced a benchmark for the ‘The most able students’, in the shape of a review of Ofsted’s last foray into this territory – a December 2009 Survey called ‘Gifted and talented pupils in schools’.

I will try my best not to repeat too much material from Part One in this second Episode so, if you feel a little at sea without this background detail, I strongly recommend that you start with the middle section of that first post before reading this one.

I will also refer you, at least once, to various earlier posts of mine, including three I wrote on the day ‘The most able students’ was published:

  • My Twitter Feed – A reproduction of the real time Tweets I published immediately the Report was made available online, summarising its key points and recommendations and conveying my initial reactions and those of several influential commentators and respondents. (If you don’t like long posts, go there for the potted version!);

Part Two is dedicated almost exclusively to analysis of ‘The most able students’ and the reaction to its publication to date.

It runs a fine tooth comb over the content of the Report, comparing its findings with those set out in Ofsted’s 2009 publication and offering some judgement as to whether it possesses the ‘landmark’ qualities boasted of it by HMCI in media interviews and/or whether it justifies the criticism heaped on it in some quarters.

It also matches Ofsted’s findings against the Institutional Quality Standards (IQS) for Gifted Education – the planning and improvement tool last refreshed in 2010 – to explore what that reveals about the coverage of each document.

For part of my argument is that, if schools are to address the issues exposed by Ofsted, they will need help and support to do so – not only a collaborative mechanism such as that proposed in ‘Driving Gifted Education Forward – but also some succinct, practical guidance that builds on the experience developed during the lifetime of the late National Gifted and Talented Programme.

For – if you’d like a single succinct take-away from this analysis – I firmly believe that it is now timely for the IQS to be reviewed and updated to better reflect current policy and the new evidence base created in part by Ofsted and the other two publications I am ‘celebrating’ as part of the Summer of Love.

Oh, and if you want to find out more about my ‘big picture’ vision, may I refer you finally to the Gifted Phoenix Manifesto for Gifted Education.

But now it’s high time I began to engage you directly with what has proved to be a rather controversial text.

.

Ofsted’s Definition of ‘Most Able’

The first thing to point out is that Ofsted’s Report is focused very broadly in one sense, but rather narrowly in another.

The logic-defying definition of ‘most able students’ Ofsted adopts – for the survey that informs the Report – is tucked away in a footnote divided between the bottom of pages 6 and 7 of the Report.

This says:

For the purpose of this survey ‘most able’ is defined as the brightest students starting secondary school in Year 7 attaining Level 5 or above, or having the potential to attain Level 5 and above, in English (reading and writing) and/or mathematics at the end of Key Stage 2. Some pupils who are new to the country and are learning English as an additional language, for example, might not have attained Level 5 or beyond at the end of Key Stage 2 but have the potential to achieve it.

It is hard to reconcile this definition with the emphasis in the title of the Report on ‘the most able students’, which suggests a much narrower population at one extreme of an ability distribution (not an attainment distribution, although most of the Report is actually about high attaining students, something quite different).

In fact, Ofsted’s sample includes:

  • All pupils achieving Level 5 and above in English – 38% of all pupils taking end KS2 tests in 2012 achieved this.
  • All pupils achieving Level 5 and above in maths – 39% of all pupils achieved this in 2012.
  • We also know that 27% of pupils achieved Level 5 or above in both English and maths in 2012. This enables us to deduce that approximately 11% of pupils managed Level 5 only in English and approximately 12% only in maths.
  • So adding these three together we get 27% + 11% + 12% = 50%. In other words, we have already included exactly half of the entire pupil population and have so far counted only ‘high attaining’ pupils.
  • But we also need to include a further proportion of pupils who ‘have the potential’ to achieve Level 5 in one or other of these subject but do not do so. This sub-population is unquantifiable, since Ofsted gives only the example of EAL pupils, rather than the full range of qualifying circumstances it has included. A range of different special needs might also cause a learner to be categorised thus. So might a particularly disadvantaged background (although that rather cuts across other messages within the Report). In practice, individual learners are typically affected by the complex interaction of a whole range of different factors, including gender, ethnic and socio-economic background, special needs, month of birth – and so on. Ofsted fails to explain which factors it has decided are within scope and which outside, or to provide any number or percentage for this group that we can tack on to the 50% already deemed high attainers.

Some might regard this lack of precision as unwarranted in a publication by our national Inspectorate, finding reason therein to ignore the important findings that Ofsted presents later in the Report. That would be unfortunate.

Not only is Ofsted’s definition very broad, it is also idiosyncratic, even in Government terms, because it is not the same as the slightly less generous version in the Secondary School Performance Tables, which is based on achievement of Level 5 in Key Stage 2 tests of English, maths and science.

So, according to this metric, Ofsted is concerned with the majority of pupils in our secondary schools – several million in fact.

But ‘The Most Able Students’ is focused exclusively on the segment of this population that attends non-selective 11-16 and 11-18 state schools.

We are told that only 160,000 students from a total of 3.235m in state-funded secondary schools attend selective institutions.

Another footnote adds that, in 2012, of 116,000 students meeting Ofsted’s ‘high attainers’ definition in state-funded schools who took GCSEs in English and maths, around 100,000 attended non-selective schools, compared with 16,000 in selective schools (so some 86%).

This imbalance is used to justify the exclusion of selective schools from the evidence base, even though some further direct comparison of the two sectors might have been instructive – possibly even supportive of the claim that there is a particular problem in comprehensive schools that is not found in selective institutions. Instead, we are asked to take this claim largely on trust.

.

Exeter1 by Gifted Phoenix

Exeter1 by Gifted Phoenix

.

The Data-Driven Headlines

The Report includes several snippets of data-based evidence to illustrate its argument, most of which relate to subsets of the population it has rather loosely defined, rather than that population as a whole. This creates a problematic disconnect between the definition and the data.

One can group the data into three categories: material relating to progression between Key Stages 2 and 4, material relating to achievement of AAB+ grades at A level in the so-called ‘facilitating subjects’ and material drawn from international comparisons studies. The former predominates.

 .

Data About Progression from KS2 to KS4

Ofsted does not explain up front the current expectation that pupils should make at least three full levels of progress between the end of Key Stage 2 and the end of Key Stage 4, or explore the fact that this assumption must disappear when National Curriculum levels go in 2016.

The conversion tables say that pupils achieving Level 5 at the end of Key Stage 2 should manage at least a Grade B at GCSE. Incidentally – and rather confusingly – that also includes pupils who are successful in the new Level 6 tests.

Hence the expectation does not apply to some of the very highest attainers who, rather than facing extra challenge, need only make two levels of progress in (what is typically) five years of schooling.

I have argued consistently that three levels of progress is insufficiently challenging for many high attainers. Ofsted makes that assumption too – even celebrates schools that push beyond it – but fails to challenge the source or substance of that advice.

We are supplied with the following pieces of data, all relating to 2012:

  • 65% of ‘high attainers’ in non-selective secondary schools – not according to Ofsted’s definition above, but the narrower one of those achieving  Key Stage 2 Level 5 in both English and maths – did not achieve GCSEs at A/A* in both those subjects. (So this is equivalent to 4 or 5 levels of progress in the two subjects combined.) This group includes over 65,000 students (see pages 4, 6, 8, 12).
  • Within the same population, 27% of students did not achieve GCSEs at B or above in both English and maths. (So this is the expected 3+ levels of progress.) This accounts for just over 27,000 students.) (see pages 4, 6 and 12).
  • On the basis of this measure, 42% of FSM-eligible students did not achieve GCSEs at B or above in both English and maths, whereas the comparable figure for non-FSM students was 25%, giving a gap between FSM and non-FSM (rather than between FSM and all students) of 17%. We are not told what the gap was at A*/A, or for the ‘survey population’ as a whole  (page 14)
  • Of those who achieved Level 5 in English (only) at Key Stage 2, 62% of those attending non-selective state schools did not achieve an A* or A Grade at GCSE (so making 4 or 5 levels of progress) and 25% did not achieve a GCSE B grade or higher (so making 3+ levels of progress) (page 12)
  • Of those who achieved Level 5 in maths (only) at Key Stage 2, 53% did not achieve A*/A at GCSE (4 or 5 levels of progress) and 22% did not achieve B or higher (3+ levels of progress) (page 12)
  • We are also given the differentials between boys and girls on several of these measures, but not the percentages for each gender. In English, for A*/A and for B and above, the gap is 11% in favour of girls. In maths, the gap is 6% in favour of girls at A*/A and 5% at B and above. In English and maths combined, the gap is 10% in favour of girls for A*/A and B and above alike (page 15).
  • As for ethnic background, we learn that non-White British students outperformed White British students by 2% in maths and 1% in English and maths together, but the two groups performed equally in English at Grades B and above. The comparable data for Grades A*/A show non-White British outperforming White British by 3% in maths and again 1% in English and maths together, while the two groups again performed equally in English (page 16)

What can we deduce from this? Well, not to labour the obvious, but what is the point of setting out a definition, however exaggeratedly inclusive, only to move to a different definition in the data analysis?

Why bother to spell out a definition based on achievement in English or maths, only to rely so heavily on data relating to achievement in English and maths?

There are also no comparators. We cannot see how the proportion of high attainers making expected progress compares with the proportion of middle and low attainers doing so, so there is no way of knowing whether there is a particular problem at the upper end of the spectrum. We can’t see the comparable pattern in selective schools either.

There is no information about the trend over time – whether the underperformance of high attainers is improving, static or deteriorating compared with previous years – and how that pattern differs from the trend for middle and low attainers.

The same applies to the information about the FSM gap, which is confined solely to English and maths, and solely to Grade B and above, so we can’t see how their performance compares between the two subjects and for the top A*/A grades, even though that data is supplied for boys versus girls and white versus non-white British.

The gender, ethnic and socio-economic data is presented separately so we cannot see how these different factors impact on each other. This despite HMI’s known concern about the underperformance of disadvantaged white boys in particular. It would have been helpful to see that concern linked across to this one.

Overall, the findings do not seem particularly surprising. The large gaps between the percentages of students achieving four and three levels of progress respectively is to be expected, given the orthodoxy that students need only make a minimum of three levels of progress rather than the maximum progress of which they are capable.

The FSM gap of 17% at Grade B and above is actually substantively lower than the gap at Grade C and above which stood at 26.2% in 2011/12. Whether the A*/A gap demonstrates a further widening at the top end remains shrouded in mystery.

Although it is far too soon to have progression data, the report almost entirely ignores the impact of Level 6 on the emerging picture. And it forbears to mention the implications for any future data analysis – including trend analysis – of the decision to dispense with National Curriculum levels entirely with effect from 2016.

Clearly additional data of this kind might have overloaded the main body of the Report, but a data Annex could and should have been appended.

.

Why Ignore the Transition Matrices?

There is a host of information available about the performance of high attaining learners at Key Stage 4 and Key Stage 5 respectively, much of which I drew on for this post back in January 2013.

This applies to all state-funded schools and makes the point about high attainers’ underachievement in spades.

It reveals that, to some extent at least, there is a problem in selective schools too:

‘Not surprisingly (albeit rather oddly), 89.8% of students in selective schools are classified as ‘above Level 4’, whereas the percentage for comprehensive schools is 31.7%. Selective schools do substantially better on all the measures, especially the EBacc where the percentage of ‘above Level 4’ students achieving this benchmark is double the comprehensive school figure (70.7% against 35.0%). More worryingly, 6.6% of these high-attaining pupils in selective schools are not making the expected progress in English and 4.1% are not doing so in maths. In comprehensive school there is even more cause for concern, with 17.7% falling short of three levels of progress in English and 15.3% doing so in maths.’

It is unsurprising that selective schools tend to perform relatively better than comprehensive schools in maximising the achievement of high attainers, because they are specialists in that field.

But, by concentrating exclusively on comprehensive schools, Ofsted gives the false impression that there is no problem in selective schools when there clearly is, albeit not quite so pronounced.

More recently, I have drawn attention to the enormous contribution that can be added to this evidence base by the Key Stage 2 to 4 Transition Matrices available in the Raise Online library.

.

Transition  Matrices and student numbers English (top) and maths (bottom)

.

TM English CaptureTransition matrices numbers English captureTM Maths CaptureTransition matrices maths numbers Capture.

These have the merit of analysing progress to GCSE on the basis of National Curriculum sub-levels, and illustrate the very different performance of learners who achieve 5C, 5B and 5A respectively.

This means we are able to differentiate within the hugely wide Ofsted sample and begin to see how GCSE outcomes are affected by the strength of learners’ KS2 level 5 performance some five years previously.

The tables above show the percentages for English and maths respectively, for those completing GCSEs in 2012. I have also included the tables giving the pupil numbers in each category.

We can see from the percentages that:

  • Of those achieving 5A in English, 47% go on to achieve an A* in the subject, whereas for 5B the percentage is 20% and for 5C as low as 4%.
  • Similarly, of those achieving 5A in Maths, 50% manage an A*, compared with 20% for those with 5B and only 6% for those with 5C.
  • Of those achieving 5A in English, 40% achieve Grade A, so there is a fairly even split between the top two grades. Some 11% achieve a Grade B and just 1% a Grade C.
  • In maths, 34% of those with 5A at KS2 go on to secure a Grade A, so there is a relatively heavier bias in favour of A* grades. A slightly higher 13% progress to a B and 3% to a Grade C.
  • The matrices show that, when it comes to the overall group of learners achieving Level 5, in English 10% get A*, 31% get A and 36% a B. Meanwhile, in maths, 20% get an A*, 31% an A and 29% a B. This illustrates perfectly the very significant advantage enjoyed by those with a high Level 5 compared with Level 5 as a whole.
  • More worryingly, the progression made by learners who achieve upper Level 4s at Key Stage 2 tends to outweigh the progression of those with 5Cs. In English, 70% of those with 5C made 3 levels of progress and 29% made 4 levels of progress. For those with 4A, the comparable percentages were 85% and 41% respectively. For those with 4B they were 70% (so equal to the 5Cs) and 21% respectively.
  • Turning to maths, the percentages of those with Level 5C achieving three and four levels of progress were 67% and 30% respectively, while for those with 4A they were 89% and 39% respectively and for 4B, 76% (so higher) and 19% (lower) respectively.

This suggests that, while there is undeniably an urgent and important issue at the very top, with half or fewer of 5As being translated into A* Grades, the bulk of the problem seems to be at the lower end of Level 5, where there is a conspicuous dip compared with both comparatively higher and comparatively lower attainers.

I realise that there are health warnings attached to the transition matrices, but one can immediately see how this information significantly enriches Ofsted’s relatively simplistic analysis.

.

Data About A Level Achievement and International Comparisons

The data supplied to illustrate progression to A level and international comparisons is comparatively limited.

For A Level:

  • In 2012, 334 (so 20%) of a total of 1,649 non-selective 11-18 schools had no students achieving AAB+ Grades at A Level including at least two of the facilitating subjects.  A footnote tells us that this applies only to 11-18 schools entering at least five pupils at A level. There is nothing about the controversy surrounding the validity of the ‘two facilitating subjects’ proviso (pages 4, 6, 14)
  • Sutton Trust data is quoted from a 2008 publication suggesting that some 60,000 learners who were in the top quintile (20%) of performers in state schools at ages 11, 14 and 16 had not entered higher education by the age of 18; also that those known to have been eligible for FSM were 19% less likely than others to enter higher education by age 19. The most significant explanatory factor was ‘the level and nature of the qualifications’ obtained by those who had been FSM-eligible (page 15).
  • A second Sutton Trust report is referenced showing that, from 2007-2009, students from independent schools were over twice as likely to gain admission to ‘one of the 30 most highly selective universities’ as students from non-selective state schools (48.2% compared with 18 %). However, this ‘could not be attributed solely to the schools’ average A level or equivalent results’ since 58% of applicants from the 30 strongest-performing comprehensive schools on this measure were admitted to these universities, compared with 87.1% from the highest-performing independent schools and 74.1% from the highest-performing grammar schools (pages 16-17)
  • The only international comparisons data is drawn from PISA 2009. The Report uses performance against the highest level in the tests of reading, maths and science respectively. It notes that, in reading, England ranked 15th on this measure though above the OECD average, in maths England ranked 33rd and somewhat below the OECD average and in science England was a strong performer somewhat above the OECD average (page 17)

Apart from the first item, all this material is now at least four years old.

There is no attempt to link KS2 progression to KS5 achievement, which would have materially strengthened the argument (and which is the focus of one of the Report’s central recommendations).

Nor is there any effort to link the PISA assessment to GCSE data, by explaining the key similarities and differences between the two instruments and exploring what that tells us about particular areas of strength and weakness for high attainers in these subjects.

There is again, a wealth of pertinent data available, much of it presented in previous posts on this blog:

Given the relatively scant use of data in the Report, and the significant question marks about the manner in which it has been applied to support the argument, it is hardly surprising that much of the criticism levelled at Ofsted can be traced back to this issue.

All the material I have presented on this blog is freely available online and was curated by someone with no statistical expertise.

While I cannot claim my analysis is error-free, it seems to me that Ofsted’s coverage of the issue is impoverished by comparison. Not only is there too little data, there is too little of the right data to exemplify the issues under discussion.

But, as I have already stated, that is not sufficient reason to condemn the entire Report out of hand.

.

Exeter2 by Gifted Phoenix

Exeter2 by Gifted Phoenix

 

The Qualitative Dimension of the Report

The Evidence Base

If you read some of the social media criticism heaped upon ‘The most able students’ you would be forgiven for thinking that the evidence base consisted entirely of a few dodgy statistics.

But Ofsted also drew on:

  • Field visits to 41 non-selective secondary schools across England, undertaken in March 2013. The sample (which is reproduced as an Annex to the Report) was drawn from each of Ofsted’s eight regions and included schools of different sizes and ‘type’ and ‘different geographical contexts’. Twenty-seven were 11-18 schools, two are described as 11-19 schools, 11 were 11-16 schools and one admitted pupils at 14. Eighteen were academy converters. Inspectors spent a day in each school, discussing issues with school leaders, staff and pupils (asking similar questions to check sources against each other) and they ‘investigated analyses of the school’s [sic] current data’. We know that:

‘Nearly all of the schools visited had a broadly average intake in terms of their students’ prior attainment at the end of Key Stage 2, although this varied from year group to year group.’

Three selective schools were also visited ‘to provide comparison’ but – rather strangely – that comparative evidence was not used in the Report.

  • A sample of 2,327 lesson observation forms collected from Section 5 inspections of a second sample of 109 non-selective secondary schools undertaken in academic year 2012/13. We are not told anything about the selection of this sample, so we have no idea how representative it was.
  • A survey of 93 responses made by parents and carers to a questionnaire Ofsted placed on the website of the National Association for Able Children in Education (NACE)’. Ofsted also ‘sought the views of some key external organisations and individuals’ but these are not named. I have been able to identify just one organisation and one individual who were approached, which perhaps betrays a rather thin sample.

I have no great problem with the sample of schools selected for the survey. Some have suggested that 41 is too few. It falls short of the 50 mentioned in HMCI’s pre-publicity but it is enough, especially since Ofsted’s last Report in December 2009 drew on evidence from just 26 primary and secondary schools.

The second sample of lesson observations is more suspect, in that no information is supplied about how it was drawn. So it is entirely possible that it included all observations from those schools whose inspections were critical of provision for high attainers, or that all the schools were rated as underperforming overall, or against one of Ofsted’s key measures. There is a sin of omission here.

The parental survey is very small and, since it was filtered through a single organisation that focuses predominantly on teacher support, is likely to have generated a biased sample. The failure to engage a proper cross-section of organisations and individuals is regrettable: in these circumstances one should either consult many or none at all.

 .

Survey Questions

Ofsted is comparatively generous with information about its Survey instrument.

There were two fundamental questions, each supported by a handful of supplementary questions:

‘Are the most able students in non-selective state secondary schools achieving as well as they should?’ (with ‘most able’ defined as set out above). This was supported by four supplementary questions:

  • Are comprehensive schools challenging bright students in the way that the independent sector and selective system do?
  • Do schools track progression effectively enough? Do they know how their most able students are doing? What enrichment programme is offered to the most able students and what is its impact?
  • What is the effect of mixed ability classes on the most able students?
  • What is the impact of early entry at GCSE on the most able students?

Why is there such disparity in admissions to the most prestigious universities between a small number of independent and selective schools and the great majority of state-maintained non-selective schools and academies?’

  • What is the quality of careers advice and its impact on A level students, particularly in terms of their successful application to top universities? Are students receiving good advice and support on how to complete their UCAS forms/personal statements?
  • Are the most able students from disadvantaged backgrounds as likely as the most able students from more affluent families to progress to top universities, and if not why?
  • What are successful state schools doing to increase application success rates and what lessons can be learnt?

Evidence from the 41 non-selective schools was collected under six broad themes:

  • ‘the leadership of the school
  • the achievement of the most able students throughout the school
  • the transfer and transition of these students from their primary schools and their induction into secondary school
  • the quality of teaching, learning and assessment of the most able students
  • the curriculum and extension activities offered to the most able student
  • the support and guidance provided for the most able students, particularly when they were choosing subjects and preparing for university.’

But  the survey also ‘focused on five key elements’ (page 32) which are virtually identical to the last five themes above.

.

Analysis of Key Findings

 

Top Level Conclusions

Before engaging in detail with the qualitative analysis from these sources, it is worth pausing to highlight two significant quantitative findings which are far more telling than those generated by the data analysis foregrounded in the Report.

Had I the good fortune to have reviewed the Report’s key findings prior to publication, I would have urged far greater prominence for:

  • ‘The 2,327 lesson observation evidence forms… showed that the most able students in only a fifth of these lessons were supported well or better.’
  • ‘In around 40% of the schools visited in the survey, the most able students were not making the progress of which they were capable. In a few of the schools visited, teachers did not even know who the most able students were.’

So, in a nutshell, one source of evidence suggests that, in 80% of lessons, support for the most able students is either inadequate or requires improvement.

Another source suggests that, in 40% of schools, the most able students are underachieving in terms of progress while, in a few schools, their identity is unknown.

And these findings apply not to a narrow group of the very highest attaining learners but, on the basis of Ofsted’s own definition, to over 50% of pupils!

Subject to the methodological concerns above, the samples appear sufficiently robust to be extrapolated to all English secondary schools – or the non-selective majority at least.

We do not need to apportion blame, or make schools feel that this is entirely their fault. But this is scandalous – indeed so problematic that it surely requires a concerted national effort to tackle it.

We will consider below whether the recommendations set out in the Report match that description, but first we need to engage with some of the qualitative detail.

The analysis below looks in turn at each of the six themes, in the order that they appear in the main body of the Report.

.

Theme 1 – Achievement of the Most Able Students

 Key finding: ‘The most able students in non-selective secondary schools are not achieving as well as they should. In many schools, expectations of what the most able students should achieve are too low.’

 Additional points:

  • [Too] many of the students in the problematic 40% of surveyed schools ‘failed to attain the highest levels at GCSE and A level’.
  • Academic progress in KS3 required improvement in 17 of the 41 schools. Data was neither accurate nor robust in seven of the 41. Progress differed widely by subject.
  • At KS4, the most able were making less progress than other students in 19 of the 41 schools.
  • At KS5, the most able were making ‘less than expected progress’ in one or more subjects at 17 of the 41 schools.

 .

Theme 2 – Leadership and Management

Key Finding: ‘Leaders in our secondary schools have not done enough to create a culture of scholastic excellence, where the highest achievement in academic work is recognised as vitally important. Schools do not routinely give the same attention to the most able as they do to low-attaining students or those who struggle at school.’

Additional points:

  • Nearly all school leaders claimed to be ambitious for their most able students, but this was not realised in practice in over 40% of the sample.
  • In less effective schools initiatives were usually new or rudimentary and had not been evaluated.
  • Students were taught mainly in mixed ability groups in about a third of the schools visited. Setting was typically restricted to core subjects and often introduced for English and science relatively late in KS3.
  • This had no detrimental effect in ‘the very best schools’ but, in the less effective, work was typically pitched to average attainers.
  • Seven schools had revised their policy on early GCSE entry because of a negative impact on the number of the most able achieving top grades.
  • Leaders in the best schools showed high aspirations for their most able students, providing high-quality teaching and work matched to their needs. Results were well above average and high proportions achieved A*/A grades at GCSE and A level.
  • The best leaders ensure their high aspirations are understood throughout the school community, set high expectations embodied in stretching targets, recruit strong staff and deploy them as specialists and create ‘a dynamic, innovative learning environment’.

.

Theme 3 – Transfer and Transition

Key Finding: ‘Transition arrangements from primary to secondary school are not effective enough to ensure that students maintain their academic momentum into Year 7. Information is not used carefully so that teachers can plan to meet the most able students’ needs in all lessons from the beginning of their secondary school career.’

Additional points:

  • The quality of transition is much too variable. Arrangements were weak in over one quarter of schools visited. Work was repeated in KS3 or was insufficiently challenging. Opportunities were missed to extend and consolidate previous learning.
  • Simple approaches were most effective, easier to implement in schools with few primary feeders or long-established cluster arrangements.
  • In the best examples secondary schools supported the most able before transfer, through specialist teaching and enrichment/extension activities.
  • In many schools activities were typically generic rather than targeted at the most able and many leaders didn’t know how effective they were for this group.
  • In over a quarter of schools the most able ‘did not get off to a good start’ in Year 7 because expectations were too low, work was insufficiently demanding and pupils were under-challenged.
  • Overall inspectors found serious weaknesses in this practice.
  • Effective practice includes: pre-transfer liaison with primary teachers and careful discussion about the most able; gathering a wide range of data to inform setting or class groups; identifying the most able early and implementing support for them to maintain their momentum; and fully evaluating pre-transfer activities and adapting them in the light of that.

.

Exeter3 by Gifted Phoenix

Exeter3 by Gifted Phoenix

 .

Theme 4 – The Quality of Teaching, Learning and Assessment

Key Findings:

‘Teaching is insufficiently focused on the most able at KS3. In over two-fifths of the schools visited for the survey, students did not make the progress that they should, or that they were capable of, between the ages of 11 and 14. Students said that too much work was repetitive and undemanding in KS3. As a result, their progress faltered and their interest in school waned.

Many students became used to performing at a lower level than they are capable of. Parents or carers and teachers accepted this too readily. Students did not do the hard work and develop the resilience needed to perform at a higher level because more challenging tasks were not regularly demanded of them. The work was pitched at the middle and did not extend the most able. School leaders did not evaluate how well mixed-ability group teaching was challenging the most able students.’

Additional points:

  • The reasons for slow progress varied between schools and subjects but included: failure to recognise and challenge the most able; variability in approaches across subjects and year groups; inconsistent application of school policy; and lack of focus by senior and middle leaders.
  • Weaker provision demonstrated: insufficient tracking of the most able, inadequate rapid intervention strategies, insufficiently differentiated homework, failure to apply Pupil Premium funding and little evaluation of the impact of teaching and support.
  • In a few schools the organisation of classes inhibited progress, as evidenced by limited knowledge of the effectiveness of differentiation in mixed ability settings and lack of challenge, particularly in KS3.
  • Eight schools had moved recently to grouping by ability, particularly in core subjects. Others indicated they were moving towards setting, streaming or banding most subjects. Schools’ data showed this beginning to have a positive impact on outcomes.

.

Theme 5 – Curriculum and Extension Activities

Key Findings:

‘The curriculum and the quality of homework required improvement. The curriculum in KS3 and early entry to GCSE examination are among the key weaknesses found by inspectors. Homework and the programme of extension activities for the most able students, where they existed, were not checked routinely for their impact or quality. Students said that too much homework was insufficiently challenging; it failed to interest them, extend their thinking or develop their skills.

Inequalities between different groups of the most able students are not being tackled satisfactorily. The attainment of the most able students who are eligible for FSM, especially the most able boys, lags behind that of other groups. Few of the schools visited used the Pupil Premium funding to support the most able students from the poorest backgrounds.

Assessment, tracking and targeting are not used sufficiently well in many schools. Some of the schools visited paid scant attention to the progress of their most able students.’

Additional points:

  • In over a quarter of schools visited, aspects of the curriculum, including homework, required improvement. In two schools the curriculum failed to meet the needs of the most able.
  • In one in seven schools, leaders had made significant changes recently, including more focus on academic subjects and more setting.
  • But schools did not always listen to feedback from their most able students. Many did not ask students how well the school was meeting their needs or how to improve further.
  • In weaker schools students were rarely given extension work. Sixth form students reported insufficient opportunities to think reflectively and too few suggestions for wider, independent reading.
  • Many in less effective schools felt homework could be more challenging. Few were set wider research or extension tasks.
  • While some leaders said extra challenge was incorporated in homework, many students disagreed. Few school leaders were aware of the homework provided to these students. Many schools had limited strategies for auditing and evaluating its quality.
  • Most school leaders said a wide range of extension tasks, extra-curricular and enrichment activities was provided for the most able, but these were usually for all students. Targeted activities, when undertaken, were rarely evaluated.
  • Research suggests it is important to provide access to such activities for the most able students where parents are not doing so. Schools used Pupil Premium for this in only a few instances.
  • The Premium was ‘generally spent on providing support for all underachieving and low-attaining students rather than on the most able students from disadvantaged backgrounds’.
  • Strong, effective practice was exemplified by a curriculum well-matched to the needs of most able students, a good range and quality of extra-curricular activity, effective use of the Pupil Premium to enrich students’ curriculum and educational experience and motivating and engaging homework, tailored to students’ needs, designed to develop creativity and independence.
  • In over a third of schools visited, tracking of the most able was ‘not secure, routine or robust’. Intervention was often too slow.
  • In weaker schools, leaders were focused mainly on the C/D borderline; stronger schools also focused on A*/A grades too, believing their pupils could do better than ‘the B grade that is implied by the expected progress measure’.
  • Some schools used assessment systems inconsistently, especially in some KS3 foundation subjects where there was insufficient or inaccurate data. In one in five schools, targets for the most able ‘lacked precision and challenge’.
  • In a fifth of schools, senior leaders had introduced improved monitoring systems to hold staff to account, but implementation was often at a very early stage. Only in the best schools were such systems well established.
  • The most effective included lesson observation, work scrutiny, data analysis and reviews of teacher planning. In the better schools students knew exactly what they needed to do to attain the next level/grade and received regular feedback on progress.
  • The most successful schools had in place a wide range of strategies including: ensuring staff had detailed knowledge of the most able, their strengths and interests; through comprehensive assessment, providing challenging programmes and high quality support that met students’ needs;  and rigorous tracking by year, department and key stage combined with swift intervention where needed.
  • Many leaders had not introduced professional development focused on the most able students. Their needs had not been tackled by staff in over one fifth of schools visited, so teachers had not developed the required skills to meet their needs, or up-to-date knowledge of the Year 6 curriculum and assessment arrangements. Stronger schools were learning with and from their peers and had formed links with a range of external agencies.

.

Theme 6 – Support and Guidance for University Entry

Key Findings:

‘Too few of the schools worked with families to support them in overcoming the cultural and financial obstacles that stood in the way of the most able students attending university, particularly universities away from the immediate local area. Schools did not provide much information about the various benefits of attending different universities or help the most able students to understand more about the financial support available.

Most of the 11-16 schools visited were insufficiently focused on university entrance. These schools did not provide students with sufficiently detailed advice and guidance on all the post-16 options available.

Schools’ expertise in and knowledge about how to apply to the most prestigious universities was not always current and relevant. Insufficient support and guidance were provided to those most able students whose family members had not attended university.’

Additional points:

  • Support and guidance varied in quality, accuracy and depth. Around half of schools visited ‘accepted any university as an option’. Almost a quarter had much to do to convince students and their families of the benefits of higher education, and began doing so too late.
  • Data provided by 26 of the 29 11-18 schools showed just 16 students went to Oxbridge in 2011, one eligible for FSM, but almost half came from just two of the schools. Nineteen had no students accepted at Oxbridge. The 2012 figures showed some improvement with 26 admitted to Oxbridge from 28 schools, three of them FSM-eligible.
  • In 2011, 293 students went to Russell Group universities, but only six were FSM eligible. By 2012 this had increased to 352, including 30 eligible for FSM, but over a quarter of the 352 came from just two schools.
  • Factors inhibiting application to prestigious universities included pressure to stay in the locality, cost (including fees), aversion to debt and low expectations. Almost half of the schools visited tackled this through partnership with local universities.
  • Schools did not always provide early or effective careers advice or information about the costs and benefits of attending university.
  • Some schools showed a lack of up-to-date intelligence about universities and their entrance requirements, but one third of those visited provided high quality support and guidance.
  • Some schools regarded going to any university as the indicator of success, disagreeing that it was appropriate to push students towards prestigious universities, rather than the ‘right’ institution for the student.
  • Most of the 11-16 schools visited were insufficiently focused on university entrance. They did not provide sufficiently detailed advice on post-16 options and did not track students’ destinations effectively, either post-16 or post-18.
  • The best schools: provided early on a planned programme to raise students’ awareness of university education; began engaging with students and parents about this as soon as they entered the school; provided support and guidance about subject choices, entry requirements and course content; supported UCAS applications; enabled students to visit a range of universities; and used alumni as role models.

.

Exeter4 by Gifted Phoenix

Exeter4 by Gifted Phoenix

 

Ofsted’s Recommendations

There are two sets of recommendations in the Report, each with an associated commentary about the key constituents of good and bad practice. The first is in HMCI’s Foreword; the second in the main body of the Report.

.

HMCI’s Version

This leads with material from the data analysis, rather than some of the more convincing data from the survey, or at least a judicious blend of both sources.

He rightly describes the outcomes as unacceptable and inconsistent with the principle of comprehensive education, though his justification for omitting selective schools from the analysis is rather less convincing, especially since he is focused in part on narrowing the gap between the two as far as admission to prestigious universities is concerned.

Having pointed up deficiencies at whole school level and in lessons he argues that:

‘The term ‘special needs’ should be as relevant to the most able as it is to those who require support for their learning difficulties’

This is rather out of left field and is not repeated in the main body or the official recommendations. There are pros and cons to such a route – and it would anyway be entirely inappropriate for a population comprising over 50% of the secondary population.

HMCI poses ‘three key challenges’:

‘First, we need to make sure that our most able students do as well academically as those of our main economic competitors. This means aiming for A* and A grades and not being satisfied with less. Not enough has changed since 2009, when the PISA tests found that England’s teenagers were just over half as likely as those from other developed nations to reach the highest levels in mathematics in international tests.

The second challenge is to ensure, from early on, that students know what opportunities are open to them and develop the confidence to make the most of these. They need tutoring, guidance and encouragement, as well as a chance to meet other young people who have embraced higher education. In this respect, independent schools as well as universities have an important role to play in supporting state schools.

The third challenge is to ensure that all schools help students and families overcome cultural barriers to attending higher education. Many of our most able students come from homes where no parent or close relative has either experienced, or expects, progression to university. Schools, therefore, need to engage more effectively with the parents or carers of these students to tackle this challenge.’

This despite the fact that comparison with international competitors is almost entirely lacking from the Report, save for one brief section on PISA data.

The role of independent schools is also underplayed, while the role of universities is seen very much from the schools’ perspective – there is no effort to link together the ‘fair access’ and ‘most able’ agendas in any meaningful fashion.

Parental engagement is also arguably under-emphasised or, at least, confined almost exclusively to the issue of progression.

.

Ofsted’s Version

The ‘official’ text provides a standard overarching bullet point profile of poor and strong provision respectively.

  • Poor provision is characterised by: ‘fragile’ primary/secondary transfer; placement in groups where teaching is not challenging; irregular progress checks; a focus on D/C borderline students at the expense of the more able; and failure to prepare students well for A levels.
  • Strong provision features: leadership determined to improve standards for all students; high expectations of the most able amongst students, families and teachers; effective transition to sustain the momentum of the most able; early identification to inform tailoring of teaching and the curriculum; curricular flexibility to permit challenge and extension; grouping to support stretch from the start of secondary school;  expert teaching, formative assessment and purposeful homework; effective training and capacity for teachers to learn from each other; close monitoring of progress to inform rapid intervention where necessary; and effective support for application to prestigious universities.

A series of 13 recommendations is provided, alongside three Ofsted commitments. Ten of the 13 are aimed at schools and three at central Government.

I have set out the recommendations in the table below, alongside those from the previous Report, published in 2009.

 

2009 Report 2013 Report
Central Government Central Government
Ensure planned catalogue of learning and professional development opportunities meets the needs of parents, schools and LAs DfE to ensure parents receive annual report recording whether students are on track to achieve as well as they should in national tests and exams
Ensure LAs hold schools more rigorously to account for the impact of their G&T provision DfE to develop progress measures from KS2 to KS4 and KS5
DfE to promote new destination data showing progression to (Russell Group) universities
Ofsted will focus inspections more closely on teaching and progress of most able, their curriculum and the information, advice and guidance provided to them
Ofsted will consider in more detail during inspection how well Pupil Premium is used to support disadvantaged most able
Ofsted will report inspection findings about this group more clearly in school, sixth form and college reports
Local Authorities Local Authorities
Hold schools more rigorously to account for the impact of their G&T provision
Encourage best practice by sharing with schools what works well and how to access appropriate resources and training
Help schools produce clearer indicators of achievement and progress at different ages
Schools Schools
Match teaching to pupils’ individual needs Develop culture and ethos so needs of most able are championed by school leaders
Listen to pupil feedback and act on it Help most able to leave school with best qualifications by developing skills, confidence and attitudes needed to succeed at the best universities
Inform parents and engage them more constructively Improve primary-secondary transfer so all Year 7 teachers know which students achieved highly and what aspects of the curriculum they studied in Year 6, and use this to inform KS3 teaching.
Use funding to improve provision through collaboration Ensure work remains challenging throughout KS3 so most able make rapid progress.
Ensure lead staff have strategic clout Ensure leaders evaluate mixed ability teaching so most able are sufficiently challenged and make good progress
Ensure rigorous audit and evaluation processes Evaluate homework to ensure it is sufficiently challenging
Give parents better and more frequent information about what their children should achieve and raise expectations where necessary.
Work more closely with families, especially first generation HE applicants and FSM-eligible to overcome cultural and financial obstacles to HE application
Develop more knowledge and expertise to support applications to the most prestigious universities
Publish more widely the university destinations of their students

TABLE 1: COMPARING OFSTED RECOMMENDATIONS IN 2009 AND 2013

The comparison serves to illustrate the degree of crossover between the two Reports – and to what extent the issues raised in the former remain pertinent four years on.

The emboldened Items in the left-hand column are still outstanding and are not addressed in the latest Report. There is nothing about providing support for schools from the centre; and nothing whatsoever about the role of the ‘middle tier’, however that is composed. Ofsted’s new Report might have been enriched by some cross-reference to its predecessor.

The three recommendations directed at the centre are relatively limited in scope – fundamentally restricted to elements of the status quo and probably demanding negligible extra work or resource

  • The reference to an annual report to parents could arguably be satisfied by the existing requirements, which are encapsulated in secondary legislation.
  • It is not clear whether promoting the new destination measures requires anything more than their continuing publication – the 2013 version is scheduled for release this very week.
  • The reference to development of progress measures may be slightly more significant but probably reflects work already in progress. The consultation document on Secondary School Accountability proposed a progress measure based on a new ‘APS8’ indicator, calculated through a Value Added method and using end KS2 results in English and maths as a baseline:

‘It will take the progress each pupil makes between Key Stage 2 and Key Stage 4 and compare that with the progress that we expect to be made by pupils nationally who had the same level of attainment at Key Stage 2 (calculated by combining results at end of Key Stage 2 in English and mathematics).’

However this applies only to KS4, not KS5, and we are still waiting to discover how the KS2 baseline will be graded from 2016 when National Curriculum levels disappear.

This throws attention back on the Secretary of State’s June 2012 announcement, so far unfulfilled by any public consultation:

‘In terms of statutory assessment, however, I believe that it is critical that we both recognise the achievements of all pupils, and provide for a focus on progress. Some form of grading of pupil attainment in mathematics, science and English will therefore be required, so that we can recognise and reward the highest achievers as well as identifying those that are falling below national expectations. We will consider further the details of how this will work.’

.

The Balance Between Challenge and Support

It is hard to escape the conclusion that Ofsted believe inter-school collaboration, the third sector and the market can together provide all the support that schools can  need (while the centre’s role is confined to providing commensurate challenge through a somewhat stiffened accountability regime).

After four years of school-driven gifted education, I am not entirely sure I share their confidence that schools and the third sector can rise collectively to that challenge.

They seem relatively hamstrung at present by insufficient central investment in capacity-building and an unwillingness on the part of key players to work together collaboratively to update existing guidance and provide support. The infrastructure is limited and fragmented and leadership is lacking.

As I see it, there are two immediate priorities:

  • To provide and maintain the catalogue of learning opportunities and professional support mentioned in Ofsted’s 2009 report; and
  • To update and disseminate national guidance on what constitutes effective whole school gifted and talented education.

The latter should in my view be built around an updated version of the Quality Standards for gifted education, last refreshed in 2010. It should be adopted once more as the single authoritative statement of effective practice which more sophisticated tools – some, such as the Challenge Award, with fairly hefty price tags attached – can adapt and apply as necessary.

The Table appended to this post maps the main findings in both the 2009 and 2013 Ofsted Reports against the Standards. I have also inserted a cross in those sections of the Standards which are addressed by the main text of the more recent Report.

One can see from this how relevant the Standards remain to discussion of what constitutes effective whole school practice.

But one can also identify one or two significant gaps in Ofsted’s coverage, including:

  • identification – and the issues it raises about the relationship between ability and attainment
  • the critical importance of a coherent, thorough, living policy document incorporating an annually updated action plan for improvement
  • the relevance of new technology (such as social media)
  • the significance of support for affective issues, including bullying, and
  • the allocation of sufficient resources – human and financial –  to undertake the work.

.

Exeter5 by Gifted Phoenix

Exeter5 by Gifted Phoenix

 

Reaction to the Report

I will not trouble to reproduce some of the more vituperative comment from certain sources, since I strongly suspect much of it to be inspired by personal hostility to HMCI and to gifted education alike.

  • To date there has been no formal written response from the Government although David Laws recorded one or two interviews such as this which simply reflects existing reforms to accountability and qualifications. At the time of writing, the DfE page on Academically More Able Pupils has not been updated to reflect the Report.
  •  The Opposition criticised the Government for having ‘no plan for gifted and talented children’ but did not offer any specific plan of their own.
  • The Sutton Trust called the Report ‘A wake-up call to Ministers’ adding:

‘Schools must improve their provision, as Ofsted recommends. But the Government should play its part too by providing funding to trial the most effective ways to enable our brightest young people to fulfil their potential. Enabling able students to fulfil their potential goes right to the heart of social mobility, basic fairness and economic efficiency.’

Contrary to my expectations, there was no announcement arising from the call for proposals the Trust itself issued back in July 2012 (see word attachment at bottom). A subsequent blog post called for:

‘A voluntary scheme which gives head teachers an incentive – perhaps through a top-up to their pupil premium or some other matched-funding provided centrally – to engage with evidence based programmes which have been shown to have an impact on the achievement of the most able students.’

‘We warned the Government in 2010 when it scrapped the gifted and talented programme that this would be the result. Many schools are doing a fantastic job in supporting these children. However we know from experience that busy schools will often only have time to focus on the latest priorities. The needs of the most able children have fallen to the bottom of the political and social agenda and it’s time to put it right to the top again.’

‘It is imperative that Ofsted, schools and organisations such as NACE work in partnership to examine in detail the issues surrounding this report. We need to disseminate more effectively what works. There are schools that are outstanding in how they provide for the brightest students. However there has not been enough rigorous research into this.’

  • Within the wider blogosphere, Geoff Barton was first out of the traps, criticising Ofsted for lack of rigour, interference in matters properly left to schools, ‘fatuous comparisons’ and ‘easy soundbites’.
  • The same day Tom Bennett was much more supportive of the Report and dispensed some commonsense advice based firmly on his experience as a G&T co-ordinator.
  • Then Learning Spy misunderstood Tom’s suggestions about identification asking ‘how does corralling the boffins and treating them differently’ serve the aim of high expectations for all? He far preferred Headguruteacher’s advocacy for a ‘teach to the top’ curriculum, which is eminently sensible.
  • Accordingly, Headguruteacher contributed The Anatomy of High Expectations which drew out the value of the Report for self-evaluation purposes (so not too different to my call for a revised IQS).
  • Finally Chris Husbands offered a contribution on the IoE Blog which also linked Ofsted’s Report to the abolition of National Curriculum levels, reminding us of some of the original design features built in by TGAT but never realised in practice.

Apologies to any I have missed!

As for yours truly, I included the reactions of all the main teachers’ associations in the collection of Tweets I posted on the day of publication.

I published Driving Gifted Education Forward, a single page proposal for the kind of collaborative mechanism that could bring about system-wide improvement, built on school-to-school collaboration. It proposes a network of Learning Schools, complementing Teaching Schools, established as centres of excellence with a determinedly outward-looking focus.

And I produced a short piece about transition matrices which I have partly integrated into this post.

Having all but completed this extended analysis, have I changed the initial views I Tweeted on the day of publication?

.

.

Well, not really. My overall impression is of a curate’s egg, whose better parts have been largely overlooked because of the opprobrium heaped on the bad bits.

.

Curate's Egg 370px-True_humility

Bishop: ‘I’m afraid you’ve got a bad egg Mr Jones’, Curate: ‘Oh, no, my Lord, I assure you that parts of it are excellent!’

.

The Report might have had a better reception had the data analysis been stronger, had the most significant messages been given comparatively greater prominence and had the tone been somewhat more emollient towards the professionals it addresses, with some sort of undertaking to underwrite support – as well as challenge – from the centre.

The commitments to toughen up the inspection regime are welcome but we need more explicit details of exactly how this will be managed, including any amendments to the framework for inspection and supporting guidance. Such adjustments must be prominent and permanent rather than tacked on as an afterthought.

We – all of us with an interest – need to fillet the key messages from the text and integrate them into a succinct piece of guidance as I have suggested, but carefully so that it applies to every setting and has built-in progression for even the best-performing schools. That’s what the Quality Standards did – and why they are still needed. Perhaps Ofsted should lead the revision exercise and incorporate them wholesale into the inspection framework.

As we draw down a veil over the second of these three ‘Summer of Love’ publications, what are the immediate prospects for a brighter future for English gifted education?

Well, hardly incandescent sunshine, but rather more promising than before. Ofsted’s Report isn’t quite the ‘landmark’ HMCI Wilshaw promised and it won’t be the game changer some of us had hoped for, but it’s better than a poke in the eye with the proverbial blunt stick.

Yet the sticking point remains the capacity of schools, organisations and individuals to set aside their differences and secure the necessary collateral to work collectively together to bring about the improvements called for in the Report.

Without such commitment too many schools will fail to change their ways.

.

GP

June 2013

.

.

ANNEX: MAPPING KEY FINDINGS FROM THE 2009 AND 2013 REPORTS AGAINST THE IQS

IQS Element IQS Sub-element Ofsted 2009 Ofsted 2013
Standards and progress Attainment levels high and progress strong Schools need more support and advice about standards and expectations Most able aren’t achieving as well as they should. Expectations are too low.65% who achieved KS2 L5 in English and maths failed to attain GCSE A*/A gradesTeaching is insufficiently focused on the most able at KS3Inequalities between different groups aren’t being tackled satisfactorily
SMART targets set for other outcomes x
Effective classroom provision Effective pedagogical strategies Pupil experienced inconsistent level of challenge x
Differentiated lessons x
Effective application of new technologies
Identification Effective identification strategies x
Register is maintained
Population is broadly representative of intake
Assessment Data informs planning and progression Assessment, tracking and targeting not used sufficiently well in many schools
Effective target-setting and feedback x
Strong peer and self-assessment
Transfer and transition Effective information transfer between classes, years and institutions Transition doesn’t ensure students maintain academic momentum into Year 7
Enabling curriculum entitlement and choice Curriculum matched to learners’ needs Pupils’ views not reflected in curriculum planning The KS3 curriculum is a key weakness, as is early GCSE entry
Choice and accessibility to flexible pathways
Leadership Effective support by SLT, governors and staff Insufficient commitment in poorer performing schools School leaders haven’t done enough to create a culture of scholastic excellence.Schools don’t routinely give the same attention to most able as low-attaining or struggling students.
Monitoring and evaluation Performance regularly reviewed against challenging targets Little evaluation of progression by different groups x
Evaluation of provision for learners to inform development x
Policy Policy is integral to school planning, reflects best practice and is reviewed regularly Many policies generic versions from other schools or the LA;Too much inconsistency and incoherence between subjects
School ethos and pastoral care Setting high expectations and celebrating achievement Many students become used to performing at a lower level than they are capable of. Parents and teachers accept this too readily.
Support for underachievers and socio-emotional needs
Support for bullying and academic pressure/opportunities to benefit the wider community
Staff development Effective induction and professional development x
Professional development for managers and whole staff x
Resources Appropriate budget and resources applied effectively
Engaging with the community, families and beyond Parents informed, involved and engaged Less than full parental engagement Too few schools supporting families in overcoming cultural and financial obstacles to attending university
Effective networking and collaboration with other schools and organisations Schools need more support to source best resources and trainingLimited collaboration in some schools; little local scrutiny/accountability Most 11-16 schools insufficiently focused on university entranceSchools’ expertise and knowledge of prestigious universities not always current and relevant
Learning beyond the classroom Participation in a coherent programme of out-of-hours learning Link with school provision not always clear; limited evaluation of impact Homework and extension activities were not checked routinely for impact and quality

My Twitter Feed Summarising Key Points from Ofsted’s Report ‘The Most Able Students’

.

Here is the record of my Tweets from this morning summarising the main points from Ofsted’s newly-published Survey Report: ‘The Most Able Students’.

.

.

.

OFSTED’S KEY FINDINGS

.

.

.

OFSTED’S RECOMMENDATIONS

.

.

.

OFSTED’S COMMITMENTS

.

.

.

OVERALL ASSESSMENT

.

.

.

GOVERNMENT RESPONSE

.

.

.

OPPOSTION RESPONSE

.

.

.

POTENTIAL PLUS PRESS RELEASE

.

.

.

SUTTON TRUST PRESS RELEASE

.

.

.

WHAT THE UNIONS THINK

.