Unpacking the Primary Assessment and Accountability Reforms

This post examines the Government response to consultation on primary assessment and accountability.

pencil-145970_640It sets out exactly what is planned, what further steps will be necessary to make these plans viable and the implementation timetable.

It is part of a sequence of posts I have devoted to this topic, most recently:

Earlier posts in the series include The Removal of National Curriculum Levels and the Implications for Able Pupils’ Progression (June 2012) and Whither National Curriculum Assessment Without Levels? (February 2013).

The consultation response contrives to be both minimal and dense. It is necessary to unpick each element carefully, to consider its implications for the package as a whole and to reflect on how that package fits in the context of wider education reform.

I have organised the post so that it considers sequentially:

  • The case for change, including the aims and core principles, to establish the policy frame for the planned reforms.
  • The impact on the assessment experience of children aged 2-11 and how that is likely to change.
  • The introduction of baseline assessment in Year R.
  • The future shape of end of KS1 and end of KS2 assessment respectively.
  • How the new assessment outcomes will be derived, reported and published.
  • The impact on floor standards.

Towards the end of the post I have also provided a composite ‘to do’ list containing all the declared further steps necessary to make the plan viable, with a suggested deadline for each.

And the post concludes with an overall judgement on the plans, in the form of a summary of key issues and unanswered questions arising from the earlier commentary. Impatient readers may wish to jump straight to that section.

I am indebted to Warwick Mansell for his previous post on this topic. I shall try hard not to parrot the important points he has already made, though there is inevitably some overlap.

Readers should also look to Michael Tidd for more information about the shape and content of the new tests.

What has been published?

The original consultation document ‘Primary assessment and accountability under the new national curriculum’ was published on 17 July 2013 with a deadline for response of 17 October 2013. At that stage the Government’s response was due ‘in autumn 2013’.

The response was finally published on 27 March, some four months later than planned and only five months prior to the introduction of the revised national curriculum which these arrangements are designed to support.

It is likely that the Government will have decided that 31 March was the latest feasible date to issue the response, so they were right up against the wire.

It was accompanied by:

  • A press release which focused on the full range of assessment reforms – for primary, secondary and post-16.

Shortly before the response was published, the reply to a Parliamentary question asked on 17 March explained that test frameworks were expected to be included within it:

‘Guidance on the nature of the revised key stage 1 and key stage 2 tests, including mathematics, will be published by the Standards and Testing Agency in the form of test framework documents. The frameworks are due to be released as part of the Government’s response to the primary assessment and accountability consultation. In addition, some example test questions will be made available to schools this summer and a full sample test will be made available in the summer of 2015.’ (Col 383W)

.

.

In the event, these documents – seven in all – did not appear until 31 March and there was no reference to any of the three commitments above in what appeared on 27 March.

Finally, the Standards and Testing Agency published on 3 April a guidance page on national curriculum tests from 2016. At present it contains very little information but further material will be added as and when it is published.

Partly because the initial consultation document was extremely ‘drafty’, the reaction of many key external respondents to the consultation was largely negative. One imagines that much of the period since 17 October has been devoted to finding the common ground.

Policy makers will have had to do most of their work after the consultation document issued because they were not ready beforehand.

But the length of the delay in issuing the response would suggest that they also encountered significant dissent amongst internal stakeholders – and that the eventual outcome is likely to be a compromise of sorts between these competing interests.

Such compromises tend to have observable weaknesses and/or put off problematic issues for another day.

A brief summary of consultation responses is included within the Government’s response. I will refer to this at relevant points during the discussion below.

 .

The Case for Change

 .

Aims

The consultation response begins – as did the original consultation document – with a section setting out the case for reform.

It provides a framework of aims and principles intended to underpin the changes that are being set in place.

The aims are:

  • The most important outcome of primary education is to ‘give as many pupils as possible the knowledge and skills to flourish in the later phases of education’. This is a broader restatement of the ‘secondary ready’ concept adopted in the original consultation document.
  • The primary national curriculum and accountability reforms ‘set high expectations so that all children can reach their potential and are well prepared for secondary school’. Here the ‘secondary ready’ hurdle is more baldly stated. The parallel notion is that all children should do as well as they can – and that they may well achieve different levels of performance. (‘Reach their potential’ is disliked by some because it is considered to imply a fixed ceiling for each child and fixed mindset thinking.)
  • To raise current threshold expectations. These are set too low, since too few learners (47%) with KS2 level 4C in both English and maths go on to achieve five or more GCSE grades A*-C including English and maths, while 72% of those with KS2 level 4B do so. So the new KS2 bar will be set at this higher level, but with the expectation that 85% of learners per school will jump it, 13% more than the current national figure. Meanwhile the KS4 outcome will also change, to achievement across eight GCSEs rather than five, quite probably at a more demanding level than the present C grade. In the true sense, this is a moving target.
  • No child should be allowed to fall behind’. This is a reference to the notion of ‘mastery’ in its crudest sense, though the model proposed will not deliver this outcome. We have noted already a reference to ‘as many children as possible’ and the school-level target – initially at least – will be set at 85%. In reality, a significant minority of learners will progress more slowly and will fall short of the threshold at the end of KS2.
  • The new system ‘will set a higher bar’ but ‘almost all pupils should leave primary school well-placed to succeed in the next phase of their education’. Another nuanced version of ‘secondary ready’ is introduced. This marks a recognition that some learners will not jump over the higher bar. In the light of subsequent references to 85%, ‘almost all’ is rather over-optimistic.
  • We also want to celebrate the progress that pupils make in schools with more challenging intakes’. Getting ‘nearly all pupils to meet this standard…’ (the standard of secondary readiness?) ‘…is very demanding, at least in the short term’. There will therefore be recognition of progress ‘from a low starting point’ – even though these learners have, by definition, been allowed to fall behind and will continue to do so.

So there is something of a muddle here, no doubt engendered by a spirit of compromise.

The black and white distinction of ‘secondary-readiness’ has been replaced by various verbal approximations, but the bottom line is that there will be a defined threshold denoting preparedness that is pitched higher than the current threshold.

And the proportion likely to fall short is downplayed – there is apparent unwillingness at this stage to acknowledge the norm that up to 15% of learners in each school will undershoot the threshold – substantially more in schools with ‘challenging intakes’.

What this boils down to is a desire that all will achieve the new higher hurdle – and that all will be encouraged to exceed it if they can – tempered by recognition that this is presently impossible. No child should be allowed to fall behind but many inevitably will do so.

It might have been better to express these aims in the form of future aspirations – and our collective efforts to bridge the gap between present reality and those ambitious aspirations.

Principles

The section concludes with a new set of principles governing pedagogy, assessment and accountability:

  • ‘Ongoing, teacher-led assessment is a crucial part of effective teaching;
  • Schools should have the freedom to decide how to teach their curriculum and how to track the progress that pupils make;
  • Both summative teacher assessment and external testing are important;
  • Accountability is key to a successful school system, and therefore must be fair and transparent;
  • Measures of both progress and attainment are important for understanding school performance; and
  • A broad range of information should be published to help parents and the wider public know how well schools are performing.’

These are generic ‘motherhood and apple pie’ statements and so largely uncontroversial. I might have added a seventh – that schools’ in-house assessment and reporting systems must complement summative assessment and testing, including by predicting for parents the anticipated outcomes of the latter.

Perhaps interestingly, there is no repetition of the defence for the removal of national curriculum levels. Instead, the response concentrates on the support available to schools.

It mentions discussion with an ‘expert group on assessment’ about ‘how to support schools to make best use of the new assessment freedoms’. We are not told the membership of this group (which, as far as I know, has not been made public) or the nature of its remit.

There is also a link to information about the Assessment Innovation Fund, which will provide up to 10 grants of up to £10,000 which schools and organisations can use to develop packages that share their innovative practice with others.

 

Children’s experience of assessment up to the end of KS2

The response mentions the full range of national assessments that will impact on children between the ages of two and 11:

  • The statutory progress check at two years of age.
  • A new baseline assessment undertaken within a few weeks of the start of Year R, introduced from September 2015.
  • An Early Years Foundation Stage Profile undertaken in the final term of the year in which children reach the age of five. A revised profile was introduced from September 2012. It is currently compulsory but will be optional from September 2016. The original consultation document said that the profile would no longer be moderated and data would no longer be collected. Neither of those commitments is repeated here.
  • The Phonics Screening Check, normally undertaken in Year 1. The possibility of making these assessments non-statutory for all-through primary schools, suggested in the consultation document, has not been pursued: 53% of respondents opposed this idea, whereas 32% supported it.
  • End of KS1 assessment and
  • End of KS2 assessment.

So a total of six assessments are in place between the ages of two and 11. At least four – and possibly five – will be undertaken between ages two and seven.

It is likely that early years’ professionals will baulk at this amount of assessment, no matter how sensitively it is designed. But the cost and inefficiency of the model is also open to criticism.

The Reception Baseline

Approach

The original consultation document asked whether:

  • KS1 assessment should be retained as a baseline – 45% supported this and 41% were opposed.
  • A baseline check should be introduced at the start of Reception – 51% supported this and 34% were opposed.
  • Such a baseline check should be optional – 68% agreed and 19% disagreed.
  • Schools should be allowed to choose from a range of commercially available materials for this baseline check – 73% said no and only 15% said yes.

So, whereas views were mixed on where the baseline should be set, there were substantial majorities in favour of any Year R baseline check being optional and following a single, standard national format.

The response argues that Year R is the most sensible point at which to position the baseline since that is:

‘…the earliest point that nearly all children are in school’.

What happens in respect of children who are not in school at this point is not discussed.

There is no explanation of why the Government has disregarded the clear majority of respondents by choosing to permit a range of assessment approaches, so this decision must be ideologically motivated.

The response says ‘most’ are likely to be administered by teaching staff, leaving open the possibility that some options will be administered externally.

Design

Such assessments will need to be:

‘…strong predictors of key stage 1 and key stage 2 attainment, whilst reflecting the age and abilities of children in Reception’.

Presumably this means predictors of attainment in each of the three core subjects – English, maths and science – rather than any broader notion of attainment. The challenge inherent in securing a reasonable predictor of attainment across these domains seven years further on in a child’s development should not be under-estimated.

The response points out that such assessment tools are already available for use in Year R, some are used widely and some schools have long experience of using them. But there is no information about how many of these are deemed to meet already the description above.

In any case, new criteria need to be devised which all such assessments must meet. Some degree of modification will be necessary for all existing products and new products will be launched to compete in the market.

There is an opportunity to use this process to ratchet up the Year R Baseline beyond current expectations, so matching the corresponding process at the end of KS2. The consultation response says nothing about whether this is on the cards.

Interestingly, in his subsequent ‘Unsure start’ speech about early years inspection, HMCI refers to:

‘…the government’s announcement last week that they will be introducing a readiness-for-school test at age four. This is an ideal opportunity to improve accountability. But I think it should go further.

I hope that the published outcomes of these tests will be detailed enough to show parents how their own child has performed. I fear that an overall school grade will fail to illuminate the progress of poor children. I ask government to think again about this issue.’

The terminology – ‘readiness for school’ is markedly blunter than the references to a reception baseline in the consultation response. There is nothing in the response about the outcomes of these tests being published, nor anything about ‘an overall school grade’.

Does this suggest that decisions have already been made that were not communicated in the consultation response?

.

Timeline, options, questions

Several pieces of further work are required in short order to inform schools and providers about what will be required – and to enable both to prepare for introduction of the assessments from September 2015. All these should feature in the ‘to do’ list below.

One might reasonably have hoped that – especially given the long delay – some attempt might have been made to publish suggested draft criteria for the baseline alongside the consultation response. The fact that even preliminary research into existing practice has not been undertaken is a cause for concern.

Although the baseline will be introduced from September 2015, there is a one-year interim measure which can only apply to all-through primary schools:

  • They can opt out of the Year R baseline measure entirely, relying instead on KS1 outcomes as their baseline; or
  • They can use an approved Year R baseline assessment and have this cohort’s progress measured at the end of KS2 (which will be in 2022) by either the Year R or the KS1 baseline, whichever demonstrates the most progress.

In the period up to and including 2021, progress will continue to be measured from the end of KS1. So learners who complete KS2 in 2021 for example will be assessed on progress since their KS1 tests in 2017.

Junior and middle schools will also continue to use a KS1 baseline.

Arrangements for infant and first schools are still to be determined, another rather worrying omission at this stage in proceedings.

It is also clear that all-through primary schools (and infant/first schools?) will continue to be able to opt out from the Year R baseline from September 2016 onwards, since the response says:

‘Schools that choose not to use an approved baseline assessment from 2016 will be judged on an attainment floor standard alone’.

Hence the Year R baseline check is entirely optional and a majority of schools could choose not to undertake it.

However, they would need to be confident of meeting the demanding 85% attainment threshold in the floor standard.

They might be wise to postpone that decision until the pitch of the progress expectation is determined. For neither the Year R baseline nor the amount of progress that learners are expected to make from their starting point in Year R is yet defined.

This latter point applies at the average school level (for the purposes of the floor standard) and in respect of the individual learner. For example, if a four year-old is particularly precocious in, say, maths, what scaled scores must they register seven years later to be judged to have made sufficient progress?

There are several associated questions that follow on from this.

Will it be in schools’ interests to acknowledge that they have precocious four year-olds at all? Will the Year R baseline reinforce the tendency to use Reception to bring all children to the same starting point in readiness for Year 1, regardless of their precocity?

Will the moderation arrangements be hard-edged enough to stop all-through primary schools gaming the system by artificially depressing their baseline outcomes?

Who will undertake this moderation and how much will it cost? Will not the decision to permit schools to choose from a range of measures unnecessarily complicate the moderation process and add to the expense?

The consultation response neither poses these questions nor supplies answers.

The future shape of end KS1 and end KS2 assessment

.

What assessment will take place?

At KS1 learners will be assessed in:

  • Reading – test plus teacher assessment
  • Writing – test (of grammar, punctuation and spelling) plus teacher assessment
  • Speaking and listening – teacher assessment
  • Maths – test plus teacher assessment
  • Science  - teacher assessment

The new test of grammar, punctuation and spelling did not feature in the original consultation and has presumably been introduced to strengthen the marker of progress to which four year-olds should aspire at age seven.

The draft test specifications for the KS1 tests in reading, GPS and maths outline the requirements placed on the test developers, so it is straightforward to compare the specifications for reading and maths with the current tests.

The GPS test will include a 20 minute written grammar and punctuation task; a 20 minute test comprising short grammar, punctuation and vocabulary questions; and a 15 minute spelling task.

There is a passing reference to further work on KS1 moderation which is included in the ‘to do’ list below.

At KS2 learners will be assessed in

  • Reading – test plus teacher assessment
  • Writing – test (of grammar spelling and punctuation) plus teacher assessment
  • Maths – test plus teacher assessment
  • Science  - teacher assessment plus a science sampling test.

Once again, the draft test specifications – reading, GPS, maths and science sampling – describe the shape of each test and the content they are expected to assess.

I will leave it to experts to comment on the content of the tests.

 .

Academies and free schools

It is important to note that the framing of this content – by means of detailed ‘performance descriptors’ – means that the freedom academies and free schools enjoy in departing from the national curriculum will be largely illusory.

I raised this issue back in February 2013:

  • ‘We know that there will be a new grading system in the core subjects at the end of KS2. If this were to be based on the ATs as drafted, it could only reflect whether or not learners can demonstrate that they know, can apply and understand ‘the matters, skills and processes specified’ in the PoS as a whole. Since there is no provision for ATs that reflect sub-elements of the PoS – such as reading, writing, spelling – grades will have to be awarded on the basis of separate syllabuses for end of KS2 tests associated with these sub-elements.
  • This grading system must anyway be applied universally if it is to inform the publication of performance tables. Since some schools are exempt from National Curriculum requirements, it follows that grading cannot be derived directly from the ATs and/or the PoS, but must be independent of them. So this once more points to end of KS2 tests based on entirely separate syllabuses which nevertheless reflect the relevant part of the draft PoS. The KS2 arrangements are therefore very similar to those planned at KS4.’

I have more to say about the ‘performance descriptors’ below.

 .

Single tests for all learners

A critical point I want to emphasise at this juncture – not mentioned at all in the consultation document or the response – is the test development challenge inherent in producing single papers suitable for all learners, regardless of their attainment.

We know from the response that the P-scales will be retained for those who are unable to access the end of key stage tests. (Incidentally, the content of the P-scales will remain unchanged so they will not be aligned with the revised national curriculum, as suggested in the consultation document.)

There will also be provision for pupils who are working ‘above the P-scales but below the level of the test’.

Now the P-scales are for learners working below level 1 (in old currency). This is the first indication I have seen that the tests may not cater for the full range from Level 1-equivalent to Level 6-equivalent and above. But no further information is provided.

It may be that this is a reference to learners who are working towards level 1 (in old currency) but do not have SEN.

The 2014 KS2 ARA booklet notes:

‘Children working towards level 1 of the national curriculum who do not have a special educational need should be reported to STA as ‘W’ (Working below the level). This includes children who are working towards level 1 solely because they have English as an additional language. Schools should use the code ‘NOTSEN’ to explain why a child working towards level 1 does not have P scales reported. ‘NOTSEN’ replaces the code ‘EAL’ that was used in previous years.’

The consultation document said:

‘We do not propose to develop an equivalent to the current level 6 tests, which are used to challenge the highest-attaining pupils. Key stage 2 national curriculum tests will include challenging material (at least of the standard of the current level 6 test) which all pupils will have the opportunity to answer, without the need for a separate test’.

The draft test specifications make it clear that the tests should:

‘provide a suitable challenge for all children and give every child the opportunity to achieve as high a standard…as possible.’

Moreover:

‘In order to improve general accessibility for all children, where possible, questions will be placed in order of difficulty.’

The development of single tests covering this span of attainment – from level 1 to above level 6 – tests in which the questions are posed in order of difficulty and even the highest attainers must answer all questions – seem to me to be a very tall order, especially in maths.

More than that, I urgently need persuading that this is not a waste of high attainers’ time and poor assessment practice.

 .

How assessment outcomes will be derived, reported and published

Deriving assessment outcomes

One of the reasons cited for replacing national curriculum levels was the complexity of the system and the difficulty parents experienced in understanding it.

The Ministerial response to the original report from the National Curriculum Expert Panel said:

‘As you rightly identified, the current system is confusing for parents and restrictive for teachers. I agree with your recommendation that there should be a direct relationship between what children are taught and what is assessed. We will therefore describe subject content in a way which makes clear both what should be taught and what pupils should know and be able to do as a result.’

The consultation document glossed the same point thus:

‘Schools will be able to focus their teaching, assessment and reporting not on a set of opaque level descriptions, but on the essential knowledge that all pupils should learn.’

However, the consultation response introduces for the first time the concept of a ‘performance descriptor’.

This term is defined in the glossaries at the end of each draft test specification:

Description of the typical characteristics of children working at a particular standard. For these tests, the performance descriptor will characterise the minimum performance required to be working at the appropriate standard for the end of the key stage.’

Essentially this is a collective term for something very similar to old-style level descriptions.

Except that, in the case of the tests, they are all describing the same level of performance.

They have been rendered necessary by the odd decision to provide only a single generic attainment target for each programme of study. But, as noted back in February 2013, the test developers need a more sophisticated framework on which to base their assessments.

According to the draft test specifications they will also be used

‘By a panel of teachers to set the standards on the new tests following their first administration in May 2016’.

When it comes to teacher assessment, the consultation response says:

‘New performance descriptors will be introduced to inform the statutory teacher assessments at the end of key stage one [and]…key stage two.’

But there are two models in play simultaneously.

In four cases – science at KS1 and reading, maths and science at KS2 – there will be ‘a single performance descriptor of the new expected standard’, in the same way as there are in the test specifications.

But in five cases – reading, writing, speaking and listening and maths at KS1; and writing at KS2 :

‘teachers will assess pupils as meeting one of several performance descriptors’.

These are old-style level descriptors by another name. They perform exactly the same function.

The response says that the KS1 teacher assessment performance descriptors will be drafted by an expert group for introduction in autumn 2014. It does not mention whether KS2 teacher assessment performance descriptors will be devised in the same way and to the same timetable.

 .

Reporting assessment outcomes to parents

When it comes to reporting to parents, there will be three different arrangements in play at both KS1 and KS2:

  • Test results will be reported by means of scaled scores (of which more in a moment).
  • One set of teacher assessments will be reported by selecting from a set of differentiated performance descriptors.
  • A second set of teacher assessments will be reported according to whether learners have achieved a single threshold performance descriptor.

This is already significantly more complex than the previous system, which applied the same framework of national curriculum levels across the piece.

It seems that KS1 test outcomes will be reported as straightforward scaled scores (though this is only mentioned on page 8 of the main text of the response and not in Annex B, which compares the new arrangements with those currently in place).

But, in the case of KS2:

‘Parents will be provided with their child’s score alongside the average for their school, the local area and nationally. In the light of the consultation responses, we will not give parents a decile ranking for their child due to concerns about whether decile rankings are meaningful and their reliability at individual pupil level.’

The consultation document proposed a tripartite reporting system comprising:

  • A scaled score for each KS2 test, derived from raw test marks and built around a ‘secondary readiness standard’. This standard would be set at a scaled score of 100, which would remain unchanged. It was suggested for illustrative purposes that a scale based on the current national curriculum tests might run from 80 to 130.
  • An average scaled score in each test for other pupils nationally with the same prior attainment at the baseline. Comparison of a learner’s scaled score with the average scaled score would show whether they had made more or less progress than the national average.
  • A national ranking in each test – expressed in terms of deciles – showing how a learner’s scaled score compared with the range of performance nationally.

The latter has been dispensed with, given that 35% of consultation respondents disagreed with it, but there were clearly technical reservations too.

In its place, the ‘value added’ progress measure has been expanded so that there is a comparison with other pupils in the learner’s own school and the ‘local area’ (which presumably means local authority). This beefs up the progression element in reporting at the expense of information about the attainment level achieved.

So at the end of KS2 parents will receive scaled scores and three average scaled scores for each of reading, writing and maths – twelve scores in all – plus four performance descriptors, of which three will be singleton threshold descriptors (reading, maths and science) and one will be selected from a differentiated series (writing). That makes sixteen assessment outcomes altogether, provided in four different formats.

The consultation response tells us nothing more about the range of the scale that will be used to provide scaled scores. We do not even know if it will be the same for each test.

The draft test specifications say that:

‘The exact scale for the scaled scores will be determined following further analysis of trialling data. This will include a full review of the reporting of confidence intervals for scaled scores.’

But they also contain this worrying statement:

‘The provision of a scaled score will aid in the interpretation of children’s performance over time as the scaled score which represents the expected standard will be the same year on year. However, at the extremes of the scaled score distribution, as is standard practice, the scores will be truncated such that above and below a certain point, all children will be awarded the same scaled score in order to minimise the effect for children at the ends of the distribution where the test is not measuring optimally.’

This appears to suggest that scaled scores will not accurately describe performance at the extremes of the distribution, because the tests will not accurately measure such performance. This might be describing a statistical truism, but it again begs the question whether the highest attainers are being short-changed by the selected approach.

.

Publication of assessment outcomes

The response introduces the idea that ‘a suite of indicators’ will be published on each school’s own website in a standard format. These are:

  • The average progress made by pupils in reading, writing and maths. (This is presumably relevant to both KS1 and KS2 and to both tests and teacher assessment.)
  • The percentage of pupils reaching the expected standard in reading, writing and mathematics at the end of key stage 2. (This is presumably relevant to both tests and teacher assessment.)
  • The average score of pupils in their end of key stage 2 assessments. (The final word suggests teacher assessment as well as tests, even though there will not be a score from the former.)
  • The percentage of pupils who achieve a high score in all areas at the end of key stage 2. (Does ‘all areas’ imply something more than statutory tests and teacher assessments? Does it mean treating each area separately, or providing details only of those who have achieved high scores across all areas?)

The latter is the only reference to high attainers in the entire response. It does not give any indication of what will count as a high score for these purposes. Will it be designed to catch the top-third of attainers or something more demanding, perhaps equivalent to the top decile?

A decision has been taken not to report the outcomes of assessment against the P-scales because the need to contextualise such information is perceived to be relatively greater.

And, as noted above, HMCI let slip the fact that the outcomes of reception baselines would also be published, but apparently in the form of a single overall grade.

We are not told when these requirements will be introduced, but presumably they must be in place to report the outcomes of assessments undertaken in spring 2016.

Additionally:

‘So that parents can make comparisons between schools, we would like to show each school’s position in the country on these measures and present these results in a manner that is clear for all audiences to understand. We will discuss how best to do so with stakeholders, to ensure that the presentation of the data is clear, fair and statistically robust.’

This suggests inclusion in the 2016 School Performance Tables, but this is not stated explicitly.

Indeed, apart from references to the publication of progress measures in the 2022 Performance Tables, there is no explicit coverage of their contribution in the response, nor any reference to the planned supporting data portal, or how data will be distributed between the Tables and the portal.

The original consultation document gave several commitments on the future content of performance tables. They included:

  • How many of a school’s pupils are amongst the highest attaining nationally, by showing the percentage of pupils achieving a high scaled score in each subject.
  • Measures to show the attainment and progress of learners attracting the Pupil Premium.
  • Comparison of each school’s performance with that of schools with similar intakes.

None are mentioned here, nor are any of the suggestions advanced by respondents taken up.

Floor standards

Changes are proposed to the floor standards with effect from September 2016.

This section of the response begins by committing to:

‘…a new floor standard that holds schools to account both on the progress that they make and on how well their pupils achieve.’

But the plans set out subsequently do not meet this description.

The progress element of the current floor standard relates to any of reading, writing or mathematics but, under the new floor standard, it will relate to all three of these together.

An all-though primary school must demonstrate that:

‘…pupils make sufficient progress at key stage 2 from their starting point…’

As we have noted above, all-through primaries can opt to use the KS1 baseline or the Year R baseline in 2015. Moreover, from 2016 they can choose not to use the Year R baseline and be assessed solely on the attainment measure in the floor standards (see below).

Junior and middle schools obviously apply the KS1 baseline, while arrangements for infant and first schools have yet to be finalised.

What constitutes ‘sufficient progress’ is not defined. Annex C of the response says:

‘For 2016 we will set the precise extent of progress required once key stage 2 tests have been sat for the first time.’

Presumably this will be progress from KS1 to KS2, since progress from the Year R baseline will not be introduced until 2023.

The attainment element of the new floor standards is for schools to have 85% or more of pupils meeting the new, higher threshold standard at the end of KS2 in all of reading, writing and maths. The text says explicitly that this threshold is ‘similar to a level 4b under the current system’.

Annex C clarifies that this will be judged by the achievement of a scaled score of 100 or more in each of the reading and maths tests, plus teacher assessment that learners have reached the expected standard in writing (so the GPS test does not count in the same way, simply informing the teacher assessment).

As noted above, this a far bigger ask than the current reference to 65% of learners meeting the expected (and lower 4c) standard. The summary at the beginning of the response refers to it as ‘a challenging aspiration’:

‘Over time we expect more and more schools to achieve this standard.’

The statement in the first paragraph of this section of the response led us to believe that these two requirements – for progress and attainment respectively – would be combined, so that schools would be held account for both (unless, presumably, they exercised their right to opt out of the Year R baseline assessment).

But this is not the case. Schools need only achieve one or the other.

It follows that schools with a very high performing intake may exceed the floor standards on the basis of all-round high attainment alone, regardless of the progress made by their learners.

The reason for this provision is unclear, though one suspects that schools with an extremely high attaining intake, whether at Reception or Year 3, will be harder pressed to achieve sufficient progress, presumably because some ceiling effects come into play at the end of KS2.

This in turn might suggest that the planned tests do not have sufficient headroom for the highest attainers, even though they are supposed to provide similar challenge to level 6 and potentially extend beyond it.

Meanwhile, schools with less than stellar attainment results will be obliged to follow the progress route to jump the floor standard. This too will be demanding because all three domains will be in play.

There will have been some internal modelling undertaken to judge how many schools would be likely to fall short of the floor standards given these arrangements and it would be very useful to know these estimates, however unreliable they prove to be.

In their absence, one suspects that the majority of schools will be below the floor standards, at least initially. That of course materially changes the nature and purpose of the standards.

To Do List

The response and the draft specifications together contain a long list of work to be carried out over the next two years or so. I have included below my best guess as to the latest possible date for each decision to be completed and communicated:

  • Decide how progress will be measured for infants and first schools between the Year R baseline and the end of KS1 (April 2014)
  • Make available to schools a ‘small number’ of sample test questions for each key stage and subject (Summer 2014)
  • Work with experts to establish the criteria for the Year R baseline (September 2014)
  • KS1 [and KS2?] teacher assessment performance descriptors to be drafted by an expert group (September 2014)
  • Complete and report outcomes of a study with schools that already use Year R baseline assessments (December 2014)
  • Decide how Year R baseline assessments will be moderated (December 2014)
  • Publish a list of assessments that meet the Year R baseline criteria (March 2015)
  • Decide how Year R baseline results will be communicated to parents and to Ofsted (March 2015)
  • Make available to schools a full set of sample materials including tests and mark schemes for all KS1 and KS2 tests (September 2015)
  • Complete work with Ofsted and Teachers to improve KS1 moderation (September 2015)
  • Provide further information to enable teachers to assess pupils at the end of KS1 and KS2 who are ‘working above the P-scales but below the level of the test’ (September 2015)
  • Decide whether to move to external moderation of P-scale teacher assessment (September 2015)
  • Agree with stakeholders how to compare schools’ performance on a suite of assessment outcomes published in a standard format (September 2015)
  • Publish all final test frameworks (Autumn 2015)
  • Introduce new requirements for schools to publish a suite of assessment outcomes in a standard format (Spring 2016)
  • Panels of teacher use level descriptors to set the standards on the new tests following their first administration in May 2016 (Summer 2016)
  • Define what counts as sufficient progress from the Year R baseline to end KS1 and end KS2 respectively (Summer 2016)

Conclusion

Overall the response is rather more cogent and coherent than the original consultation document, though there are several inconsistencies and many sins of omission.

Drawing together the key issues emerging from the commentary above, I would highlight twelve key points:

  • The declared aims express the policy direction clumsily and without conviction. The ultimate aspirations are universal ‘secondary readiness’ (though expressed in broader terms), ‘no child left behind’ and ‘every child fulfilling their potential’ but there is no real effort to reconcile these potentially conflicting notions into a consensual vision of what primary education is for. Moreover, an inconvenient truth lurks behind these statements. By raising expectations so significantly – 4b equivalent rather than 4c; 85% over the attainment threshold rather than 65%; ‘sufficient progress’ rather than median progress and across three domains rather than one – there will be much more failure in the short to medium term. More learners will fall behind and fall short of the thresholds; many more schools are likely to undershoot the floor standards. It may also prove harder for some learners to demonstrate their potential. It might have been better to acknowledge this reality and to frame the vision in terms of creating the conditions necessary for subsequent progress towards the ultimate aspirations.
  • Younger children are increasingly caught in the crossbeam from the twin searchlights of assessment and accountability. HMCI’s subsequent intervention has raised the stakes still further. This creates obvious tensions in the sector which can be traced back to disagreements over the respective purposes of early years and primary provision and how they relate to each other. (HMCI’s notion of ‘school readiness’ is no doubt as narrow to early years practitioners as ‘secondary readiness’ is to primary educators.) But this is not just a theoretical point. Additional demands for focused inspection, moderation and publication of outcomes all carry a significant price tag. It must be open to question whether the sheer weight of assessment activity is optimal and delivers value for money. Should a radical future Government – probably with a cost-cutting remit – have rationalisation in mind?
  • Giving schools the freedom to choose from a range of Year R baseline assessment tools also seems inherently inefficient and flies in the face of the clear majority of consultation responses. We are told nothing of the perceived quality of existing services, none of which can – by definition – satisfy these new expectations without significant adjustment. It will not be straightforward to construct a universal and child-friendly instrument that is a sufficiently strong predictor of Level 4b-equivalent performance in KS2 reading, writing and maths assessments undertaken seven years later. Moreover, there will be a strong temptation for the Government to pitch the baseline higher than current expectations, so matching the  realignment at the other end of the process. Making the Reception baseline assessment optional – albeit with strings attached – seems rather half-hearted, almost an insurance against failure. Effective (and expensive) moderation may protect against widespread gaming, but the risk remains that Reception teachers will be even more predisposed to prioritise universal school readiness over stretching their more precocious four year-olds.
  • The task of designing an effective test for all levels of prior attainment at the end of key stage 2 is equally fraught with difficulty. The P-scales will be retained (in their existing format, unaligned with the revised national curriculum) for learners with special needs working below the equivalent of what is currently level 1. There will also be undefined provision ‘for those working above the level of the P-scales but below the level of the test’, even though the draft test development frameworks say:

‘All eligible children who are registered at maintained schools, special schools, or academies (including free schools) in England and are at the end of key stage 2 will be required to take the…test, unless they have taken it in the past.’

And this applies to all learners other than those in the exempted categories set out in the ARA booklets. The draft specifications add that test questions will be placed in order of difficulty. I have grave difficulty in understanding how such assessments can be optimal for high attainers and fear that this is bad assessment practice.

  • On top of this there is the worrying statement in the test development frameworks that scaled scores will be ‘truncated’ at the extremes of the distribution’. This does not fill one with confidence that the highest and lowest attainers will have their test performance properly recognised and reported.
  • The necessary invention of ‘performance descriptors’ removes any lingering illusion that academies and free schools have significant freedom to depart from the national curriculum, at least as far as the core subjects are concerned. It is hard to understand why these descriptors could not have been published alongside the programmes of study within the national curriculum.
  • The ‘performance descriptors’ in the draft test specifications carry all sorts of health warnings that they are inappropriate for teacher assessment because they cover only material that can be assessed in a written test. But there will be significant overlap between the test and teacher assessment versions, particularly in those that describe threshold performance at the equivalent of level 4b. For we know now that there will also be hierarchies of performance descriptors – aka level descriptors – for KS1 teacher assessment in reading, writing, speaking and listening and maths, as well as for KS2 teacher assessment in writing. Levels were so problematic that it has been necessary to reinvent them!
  • What with scaled scores, average scaled scores, threshold performance descriptors and ‘levelled’ performance descriptors, schools face an uphill battle in convincing parents that the reporting of test outcomes under this system will be simpler and more understandable. At the end of KS2 they will receive 16 different assessments in four different formats. (Remember that parents will also need to cope with schools’ approaches to internal assessment, which may or may not align with these arrangements.)
  • We are told about new requirements to be placed on schools to publish assessment outcomes, but the description is infuriatingly vague. We do not know whether certain requirements apply to both KS1 and 2, and/or to both tests and teacher assessment. The reference to ‘the percentage of pupils who achieve a high score in all areas at the end of key stage 2’ is additionally vague because it is unclear whether it applies to performance in each assessment, or across all assessments combined. Nor is the pitch of the high score explained. This is the only reference to high attainers in the entire response and it raises more questions than it answers.
  • We also have negligible information about what will appear in the school performance tables and what will be relegated to the accompanying data portal. We know there is an intention to compare schools’ performance on the measures they are required to publish and that is all. Much of the further detail in the original consultation document may or may not have fallen by the wayside.
  • The new floor standards have all the characteristics of a last-minute compromise hastily stitched together. The consultation document was explicit that floor standards would:

‘…focus on threshold attainment measures and value-added progress measures’

It anticipated that the progress measure would require average scaled scores of between 98.5 and 99.0 adding:

‘Our modelling suggests that a progress measure set at this level, combined with the 85% threshold attainment measure, would result in a similar number of schools falling below the floor as at present.’

But the analysis of responses fails to report at all on the question ‘Do you have any comments about these proposals for the Department’s floor standards?’ It does include the response to a subsequent question about including an average point score attainment measure in the floor standards (39% of respondents were in favour of this against 31% against). But the main text does not discuss this option at all. It begins by stating that both an attainment and a progress dimension are in play, but then describes a system in which schools can choose one or the other. There is no attempt to quantify ‘sufficient progress’ and no revised modelling of the impact of standards set at this level. We are left with the suspicion that a very significant proportion of schools will not exceed the floor. There is also a potential perverse incentive for schools with very high attaining intakes not to bother about progress at all.

  • Finally, the ‘to do’ list is substantial. Several of those with the tightest deadlines ought really to have been completed ahead of the consultation response, especially given the significant delay. There is nothing about the interaction between this work programme and that proposed by NAHT’s Commission on Assessment. Much of this work would need to take place on the other side of a General Election, while the lead time for assessing KS2 progress against a Year R baseline is a full nine years. This makes the project as a whole particularly vulnerable to the whims of future governments.

I’m struggling to find the right description for the overall package. I don’t think it’s quite substantial or messy enough to count as a dog’s breakfast. But, like a poorly airbrushed portrait, it flatters to deceive. Seen from a distance it appears convincing but, on closer inspection, there are too many wrinkles that have not been properly smoothed out

GP

April 2014

 

 

What Has Become of the European Talent Network? Part Two

 .

This is the second and concluding part of a post about progress by the European Talent Centre towards a European Talent Network.

EU flag CapturePart One:

  • Provided an updated description of the Hungarian model for talent support and its increasingly complex infrastructure.
  • Described the origins of the European Talent project and how its scope and objectives have changed since its inception and.
  • Outlined the project’s initial advocacy effort within the European Commission.

This second episode describes the evolution of the model for the European Network, continues the history of its advocacy effort and reviews the progress Flag_of_Hungarymade by the European Centre in Budapest towards achieving its aims.

It concludes with an overall assessment of progress that highlights some key fault lines and weaknesses that, if addressed, would significantly improve the chances of overall success.

Initial Efforts to Design the European Network

A Draft Talent Points Plan 

At the 2012 ECHA Conference in Munster, a draft ‘Talent Points Plan’ was circulated which set out proposed criteria for EU Talent Points.

The following entities qualify for inclusion on the EU Talent Map:

  • ‘an already existing at least 2 year-old network connected to talent support
  • organizations/institutions focusing mainly on talent support: research, development, identification (eg schools, university departments, talent centers, excellence centers etc)
  • policy makers on national or international level (ministries, local authorities)
  • NGOs
  • business corporation with talent management programs (talent identification, corporate responsibility programs, creative climates)
  • parent organizations of gifted and talented children.’

But only organisations count as EU Talent Points. Each:

  • ‘has a strategy/action plan connected to talent (identification, support, research, carrier planning, etc…)
  • is willing to share at least one best/good practice, research results, video
  • is willing to share information on talent support (programs, conferences, talent days)
  • is open to be visited by other network members
  • is open to cooperate
  • accepts English as a common language while communicating in the network
  • is willing to update the data of home page 2 times/year.’ [sic]

My feedback on this draft urged a more flexible, inclusive approach – similar to what had been proposed earlier – as well as an online consultation of stakeholders to find out what they wanted from the Centre and the wider network.

Curiously, the ‘Towards a European Talent Support Network’ publication that was also distributed at the Conference took a somewhat different line, suggesting a more distributed network in which each country has its own Talent Support Centre:

‘The Talent Support Centres of the European countries could serve as regional hubs of this network building a contact structure going beyond their own country, while the core elements of our unique network could be the so-called European Talent Points… European Talent Centres are proposed to be registered by the Committee of the European Council of High Ability… A European Talent Centre should be an organization or a distinct part of a larger organization established for this purpose.

This is a pronounced shift from the ‘networked hubs’ proposed previously.

The publication goes on to set out ‘proposed requirements for a European Talent Centre’. Each:

  • ‘has an expertise of at least one year to coordinate the talent support activity of minimum 10 thousand persons 
  • has minimum two full-time employees who are dedicated to the tasks listed below 
  • is able to provide high quality information on theoretical and practical issues of gifted education and talent support
  • is able to keep records on the talent support activity of its region including the registration, help and coordination of European Talent Points and making this information available on the web (in the form of a Talent Support Map of the region)
  • is willing to cooperate with other European Talent Centres and with ECHA
  • is willing and able to coordinate joint actions, international events, Talent Days and other meetings in the field of talent support
  • is open to be visited by representatives, experts, talented young people of other European Talent Centres
  • is able to help and influence decisions on regional, national and/or European policies concerning the gifted and talented.’

The document also offers an alternative version of the criteria for European Talent Points.

Whereas the draft I began with specified that only organisations could be placed on the EU Talent Map, this version offers a more liberal interpretation, saying that Talent Points may be:

  • ‘organizations/institutions focusing mainly on talent support: research, development, identification (e. g: schools, university departments, talent centres, excellence centres, NGOs, etc.)
  • talent-related policy makers on national or international level [sic] (ministries, local authorities)
  • business corporation with talent management programs (talent identification, corporate responsibility programs, creative climate)
  • organizations of gifted and talented people
  • organizations of parents of gifted and talented children, or
  • umbrella organization (network) of organizations of the above types’

Talent points are to be registered (not accredited) by the appropriate European talent centres, but it appears that the centres would not enjoy discretion in such matters because there is a second set of proposed requirements:

  • ‘Has a strategy/action plan connected to talent (identification, support, research, career planning, etc.)
  • Is able and willing to share information on its talent support practices and other talent-related matters with other European Talent Points (programs, conferences, Talent Days) including sending the necessary data to a European Talent Centre and sharing at least one best practice/research result on the web
  • Is open to cooperate with other European Talent Points including the hosting of visiting representatives, talented young people from other European Talent Points.’

 .

Problems with the Talent Points Plan

‘Towards a European Talent Support Network’ stipulates – for no apparent reason – that a European Talent Centre has to be an organisation or part of an organisation established specifically for this purpose. It cannot be subsumed seamlessly into the existing responsibilities of an organisation.

There is no reference to funding to cover the cost of this activity, so that is presumably to be provided, or at least secured, by the organisation in question.

The criteria for European centres seem to be seeking to clone the Budapest Centre. To locate one in every European country – so roughly 50 countries – would be a tall order indeed, requiring a minimum of 100FTE employees.

The impact on the role and responsibilities of the Budapest Centre is not discussed. What would it do in this brave new world, other than to cover Hungary’s contribution to the network?

The only justification for ECHA’s involvement is presumably the reference earlier in ‘Towards a European Talent Support Network’:

‘Stemming from its traditions – and especially due to its consultative status as a non-governmental organization (NGO) at the Council of Europe –ECHA has to stand in the forefront in building a European Talent Support Network; a network of all people involved in talent support.’

ECHA News carries a report of the minutes of an ECHA committee meeting held in April 2013:

‘It was suggested that ECHA should be an accrediting organization for European Talent Centres and Talent Points. In the following discussion it was concluded that (1) it might be possible to establish a special accrediting committee; (2) Talent Centres would decide where Talent Points can be; (3) the proposal for European Talent Centres and European Talent Points criteria would be sent to additional key ECHA members (including National Correspondents) as discussion material. Criteria will be decided later.’

So ECHA would have control of the decision which entities could become European Talent Centres. This is despite the fact that ECHA is an entirely separate membership organisation with no formal responsibility for the EU Talent initiative.

This is not a sensible arrangement.

There is no explanation of why the network itself could not accredit its own members.

Turning back to the proposed requirements for European talent centres, these must be minimum requirements since there would otherwise be no need for an accreditation committee to take decisions.

Presumably the committee might impose its own additional criteria, to distinguish, for example, between two competing proposals for the same region.

The requirement for a year’s experience in relation to ‘co-ordinating’ talent support activity for at least 10,000 people is not explained. What exactly does it mean?

It might have been better to avoid quantitative criteria altogether. Certainly it is questionable whether even the present centre in Budapest meets this description.

And why the attempt to control inputs – the reference to at least two full-time staff – rather than outcomes? Surely the employment of sufficient staff is a matter that should be left to the centre’s discretion entirely.

The broad idea of a distributed network rather than a Budapest-centred network is clearly right, but the reasoning that puts ECHA in a controlling position with regard to the network is out of kilter with that notion, while the criteria themselves are inflexible and unworkable, especially since there is no budget attached to them.

When it comes to the talent points there are clear conflicts between the two versions. The first set of criteria outlined above is the more onerous. They propose an exclusive – rather than illustrative – list of those that can be included on the EU Talent Map.

Additionally they add that existing networks can feature on the map, but only if they are at least two years old! And they stipulate an additional English language requirement and biannual updating of their website homepage.

Only an entity with some serious difficulties could manage to share two sets of different draft criteria – each with its own profound problems – at precisely the same time!

Hungary budapest by night

Budapest by Night

.

The EU Advocacy Effort Continues

.

What Became of the Written Declaration?

Written Declarations are designed to stimulate debate. Once submitted by MEPs they are printed in all official EU languages and entered into a register. There is then a three month window in which other MEPs may sign them.

Those attracting signatures from a majority of MEPs are announced by the President in a plenary session of the European Parliament and forwarded for consideration to the bodies named in the text.

Those that do not attract sufficient signatures officially lapse.

.

The archive of written declarations shows that – despite the revisions outlined above and the best efforts of all those lobbying (including me) – WD 0034/2012 lapsed on 20 February 2013 having attracted 178 signatures. Since there are some 750 MEPs, that represents less than 25% of the total.

 .

A Parliamentary Hearing

As part of this ultimately unsuccessful lobbying effort, the Hungarian MEP who – along with three colleagues – submitted the Written Declaration also hosted a Parliamentary Hearing on the support of talents in the European Union.

The programme lists the speakers as:

  • Anneli Pauli, a Finn, formerly a Deputy Director General of the European Commission’s Research and Innovation Directorate.
  • Laszlo Andor, a Hungarian and EU Commissioner for employment, social affaris and inclusion. (Any contribution he made to the event is not included in the record, so he may or may not have been there.)
  • Peter Csermely. The current ECHA President and the man behind the EU Talent Centre.

There was no-one from the Commission’s Education Directorate involved.

The record of proceedings makes interesting reading, highlighting the Written Declaration, the economic value of talent development to the EU, the contribution it can make to research and innovation, the scope to support the inclusion of immigrants and minorities and the case for developing the European network.

Pauli is reported as saying that:

‘Talents are the heart of the future EU’s research area, thus they will work hard on it that the Horizon 2020 will offer enough support to them.’ [sic]

Horizon 2020 is the EU Framework Programme for Research and Innovation. There is no explicit home for talent support within the framework of the Horizon 2020 programme, so it remains to be seen how this will materialise in practice.

She also says:

‘…that school education on talents and the creative education in school sciences should be strengthened’ [sic]

This presumably carried rather less authority considering her role – and considering that, as we have seen, the Declaration was framed exclusively in terms of ‘non-formal learning’.

There is little explicit reference to the specifics of the European Talent project other than that:

‘…EU-wide talent-support units are needed, Europren [sic] Talent Points Network, a European Talent Day could be organised, or even a Year of Excellence and Talents could be implemented in the future too.’

We are not told how well attended the hearing was, nor do we have any information about its influence.

Only 13 more MEPs signed the WD between the Hearing and the deadline, and that was that.

An EU Thematic Working Group on Talent Support?

The 2013 publication ‘Towards a European Talent Support Network’ puts the best possible spin on the Written Declaration and the associated Hearing.

It then continues:

‘Confirming the importance of WD 34/2012, an EU Thematic Working Group on supporting talent and creativity was initiated by Prof. Péter Csermely. As a starting activity, the EU Thematic Working Group will work out the detailed agenda of discussions and possible EU member state co-operation in the area of talent support. This agenda may include items like:

  • Mutual information on measures to promote curricular and extra-curricular forms of talent support, including training for educational professionals to recognise and help talent;
  • Consideration of the development of an EU member state talent support network bringing together talent support communities, Talent Points and European Talent Centres in order to facilitate co-operation and the development and dissemination of the best talent support practices in Europe;
  • Consideration of celebration of the European Day of Talented;
  • Suggestions to the Commission to include talent support as a priority in future European strategies, such as the strategies guiding the European Research Area and the European Social Fund.’

The proposed status of this group is not discussed, so it is unclear whether it will be an expert group under the aegis of the Commission, or an independent group established with funding from Erasmus Plus or another EU programme.

If it is the latter, we will have to wait some time for it to be established; if it is the former, it does not yet feature in the Commission’s Register.

In either case, we are some nine months on from the publication of the document that brought us this news and there is still no indication of whether this group exists, when it will start work or who its membership is/will be.

 .

A European Economic and Social Committee (EESC) Opinion

At about the same time as a draft Written Declaration was circulated in January 2012, the Bureau of the EU’s European Economic and Social Committee was recommending that the Committee proper should undertake a fresh programme of ‘own initiative opinions’ (so the weakest category of NLA).

These included:

‘Unleashing the potential of young people with high intellectual abilities in the European Union’

Although the development process was undertaken during 2012, the final opinion was not published until January 2013.

The EESC describes itself thus:

‘The European Economic and Social Committee (EESC) is a consultative body that gives representatives of Europe’s socio-occupational interest groups and others, a formal platform to express their points of views on EU issues. Its opinions are forwarded to the Council, the European Commission and the European Parliament.’

Its 353 members are nominated by member governments and belong to an employers’ group, a workers’ group or a ‘various interests’ group. There are six sections, one of which is ‘Employment, Social Affairs and Citizenship’ (SOC).

EESC opinions are prepared by study groups which typically comprise 12 members including a rapporteur. Study groups may make use of up to four experts.

I cannot trace a relationship between the EESC’s opinion and the European Talent initiative.

The latter’s coverage does not mention any involvement and there is no information on the EU side about who prompted the process.

The focus of the opinion – high intellectual ability – is markedly out of kilter with the broader talent focus of the Talent Network, so it is highly likely that this activity originated elsewhere.

If that is the case then we can reasonably conclude that the European Talent initiative has not fulfilled its original commitment to an NLA.

Diligent online researchers can trace the development of this Opinion from its earliest stages through to eventual publication. There is a database of the key documents and also a list of the EESC members engaged in the process.

As far as I can establish the group relied on a single expert – one Jose Carlos Gibaja Velazquez, who is described as Subdirección General de Centros de Educación Infantil, Primaria y Especial Comunidad de Madrid’.

The link between JCBV and the EESC is explained here (translation into English here). I can find no link between Senor Gibaja and the EU Talent Network.

EESC members of the study group were:

  • Beatrice Quin France)
  • Teresa Tsizbierek (Pol)

An Early Draft of the Opinion

The earliest version of the Opinion is included an information memo dated 7 January. This also cites the significance of the Europe 2020 Strategy:

‘One of the top priorities of the Europe 2020 Strategy is to promote smart growth, so that knowledge and innovation become the two key drivers of the European economy. In order to reach this goal, it is essential that the European Union take advantage of the potential of the available human capital, particularly of young people with high intellectual capacities, who make up around 3% of the population.’

But it is clearly coming from a different perspective to the EU Talent Centre, which isn’t mentioned.

The ‘gist of the opinion’ at this early stage is as follows:

‘The EESC recommends that the European Commission and the Member States support further studies and research that would tap the potential of gifted children and young people in a wide variety of fields, aiming to facilitate employment and employability within the framework of the EU and, in a context of economic crisis, enhance specialist knowledge and prevent brain drain;

  • The Committee recommends that, in the future, greater consideration be given to each Member State’s existing models for and experience in working with highly gifted children, particularly those which benefit all of society, facilitate cohesion, reduce school failure and encourage better education in accordance with the objectives of the Europe 2020 strategy;
  • The Committee proposes improving educational care for children and young people with high abilities, in terms of the following aspects:

-          initial and ongoing training of teaching staff regarding the typical characteristics of highly able students, as well as the detection and educational care they need;

-          pooling of procedures for the early detection of high intellectual abilities among students in general and in particular among those from disadvantaged social backgrounds;

-          designing and implementing educational measures aimed at students with high intellectual abilities;

-          incorporating into teacher training the values of humanism, the reality of multiculturalism, the educational use of ICT and, lastly, the encouragement of creativity, innovation and initiative.’

Mount Bel Stone courtesy of Horvabe

Mount Bel Stone courtesy of Horvabe

.

What the Opinion Eventually Recommended

The final version of the Opinion was discussed by the EESC at its meeting on 16 January 2013 and was adopted ‘by 131 votes in favour, none against, with 13 abstentions’.

The analysis contained in the Opinion is by no means uncontentious and a close analysis would generate a long list of reservations. But this would be oblique to the issue under discussion.

The recommendations are as follows (my emboldening):

‘The European Economic and Social Committee is aware that the issue of children and young people with high intellectual abilities has been fairly well researched, as a result of the studies conducted over the last decades and the extensive corpus of specialist scientific literature. However, given the importance of this topic, the EESC recommends that the European Commission and the Member States support further studies and research and adopt suitable measures to cater for diversity among all types of people. These should include programmes that would tap the potential of gifted children and young people in a wide variety of fields. The aims of this action would include facilitating employment and employability within the framework of the EU and, in a context of economic crisis, enhancing specialist knowledge and preventing brain drain to other parts of the world.

The Committee proposes nurturing the development and potential of children and young people with high abilities throughout the various stages and forms of their education, avoiding premature specialisation and encouraging schools to cater for diversity, and exploiting the possibilities of cooperative and non-formal learning.

The Committee recommends fostering education and lifelong learning, bearing in mind that each individual’s intellectual potential is not static but evolves differently throughout the various stages of his or her life.

The Committee recommends that, in the future, greater consideration be given to each Member State’s existing models for and experience in working with highly gifted children, particularly those which benefit all of society, facilitate cohesion, reduce school failure and encourage better education in accordance with the objectives of the Europe 2020 strategy.

The Committee highlights the need to detect, in the workplace, those workers (particularly young workers) who are able and willing to develop their intellectual capabilities and contribute to innovation, and to give them the opportunity to further their education in the field that best matches their ambitions and centres of interest.

The Committee proposes improving educational care for children and young people with high abilities, in terms of the following aspects:

  • initial and ongoing training of teaching staff regarding the typical characteristics of highly able students, as well as the detection and educational care they need;
  • pooling of procedures for the early detection of high intellectual abilities among students in general and in particular among those from disadvantaged social backgrounds;
  • designing and implementing educational measures aimed at students with high intellectual abilities. These measures should include actions inside and outside ordinary educational establishments;
  • incorporating into teacher training the values of humanism, the reality of multiculturalism, the educational use of ICT and, lastly, the encouragement of creativity, innovation and initiative.

Improving the care provided for highly able students should include their emotional education (which is particularly important during adolescence), the acquisition of social skills with a view to facilitating integration and inclusion in society, integration into the labour market, and fostering their teamwork skills.

Schemes and procedures for student exchanges and visits abroad should be tapped into so that gifted students can take part in them, particularly those from disadvantaged backgrounds.

Opportunities for exchanging information and good practices on detecting and caring for gifted students should be harnessed across the EU Member States.

Entrepreneurship should be fostered among children and young people with high abilities, with a view to encouraging responsibility and solidarity towards society overall.

 .

More than One Opinion?

I have devoted significant attention to this apparently unrelated initiative because it shows that the EU lobbying effort in this field is poorly co-ordinated and pursuing substantively different objectives.

The EU Talent project failed to secure the NLA it was pursuing, but someone else has exploited the same route to influence – and for substantially different purposes.

What is worse, the EU Talent lobby seems to have failed entirely to secure any cross-reference to their efforts, despite there being two Hungarians on the study group. Did they try and fail or didn’t they try at all?

Perhaps fortunately, the Opinion seems to have been as influential as the Written Declaration. One wonders whether the enormous energy and time invested in each of these processes was ultimately worthwhile.

 .

What progress has been made by the European Talent Project?

. 

The Mission Has Changed

The website version of the Centre’s mission is subtly different from the original version discussed earlier in this post

The Centre now seeks:

  • ‘to provide talent support an emphasis commensurate with its importance in every European country [same]
  • to provide talented youngsters access to the most adequate forms of education in every Member State [same]
  • to make Europe attractive for the talented youth [same]
  • to create talent-friendly societies in every European country [same]
  • to accelerate the sharing of information on the topic [new]
  • to create a higher number of more efficient forms of talent support for the talented’ [new]
  • to make it easier for social actors interested in talent support to find each other through the European talent support network.’ [new]

The reference to voluntary experts has gone, to be replaced by a call for:

‘…partners – professionals, talents and talent supporters – willing to think and work together.’

Towards a European Talent Support Network’ offers a different version again.

The mission and role of the Centre have changed very slightly, to reflect the new orthodoxy of multiple European talent centres, describing the Budapest body as ‘the first European Talent Centre’.

Four long-term goals are outlined:

  • to give talent support a priority role in the transformation of the sector of education;
  • To reduce talent loss to the minimum in Europe,
  • To accelerate the sharing of information on the topic by integrating talent support initiatives of the Member States of the EU into a network
  • To make it easier for social actors interested in talent support to find each other through the European talent support network.’

It adds some additional short term objectives for good measure:

  • ‘As a hub of a European network, try to trigger mechanisms which bring organizations and individuals together to facilitate collaboration, share best practices and resources
  • Draw the Talent Support Map of Europe
  • Organize conferences for professionals in the region
  • Do research on the field of talent support
  • Collect and share best practices.’

We have now encountered three different versions of a mission statement for an entity that is less than two years old.

It is not clear whether this represents an evolutionary process within the organisation – which might be more understandable if it were better documented – or a certain slipperiness and opportunistic shifting of position that makes it very difficult for outsiders to get a grip on exactly what the Centre is for.

In typical fashion, the document says that:

‘the activities of the Centre fall into four large groups: advocacy, research, organisation (conferences, meetings, Talent Days), contact-keeping (meeting delegations from all over the world) and sharing information.’

Forgive me, but isn’t that five groups?

We have dealt with advocacy already and unfortunately there is negligible information available about the ‘contact-keeping’ activity undertaken – ie the various delegations that have been met by the staff and what the outcomes have been of those meetings.

That leaves research, organisation and sharing information.

.

Esterhazy Castle

Esterhazy Castle

Advisory Board and Partners

Before leaving the Centre’s operations, it is important to note that a three-strong Advisory Board has bee been appointed.

All three are luminaries of ECHA, two of them serving on the current Executive Committee.

There is no explanation of the Board’s role, or how it was chosen, and no published record of its deliberations. It is not clear whether it is intended as a substitute for the advisory group that was originally envisaged, which was to have had much broader membership.

As noted above, there is also a new emphasis on ‘partners’. The full text of the reference on the website says:

‘We are looking for partners – professionals, talents and talent supporters – willing to think and work together. We sincerely hope that the success of the Hungarian example will not stop short at the frontiers of the country, but will soon make its way to European talent support co-operation.’

Four partners are currently listed – ECHA, the Global Centre for Gifted and Talented Children, IGGY and the World Council – but there is no explanation of the status conferred by partnership or the responsibilities expected of partners in return.

Are partners prospective European Talent Centres or do they have a different status? Must partners be talent points or not? We are not told.

Research

This is presumably a reference to the ‘Best Practices’ section of the Budapest Centre’s website, which currently hosts two collections of studies ‘International Horizons of Talent Support Volumes 1 and 2’ and a selection of individual studies (17 at the time of writing).

 .

The quality of this material can best be described as variable. This study of provision in Ireland is relatively unusual, since most of the material is currently devoted to Central and Eastern Europe, but it gives a sense of what to expect.

There has been no effort to date to collect together already-published research and data about provision in different parts of Europe and to make that material openly accessible to readers. That is a major disappointment.

There is nothing in the collection that resembles an independent evaluation of the European Talent Initiative as a whole, or even an evaluation of the Hungarian NTP.

At best one can describe the level and quality of research-related activity as embryonic.

 .

Event Organisation

This Table shows what the Centre has achieved to date and what is planned for 2014:

.

2011 2012 2013 2014
Conference Yes (Budapest) Unofficial (Warsaw) No Yes (Budapest)
EU Talent Day Yes No No Yes

 .

The 2014 Conference is the first official EU-wide event since the 2011 launch conference. The same is true of the 2014 EU Talent Day.

The Polish conference was initially planned for spring 2012, but failed to materialise. By July it was confirmed that there would only be ‘an unofficial follow-up’ in October. My December 2012 post described my personal and ultimately unsuccessful efforts to attend this event and summarised the proceedings.

The 2014 Conference Website insists that it will coincide with the Third EU Talent Day but I can find barely a trace of a Second, except in Estonia, where it was celebrated on 21 March 2012.

.

.

This is not a strikingly positive record.

The 2014 Conference website names an organising ‘international scientific committee’ that is heavily biased towards academics (eight of the eleven), ECHA luminaries (five of the eleven) and Hungarians (four of the eleven).

The programme features four academic keynotes about networks and networking.

The remainder involve Slovenia’s education minister, the EU Commissioner for Employment, Social Affairs and Inclusion (a Hungarian who was advertised to be part of the Parliamentary Hearing on the Written Declaration but, if he did attend, apparently made no contribution) and one devoted to the ‘International Talent Competiveness Index’.

I think this must be INSEAD’s Global Talent Competitiveness Index).

INSEAD’s inaugural 2013 Report ranks Hungary 40th of 103 countries on this Index. (The UK is ranked 7th and the US 9th).

There are eight ‘break-up sessions’ [sic]:

  • The role of governments and the EU in creation a European Network[sic]
  • Digital Networks for Talented Youth
     
  • Social responsibility and organisational climate
  • Practice and Ethics of Networking
  • Multiple disadvanteged children [sic]
  • Parents’ networks in Europe
  • Counselling Centers [sic]
  • Civil networks for Talent Support

The expected outcome of the event is not specified. There is no scheduled opportunity to discuss the progress made to date by the EU Talent initiative, or the policy and implementation issues flagged up in this post. And there is no information about the mediation of the Conference via social media (though there are now Skype links next to the items in the programme).

 .

Talent Map and Resources

The website features a Resource Center [sic] which includes a database of ‘selected resources’. We are not told on what basis the selection has been made.

The database is built into the website and is not particularly accessible, especially if one compares it with the Hungarian equivalent. Indeed, the Talent Centre website is decidedly clunky by comparison.

The Talent Map is now better populated than it was, though inconsistently so. There are only two entries for Hungary, for example, while Romania has 11. There are only three in the UK and none in Ireland. Neither CTYI nor SNAP is mentioned.

It might have been better to pre-populate the map and then to indicate which entries had been ‘authorised’ by their owners.

From a presentational perspective the map is better than the database, though it should have a full page to itself.

Both the database and the map are still works in progress.

Overall Assessment and Key Issues Arising

In the light of this evidence, what are we to make of the progress achieved towards a European Talent Network over the last four years?

In my judgement:

  • The fundamental case for modelling a European Talent Network on the Hungarian National Talent Programme is unproven. The basic design of the NTP may reflect one tradition of consensus on effective practice, but the decision to stop at age 35 is unexplained and idiosyncratic. The full model is extremely costly to implement and relies heavily on EU funding. Even at current levels of funding, it is unlikely to be impacting on more than a relatively small minority of the target population. It is hard to see how it can become financially sustainable in the longer term. 
  • There is no detailed and convincing rationale for, or description of, how the model is being modified (into ‘Hungary-lite’) for European rollout. It is abundantly clear that this rollout will never attract commensurate funding and, compared with the NTP, it is currently being run ‘on a shoestring’. But, as currently envisaged, the rollout will require significant additional funding and the projected sources of this funding are unspecified. The more expensive the rollout becomes, the more unlikely it is to be financially sustainable. In short, the scalability to Europe of the modified Hungarian talent support model is highly questionable.
  • The shape and purpose of the overall European Talent initiative has changed substantively on several occasions during its short lifetime. There is only limited consistency between the goals being pursued now and those originally envisaged. There have been frequent changes to these goals along the way, several of them unexplained. It is not clear whether this is attributable to political opportunism and/or real confusion and disagreement within the initiative over what exactly it is seeking to achieve and how. There are frequently inconsistencies between different sources over exactly how aspects of the rollout are to be implemented. This causes confusion and calls into question the competence of those who are steering the process. Such ‘mission creep’ will radically reduce the chances of success.
  • The relationship with ECHA has always been problematic – and remains so. Fundamentally the European Talent Initiative is aiming to achieve what ECHA itself should have achieved, but failed. The suggestion that ECHA be given control over the accreditation of European Talent Centres is misguided. ECHA is a closed membership organisation rather than an open network and cannot be assumed to be representative of all those engaged in talent support throughout Europe. There is no reason why this process could not be managed by the network itself. In the longer term the continued co-existence of the Network and ECHA as separate entities becomes increasingly problematic. But any merger would demand radical reform of ECHA. Despite the injection of new blood into the ECHA Executive, the forces of conservatism within it remain strong and are unlikely to countenance such a radical step.
  • The progress achieved by the European Talent Centre during its relatively short existence has been less than impressive. That is partly attributable to the limited funding available and the fact that it is being operated on the margins of the Hungarian NTP. The funding it does attract comes with the expectation that it will be used to advertise the successes of the NTP abroad, so raising the status and profile of the domestic effort. There is a tension between this and the Centre’s principal role, which must be to drive the European rollout. 
  • The decision to move to a distributed model in which several European Talent Centres develop the network, rather than a centralised model driven by Budapest, is absolutely correct. (I was saying as much back in 2011.) However, the wider implications of this decision do not appear to have been thought through. I detect a worrying tendency to create bureaucracy for the sake of it, rather than focusing on getting things done.
  • Meanwhile, the Budapest Centre has made some headway with a Talent Map and a database of resources, but not nearly enough given the staffing and resource devoted to the task. The failure to deliver annual EU Conferences and Talent Days is conspicuous and worrying. Conversely, the effort expended on lobbying within the European Commission has clearly been considerable, though the tangible benefits secured from this exercise are, as yet, negligible.
  • For an initiative driven by networking, the quantity and quality of communication is poor. Independent evaluation studies of the Hungarian model do not seem to be available, at least not in English. There should be a fully costed draft specification for the European roll-out which is consulted upon openly and widely. Consultation seems confined currently to ECHA members which is neither inclusive nor representative. No opportunities are provided to challenge the direction of travel pursued by the initiative and its decision-making processes are not transparent. There is no evidence that it is willing to engage with critics or criticism of its preferred approach. The programme for the 2014 Conference does not suggest any marked shift in this respect.

An unkind critic might find sufficient evidence to level an accusation of talent support imperialism, albeit masked by a smokescreen of scientifically justified networkology.

I do not subscribe to that view, at least not yet. But I do conclude that the European Talent effort is faltering badly. It may limp on for several years to come, but it will never achieve its undoubted potential until the issues outlined above are properly and thoroughly addressed.

.

GP

March 2014

 

Challenging NAHT’s Commission on Assessment

.

This post reviews the Report of the NAHT’s National Commission on Assessment, published on 13 February 2014.

pencil-145970_640Since I previously subjected the Government’s consultation document on primary assessment and accountability to a forensic examination, I thought it only fair that I should apply the same high standards to this document.

I conclude that the Report is broadly helpful, but there are several internal inconsistencies and a few serious flaws.

Impatient readers may wish to skip the detailed analysis and jump straight to the summary at the end of the post which sets out my reservations in the form of 23 recommendations addressed to the Commission and the NAHT.

.

Other perspectives

Immediate reaction to the Report was almost entirely positive.

The TES included a brief Ministerial statement in its coverage, attributed to Michael Gove:

‘The NAHT’s report gives practical, helpful ideas to schools preparing for the removal of levels. It also encourages them to make the most of the freedom they now have to develop innovative approaches to assessment that meet the needs of pupils and give far more useful information to parents.’

ASCL and ATL both welcomed the Report, as did the National Governors’ Association, though there was no substantive comment from NASUWT or NUT.

The Blogosphere exhibited relatively little interest, although a smattering of posts began to expose some issues:

  • LKMco supported the key recommendations, but wondered whether the Commission might not be guilty of reinventing National Curriculum levels;
  • Mr Thomas Maths was more critical, identifying three key shortcomings, one being the proposed approach to differentiation within assessment;
  • Warwick Mansell, probably because he blogs for NAHT, confined himself largely to summarising the Report, which he found ‘impressive’, though he did raise two key points – the cost of implementing these proposals and how the recommendations relate to the as yet uncertain position of teacher assessment in the Government’s primary assessment and accountability reforms.

All of these points – and others – are fleshed out in the critique below.

.

Background

.

Remit, Membership and Evidence Base

The Commission was first announced in July 2013, when it was described as:

‘a commission of practitioners to shape the future of assessment in a system without levels.’

By September, Lord Sutherland had agreed to Chair the body and its broad remit had been established:

‘To:

  • establish a set of principles to underpin national approaches to assessment and create consistency;
  • identify and highlight examples of good practice; and
  • build confidence in the assessment system by securing the trust and support of officials and inspectors.’

Written evidence was requested by 16 October.

The first meeting took place on 21 October and five more were scheduled before the end of November.

Members’ names were not included at this stage (beyond the fact that NAHT’s President – a Staffordshire primary head – was involved) though membership was now described as ‘drawn from across education’.

Several members had in fact been named in an early October blog post from NAHT and a November press release from the Chartered Institute of Educational Assessors (CIEA) named all but one – NAHT’s Director of Education. This list was confirmed in the published Report.

The Commission had 14 members but only six of them – four primary heads one primary deputy and one secondary deputy – could be described as practitioners.

The others included two NAHT officials in addition to the secretariat, one being General Secretary Russell Hobby, and one from ASCL;  John Dunford, a consultant with several other strings to his bow, one of those being Chairmanship of the CIEA; Gordon Stobart an academic specialist in assessment with a long pedigree in the field; Hilary Emery, the outgoing Chief Executive of the National Children’s Bureau; and Sam Freedman of Teach First.

There were also unnamed observers from DfE, Ofqual and Ofsted.

The Report says the Commission took oral evidence from a wide range of sources. A list of 25 sources is provided but it does not indicate how much of their evidence was written and how much oral.

Three of these sources are bodies represented on the Commission, two of them schools. Overall seven are from schools. One source is Tim Oates, the former Chair of the National Curriculum Review Expert Panel.

The written evidence is not published and I could find only a handful of responses online, from:

Overall one has to say that the response to the call for evidence was rather limited. Nevertheless, it would be helpful for NAHT to publish all the evidence it received. It might be helpful for NAHT to consult formally on key provisions in its Report.

 .

Structure of the Report and Further Stages Proposed

The main body of the Report is sandwiched between a foreword by the Chair and a series of Annexes containing case studies, historical and international background.  This analysis concentrates almost entirely on the main body.

The 21 Recommendations are presented twice, first as a list within the Executive Summary and subsequently interspersed within a thematic commentary that summarises the evidence received and also conveys the Commission’s views.

The Executive Summary also sets out a series of Underpinning Principles for Assessment and a Design Checklist for assessment in schools, the latter accompanied by a set of five explanatory notes.

It offers a slightly different version of the Commission’s Remit:

‘In carrying out its task, the Commission was asked to achieve three distinct elements:

  • A set of agreed principles for good assessment
  • Examples of current best practice in assessment that meet these principles
  • Buy-in to the principles by those who hold schools to account.’

These are markedly less ambitious than their predecessors, having dropped the reference to ‘national approaches’ and any aspiration to secure support from officials and inspectors for anything beyond the Principles.

Significantly, the Report is presented as only the first stage in a longer process, an urgent response to schools’ need for guidance in the short term.

It recommends that further work should comprise:

  • ‘A set of model assessment criteria based on the new National Curriculum.’ (NAHT is called upon to develop and promote these. The text says that a model document is being  commissioned but doesn’t reveal the timescale or who is preparing it);
  • ‘A full model assessment policy and procedures, backed by appropriate professional development’ that would expand upon the Principles and Design Checklist. (NAHT is called upon to take the lead in this, but there is no indication that they plan to do so. No timescale is attached)
  • ‘A system-wide review of assessment’ covering ages 2-19. It is not explicitly stated, but one assumes that this recommendation is directed towards the Government. Again no timescale is attached.

The analysis below looks first at the assessment Principles, then the Design Checklist and finally the recommendations plus associated commentary. It concludes with an overall assessment of the Report as a whole.

.

Assessment Principles

As noted above, it seems that national level commitment is only sought in respect of these Principles, but there is no indication in the Report – or elsewhere for that matter – that DfE, Ofsted and Ofqual have indeed signed up to them.

Certainly the Ministerial statement quoted above stops well short of doing so.

The consultation document on primary assessment and accountability also sought comments on a set of core principles to underpin schools’ curriculum and assessment frameworks. It remains to be seen whether the version set out in the consultation response will match those advanced by the Commission.

The Report recommends that schools should review their own assessment practice against the Principles and Checklist together, and that all schools should have their own clear assessment principles, presumably derived or adjusted in the light of this process.

Many of the principles are unexceptionable, but there are a few interesting features that are directly relevant to the commentary below.

For it is of course critical to the internal coherence of the Report that the Design Checklist and recommendations are entirely consistent with these Principles.

I want to highlight three in particular:

  • ‘Assessment is inclusive of all abilities…Assessment embodies, through objective criteria, a pathway of progress and development for every child…Assessment objectives set high expectations for learners’.

One assumes that ‘abilities’ is intended to stand proxy for both attainment and potential, so that there should be ‘high expectations’ and a ‘pathway of progress and development’ for the lowest and highest attainers alike.

  • ‘Assessment places achievement in context against nationally standardised criteria and expected standards’.

This begs the question whether the ‘model document’ containing assessment criteria commissioned by NAHT will be ‘nationally standardised’ and, if so, what standardisation process will be applied.

  • ‘Assessment is consistent…The results are readily understandable by third parties…A school’s results are capable of comparison with other schools, both locally and nationally’.

The implication behind these statements must be that results of assessment in each school are transparent and comparable through the accountability regime, presumably by means of the performance tables (and the data portal that we expect to be introduced to support them).

This cannot be taken as confined to statutory tests, since the text later points out that:

‘The remit did not extend to KS2 tests, floor standards and other related issues of formal accountability.’

It isn’t clear, from the Principles at least, whether the Commission believes that teacher assessment outcomes should also be comparable. Here, as elsewhere, the Report does a poor job of distinguishing between statutory teacher assessment and assessment internal to the school.

.

Design Checklist.

 

Approach to Assessment and Use of Assessment

The Design Checklist is described as:

‘an evaluation checklist for schools seeking to develop or acquire an assessment system. They could also form the seed of a revised assessment policy.’

It is addressed explicitly to schools and comprises three sections covering, respectively, a school’s approach to assessment, method of assessment and use of assessment.

The middle section is by far the most significant and also the most complex, requiring five explanatory notes.

I have taken the more straightforward first and third sections first.

‘Our approach to assessment’ simply makes the point that assessment is integral to teaching and learning, while also setting expectations for regular, universal professional development and ‘a senior leader who is responsible for assessment’.

It is not clear whether this individual is the same as, or additional to, the ‘trained assessment lead’ mentioned in the Report’s recommendations.

I can find no justification in the Report for the requirement that this person must be a senior leader.

A more flexible approach would be preferable, in which the functions to be undertaken are outlined and schools are given flexibility over how those are distributed between staff. There is more on this below.

The final section ‘Our use of assessment’ refers to staff:

  • Summarising and analysing attainment and progress;
  • Planning pupils’ learning to ensure every pupil meets or exceeds expectations (Either this is a counsel of perfection, or expectations for some learners are pitched below the level required to satisfy the assessment criteria for the subject and year in question. The latter is much more likely, but this is confusing since satisfying the assessment criteria is also described in the Checklist in terms of ‘meeting…expectations’.)
  • Analysing data across the school to ensure all pupils are stretched while the vulnerable and those at risk make appropriate progress (‘appropriate’ is not defined within the Checklist itself but an explanatory note appended to the central section  – see below – glosses this phrase);
  • Communicating assessment information each term to pupils and parents through ‘a structured conversation’ and the provision of ‘rich, qualitative profiles of what has been achieved and indications of what they [ie parents as well as pupils] need to do next’; and
  • Celebrating a broad range of achievements, extending across the full school curriculum and encompassing social, emotional and behavioural development.

.

Method of Assessment: Purposes

‘Our method of assessment’ is by far the longest section, containing 11 separate bullet points. It could be further subdivided for clarity’s sake.

The first three bullets are devoted principally to some purposes of assessment. Some of this material might be placed more logically in the ‘Our Use of Assessment’ section, so that the central section is shortened and restricted to methodology.

The main purpose is stipulated as ‘to help teachers, parents and pupils plan their next steps in learning’.

So the phrasing suggests that assessment should help to drive forward the learning of parents and teachers, as well as to the learning of pupils. I’m not sure if this is deliberate or accidental.

Two subsidiary purposes are mentioned: providing a check on teaching standards and support for their improvement; and providing a comparator with other schools via collaboration and the use of ‘external tests and assessments’.

It is not clear why these three purposes are singled out. There is some overlap with the Principles but also a degree of inconsistency between the two pieces of documentation. It might have been better to cross-reference them more carefully.

In short, the internal logic of the Checklist and its relationship with the Principles could both do with some attention.

The real meat of the section is incorporated in the eight remaining bullet points. The first four are about what pupils are assessed against and when that assessment takes place. The last four explain how assessment judgements are differentiated, evidenced and moderated.

.

Method of Assessment: What Learners Are Assessed Against – and When

The next four bullets specify that learners are to be assessed against ‘assessment criteria which are short, discrete, qualitative and concrete descriptions of what a pupil is expected to know and be able to do.’

These are derived from the school curriculum ‘which is composed of the National Curriculum and our own local design’ (Of course that is not strictly the position in academies, as another section of the Report subsequently points out.)

The criteria ‘for periodic assessment are arranged into a hierarchy setting out what children are normally expected to have mastered by the end of each year’.

Each learner’s achievement ‘is assessed against all the relevant criteria at appropriate times of the school year’.

.

The Span of the Assessment Criteria

The first explanatory note (A) clarifies that the assessment criteria are ‘discrete, tangible descriptive statements of attainment’ derived from ‘the National Curriculum (and any school curricula)’.

There is no repetition of the provision in the Principles that they should be ‘nationally standardised’ but ‘there is little room for meaningful variety’, even though academies are not obliged to follow the National Curriculum and schools have complete flexibility over the remainder of the school curriculum.

The Recommendations have a different emphasis, saying that NAHT’s model criteria should be ‘based on the new National Curriculum’ (Recommendation 6), but the clear impression here is that they will encompass the National Curriculum ‘and any school curricula’ alike.

This inconsistency needs to be resolved. NAHT might be better off confining its model criteria to the National Curriculum only – and making it clear that even these may not be relevant to academies.

.

The Hierarchy of Assessment Criteria

The second explanatory note (B) relates to the arrangement of the assessment criteria

‘…into a hierarchy, setting out what children are normally expected to have mastered by the end of each year’.

This note is rather muddled.

It begins by suggesting that a hierarchy divided chronologically by school year is the most natural choice, because:

‘The curriculum is usually organised into years and terms for planned delivery’

That may be true, but only the Programmes of Study for the three core subjects are organised by year, and each clearly states that:

‘Schools are…only required to teach the relevant programme of study by the end of the key stage. Within each key stage, schools therefore have the flexibility to introduce content earlier or later than set out in the programme of study. In addition, schools can introduce key stage content during an earlier key stage if appropriate.’

All schools – academies and non-academies alike – therefore enjoy considerable flexibility over the distribution of the Programmes of Study between academic years.

(Later in the Report – in the commentary preceding the first six recommendations – the text mistakenly suggests that the entirety of ‘the revised curriculum is presented in a model of year-by-year progress’ (page 14) It does not mention the provision above).

The note goes on to suggest that the Commission has chosen a different route, not because of this flexibility, but because ‘children’s progress may not fit neatly into school years’:

‘…we have chosen the language of a hierarchy of expectations to avoid misunderstandings. Children may be working above or below their school year…’

But this is not an absolute hierarchy of expectations – in the sense that learners are free to progress entirely according to ability (or, more accurately, their prior attainment) rather than in age-related lock steps.

In a true hierarchy of expectations, learners would be able to progress as fast or as slowly as they are able to, within the boundaries set by:

  • On one hand, high expectations, commensurate challenge and progression;
  • On the other hand, protection against excessive pressure and hot-housing and a judicious blending of faster pace with more breadth and depth (of which more below).

This is no more than a hierarchy by school year with some limited flexibility at the margins.

.

The timing of assessment against the criteria

The third explanatory note (C) confirms the Commission’s assumption that formal assessments will be conducted at least termly – and possibly more frequently than that.

It adds:

‘It will take time before schools develop a sense of how many criteria from each year’s expectations are normally met in the autumn, spring and summer terms, and this will also vary by subject’.

This is again unclear. It could mean that a future aspiration is to judge progress termly, by breaking down the assessment criteria still further – so that a learner who met the assessment criteria for, say, the autumn term is deemed to be meeting the criteria for the year as a whole at that point.

Without this additional layer of lock-stepping, presumably the default position for the assessments conducted in the autumn and spring terms is that learners will still be working towards the assessment criteria for the year in question.

The note also mentions in passing that:

‘For some years to come, it will be hard to make predictions from outcomes of these assessments to the results in KS2 tests. Such data may emerge over time, although there are question marks over how reliable predictions may be if schools are using incompatible approaches and applying differing standards of performance and therefore cannot pool data to form large samples.’

This is one of very few places where the Report picks up on the problems that are likely to emerge from the dissonance between internal and external statutory assessment.

But it avoids the central issue, this being that the approach to internal assessment it advocates may not be entirely compatible with predicting future achievement in the KS2 tests. If so, its value is seriously diminished, both for parents and teachers, let alone the learners themselves.  This issue also reappears below.

.

Method of Assessment: How Assessment Judgements are Differentiated, Evidenced and Moderated

The four final bullet points in this section of the Design Checklist explain that all learners will be assessed as either ‘developing’, ‘meeting’, or ‘exceeding’ each relevant criterion for that year’.

Learners deemed to be exceeding the relevant criteria in a subject for a given year ‘will also be assessed against the criteria in that subject for the next year.’

Assessment judgements are supported by evidence comprising observations, records of work and test outcomes and are subject to moderation by teachers in the same school and in other schools to ensure they are fair, reliable and valid.

I will set moderation to one side until later in the post, since that too lies outside the scope of methodology.

.

Differentiation against the hierarchy of assessment criteria

The fourth explanatory note (D) addresses the vexed question of differentiation.

As readers may recall, the Report by the National Curriculum Review Expert Panel failed abjectly to explain how they would provide stretch and challenge in a system that focused exclusively on universal mastery and ‘readiness to progress’, saying only that further work was required to address the issue.

Paragraph 8.21 implied that they favoured what might be termed an ‘enrichment and extension’ model:

‘There are issues regarding ‘stretch and challenge’ for those pupils who, for a particular body of content, grasp material more swiftly than others. There are different responses to this in different national settings, but frequently there is a focus on additional activities that allow greater application and practice, additional topic study within the same area of content, and engagement in demonstration and discussion with others…These systems achieve comparatively low spread at the end of primary education, a factor vital in a high proportion of pupils being well positioned to make good use of more intensive subject-based provision in secondary schooling.’

Meanwhile, something akin to the P Scales might come into play for those children with learning difficulties.

On this latter point, the primary assessment and accountability consultation document said DfE would:

‘…explore whether P-scales should be reviewed so that they align with the revised national curriculum and provide a clear route to progress to higher attainment.’

We do not yet know whether this will happen, but Explanatory Note B to the Design Checklist conveys the clear message that the P-Scales need to be retained:

‘…must ensure we value the progress of children with special needs as much as any other group. The use of P scales here is important to ensure appropriate challenge and progression for pupils with SEN.’

By contrast, for high attainers, the Commission favours what might be called a ‘mildly accelerative’ model whereby learners who ‘exceed’ the assessment criteria applying to a subject for their year group may be given work that enables them to demonstrate progress against the criteria for the year above.

I describe it as mildly accelerative because there is no provision for learners to be assessed more than one year ahead of their chronological year group. This is a fairly low ceiling to impose on such accelerative progress.

It is also unclear whether the NAHT’s model assessment criteria will cover Year 7, the first year of the KS3 Programmes of Study, to enable this provision to extend into Year 6.

The optimal approach for high attainers would combine the ‘enrichment and extension’ approach apparently favoured by the Expert Panel with an accelerative approach that provides a higher ceiling, to accommodate those learners furthest ahead of their peers.

High attaining learners could then access a customised blend of enrichment (more breadth), extension (greater depth) and acceleration (faster pace) according to their needs.

This is good curricular practice and it should be reflected in assessment practice too, otherwise the risk is that a mildly accelerative assessment process will have an undesirable wash-back effect on teaching and learning.

Elsewhere, the Report advocates the important principle that curriculum, assessment and pedagogy should be developed in parallel, otherwise there is a risk that one – typically assessment – has an undesirable effect on the others. This would be an excellent exemplar of that statement.

The judgement whether a learner is exceeding the assessment criteria for their chronological year would be evidenced by enrichment and extension activity as well as by pre-empting the assessment criteria for the year ahead. Exceeding the criteria in terms of greater breadth or more depth should be equally valued.

This more rounded approach, incorporating a higher ceiling, should also be supported by the addition of a fourth ‘far exceeded’ judgement, otherwise the ‘exceeded’ judgement has to cover far too wide a span of attainment, from those who are marginally beyond their peers to those who are streets ahead.

These concerns need urgently to be addressed, before NAHT gets much further with its model criteria.

.

The aggregation of criteria

In order to make the overall judgement for each subject, learners’ performance against individual assessment criteria has to be combined to give an aggregate measure.

The note says:

‘The criteria themselves can be combined to provide the qualitative statement of a pupil’s achievements, although teachers and schools may need a quantitative summary. Few schools appear to favour a pure binary approach of yes/no. The most popular choice seems to be a three phase judgement of working towards (or emerging, developing), meeting (or mastered, confident, secure, expected) and exceeded. Where a student has exceeded a criterion, it may make sense to assess them also against the criteria for the next year.’

This, too, begs some questions. The statement above is consistent with one of the Report’s central recommendations:

‘Pupil progress and achievement should be communicated in terms of descriptive profiles rather than condensed to numerical summaries (although schools may wish to use numerical data for internal purposes).’

Frankly it seems unlikely that such ‘condensed numerical summaries’ can be kept hidden from parents. Indeed, one might argue that they have a reasonable right to know them.

These aggregations – whether qualitative or quantitative – will be differentiated at three levels, according to whether the learner best fits a ‘working towards’, ‘meeting’ or ‘exceeding’ judgement for the criteria relating to the appropriate year in each programme of study.

I have just recommended that there needs to be an additional level at the top end, to remove undesirable ceiling effects that lower expectations and are inconsistent with the Principles set out in the Report. I leave it to others to judge whether, if this was accepted, a fifth level is also required at the lower end to preserve the symmetry of the scale.

There is also a ‘chicken and egg’ issue here. It is not clear whether a learner must already be meeting some of the criteria for the succeeding year in order to show they are exceeding the criteria for their own year – or whether assessment against the criteria for the succeeding year is one potential consequence of a judgement that they are exceeding the criteria for their own year.

This confusion is reinforced by a difference of emphasis between the checklist – which says clearly that learners will be assessed against the criteria for the succeeding year if they exceeded the criteria for their own – and the explanatory note, which says only that this may happen.

Moreover, the note suggests that this applies criterion by criterion – ‘where a student has exceeded a criterion’ – rather than after the criteria have been aggregated, which is the logical assumption from the wording in the checklist – ‘exceeded the relevant criteria’.

This too needs clarifying.

.

.

Recommendations and Commentary

I will try not to repeat in this section material already covered above.

I found that the recommendations did not always sit logically with the preceding commentary, so I have departed from the subsections used in the Report, grouping the material into four broad sections: further methodological issues; in-school and school-to school support; national support; and phased implementation.

Each section leads with the relevant Recommendations and folds in additional relevant material from different sections of the commentary. I have repeated recommendations where they are relevant to more than one section.

.

Further methodological issues

Recommendation 4: Pupils should be assessed against objective criteria rather than ranked against each other

Recommendation 5: Pupil progress and achievements should be communicated in terms of descriptive profiles rather than condensed to numerical summaries (although schools may wish to use numerical data for internal purposes.

Recommendation 6: In respect of the National Curriculum, we believe it is valuable – to aid communication and comparison – for schools to be using consistent criteria for assessment. To this end, we call upon NAHT to develop and promote a set of model assessment criteria based on the new National Curriculum.

The commentary discusses the evolution of National Curriculum levels, including the use of sub-levels and their application to progress as well as achievement. In doing so, it summarises the arguments for and against the retention of levels.

In favour of retention:

  • The system of levels provides a common language used by schools to summarise attainment and progress;
  • It is argued (by some professionals) that parents have grown up with levels and have an adequate grasp of what they mean;
  • The numerical basis of levels was useful to schools in analysing and tracking the performance of large numbers of pupils;
  • The decision to remove levels was unexpected and caused concern within the profession, especially as it was also announced that being ‘secondary ready’ was to be associated with the achievement of Level 4B;
  • If levels are removed, they must be replaced by a different common language, or at least ‘an element of compatibility or common understanding’ should several different assessment systems emerge.

In favour of removal:

  • It is argued (by the Government) that levels are not understood by parents and other stakeholders;
  • The numerical basis of levels does not have the richness of a more rounded description of achievement. The important narrative behind the headline number is often lost through over-simplification.
  • There are adverse effects from labelling learners with levels.

The Commission is also clear that the Government places too great a reliance on tests, particularly for accountability purposes. This has narrowed the curriculum and resulted in ‘teaching to the test’.

It also creates other perverse incentives, including the inflation of assessment outcomes for performance management purposes or, conversely, the deflation of assessment outcomes to increase the rate of progress during the subsequent key stage.

Moreover, curriculum, assessment and pedagogy must be mutually supportive. Although the Government has not allowed the assessment tail to wag the curricular dog:

‘…curriculum and assessment should be developed in tandem.’

Self-evidently, this has not happened, since the National Curriculum was finalised way ahead of the associated assessment arrangements which, in the primary sector, are still unconfirmed.

There is a strong argument that such assessment criteria should have been developed by the Government and made integral to the National Curriculum.

Indeed, in Chapter 7 of its Report on ‘The Framework for the National Curriculum’, the National Curriculum Expert Panel proposed that attainment targets should be retained, not in the form of level descriptors but as ‘statements of specific learning outcomes related to essential knowledge’ that  would be ’both detailed and precise’. They might be presented alongside the Programmes of Study.

The Government ignored this, opting for a very broad single, standard attainment target in each programme of study:

‘By the end of each key stage, pupils are expected to know, apply and understand the matters, skills and processes specified in the relevant programme of study.’

As I pointed out in a previous post, one particularly glaring omission from the Consultation Document on Primary Assessment and Accountability was any explanation of how Key Stage Two tests and statutory teacher assessments would be developed from these singleton ‘lowest common denominator’ attainment targets, especially in a context where academies, while not obliged to follow the National Curriculum, would undertake the associated tests.

We must await the long-delayed response to the consultation to see if it throws any light on this matter.

Will it commit the Government to producing a framework, at least for statutory tests in the core subjects, or will it throw its weight behind the NAHT’s model criteria instead?

I have summarised this section of the Report in some detail as it is the nearest it gets to providing a rational justification for the approach set out in the recommendations above.

The model criteria appear confined to the National Curriculum at this point, though we have already noted that is not the case elsewhere in the Report.

I have also discussed briefly the inconsistency in permitting the translation of descriptive profiles into numerical data ‘for internal purposes’, but undertook to develop that further, for there is a wider case that the Report does not entertain.

We know that there will be scores attached to KS2 tests, since those are needed to inform parents and for accountability purposes.

The Primary Assessment and Accountability consultation document proposed a tripartite approach:

  • Scaled scores to show attainment, built around a new ‘secondary-ready’ standard, broadly comparable with the current Level 4B;
  • Allocation to a decile within the range of scaled scores achieved nationally, showing attainment compared with one’s peers; and
  • Comparison with the average scaled score of those nationally with the same prior attainment at the baseline, to show relative progress.

Crudely speaking, the first of these measures is criterion-referenced while the second and third are norm-referenced.

We do not yet know whether these proposals will proceed – there has been some suggestion that deciles at least will be dropped – but parents will undoubtedly want schools to be able to tell them what scaled scores their children are on target to achieve, and how those compare with the average for those with similar prior attainment.

It will be exceptionally difficult for schools to convey that information within the descriptive profiles, insofar as they relate to English and maths, without adopting the same numerical measures.

It might be more helpful to schools if the NAHT’s recommendations recognised that fact. For the brutal truth is that, if schools’ internal assessment processes do not respond to this need, they will have to set up parallel processes that do so.

In order to derive descriptive profiles, there must be objective assessment criteria that supply the building blocks, hence the first part of Recommendation 4. But I can find nothing in the Report that explains explicitly why pupils cannot also be ranked against each other. This can only be a veiled and unsubstantiated objection to deciles.

Of course it would be quite possible to rank pupils at school level and, in effect, that is what schools will do when they condense the descriptive profiles into numerical summaries.

The real position here is that such rankings would exist, but would not be communicated to parents, for fear of ‘labelling’. But the labelling has already occurred, so the resistance is attributable solely to communicating these numerical outcomes to parents. That is not a sustainable position.

.

In-school and school-to-school support

Recommendation 1: Schools should review their assessment practice against the principles and checklist set out in this report. Staff should be involved in the evaluation of existing practice and the development of a new, rigorous assessment system and procedures to enable the school to promote high quality teaching and learning.

Recommendation 2: All schools should have clear assessment principles and practices to which all staff are committed and which are implemented. These principles should be supported by school governors and accessible to parents, other stakeholders and the wider school community.

Recommendation 3: Assessment should be part of all school development plans and should be reviewed regularly. This review process should involve every school identifying its own learning and development needs for assessment. Schools should allocate specific time and resources for professional development in this area and should monitor how the identified needs are being met.

Recommendation 7 (part): Schools should work in collaboration, for example in clusters, to ensure a consistent approach to assessment. Furthermore, excellent practice in assessment should be identified and publicised…

Recommendation 9: Schools should identify a trained assessment lead, who will work with other local leads and nationally accredited assessment experts on moderation activities.

Recommendation 16: All those responsible for children’s learning should undertake rigorous training in formative, diagnostic and summative assessment, which covers how assessment can be used to support teaching and learning for all pupils, including those with special educational needs. The government should provide support and resources for accredited training for school assessment leads and schools should make assessment training a priority.

Recommendation 20: Schools should be asked to publish their principles of assessment from September 2014, rather than being required to publish a detailed assessment framework, which instead should be published by 2016. The development of the full framework should be outlined in the school development plan with appropriate milestones that allow the school sufficient time to develop an effective model.

All these recommendations are perfectly reasonable in themselves, but it is worth reflecting for a while on the likely cost and workload implications, particularly for smaller primary schools:

Each school must have a ‘trained assessment lead’ who may or may not be the same as the ‘senior leader who is responsible for assessment’ mentioned in the Design Checklist. There is no list of responsibilities for that person, but it would presumably include:

  • Leading the review of assessment practice and developing a new assessment system;
  • Leading the definition of the school’s assessment principles and practices and communicating these to governors, parents, stakeholders and the wider community;
  • Lead responsibility for the coverage of assessment within the school’s development plan and the regular review of that coverage;
  • Leading the identification and monitoring of the school’s learning and development needs for assessment;
  • Ensuring that all staff receive appropriate professional development – including ‘rigorous training in formative diagnostic and summative assessment’;
  • Leading the provision of in-school and school-to-school professional development relating to assessment;
  • Allocating time and resources for all assessment-related professional development and monitoring its impact;
  • Leading collaborative work with other schools to ensure a consistent approach to assessment;
  • Dissemination of effective practice;
  • Working with other local assessment leads and external assessment experts on moderation activities.

And, on top of this, there is a range of unspecified additional responsibilities associated with the statutory tests.

It is highly unlikely that this range of responsibilities could be undertaken effectively by a single person in less than half a day a week, as a bare minimum. There will also be periods of more intense pressure when a substantially larger time allocation is essential.

The corresponding salary cost for a ‘senior leader’ might be £3,000-£4,000 per year, not to mention the cost of undertaking the other responsibilities displaced.

There will also need to be a sizeable school budget and time allocation for staff to undertake reviews, professional development and moderation activities.

Moderation itself will bear a significant cost. Internal moderation may have a bigger opportunity cost but external moderation will otherwise be more expensive.

Explanatory note (E), attached to the Design Checklist, says:

‘The exact form of moderation will vary from school to school and from subject to subject. The majority of moderation (in schools large enough to support it) will be internal but all schools should undertake a proportion of external moderation each year, working with partner schools and local agencies.’

Hence the cost of external moderation will fall disproportionately on smaller schools with smaller budgets.

It would be wrong to suggest that this workload is completely new. To some extent these various responsibilities will be undertaken already, but the Commission’s recommendations are effectively a ratcheting up of the demand on schools.

Rather than insisting on these responsibilities being allocated to a single individual with other senior management responsibilities, it might be preferable to set out the responsibilities in more detail and give schools greater flexibility over how they should be distributed between staff.

Some of these tasks might require senior management input, but others could be handled by other staff, including paraprofessionals.

.

National support

Recommendation 7 (part): Furthermore, excellent practice in assessment should be identified and publicised, with the Department for Education responsible for ensuring that this is undertaken.

Recommendation 8 (part): Schools should be prepared to submit their assessment to external moderators, who should have the right to provide a written report to the head teacher and governors setting out a judgement on the quality and reliability of assessment in the school, on which the school should act. The Commission is of the view that at least some external moderation should be undertaken by moderators with no vested interest in the outcomes of the school’s assessment. This will avoid any conflicts of interest and provide objective scrutiny and broader alignment of standards across schools.

Recommendation 9: Schools should identify a trained assessment lead, who will work with other local leads and nationally accredited assessment experts on moderation activities.

Recommendation 11: The Ofsted school inspection framework should explore whether schools have effective assessment systems in place and consider how effectively schools are using pupil assessment information and data to improve learning in the classroom and at key points of transition between key stages and schools.

Recommendation 14: Further work should be undertaken to improve training for assessment within initial teacher training (ITT), the newly qualified teacher (NQT) induction year and on-going professional development. This will help to build assessment capacity and support a process of continual strengthening of practice within the school system.

Recommendation 15: The Universities’ Council for the Education of Teachers (UCET) should build provision in initial teacher training for delivery of the essential assessment knowledge.

Recommendation 16: All those responsible for children’s learning should undertake rigorous training in formative, diagnostic and summative assessment, which covers how assessment can be used to support teaching and learning for all pupils, including those with special educational needs. The government should provide support and resources for accredited training for school assessment leads and schools should make assessment training a priority.

Recommendation 17: A number of pilot studies should be undertaken to look at the use of information technology (IT) to support and broaden understanding and application of assessment practice.

Recommendation 19: To assist schools in developing a robust framework and language for assessment, we call upon the NAHT to take the lead in expanding the principles and design checklist contained in this report into a full model assessment policy and procedures, backed by appropriate professional development.

There are also several additional proposals in the commentary that do not make it into the formal recommendations:

  • Schools should be held accountable for the quality of their assessment practice as well as their assessment results, with headteachers also appraising teachers on their use of assessment. (The first part of this formulation appears in Recommendation 11 but not the second.) (p17);
  • It could be useful for the teaching standards to reflect further assessment knowledge, skills and understanding (p17);
  • A national standard in assessment practice for teachers would be a useful addition (p18);
  • The Commission also favoured the approach of having a lead assessor to work with each school or possibly a group of schools, helping to embed good practice across the profession (p18).

We need to take stock of the sheer scale of the infrastructure that is being proposed and its likely cost.

In respect of moderation alone, the Report is calling for sufficient external moderators, ‘nationally accredited assessment experts’ and possibly lead assessors to service some 17,000 primary schools.

Even if we assume that these roles are combined in the same person and that each person can service, say, 25 schools, that still demands something approaching a cadre of 700 people who also need to be supported, managed and trained.

If they are serving teachers there is an obvious opportunity cost. Providing a service of this scale would cost tens of millions of pounds a year.

Turning to training and professional development, the Commission is proposing:

  • Accredited training for some 17,000 school assessment leads (with an ongoing requirement to train new appointees and refresh the training of those who undertook it too far in the past);
  • ‘Rigorous training in formative, diagnostic and summative assessment, which covers how assessment can be used to support teaching and learning for all pupils, including those with special educational needs’ for everyone deemed responsible for children’s learning, so not just teachers. This will include hundreds of thousands of people in the primary sector alone.
  • Revitalised coverage of assessment in ITE and induction, on top of the requisite professional development package.

The Report says nothing of the cost of developing, providing and managing this huge training programme, which would cost some more tens of millions of pounds a year.

I am plucking a figure out of the air, but it would be reasonable to suggest that moderation and training costs combined might require an annual budget of some £50 million – and quite possibly double that. 

Unless one argues that the testing regime should be replaced by a national sampling process – and while the Report says some of the Commission’s members supported that, it stops short of recommending it – there are no obvious offsetting savings.

It is disappointing that the Commission made no effort at all to quantify the cost of its proposals.

These recommendations provide an excellent marketing opportunity for some of the bodies represented on the Commission.

For example, the CIEA press release welcoming the Report says:

‘One of the challenges, and one that schools will need to meet, is in working together, and with local and national assessment experts, to moderate their judgements and ensure they are working to common standards across the country. The CIEA has an important role to play in training these experts.’

Responsibility for undertaking pilot studies on the role of IT in assessment is not allocated, but one assumes it would be overseen by central government and also funded by the taxpayer.

Any rollout from the pilots would have additional costs attached and would more than likely create additional demand for professional development.

The reference to DfE taking responsibility for sharing excellent practice is already a commitment in the consultation document:

‘…we will provide examples of good practice which schools may wish to follow. We will work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’ (paragraph 3.8).

Revision of the School Inspection Framework will require schools to give due priority to the quality of their assessment practice, though Ofsted might reasonably argue that it is already there.

Paragraph 116 of the School Inspection Handbook says:

‘Evidence gathered by inspectors during the course of the inspection should include… the quality and rigour of assessment, particularly in nursery, reception and Key Stage 1.’

We do not yet know whether NAHT will respond positively to the recommendation that it should go beyond the model assessment criteria it has already commissioned by leading work to expand the Principles and Design Checklist into ‘a full model assessment policy and procedures backed by appropriate professional development’.

There was no reference to such plans in the press release accompanying the Report.

Maybe the decision could not be ratified in time by the Association’s decision-making machinery – but this did not prevent the immediate commissioning of the model criteria.

.

Phased Implementation

Recommendation 10: Ofsted should articulate clearly how inspectors will take account of assessment practice in making judgements and ensure both guidance and training for inspectors is consistent with this.

Recommendation 12: The Department for Education should make a clear and unambiguous statement on the teacher assessment data that schools will be required to report to parents and submit to the Department for Education. Local authorities and other employers should provide similar clarity about requirements in their area of accountability.

Recommendation 13: The education system is entering a period of significant change in curriculum and assessment, where schools will be creating, testing and revising their policies and procedures. The government should make clear how they will take this into consideration when reviewing the way they hold schools accountable as new national assessment arrangements are introduced during 2014/15. Conclusions about trends in performance may not be robust.

Recommendation 18: The use by schools of suitably modified National Curriculum levels as an interim measure in 2014 should be supported by the government. However, schools need to be clear that any use of levels in relation to the new curriculum can only be a temporary arrangement to enable them to develop, implement and embed a robust new framework for assessment. Schools need to be conscious that the new curriculum is not in alignment with the old National Curriculum levels.

Recommendation 20: Schools should be asked to publish their principles of assessment from September 2014, rather than being required to publish a detailed assessment framework, which instead should be published by 2016. The development of the full framework should be outlined in the school development plan with appropriate milestones that allow the school sufficient time to develop an effective model.

Recommendation 21: A system wide review of assessment should be undertaken. This would help to repair the disjointed nature of assessment through all ages, 2-19.

The Commission quite rightly identifies a number of issues caused by the implementation timetable, combined with continuing uncertainty over aspects of the Government’s plans.

At the time of writing, the response to the consultation document has still not been published (though it was due in autumn 2013) yet schools will be implementing the new National Curriculum from this September.

The Report says:

‘There was strong concern expressed about the requirement for schools to publish their detailed curriculum and assessment framework in September 2014.’

This is repeated in Recommendation 20, together with the suggestion that this timeline should be amended so that only a school’s principles for assessment need be published by this September.

I have been trying to pin down the source of this requirement.

Schedule 4 of The School Information (England) (Amendment) Regulations 2012 do not require the publication of a detailed assessment framework, referring only to

‘The following information about the school curriculum—

(a)  in relation to each academic year, the content of the curriculum followed by the school for each subject and details as to how additional information relating to the curriculum may be obtained;

(b)  in relation to key stage 1, the names of any phonics or reading schemes in operation; and

(c)  in relation to key stage 4—

(i)            a list of the courses provided which lead to a GCSE qualification,

(ii)          a list of other courses offered at key stage 4 and the qualifications that may be acquired.’

I could find no Government guidance stating unequivocally that this requires schools to carve up all the National Curriculum programmes of study into year-by-year chunks.  (Though there is no additional burden attached to publication if they have already undertaken this task for planning purposes.)

There are references to the publication of Key Stage 2 results (which will presumably need updating to reflect the removal of levels), but nothing on the assessment framework.

Moreover, the DfE mandatory timeline says that from the Spring Term of 2014:

‘All schools must publish their school curriculum by subject and academic year, including their provision of personal, social, health and economic education (PSHE).’

(The hyperlink returns one to the Regulations quoted above.)

There is no requirement for publication of further information in September.

I wonder therefore if this is a misunderstanding. I stand to be corrected if readers can point me to the source.

It may arise from the primary assessment and accountability consultation document, which discusses publication of curricular details and then proceeds immediately to discuss the relationship between curriculum and assessment:

‘Schools are required to publish this curriculum on their website…In turn schools will be free to design their approaches to assessment, to support pupil attainment and progression. The assessment framework must be built into the school curriculum, so that schools can check what pupils have learned and whether they are on track to meet expectations at the end of the key stage, and so that they can report regularly to parents.’ (paras 3.4-3.5)

But this conflation isn’t supported by the evidence above and, anyway, these are merely proposals.

That said, it must be assumed that the Commission consulted its DfE observer on this point before basing recommendations on this interpretation.

If the observer’s response was consistent with the Commission’s interpretation, then it is apparently inconsistent with all the material so far published by the Department!

It may be necessary for NAHT to obtain clarification of this point given the evidence cited above.

That aside, there are issues associated with the transition from the current system to the future system.

The DfE’s January 2014 ‘myths and facts’ publication says:

‘As part of our reforms to the national curriculum, the current system of “levels” used to report children’s attainment and progress will be removed from September 2014. Levels are not being banned, but will not be updated to reflect the new national curriculum and will not be used to report the results of national curriculum tests. Key Stage 1 and Key Stage KS2 [sic] tests taken in the 2014 to 2015 academic year will be against the previous national curriculum, and will continue to use levels for reporting purposes

Schools will be expected to have in place approaches to formative assessment that support pupil attainment and progression. The assessment framework should be built into the school curriculum, so that schools can check what pupils have learned and whether they are on track to meet expectations at the end of the key stage, and so that they can report regularly to parents. Schools will have the flexibility to use approaches that work for their pupils and circumstances, without being constrained by a single national approach.’

The reference here to having approaches in place – rather than the publication of a ‘detailed curriculum and assessment framework’ – would not seem wildly inconsistent with the Commission’s idea that schools should establish their principles by September 2014, and develop their detailed assessment frameworks iteratively over the two succeeding years. However, the Government needs to clarify the position.

Since Key Stage 2 tests will not dispense with levels until May 2016 (and they will be published in the December 2015 Performance Tables), there will be an extended interregnum in which National Curriculum Levels will continue to have official currency.

Moreover, levels may still be used in schools – they are not being banned – though they will not be aligned to the new National Curriculum.

The Report says:

‘…it is important to recognise that, even if schools decide to continue with some form of levels, the new National Curriculum does not align to the existing levels and level descriptors and this alignment is a piece of work that needs to be undertaken now.’ (p19).

However, the undertaking of this work does not feature in the Recommendations, unless it is implicit in the production by NAHT of ‘a full model assessment policy and procedures’, which seems unlikely.

One suspects that the Government would be unwilling to endorse such a process, even as a temporary arrangement, since what is to stop schools from continuing to use this new improved levels structure more permanently?

The Commission would appear to be on stronger ground in asking Ofsted to make allowances during the interregnum (which is what I think Recommendation 10 is about) especially given that, as Recommendation 13 points out, evidence of ‘trends in performance may not be robust’.

The point about clarity over teacher assessment is well made – and one hopes it will form part of the response to the primary assessment and accountability consultation document when that is eventually published.

The Report itself could have made progress in this direction by establishing and maintaining a clearer distinction between statutory and internal teacher assessment.

The consultation document itself made clear that KS2 writing would continue to be assessed via teacher assessment rather than a test, and, moreover:

‘At the end of each key stage schools are required to report teacher assessment judgements in all national curriculum subjects to parents. Teachers will judge whether each pupil has met the expectations set out in the new national curriculum. We propose to continue publishing this teacher assessment in English, mathematics and science, as Lord Bew recommended.’ (para 3.9)

But what it does not say is what requirements will be imposed to ensure consistency across this data. Aside from KS2 writing, will they also be subject to the new scaled scores, and potentially deciles too?

Until schools have answers to that question, they cannot consider the overall shape of their assessment processes.

The final recommendation, for a system-wide review of assessment from 2-19 is whistling in the wind, especially given the level of disruption already caused by the decision to remove levels.

Neither this Government nor the next is likely to act upon it.

 

Conclusion

The Commission’s Report moves us forward in broadly the right direction.

The Principles, Design Checklist and wider recommendations help to fill some of the void created by the decision to remove National Curriculum levels, the limited nature of the primary assessment and accountability consultation document and the inordinate delay in the Government’s response to that consultation.

We are in a significantly better place as a consequence of this work being undertaken.

But there are some worrying inconsistencies in the Report as well as some significant shortcomings to the proposals it contains. There are also several unanswered questions.

Not to be outdone, I have bound these up into a series of recommendations directed at NAHT and its Commission. There are 23 in all and I have given mine letters rather than numerals, to distinguish them from the Commission’s own recommendations.

  • Recommendation A: The Commission should publish all the written evidence it received.
  • Recommendation B: The Commission should consult on key provisions within the Report, seeking explicit commitment to the Principles from DfE, Ofqual and Ofsted.
  •  Recommendation C: The Commission should ensure that its Design Checklist is fully consistent with the Principles in all respects. It should also revisit the internal logic of the Design Checklist.
  • Recommendation D: So far as possible, ahead of the primary assessment and accountability consultation response, the Commission should distinguish clearly how its proposals relate to statutory teacher assessment, alongside schools’ internal assessment processes.
  • Recommendation E: NAHT should confirm who it has commissioned to produce model assessment criteria and to what timetable. It should also explain how these criteria will be ‘nationally standardised’.
  • Recommendation F: The Commission should clarify whether the trained assessment lead mentioned in Recommendation 9 is the same or different to the ‘senior leader who is responsible for assessment’ mentioned in the Design Checklist.
  • Recommendation G: The Commission should set out more fully the responsibilities allocated to this role or roles and clarify that schools have flexibility over how they distribute those responsibilities between staff.
  • Recommendation H:  NAHT should clarify how the model criteria under development apply – if at all – to the wider school curriculum in all schools and to academies not following the National Curriculum.
  • Recommendation I: NAHT should clarify how the model criteria under development will allow for the fact that in all subjects all schools enjoy flexibility over the positioning of content in different years within the same key stage – and can also anticipate parts of the subsequent key stage.
  • Recommendation J: NAHT should clarify whether the intention is that the model criteria should reflect the allocation of content to specific terms as well as to specific years.
  • Recommendation K: The Commission should explain how its approach to internal assessment will help predict future performance in end of Key Stage tests.
  • Recommendation L: The Commission should shift from its narrow and ‘mildly accelerative’ view of high attainment to accommodate a richer concept that combines enrichment (breadth), extension (depth) and acceleration (faster pace) according to learners’ individual needs.
  • Recommendation M: The Commission should incorporate a fourth ‘far exceeded’ assessment judgement, since the ‘exceeded’ judgement covers too wide a span of attainment.
  • Recommendation N: NAHT should clarify whether its model criteria will extend into KS3, to accommodate assessment against the criteria for at least year 7, and ideally beyond.
  • Recommendation O: The Commission should clarify whether anticipating criteria for a subsequent year is a cause or a consequence of being judged to be ‘exceeding’ expectations in the learner’s own chronological year.
  • Recommendation P: The Commission should confirm that numerical summaries of assessment criteria – as well as any associated ranking positions – should be made available to parents who request them.
  • Recommendation Q: The Commission should explain why schools should be forbidden from ranking learners against each other (or allocating them to deciles).
  • Recommendation R: The Commission should assess the financial impact of its proposals on schools of different sizes.
  • Recommendation S: The Commission should cost its proposals for training and moderation, identifying the burden on the taxpayer and any offsetting savings.
  • Recommendation T: NAHT should clarify its response to Recommendation 19, that it should lead the development of a full model assessment policy and procedures.
  • Recommendation U: The Commission should clarify with DfE its understanding that schools are required to publish a detailed curriculum and assessment framework by September 2014.
  • Recommendation V: The Commission should clarify with DfE the expectation that it should have in place ‘approaches to formative assessment’ and whether the proposed assessment principles satisfy this requirement.
  • Recommendation W: The commission should clarify whether it is proposing that work is undertaken to align National Curriculum levels with the new National Curriculum and, if so, who it proposes should undertake this.

So – good overall – subject to these 23 reservations!

Some are more significant than others. Given my area of specialism, I feel particularly strongly about those that relate directly to high attainers, especially L and M above.

Those are the two I would nail to the door of 1 Heath Square.

.

GP

March 2014

What Becomes of Schools That Fail Their High Attainers?*

.

This post reviews the performance and subsequent history of schools with particularly poor results for high attainers in the Secondary School Performance Tables over the last three years.

P1010120

Seahorse in Perth Aquarium by Gifted Phoenix

It establishes a high attainer ‘floor target’ so as to draw a manageable sample of poor performers and, having done so:

  • Analyses the characteristics of this sample;
  • Explores whether these schools typically record poor performance in subsequent years or manage to rectify matters;
  • Examines the impact of various interventions, including falling below the official floor targets, being placed in special measures or deemed to have serious weaknesses following inspection, becoming an academy and receiving a pre-warning and/or warning notice;
  • Considers whether the most recent Ofsted reports on these schools do full justice to this issue, including those undertaken after September 2013 when new emphasis was placed on the performance of the ‘most able’.

The post builds on my previous analysis of high attainment in the 2013 School Performance Tables (January 2014). It applies the broad definition of high attainers used in the Tables, which I discussed in that post and have not repeated here.

I must emphasise at the outset that factors other than poor performance may partially explain particularly low scores in the Tables.

There may be several extenuating circumstances that are not reflected in the results. Sometimes these may surface in Ofsted inspection reports, but the accountability and school improvement regime typically imposes a degree of rough justice, and I have followed its lead.

It is also worth noting that the Performance Tables do not provide data for schools where the number of high attainers is five or fewer, because of the risk that individuals may be identifiable even though the data is anonymised.

This is unfortunate since the chances are that schools with very few high attainers will find it more difficult than others to address their needs. We may never know, but there is more on the impact of cohort size below.

Finally please accept my customary apology for any transcription errors. Do let me know if you notice any and I will correct them.

.

Drawing the Sample

The obvious solution would be to apply the existing floor targets to high attainers.

So it would include all schools recording:

  • Fewer than 35% (2011) or 40% (2012 and 2013) of high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and mathematics and
  • Below median scores for the percentage of high attainers making at least the expected three levels of progress between Key Stages 2 and 4 in English and maths respectively.

But the first element is far too undemanding a threshold to apply for high attaining learners and the overall target generates a tiny sample.

The only school failing to achieve it in 2013 was Ark Kings Academy in Birmingham, which recorded just six high attainers, forming 9% of the cohort (so only just above the level at which results would have been suppressed).

In 2012 two schools were in the same boat:

  • The Rushden Community College in Northamptonshire, with 35 high attainers (26% of the cohort), which became a sponsored academy with the same name on 1 December 2012; and
  • Culverhay School in Bath and North East Somerset, with 10 high attainers (19% of the cohort), which became Bath Community Academy on 1 September 2012.

No schools at all performed at this level in 2011.

A sample of just three schools is rather too unrepresentative, so it is necessary to set a more demanding benchmark which combines the same threshold and progress elements.

The problem is not with the progress measure. Far too many schools fail to meet the median level of performance – around 70% each year in both English and maths – even with their cadres of high attainers. Hence I need to lower the pitch of this element to create a manageable sample.

I plumped for 60% or fewer high attainers making at least the expected progress between KS2 and KS4 in both English and maths. This captured 22 state-funded schools in 2013, 31 in 2012 and 38 in 2011. (It also enabled Ark King’s Academy to escape, by virtue of the fact that 67% of its high attainers learners achieved the requisite progress in English.)

For the threshold element I opted for 70% or fewer high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and maths. This captured 19 state-funded schools in 2013, 29 in 2012 and 13 in 2011.

.

Venn 2.

The numbers of state-funded schools that met both criteria were seven in 2013, eight  in 2012 and five in 2011, so 20 in all.

I decided to feature this small group of schools in the present post while also keeping in mind the schools occupying each side of the Venn Diagram. I particularly wanted to see whether schools which emerged from the central sample in subsequent years continued to fall short on one or other of the constituent elements.

The 20 schools in the main sample are:

Table 1 below provides more detail about these 20 schools.

.

Table 1: Schools Falling Below Illustrative High Attainer Floor Targets 2011-2013

Name Type LA Status/Sponsor Subsequent History
2011
Carter Community School 12-16 mixed modern Poole Community Sponsored academy (ULT) 1/4/13
Hadden Park High School 11-16 mixed comp Nottingham Foundation Sponsored Academy (Bluecoat School) 1/1/14
Merchants Academy 11-18 mixed comp Bristol Sponsored Academy (Merchant Venturers/ University of Bristol
The Robert Napier School 11-18 mixed modern Medway Foundation Sponsored Academy (Fort Pitt Grammar School)  1/9/12
Bishop of Rochester Academy 11-18 mixed comp Kent Sponsored Academy (Medway Council/ Canterbury Christ Church University/ Diocese of Rochester)
2012
The Rushden Community College 11-18 mixed comp Northants Community Sponsored Academy (The Education Fellowship) 12/12
Culverhay School 11-18 boys comp Bath and NE Somerset Community Bath Community Academy – mixed (Cabot Learning) 1/9/12
Raincliffe School 11-16 mixed comp N Yorks Community Closed 8/12 (merged with Graham School)
The Coseley School 11-16 mixed comp Dudley Foundation
Fleetwood High School 11-18 mixed comp Lancs Foundation
John Spendluffe Foundation Technology College 11-16 mixed modern Lincs Academy converter
Parklands High School 11-18 mixed Liverpool Foundation Discussing academy sponsorship (Bright Tribe)
Frank F Harrison Engineering College 11-18 mixed comp Walsall Foundation Mirus Academy (sponsored by Walsall College) 1/1/12
2013
Gloucester Academy 11-19 mixed comp Glos Sponsored Academy (Prospect Education/ Gloucestershire College)
Christ the King Catholic and Church of England VA School 11-16 mixed comp Knowsley VA Closed 31/8/13
Aireville School 11-16 mixed modern N Yorks Community
Manchester Creative and Media Academy for Boys 11-19 boys comp Manchester Sponsored Academy (Manchester College/ Manchester Council/ Microsoft)
Fearns Community Sports College 11-16 mixed comp Lancs Community
Unity College Blackpool 5-16 mixed comp Blackpool Community Unity Academy Blackpool (sponsored by Fylde Coast Academies)
The Mirus Academy 3-19 mixed comp Walsall Sponsored Academy (Walsall College)

 .

Only one school appears twice over the three-year period albeit in two separate guises – Frank F Harrison/Mirus.

Of the 20 in the sample, seven were recorded in the relevant year’s Performance Tables as community schools, six as foundation schools, one was VA, one was an academy converter and the five remaining were sponsored academies.

Of the 14 that were not originally academies, seven have since become sponsored academies and one is discussing the prospect. Two more have closed, so just five – 25% of the sample – remain outside the academies sector.

All but two of the schools are mixed (the other two are boys’ schools). Four are modern schools and the remainder comprehensive.

Geographically they are concentrated in the Midlands and the North, with a few in the South-West and the extreme South-East. There are no representatives from London, the East or the North-East.

.

Performance of the Core Sample

Table 2 below looks at key Performance Table results for these schools. I have retained the separation by year and the order in which the schools appear, which reflects their performance on the GCSE threshold measure, with the poorest performing at the top of each section.

.

Table 2: Performance of schools falling below proposed high attainer floor targets 2011-2013

Name No of HA % HA 5+ A*-C incl E+M 3+ LoP En 3+ LoP Ma APS (GCSE)
2011
Carter Community School 9 13 56 56 44 304.9
Hadden Park High School 15 13 60 40 20 144.3
Merchants Academy 19 19 68 58 42 251.6
The Robert Napier School 28 12 68 39 46 292.8
Bishop of Rochester Academy 10 5 70 50 60 298.8
2012
The Rushden Community College 35 26 3 0 54 326.5
Culverhay School 10 19 30 40 20 199.3
Raincliffe School 6 11 50 50 33 211.5
The Coseley School 35 20 60 51 60 262.7
Fleetwood High School 34 22 62 38 24 272.9
John Spendluffe Foundation Technology College 14 12 64 50 43 283.6
Parklands High School 13 18 69 23 8 143.7
Frank F Harrison Engineering College 20 12 70 35 60 188.3
2013
Gloucester Academy 18 13 44 28 50 226.8
Christ the King Catholic and Church of England VA School 22 22 55 32 41 256.5
Aireville School 23 23 61 35 57 267.9
Manchester Creative and Media Academy for Boys 16 19 63 50 50 244.9
Fearns Community Sports College 22 13 64 36 59 306.0
Unity College Blackpool 21 18 67 57 52 277.1
The Mirus Academy 23 13 70 57 52 201.4

.

The size of the high attainer population in these schools varies between 6 (the minimum for which statistics are published) and 35, with an average of just under 20.

The percentage of high attainers within each school’s cohort ranges from 5% to 26% with an average of slightly over 16%.

This compares with a national average in 2013 for all state-funded schools of 32.4%, almost twice the size of the average cohort in this sample. All 20 schools here record a high attainer population significantly below this national average.

This correlation may be significant – tending to support the case that high attainers are more likely to struggle in schools where they are less strongly concentrated – but it does not prove the relationship.

Achievement against the GCSE threshold measure falls as low as 3% (Rushden in 2012) but this was reportedly attributable to the school selecting ineligible English specifications.

Otherwise the poorest result is 30% at Culverhay, also in 2012, followed by Gloucester Academy (44% in 2013) and Raincliffe (50% in 2012). Only these four schools have recorded performance at or below 50%.

Indeed there is a very wide span of performance even amongst these small samples, especially in 2012 when it reaches an amazing 67 percentage points (40 percentage points excluding Rushden). In 2013 there was a span of 26 percentage points and in 2011 a span of 14 percentage points.

The overall average amongst the 20 schools is almost 58%. This varies by year. In 2011 it was 64%, in 2012 it was significantly lower at 51% (but rose to 58% if Rushden is excluded) and in 2013 it was 61%.

This compares with a national average for high attainers in state-funded schools of 94.7% in 2013. The extent to which some of these outlier schools are undershooting the national average is truly eye-watering.

Turning to the progress measures, one might expect even greater variance, given that so many more schools fail to clear this element of the official floor targets with their high attainers.

The overall average across these 20 schools is 41% in English and 44% in maths, suggesting that performance is slightly stronger in maths than English.

But in 2011 the averages were 49% in English and 42% in maths, reversing this general pattern and producing a much wider gap in favour of English.

In 2012 they were 36% in English and 38% in maths, but the English average improves to 41% if Rushden’s result is excluded. This again bucks the overall trend.

The overall average is cemented by the 2013 figures when the average for maths stood at 53% compared with 42% for English.

Hence, over the three years, we can see that the sharp drop in English in 2012 – most probably attributable to the notorious marking issue – was barely recovered in 2013. Conversely, a drop in maths in 2012 was followed by a sharp recovery in 2013.

The small sample size calls into question the significance of these patterns, but they are interesting nevertheless.

The comparable national averages among all state-funded schools in 2013 were 86.2% in English and 87.8% in maths. So the schools in this sample are typically operating at around half the national average levels. This is indeed worse than the comparable record on the threshold measure.

That said, the variation in these results is again huge – 35 percentage points in English (excluding Rushden) and as much as 52 percentage points in maths.

There is no obvious pattern in these schools’ comparative performance in English and maths. Ten schools scored more highly in English and nine in maths, with one school recording equally in both. English was in the ascendancy in 2011 and 2012, but maths supplanted it in 2013.

The final column in Table 2 shows the average point score (APS) for high attainers’ best eight GCSE results. There is once more a very big range, from 144.3 to 326.5 – over 180 points – compared with a 2013 national average for high attainers in state-funded schools of 377.6.

The schools at the bottom of the distribution are almost certainly relying heavily on GCSE-equivalent qualifications, rather than pushing their high attainers towards GCSEs.

Those schools that record relatively high APS alongside relatively low progress scores are most probably taking their high attaining learners with L5 at KS2 to GCSE grade C, but no further.

.

Changes in Performance from 2011 to 2013

Table 3, below, shows how the performance of the 2011 sample changed in 2012 and 2013, while Table 4 shows how the 2012 sample performed in 2013.

The numbers in green show improvements compared with the schools’ 2011 baselines and those in bold are above my illustrative high attainer floor target. The numbers in red are those which are lower than the schools’ 2011 baselines.

.

Table 3: Performance of the 2011 Sample in 2012 and 2013

Name             % HA  5+ A*-C incl E+M      3+ LOP E    3+ LOP M
11 12 13 11 12 13 11 12 13 11 12 13
Carter Community School 13 14 13 56 100 92 56 80 75 44 80 33
Hadden Park High School 13 15 8 60 87 75 40 80 75 20 53 50
Merchants Academy 19 16 20 68 79 96 58 79 88 42 47 71
The Robert Napier School 12 12 11 68 83 96 39 59 92 46 62 80
Bishop of Rochester Academy 5 7 8 70 83 73 50 67 47 60 75 53

.

All but one of the five schools showed little variation in the relative size of their high attainer populations over the three years in question.

More importantly, all five schools made radical improvements in 2012.

Indeed, all five exceeded the 5+ GCSE threshold element of my illustrative floor target in both 2012 and 2013 though, more worryingly, three of the five fell back somewhat in 2013 compared with 2012, which might suggest that short term improvement is not being fully sustained.

Four of the five exceeded the English progress element of the illustrative floor target in 2012 while the fifth – Robert Napier – missed by only 1%.

Four of the five also exceeded the floor in 2013, including Robert Napier which made a 43 percentage point improvement compared with 2012. On this occasion, Bishop of Rochester was the exception, having fallen back even below its 2011 level.

In the maths progress element, all five schools made an improvement in 2012, three of the five exceeding the floor target, the exceptions being Hadden Park and Merchants Academy

But by 2013, only three schools remained above their 2011 baseline and only two – Merchants and Robert Napier – remained above the floor target.

None of the five schools would have remained below my floor target in either 2012 or 2013, by virtue of their improved performance on the 5+ GCSE threshold element, but there was significantly greater insecurity in the progress elements, especially in maths.

There is also evidence of huge swings in performance on the progress measures. Hadden Park improved progression in English by 40 percentage points between 2011 and 2012. Carter Community School almost matched this in maths, improving by 36 percentage points, only to fall back by a huge 47 percentage points in the following year.

Overall this would appear to suggest that this small sample of schools made every effort to improve against the threshold and progress measures in 2012 but, while most were able to sustain improvement – or at least control their decline – on the threshold measure into 2013, this was not always possible with the progress elements.

There is more than a hint of two markedly different trajectories, with one group of schools managing to sustain initial improvements from a very low base and the other group falling back after an initial drive.

Is the same pattern emerging amongst the group of schools that fell below my high attainer floor target in 2012?

.

Table 4: Performance of the 2012 Sample in 2013

Name   % HA  5+ A*-C incl E+M 3+ LOP E 3+ LOP M
12 13  12 13 12 13 12 13
The Rushden Community College 26 23 3 90 0 74 54 87
Culverhay School 19 12 30 67 40 67 20 67
Raincliffe School 11 - 50 - 50 - 33 -
The Coseley School 20 26 60 88 51 82 60 78
Fleetwood High School 22 24 62 84 38 36 24 67
John Spendluffe Foundation Technology College 12 15 64 100 50 61 43 83
Parklands High School 18 11 69 78 23 56 8 56
Frank F Harrison Engineering College 12 13 70 70 35 57 60 52

.

We must rule out Raincliffe, which closed, leaving seven schools under consideration.

Some of these schools experienced slightly more fluctuation in the size of their high attainer populations – and over the shorter period of two years rather than three.

Six of the seven managed significant improvements in the 5+ GCSE threshold with the remaining school – Frank F Harrison – maintaining its 2012 performance.

Two schools – Frank F Harrison and Culverhay did not exceed the illustrative floor on this element.  Meanwhile John Spendluffe achieved a highly creditable perfect score, comfortably exceeding the national average for state-funded schools. Rushden was not too far behind.

There was greater variability with the progress measures. In English, three schools remained below the illustrative floor in 2013 with one – Fleetwood High – falling back compared with its 2012 performance.

Conversely, Coseley improved by 31 percentage points to not far below the national average for state-funded schools.

In maths two schools failed to make it over the floor. Parklands made a 48 percentage point improvement but still fell short, while Frank F Harrison fell back eight percentage points compared with its 2012 performance.

On the other hand, Rushden and John Spendluffe are closing in on national average performance for state-funded schools. Both have made improvements of over 30 percentage points.

Of the seven, only Frank F Harrison would remain below my overall illustrative floor target on the basis of its 2013 performance.

Taking the two samples together, the good news is that many struggling schools are capable of making radical improvements in their performance with high attainers.

But question marks remain over the capacity of some schools to sustain initial  improvements over subsequent years.

 .

What Interventions Have Impacted on these Schools?

Table 5 below reveals how different accountability and school improvement interventions have been brought to bear on this sample of 20 schools since 2011.

.

Table 5: Interventions Impacting on Sample Schools 2011-2014

Name Floor Targets Most recent Inspection Ofsted Rating (Pre-) warning notice Academised
2011
Carter Community School  .FT 2011. FT 2013  .29/11/12. NYI as academy 2 Sponsored
Hadden Park High School  .FT 2011.FT 2012

.FT 2013

 .13/11/13 .NYI as academy SM Sponsored
Merchants Academy  .FT 2011 .FT 2012  .9/6/11 2
The Robert Napier School  .FT 2011.FT 2012  .17/09/09.NYI as academy 3 Sponsored
Bishop of Rochester Academy  .FT 2011.FT 2013  .28/6/13 3 PWN 3/1/12
2012
The Rushden Community College FT 2012  .10/11/10.NYI as academy 3 Sponsored
Culverhay School  .FT 2011 .FT 2012

.(FT 2013)

 .11/1/12 .NYI as academy SM Sponsored
Raincliffe School  .FT 2012  .19/10/10 3 Closed
The Coseley School  .FT 2012  .13/9/12 SM
Fleetwood High School  .FT 2012 .FT 2013  .20/3/13 SWK
John Spendluffe Foundation Technology College  .FT 2012  .3/3/10 .As academy    18/9/13 .1.2 Academy converter 9/11
Parklands High School  .FT 2011.FT 2012

.FT 2013

 .5/12/13 SM Discussing sponsorship
Frank F Harrison Engineering College  .FT 2011.FT 2012

.(FT 2013)

 .5/7/11.See Mirus Academy below 3 Now Mirus Academy (see below)
2013
Gloucester Academy  .FT 2011.FT 2012

 .FT 2013

 .4/10/12 SWK  .PWN 16/9/13.WN 16/12/13
Christ the King RC and CofE VA School  .FT 2011.FT 2012

.FT 2013

 .18/9/12 SM Closed
Aireville School  .FT 2012.FT 2013  .15/5/13 SM
Manchester Creative and Media Academy for Boys  .FT 2011.FT 2012

.FT 2013

 .13/6/13 SWK PWN 3/1/12
Fearns Community Sports College  .FT 2011.FT 2013  .28/6/12 3
Unity College Blackpool .  .FT 2011 .FT 2012

.FT 2013

 .9/11/11.NYI as academy 3 Sponsored
The Mirus Academy  .FT 2013  .7/11/13 SM

 .

Floor Targets

The first and obvious point to note is that every single school in this list fell below the official floor targets in the year in which they also undershot my illustrative high attainers’ targets.

It is extremely reassuring that none of the schools returning particularly poor outcomes with high attainers are deemed acceptable performers in generic terms. I had feared that a few schools at least would achieve this feat.

In fact, three-quarters of these schools have fallen below the floor targets in at least two of the three years in question, while eight have done so in all three years, two having changed their status by becoming academies in the final year (which, strictly speaking, prevents them from scoring the hat-trick). One has since closed.

Some schools appear to have been spared intervention by receiving a relatively positive Ofsted inspection grade despite their floor target records. For example, Carter Community School had a ‘good’ rating sandwiched between two floor target appearances, while Merchants Academy presumably received its good rating before subsequently dropping below the floor.

John Spendluffe managed an outstanding rating two years before it dropped below the floor target and was rated good – in its new guise as an academy – a year afterwards.

The consequences of falling below the floor targets are surprisingly unclear, as indeed are the complex rules governing the wider business of intervention in underperforming schools.

DfE press notices typically say something like:

Schools below the floor and with a history of underperformance face being taken over by a sponsor with a track record of improving weak schools.’

But of course that can only apply to schools that are not already academies.

Moreover, LA-maintained schools may appeal to Ofsted against standards and performance warning notices issued by their local authorities; and schools and LAs may also challenge forced academisation in the courts, arguing that they have sufficient capacity to drive improvement.

As far as I can establish, it is nowhere clearly explained what exactly constitutes a ‘history of underperformance’, so there is inevitably a degree of subjectivity in the application of this criterion.

Advice elsewhere suggests that a school’s inspection outcomes and ‘the local authority’s position in terms of securing improvement as a maintained school’ should also be taken into account alongside achievement against the floor targets.

We do not know what weighting is given to these different sources of evidence, nor can we rule out the possibility that other factors – tangible or intangible – are also weighed in the balance.

Some might argue that this gives politicians the necessary flexibility to decide each case on its merits, taking careful account of the unique circumstances that apply rather than imposing a standard set of cookie-cutter judgements.

Others might counter that the absence of standard criteria, imposed rigorously but with flexibility to take additional special circumstances in to account, lays such decisions unnecessarily open to dispute and is likely to generate costly and time-consuming legal challenge

.

Academy Warning Notices

When it comes to academies:

‘In cases of sustained poor academic performance at an academy, ministers may issue a pre-warning notice to the relevant trust, demanding urgent action to bring about substantial improvements, or they will receive a warning notice. If improvement does not follow after that, further action – which could ultimately lead to a change of sponsor – can be taken. In cases where there are concerns about the performance of a number of a trust’s schools, the trust has been stopped from taking on new projects.’

‘Sustained poor academic performance’ may or may not be different from a ‘history of underperformance’ and it too escapes definition.

One cannot but conclude that it would be very helpful indeed to have some authoritative guidance, so that there is much greater transparency in the processes through which these various provisions are being applied, to academies and LA-maintained schools alike.

In the absence of such guidance, it seems rather surprising that only three of the academies in this sample – Bishop of Rochester, Gloucester and Manchester Creative and Media – have received pre-warning letters to date, while only Gloucester’s has been superseded by a full-blown warning notice. None of these mention specifically the underperformance of high attainers.

  • Bishop of Rochester received its notice in January 2012, but subsequently fell below the floor targets in both 2012 and 2013 and – betweentimes – received an Ofsted inspection rating of 3 (‘requires improvement’).
  • Manchester Creative and Media also received its pre-warning notice in January 2012. It too has been below the floor targets in both 2012 and 2013 and was deemed to have serious weaknesses in a June 2013 inspection.
  • Gloucester received its pre-warning notice much more recently, in September 2013, followed by a full warning notice just three months later.

These pre-warning letters invite the relevant Trusts to set out within 15 days what action they will take to improve matters, whereas the warning notices demand a series of specific improvements with a tight deadline. (In the case of Gloucester Academy the notice issued on 16 December 2013 imposing a deadline of 15 January 2014. We do not yet know the outcome.)

Other schools in my sample have presumably been spared a pre-warning letter because of their relatively recent acquisition of academy status, although several other 2012 openers have already received them. One anticipates that more will attract such attention in due course.

 .

Ofsted Inspection

The relevant columns of Table 5 reveal that, of the 12 schools that are now academies (taking care to count Harrison/Mirus as one rather than two), half have not yet been inspected in their new guise.

As noted above, it is strictly the case that, when schools become academies – whether sponsored or via conversion – they are formally closed and replaced by successor schools, so the old inspection reports no longer apply to the new school.

However, this does not prevent many academies from referring to such reports on their websites – and they do have a certain currency when one wishes to see whether or not a recently converted academy has been making progress.

But, if we accept the orthodox position, there are only six academies with bona fide inspection reports: Merchants, Bishop of Rochester, John Spendluffe, Gloucester, Manchester Creative and Media and Mirus.

All five of the LA-maintained schools still open have been inspected fairly recently: Coseley, Fleetwood, Parklands, Aireville and Fearns.

This gives us a sample of 11 schools with valid inspection reports:

  • Two academies are rated ‘good’ (2)  – Merchants and John Spendluffe;
  • One academy – Bishop of Rochester – and one LA-maintained school –  Fearns – ‘require improvement’ (3);
  • Two academies – Gloucester and Manchester – and one LA-maintained school – Fleetwood – are inadequate (4) having serious weaknesses and
  • One academy – Mirus – and three LA-maintained schools – Parklands, Coseley and Aireville – are inadequate (4) and in Special Measures.

The School Inspection Handbook explains the distinction between these two  variants of ‘inadequate’:

‘A school is judged to require significant improvement where it has serious weaknesses because one or more of the key areas is ‘inadequate’ (grade 4) and/or there are important weaknesses in the provision for pupils’ spiritual, moral, social and cultural development. However, leaders, managers and governors have been assessed as having the capacity to secure improvement

…A school requires special measures if:

  • it is failing to give its pupils an acceptable standard of education and
  • the persons responsible for leading, managing or governing are not demonstrating the capacity to secure the necessary improvement in the school.’

Schools in each of these categories are subject to more frequent monitoring reports. Those with serious weaknesses are typically re-inspected within 18 months, while, for those in special measures, the timing of re-inspection depends on the school’s rate of improvement.

It may be a surprise to some that only seven of the 11 are currently deemed inadequate given the weight of evidence stacked against them.

There is some support for the contention that Ofsted inspection ratings, floor target assessments and pre-warning notices do not always link together as seamlessly as one might imagine, although apparent inconsistencies may sometimes arise from the chronological sequence of these different judgements.

But what do these 11 reports say, if anything, about the performance of high attainers? Is there substantive evidence of a stronger focus on ‘the most able’ in those reports that have issued since September 2013?

.

The Content of Ofsted Inspection Reports

Table 6, below, sets out what each report contains on this topic, presenting the schools in the order of their most recent inspection.

One might therefore expect the judgements to be more specific and explicit in the three reports at the foot of the table, which should reflect the new guidance introduced last September. I discussed that guidance at length in this October 2013 post.

.

Table 6: Specific references to high attainers/more able/most able in inspection reports

Name Date Outcome Comments
Merchants Academy 29/6/11 Good (2) In Year 9… an impressive proportion of higher-attaining students…have been entered early for the GCSE examinations in mathematics and science. Given their exceptionally low starting points on entry into the academy, this indicates that these students are making outstanding progress in their learning and their achievement is exceptional.More-able students are fast-tracked to early GCSE entry and prepared well to follow the InternationalBaccalaureate route.
Fearns Community Sports College 28/6/12 Requires improvement (3) Setting has been introduced across all year groups to ensure that students are appropriately challenged and supported, especially more-able students. This is now beginning to increase the number of students achieving higher levels earlier in Key Stage 3.
The Coseley School 13/9/12 Special Measures (4) Teaching is inadequate because it does not always extend students, particularly the more able.What does the school need to do to improve further?Raise achievement, particularly for the most able, by ensuring that:

  • work consistently challenges and engages all students so that they make good progress in lessons
  • challenging targets are set as a minimum expectation
  • students do not end studies in English language and mathematics early without having the chance to achieve the best possible grade
  • GCSE results in all subjects are at least in line with national expectations.

Target setting is not challenging enough for all ability groups, particularly for the more-able students who do not make sufficient progress by the end of Key Stage 4.

Gloucester Academy 4/10/12 Serious Weaknesses (4) No specific reference
Fleetwood High School 20/3/13 Serious Weaknesses(4) No specific reference
Aireville School 15/5/13 Special Measures(4) Teachers tend to give the same task to all students despite a wide range of ability within the class. Consequently, many students will complete their work and wait politely until the teacher has ensured the weaker students complete at least part of the task. This limits the achievement of the more-able students and undermines the confidence of the least-able.There is now a good range of subjects and qualifications that meet the diverse needs and aspirations of the students, particularly the more-able students.
Manchester Creative and Media Academy for Boys 13/6/13 Serious Weaknesses(4) The most-able boys are not consistently challenged to attain at the highest levels. In some lessons they work independently and make rapid progress, whereas on other occasions their work is undemanding.What does the academy need to do to improve further?Improve the quality of teaching in Key Stages 3 and 4 so that it is at least good leading to rapid progress and raised attainment for all groups of boys, especially in English, mathematics and science by…  ensuring that tasks are engaging and challenge all students, including the most-able.The most-able boys receive insufficient challenge to enable them to excel. Too many lessons donot require them to solve problems or link their learning to real-life contexts.In some lessons teachers’ planning indicates that they intend different students to achieve different outcomes, but they provide them all with the same tasks and do not adjust the pace or nature of work for higher- or lower-attaining students. This results in a slow pace of learning and some boys becoming frustrated.
Bishop of Rochester Academy 28/6/13 Requires improvement (3) No specific reference
John Spendluffe Foundation Technology College 18/9/13 Good (2) Not enough lessons are outstanding in providing a strong pace, challenge and opportunities for independent learning, particularly for the most able.The 2013 results show a leap forward in attainment and progress, although the most able could still make better progress.Leadership and management are not outstanding because the achievement of pupils, though improving quickly, has not been maintained at a high level over a period of time, and a small number of more-able students are still not achieving their full potential.
The Mirus Academy 7/11/13 Special Measures (4) The academy’s early entry policy for GCSE has made no discernible difference to pupils’ achievement, including that of more able pupils.
Parklands High School 5/12/13 Special Measures (4) The achievement of students supported by the pupil premium generally lags behind that of their classmates. All groups, including themost able students and those who have special educational needs, achieve poorly.Students who join the school having achieved Level 5 in national Key Stage 2 tests in primary school fare less well than middle attainers, in part due to early GCSE entry. They did a little better in 2013 than in 2012.

.

There is inconsistency within both parts of the sample – the first eight reports that pre-date the new guidance and the three produced subsequently.

Three of the eleven reports make no specific reference to high attainers/most able learners, all of them undertaken before the new guidance came into effect.

In three more cases the references are confined to early entry or setting, one of those published since September 2013.

Only four of the eleven make what I judge to be substantive comments:

  • The Coseley School (special measures) – where the needs of the most able are explicitly marked out as an area requiring improvement;
  • The Manchester Creative and Media Academy for Boys (serious weaknesses) – where attention is paid to the most able throughout the report;
  • John Spendluffe Foundation Technology College (good) – which includes some commentary on the performance of the most able; and
  • Parklands High School (special measures) – which also provides little more than the essential minimum coverage.

The first two predate the new emphasis on the most able, but they are comfortably the most thorough. It is worrying that not all reports published since September are taking the needs of the most able as seriously as they might.

One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.

.

Conclusion

This post established an illustrative floor target to identify a small sample of 20 schools that have demonstrated particularly poor performance with high attainers in the Performance Tables for 2011, 2012 or 2013.

It:

  • Compared the performance of these schools in the year in which they fell below the floor, noting significant variance by year and between institutions, but also highlighting the fact that the proportion of high attainers attending these schools is significantly lower than the national average for state-funded schools.
  • Examined the subsequent performance of schools below the illustrative floor in 2011 and 2012, finding that almost all made significant improvements in the year immediately following, but that some of the 2011 cohort experienced difficulty in sustaining this improvement across all elements into a second year. It seems that progress in English, maths or both are more vulnerable to slippage than the 5+ A*-C GCSE threshold measure.
  • Confirmed – most reassuringly – that every school in the sample fell below the official, generic floor targets in the year in which they also undershot my illustrative high attainer floor targets.
  • Reviewed the combination of assessments and interventions applied to the sample of schools since 2011, specifically the interaction between academisation, floor targets, Ofsted inspection and (pre)warning notices for academies. These do not always point in the same direction, although chronology can be an extenuating factor. New guidance about how these and other provisions apply and interact would radically improve transparency in a complex and politically charged field.
  • Analysed the coverage of high attainers/most able students in recent inspection reports on 11 schools from amongst the sample of 20, including three published after September 2013 when new emphasis on the most able came into effect. This exposed grave inconsistency in the scope and quality of the coverage, both before and after September 2013, which did not correlate with the grade of the inspection. Inspectors would benefit from succinct additional guidance.

In the process of determining which schools fell below my high attainers floor target, I also identified the schools that undershot one or other of the elements but not both. This wider group included 46 schools in 2011, 52 schools in 2012 and 34 schools in 2013.

Several of these schools reappear in two or more of the three years, either in their existing form or following conversion to academy status.

Together they constitute a ‘watch list’ of more than 100 institutions, the substantial majority of which remain vulnerable to continued underperformance with their high attainers for the duration of the current accountability regime.

The chances are that many will also continue to struggle following the introduction of the new ‘progress 8’ floor measure from 2015.

Perhaps unsurprisingly, the significant majority are now sponsored academies.

I plan to monitor their progress.

.

*Apologies for this rather tabloid title!

.

GP

February 2014

Gifted Education Activity in the Blogosphere and on Twitter

.

4-Eyes-resized-greenjacketfinalI have been doing some groundwork for an impending analysis of the coverage of gifted education (and related issues) in social media – and reflecting on how that has changed in the four years I have been involved.

As a first step I revised my Blogroll (normally found in the right hand margin, immediately below the Archives).

I decided to include only Blogs that have published three or more relevant posts in the last six months – and came up with the following list of 23, which I have placed in alphabetical order.

.

Begabungs

Belin-Blank Center

Distilling G and T Ideas

Dona Matthews

Gifted and Talented Ireland

Gifted Challenges

Gifted Education Perspectives

Gifted Exchange

Gifted Parenting Support

Global #gtchat powered by TAGT

headguruteacher  (posts tagged #gtvoice)

Irish Gifted Education Blog

Krummelurebloggen

Laughing at Chaos

Living the Life Fantastic

Ramblings of a Gifted Teacher

smarte barn

Talent Igniter

Talent Talk

Talento y Educacion

The Deep End

The Prufrock Press Blog

Unwrapping the Gifted

WeAreGifted2

.

This is rather a short list, which might suggest a significant falling off of blogging activity since 2010. I had to delete the majority of the entries in the previous version of the Blogroll because they were dormant or dead.

But I might have missed some deserving blogs, particularly in other languages. Most on this list are written in English.

If you have other candidates for inclusion do please suggest them through the comments facility below, or pass them on via Twitter.

You may have views about the quantity and quality of blogging activity – and whether there is an issue here that needs to be addressed. Certainly the apparent decline in gifted education blogging comes at a time when edublogging in England has never been more popular. Perhaps you have ideas for stimulating more posts.

On the other hand, you might take the view that blogging is increasingly irrelevant, given the inexorable rise of microblogging – aka Twitter – and the continued popularity of Facebook, let alone the long list of alternatives.

Speaking of Twitter, I thought it might be an interesting exercise to compile a public list of every feed I could find that references gifted education (or an equivalent term, whether in English or another language) in its profile.

The full list – which you can find at https://twitter.com/GiftedPhoenix/lists/gifted-education – contains 1,245 members at present.

I have embedded the timeline below, and you can also find it in the right hand margin, immediately below the Blogroll.

.

.

The list includes some leading academic authorities on the subject, but is dominated by gifted education teachers and the parents of gifted learners, probably in roughly equal measure.

The clear majority is based in the United States, but there is a particularly strong community in the Netherlands and reasonable representation in Australia, Canada, the Netherlands, Spain and the UK. Several other countries are more sparsely represented.

(One authority – who shall remain nameless – has unaccountably blocked me, which prevents his inclusion in the list. But he has only produced eight tweets, the most recent over a year old, so I suppose he is no great loss.)

I cannot compare this with earlier lists, but it feels as though there has been a significant expansion of the gifted Twittersphere since I began in 2010.

That said I have no information yet about how many of the feeds are active – and just how active they are.

If I have inadvertently omitted you from the list, please Tweet to let me know. Please feel free to make use of the list as you wish, or to offer suggestions for how I might use it.

There will be further segmented lists in due course.

 

Postscript 13 January:

Many thanks for your really positive response. The blogroll now has 34 entries…and there’s always room for more.

If you’d like to subscribe to the Twitter list but are not sure how, here’s Twitter’s guide (see bottom of page).

If you’re not on the list but would like to be, please either follow me (making sure there’s a reference to gifted or similar in your profile) or send me a tweet requesting to be added.

You can follow or tweet me direct from this blog by going to the ‘Gifted Phoenix on Twitter’ embed in the right hand column.

 

.

GP

January 2014

cropped-p1010480.jpg

Presentation at Westminster Education Forum, 7 November 2013

.

Becky Francis put in a good word (thanks!) and I was invited to make a five minute pitch at a recent Westminster Education Forum event on Raising Pupil Attainment.

Fellow contributors on ‘Improving Provision for High Ability Pupils’ included:

  • Sally Coates (Burlington Danes)
  • Melanie Saunders (Hampshire LA)
  • Denise Yates (Potential Plus)

Here is my Powerpoint which is, I guess, as good a summary as I can manage of where I think we are, six months on from the publication of Ofsted’s ‘The most able students’.

The penultimate slide presents some key elements of a middle way between top-down national prescription and bottom-up school-driven improvement.

I have been advocating this for some time in my area of specialism, though it is potentially applicable to the entire business of school improvement.

No sign yet of any substantive support for this approach: too many are still too wedded to the bottom up model, even though its shortcomings are daily becoming more conspicuous.

I plan to write more about that next year.

 .

Westminster Forum Presentation 7 November 2013 without notes

Westminster Forum Presentation 7 November 2013 without notes 2

Westminster Forum Presentation 7 November 2013 without notes 3

Westminster Forum Presentation 7 November 2013 without notes 4

Westminster Forum Presentation 7 November 2013 without notes 5

Westminster Forum Presentation 7 November 2013 without notes 6

.

GP

December 2013

Gifted Phoenix 2012 Review and Retrospective

.

I thought it might be neat – as well as useful – to round out this year’s blogging with a mildly self-congratulatory review, looking back at the various posts I’ve written about giftedness and gifted education.

New Year Fireworks courtesy of RobW_

New Year Fireworks courtesy of RobW_

4-Eyes-resized-greenjacketfinal

I have embedded links to every post, so this is also an index of sorts. If you missed anything first time round, now’s your chance to catch up before next year’s programme kicks off.

This is my 40th post of 2012. There were none in August (holidays) or in October (heavy research and some privately commissioned work). I published between three and six posts in each of the remaining ten months. I haven’t attempted an accurate word count, but my best guess is roughly 200,000.

Crikey.

 

National Studies

I’ve published four ‘signature’ features on national systems of gifted education:

  • South Korea – Parts One and Two;
  • Singapore – Parts One and Two;
  • New Zealand’s Excellence Gap – Parts One and Two; and

The first two were studies of ‘Asian Tigers’, intended to showcase the particular significance of gifted education to a select group of jurisdictions that are so often held up as educational paragons for us to emulate as best we can. They complement an earlier series about gifted education in Hong Kong.

The New Zealand post was this year’s contribution to the NZ Gifted Awareness Week Blog Tour. It attracted a lot less attention (and, consequently, much less vituperation) than I had anticipated. The substance of my argument is that New Zealanders are over-focused on ethnic achievement gaps, including at the top end, rather than socio-economic achievement gaps (which will of course have a significant ethnic dimension).

The post on Israel was a huge task, given the immense range of background material available online. I knew that Israel had a long pedigree in the field, but hadn’t appreciated that it was quite so extensive. Much of this activity deserves to be better known and better understood – and I hope my post has made some small contribution to that end.

 .

The Directory of Gifted Education Centres

Four more of my posts during 2012 are contributions to an ongoing series about important centres for the delivery and support of gifted education:

  • Back in January I produced a postscript to my earlier work on the Hong Kong Academy for Gifted Education (HKAGE) analysing recently published data about the Academy’s effectiveness;

.

Theoretical Posts

A third group of posts can perhaps best be regarded as contributions to the theoretical underpinnings of gifted education.

  • At the beginning of the year I offered a piece called ‘Are All Children Gifted?’ – Parts One and Two – which was prompted by an initial discussion on Twitter. The first part set out my personal position, together with a frame for the consideration of statements of this kind. The second part analysed three different examples of the genre.
  • Later that spring I published ‘A Bold Step in Broadly the Right Direction…But There’s a Big But!’ This is my contribution to the vociferous and sometimes violent debate prompted by the publication of ‘Taking a Bold Step’ an article by Paula Olszewski-Kubilius, President of the US National Association for Gifted Children. Fundamentally, I argue for an inclusive, consensual position that can be supported by advocates of trait-based giftedness on one hand and gifted education as talent development on the other. But I place myself firmly in the latter camp, subject only to profound reservations over the idea that gifted education must be devoted to the nurturing of adult eminence.

.

Social Media

In the summer several posts were dedicated to consideration of the contribution that social media might make to gifted education.

I chaired a Symposium on this topic at the ECHA 29012 Conference in Munster, Germany. Two preparatory posts, published in July and September respectively, were concerned with the Symposium itself, including arrangements for a linked #gtchat on Twitter, designed to embody in practice some of the Symposium’s key messages.

There was also a substantive post ‘Can Social Media Help Overcome the Problems We Face in Gifted Education – Part One and Two. This considered how social media might be harnessed to support advocacy, learning, policy-making, professional development and research, offering several suggestions for worthwhile collaborative projects.

Finally, in October, I published a full review of the Conference as a whole, including reflections on the Symposium. This offered some potential learning points for the next conference in Ljubliana in two years’ time.

It is gratifying that the organisers have already been in touch expressing their willingness to act on such feedback. The Conference itself is called ‘Rethinking Giftedness: Giftedness in the Digital Age’, so this is perhaps the perfect opportunity to address some of these issues directly. I hope I can play an active part in that.

.

P1020142

.

Posts Pertaining to English Gifted Education

Six of my posts dealt with the impact of English education policy on gifted learners, including high attainers.

  • In February I published a Policy Statement on the English School Performance Tables for GT Voice. This was drafted on behalf of the Board and revised in the light of comments received from other members. Later in the year, in early October, I resigned from the Board in protest at the very limited progress made since GT Voice was first established. I am still a member and – despite continuing forebodings – I very much hope that GT Voice can develop some real momentum in 2013.
  • The GT Voice Policy Statement was produced in response to the 2011 Performance Tables. In December I produced an analysis of the performance of High Attaining Pupils in the 2012 Primary Tables. There was evidence of real improvement between 2011 and 2012, though changes to statutory tests were a complicating factor and there is still considerable scope for further improvement in 2013 and beyond
  • Three posts dating from the early summer consider issues arising from the emerging outcomes of England’s National Curriculum Review. The first considered The Removal of National Curriculum Levels and the Implications for Able Pupils’ Progression. This was supplemented by a proposed Basic Framework for National Curriculum Assessment. A final post traced the clarification of Government policy over the secondary National Curriculum and replacement of existing GCSE qualifications taken at age 16. Initial media statements presaging full abolition of the secondary National Curriculum were succeeded by plans for a ‘skeleton’ comprising:

‘very, very short programmes of study that will give teachers “extreme” and “almost total” freedom over what is taught’.

  .Six months on, these are still to be published.

  • Two posts were dedicated to dissecting reports published by the Sutton Trust. The first considering its proposals for an Open Access Scheme; the second analysing a Report on ‘Educating the Highly Able’. I’m afraid I found them equally unconvincing. The first depends on a substantial taxpayer investment in independent (private sector) schools at a time when budgets are stretched as never before, quite apart from the fact that it would also denude state schools of all their most able learners. The second fails entirely to acknowledge the proposals in the first. By defining high ability almost exclusively in terms of high attainment, its proposed course of action would serve only to increase the ‘excellence gap’ between disadvantaged gifted learners and their peers.

.

Twitter Round-ups

I provided eight comprehensive listings of Gifted Phoenix Tweets during 2012. The first seven were monthly reviews, but the eighth and last marked a shift to quarterly/termly round-ups:

Gifted Phoenix on Twitter provides comprehensive coverage of global gifted education news, as well as links to useful research, commentary and resources made freely available online.

My Twitter feed also offers balanced analysis of wider education policy here in England, while specialising in unearthing and sharing newsworthy educational material from public sector sources. This supports the cause of greater transparency, espoused by the Government and opposition parties alike. It also helps ‘proper’ educational journalists keep up to speed.

Gifted Phoenix published around 6,500 Tweets during 2012. It has over 3,000 followers including several very influential politicians and educationalists.

.

Key Documents

Finally, I published a brief post drawing readers’ attention to an evolving Key Documents section of this Blog.

My plan is to build incrementally a global library of freely available documents, wherever possible (ie where copyright provisions appear not to stand in the way) by storing a PDF on the site.

When future posts need to reference the documents in question, I can link to the copy on this Blog rather than relying on external URLs. This should significantly reduce the incidence of dead links.

Phase One of this project is now almost complete, in that the ‘Gifted Education in the United Kingdom’ section is fully stocked with uploadable PDFs. I shall begin to stock the ‘Gifted Education in the Rest of the World’ and ‘Research’ sections during the coming year.

.

Analytics

It is never wise to place too much faith in Blog analytics, but WordPress suggests my readership almost doubled in 2012 compared with the previous year.

There have been visits from 151 countries since 1 April. Some 48.5% of those visitors are resident in the United States or the United Kingdom.

The next largest readerships are located in Singapore, Saudi Arabia, India, Australia, Germany, France, Canada, Malaysia, Hong Kong, New Zealand, the Netherlands and the Philippines respectively

The ten most read posts during the year (including some published before 2012) are:

Mawhiba: Gifted Education in Saudi Arabia (Part One)

Gifted Education in South Korea – Part One

The Removal of National Curriculum Levels and the Implications for Able Pupils’ Progression

Hong Kong Academy for Gifted Education: An In-Depth Analysis

Gifted Education in Singapore: Part 1

Gifted Education in Singapore: Part 2

The European Council for High Ability (ECHA)

Are Leonardo Schools a Good Model of Gifted Education?

USA: Maryland – Center for Talented Youth (CTY), Johns Hopkins University

.

Finally

As we move into 2013, may I take this opportunity to wish all my visitors and readers a very Happy New Year.

I have several very interesting posts planned for the early part of next year. I hope they will continue to meet your needs but, if you would like me to address a particular topic, please don’t hesitate to suggest it.

.

GP

December 2012

IGGY – The International Gateway for Gifted Youth

.

This post is an in-depth review of IGGY, a service for gifted learners hosted by the University of Warwick in England.

IGGY_Logo_Blue_DDAI met IGGY’s Academic Principal at the 2012 ECHA Conference in Munster, Germany and undertook to feature the new set-up in an upcoming post. This is the product of that commitment.

An earlier post, from July 2010, included some detail about IGGY’s activities in Africa, but it has radically changed is character since then.

This post traces the transformation of IGGY from, first and foremost, an international summer and winter school provider into an education social network. It attempts a balanced scrutiny of current provision, identifying weaknesses as well as strengths.

The IGGY logo is reproduced here with permission. I stipulated the blue version, for the pink is not at all to my taste. (I expect it goes down well with 13-19 year-olds but it’s far too vivid for me.)

.

.

IGGY’s Origins and Early Development

From 2002 until 2007, Warwick University held a contract with England’s education ministry to run the National Academy for Gifted and Talented Youth, but chose not to compete for the subsequent contract to run Young, Gifted and Talented (YG&T), which was won by CfBT and ran until March 2010.

This new contract was to support all learners aged 4-19 identified as gifted and talented by their schools and colleges, whereas NAGTY targeted the top 5% of 11-19 year-olds (an estimated population of 200,000).

NAGTY itself evolved from summer school provider towards a blended learning model which relied increasingly on online provision, driven by the demands of scalablity within a limited resource envelope. YG&T also faced the same imperative, compounded by the fact that it served a target group five times the size of NAGTY’s.

Both experienced major challenges in combining effective brokerage of third party learning opportunities with a vibrant online learning community. The comparative advantages of a social network model were already becoming apparent towards the end of the NAGTY contract and in the initial stages of YG&T, but the idea seemed ahead of its time.

Decision makers found it hard to grasp the opportunities presented by this model, but understood only too well the not inconsiderable threats it posed. The balance was not attractive to inherently risk-averse organisations. Some major risks were exposed which IGGY will also have to manage and, if necessary, overcome.

Back in August 2007, Warwick hosted the biennial World Council Conference with financial support from the Government. This was, in effect, NAGTY’s swansong.

But the University chose this opportunity to announce the creation of IGGY, a new international organisation ‘targeted at the top 5% of 11-19 year-olds from around the world’.

The implication was that Warwick would capitalise on the expertise it had developed in the NAGTY years, with the University itself as the primary beneficiary.

The press release said that a pilot programme for up to 1,000 students would begin in spring 2008, followed by a full launch in the UK and an unspecified Asian country the following autumn. Subsequent rollout would extend the programme into two or three additional countries by autumn 2009.

It promised an inaugural summer school for 150 participants in summer 2009, and an intention to offer similar events in more than one country in subsequent years.

These signature events were to become part of a blended learning offer:

‘At the heart of the “IGGY” experience will be a developing personalized online learning network: a community-led site where leading national and international Higher Education institutions, educators, companies and others will deliver content, provide expertise and offer students learning activities and development opportunities (both online and through events) to enhance their learning and social development and to both contribute to and support their mainstream educational progress.’

Contemporary materials still preserved on Warwick’s website throw more light on the original plans and how they developed over time.

A presentation from June 2008 defines IGGY’s bipartite offer:

  • A ‘collaborative online learning space’ backed up by an archive of material created by and for its members;
  • Face-to-face activities provided through international partners with a ‘summer university’ as the centrepiece. The first of these – a two week event – is scheduled to take place in Warwick in August 2008 with four courses on offer for about 100 participants. There will also be a ‘winter university’ probably hosted abroad.

The presentation notes that ‘IGGY is a key project within the university strategy’ citing multiple benefits for Warwick’s international profile and branding, its student recruitment and wider reputation.

However ‘initial University investment will be limited’ while fees will be deferred initially and subsequently kept low. This means the rate of expansion will be heavily dependent on income generated from partners. It was this equation which initially drove IGGY in a philanthropic direction (though always with an eye towards international recruitment in developing markets).

.

Progress from 2010

A second presentation from April 2010 says that membership has reached around 2,500, drawn from 40 different countries.

Four ‘IGGY universities’ have been held since 2008, two more are planned for August 2010 and there are initial plans for an event in either Australia or South Africa in 2011.

An imminent event located in Botswana is described as ‘the main focus’ in the short term. The parallel Warwick event is expected to cater for 125 students and will host a delegation from Brunei’s Ministry of Education.

A 2010 University Corporate Planning Statement states categorically that:

‘An IGGY U[niversity] will be run in partnership with Monash [University], Australia in 2011’

but I can find no record of it having taken place, probably because the IGGY vision was undergoing radical transformation by this point. (IGGY is not mentioned explicitly in a university partnership recently concluded between Warwick and Monash.)

Other initiatives have been pursued alongside these summer and winter schools, including a series of Junior Commissions – based on an existing Warwick Commission model. These support ten members to work collaboratively on a year-long research project. IGGY has also administered a Litro Short Story competition with prize money provided by a Warwick alumnus.

Although not mentioned in the presentation, a separate entity called IGGY Juniors had also evolved by this stage, targeted mainly at younger children.

The precise relationship between IGGY and IGGY Juniors remains unclear. The new IGGY website doesn’t mention IGGY Juniors, even as a partner, though there is a page on the University website.

This refers to the ‘Da Vinci Group’ as the supporting ‘online intellectual membership community’ for IGGY Juniors, with a membership fee of £35 per month.

But the self-same Da Vinci Group is advertised as a service provided through another body called OLP. Their website seems largely dormant, though some 2012 courses are advertised.

The University publicised some of these developments in a 2010 press notice selecting February 12 2010 as the date of its announcement:

‘The national (English) Young Gifted & Talented website currently says “The Young Gifted & Talented website will be closing at the end of Friday 12 February 2010”.  However on that very same day that gifted programme’s original home at the University of Warwick will announce a range of new opportunities for its global membership of gifted young people in its thriving International Gateway for Gifted Youth (IGGY).

The University of Warwick was host to the original “National Academy for Gifted and Talented Youth” for five years. Warwick moved beyond a focus on England alone and is now home to IGGY - a network of the world’s brightest and most creative young people aged 11-19.’

It is easy to suspect an element of schadenfreude in this statement, for the closure of the YG&T website marked the imminent end of its contract – and of Government-led investment in the education of gifted learners. This left the way open for IGGY to expand its domestic operation in an open market with negligible UK-based competition.

Whether IGGY could be described as ‘thriving’ at this point is a moot point. Membership of 2,500 after three years is arguably a relatively poor return on the University’s investment. There are obvious problems of scalability with the face-to-face events.

Within the presentation, the online dimension is described as dependent on an ‘interim website’ which is old-fashioned and not designed on social media principles. Online presence is recognised as key to scalability and described as a priority over the coming year, but there are clear (if undeclared) tensions with the philanthropic direction of travel, because of the limited reach of sophisticated broadband-reliant social multimedia in sub-Saharan Africa.

There have been software trials involving Cisco and a project officer has been appointed but development appears to have been slow, perhaps because the University was not able to reconcile these competing ‘high-tech’ and ‘philanthropic’ aims.

While social networking is perceived as key to the future vision, the cost is prohibitive, so Warwick is exploring prospective partnerships. There are plans for a ‘rolling programme of themed online provision’ but partnership funding will still be necessary to achieve ‘a sustainable funding position’.

The points made in 2008 about limited scope for income generation from fees and a low ceiling on University subsidy are repeated verbatim. The accompanying notes read:

‘real progress made but still haven’t had that one big donation that would allow a step-change’.

This is perhaps understandable, because the benefit stream to prospective sponsors is not entirely clear. Moreover, they are being asked to subsidise an endeavour that places Warwick in a privileged position in the race to recruit potentially lucrative international students. One can imagine that several potential sponsors might prefer a model that distributes the benefits more widely.

.

P1020137

.

IGGY Changes its Delivery Model

The Director of IGGY at this time was Warwick’s Deputy Registrar and former NAGTY Operations Director. The re-invigoration of IGGY can be linked to his return to Warwick as Registrar in February 2012.

Though IGGY’s new direction was already established by summer 2011, its former Director retained a role in its development while employed elsewhere.

An article on Warwick’s intranet from June 2011 confirms that IGGY has been working towards a predominantly online delivery model through partnership with IBM and CISCO.

Pre-testing began with existing members in May 2011 focused on computer programming, creative writing and global leadership. This was intended to pave the way for a more ambitious summer pilot, with the aim of launching the full service in September 2011.

A University strategic presentation dating from September 2011reveals (in the associated speaking notes) that Warwick is sticking with its existing IT partners. Cisco has sponsored IGGY’s graphic designer while IBM has provided ‘Lotus Live software plus expertise’.

Promotion activities are scheduled to begin in autumn 2011, and declared targets at this stage are for IGGY to recruit:

  • 6,000 members by 2012, so more than doubling its membership in 2010;
  • 50,000 members by 2014, implying rapid eight-fold expansion over the two succeeding years; and
  • 40% of members from ‘low income homes internationally’ (this presumably applies domestically as well).

There may be the possibility of cross-subsidising members from poor backgrounds by charging the relatively wealthy a premium fee.

IGGY will also be a ‘key component’ in Warwick’s campaign to raise £50m (though it is noteworthy that it isn’t mentioned as such on the campaign pages).

But, by November 2011, there has been a significant change of tone. Warwick announces the appointment of a new Director who is to begin work the following month.

The aims are highly ambitious. The new Director:

‘served for almost 5 years as Channel 4’s Head of Education where she led a major strategic shift in Channel 4 Education from TV programmes to digital projects, successfully targeting teen audiences with innovative digital content. That experience will greatly assist her to realise IGGY’s next stage: a new online network offering significant, high quality content to over 100,000 gifted young people across the globe.’

No timetable is applied to the fulfilment of this latter ambition, which doubles the declared 2015 target.

Progress during the first half of 2012 was mostly low-key.

By April, IGGY membership had increased by 500 or so to ‘over 3,000’ but curiously the number of countries supplying members has reduced from 40 to ‘over 30’. Maybe some of the summer and winter school beneficiaries were less attracted by predominantly online provision.

It is interesting to speculate whether an increase of 500 members in two years – even though it could be seen in a positive light as 20% growth – was viewed by Warwick as relatively underwhelming, especially since the distribution between countries has fallen by up to 25%.

It leaves Warwick needing to recruit 3,000 more members in eight months to satisfy its target of 6,000 members by the end of 2012.

A June 2012 feature on Merlin John’s Blog provides some interesting insights into how thinking is developing:

‘Students take up subscriptions with IGGY through the website, authorised by their teachers who are an important key to the service. IGGY will be a subscription service but will offer up to half of the memberships free to disadvantaged students. The subscription price is still to be confirmed but will be in the region of £120 a year with substantial discounts for schools,’

We will look at the final arrangements in more detail below.

The new Director undertook a series of meetings with UK gifted education interests, to update them on plans and lay the groundwork for mutually beneficial partnerships. I met her myself in April 2012 when plans were mentioned to run a ‘Global and Gifted Conference’. This duly took place on 4 July at Warwick, but no invitation arrived.

The Storify record says there were over 100 people present.  Though billed as having ‘a focus on new international research and developments in gifted education’ there were just three presenters: Joan Freeman, Jonathan Hare (a freelance research scientist) and, IGGY’s newly-appointed Academic Principal.

The presentations were initially published but are no longer available online. There was relatively limited coverage of the topic specified. The fundamental purpose of the event is rather unclear, but it will not have positioned IGGY at the heart of contemporary debate about global gifted and talented education.

Two other announcements of note were made during the summer of 2012:

  • In June 2012 IGGY offered free membership to all 1,470 Year 9 students nominated for the education ministry’s Dux Award Scheme. The Ministry makes no reference to this in its own materials, so it is not officially endorsed. It would have been impossible to pass on student details to Warwick because of data protection restrictions, but maybe the list of participating schools was shared. We do not know how many Dux participants have taken up the offer, or to what extent this has contributed towards the achievement of IGGY’s membership targets.
  • In August 2012 IGGY and Warwick’s Institute of Education jointly offered support for two part-time PhD research scholarships in gifted education, with funding to cover full fees (£2,340 in the current year) plus £500 per student for expenses. Doctoral supervision is to be shared between IGGY’s Academic Principal and a WIE lecturer. Those eligible are required to:

‘keep abreast of the latest research developments in gifted education; produce a 2,500 word report each quarter detailing their findings; contribute to IGGY’s annual conference; publish papers in academic journals and present at relevant conferences’.

This sounds like a cunning plan to strengthen IGGY’s gifted education expertise, so giving it the wherewithal to contribute to developing thinking in the field. It may also help to provide some evaluative capacity (the Academic Director’s job description requires him to develop systems to assess the impact of IGGY’s activities). It is similar in many respects to arrangements made during the NAGTY era, when an in-house research capability was evolved. While it may enable IGGY to develop a ‘thought leadership’ capability, there is a risk that these students may be perceived to have too close and reliant a relationship with IGGY to be entirely objective, especially if they are to be utilised as evaluators.

 .

The IGGY Relaunch

The new-style IGGY opened for business in September 2012 as planned.

Warwick’s internal news service reports that initial priority is being given to English, maths, science and history. Ten postgraduate mentors have been recruited and partnerships established with Severn Trent Water and the National Grid to ‘involve students in real-life projects and issues’.

We are not told whether these two organisations have provided financial support or if the relationship is confined to ‘in kind’ support.

Not only will IGGY offer free membership to students from disadvantaged backgrounds, it will also extend this to all eligible students at Warwickshire and Coventry schools (a sure-fire way of increasing numbers, though not entirely equitable). An earlier offer of free places for all until early 2013 seems to have fallen by the wayside.

There is provision for pilot schools, another mechanism allowing IGGY to recruit members en masse. The first is located in Leamington Spa, on Warwick’s doorstep. It is evident that the University is pulling out all the stops: expensive banner advertisements for IGGY appear in at least one national newspaper for several weeks.

The September 2012 announcement confirms that the likes of Cisco and IBM have been set aside in favour of ‘local games company Fishinabottle’. It is not clear whether this work was procured competitively.

Strangely, the Company fails to list IGGY amongst its clients, though it has released a press notice announcing the launch of the new website:

‘IGGY.net features full social profile building functionality, forums for discussion and debate and a “Knowledge” section, in which members can tackle challenges and take part in activities either collaboratively or as individuals. The site offers deeply rooted ‘gamification’ in its social aspects; members gain experience, earn awards and prizes and are attributed statuses as they progress in the world of IGGY. This ‘gamification’ drives engagement and encourages exploration, two of the most important factors in creating digital materials for the educational space.

The biggest challenge in creating IGGY was ensuring a safe and secure environment for our members. To that end, we developed an enrolment process whereby members are confirmed by both their parents and their school in order to gain access to the community. This provides accountability as well as strengthening the authenticity of IGGY’s membership.’

 A second phase of website development was launched in October 2012:

‘Members can now create their own profiles including a public or private blog, comment on articles and debates, build an activity page, earn points and achievements for the things they accomplish on the site, make friends and collaborate with other gifted students around the world.’

A further release is scheduled for late December 2012.

During October the IGGY Office moved to Senate House at the centre of Warwick’s campus. One might read into that an intention to make it more central to the University’s wider business, or possibly a determination on the part of Warwick’s senior management to keep a closer watch on proceedings – perhaps both simultaneously.

The University hosted a launch event at this time, captured by this podcast. Clearly a profile-raising opportunity, there was negligible press coverage. No invitation reached Gifted Phoenix Towers.

At this point, the appointed Director is still in place, at the head of a staff of twelve. But some three weeks later she has been replaced.

Warwick announced that:

‘IGGY, the University’s online network for gifted students, is expanding and has appointed Adrian Hall as its Managing Director. Janey Walker becomes Director of Partnerships and will focus on building new relationships with funders and content partners… Adrian has been working with IGGY as Content and e-Learning Advisor since May 2012’

It is difficult to know what to make of this, though it cannot be a vote of confidence, nor can it mark complete satisfaction with the progress made during the preceding year.

The logo received a makeover at around this time and the change at the top also coincides with a big increase in complement: the staff now numbers 18, a 50% increase within a month.

Warwick stocks most of the job details on its website, so it is possible to estimate the approximate expenditure on salaries. Unfortunately, the posts advertised do not correspond exactly with the current organisational structure.

However, it seems likely that total salary expenditure is somewhere between £600,000 and £700,000 a year, implying annual expenditure on salary and on-costs of around £1m. That is a big investment for a single university, especially if sponsorship remains thin on the ground.

In early November it was reported that IGGY would organise a third Junior Commission in 2013. The supportive quote is supplied for the first time by Hall rather than Walker.

IGGY merits a brief reference in Warwick’s Access Agreement for 2012/13, setting out how it plans to support fair access to the University for those from disadvantaged backgrounds:

‘The International Gateway for Gifted and Talented Youth (IGGY) will offer free membership and access to its resources for eligible students from low participation neighbourhoods helping to raise their aspirations through on-line resources and networking events.’

The 2013/4 Agreement repeats this verbatim, but also mentions a relationship with Warwick’s Goal Programme, the university’s principal fair access initiative:

‘The programme recruits a new cohort of 100-200 disadvantaged students each year, giving them access over four years to a programme of bespoke activities and free access to the wider YG&T provision… All members of Goal automatically become part of IGGY’.

So more grist to the recruitment mill.

.

P1020159

.

Recruitment Targets

Membership targets have been adjusted but also front-loaded:

‘IGGY aims to reach 15,000 gifted students in its first year and 50,000 after 3 years. Applicants will have to be endorsed as gifted by their schools. As well as UK students, IGGY is already recruiting new members as far afield as South Africa, Singapore and Saudi Arabia.’

One must assume that the ‘first year’ is now academic year 2012/13. If IGGY is to achieve 15,000 members by August 2013, that will require a five-fold increase in 16 months.

The use of the word ‘reach’ may hide a multitude of sins. Whether it is a looser construct than membership remains unclear, but making it so would arguably be statistical sleight of hand.

The 50,000 figure for 2015 has been scaled back by 100% compared with a figure of 100,000 mentioned in the Academic Principal’s job description dating from early 2012. So the autumn 2011 target has been doubled and then halved again, indicating some tempering of the University’s ambitions by realism.

Another job description indicates that half of the 50,000 target for 2015 – ie 25,000 – must be international members. Obviously then, 25,000 must be drawn from the home countries.

There is no further reference to the original 40% target for learners – national and international – to come from low income households (‘up to half’ in the John blog post), although I have taken the latter figure into account as a continuing assumption when considering the implications for income generation below.

The volatility of these targets suggests they have been plucked from the air rather than based on any projection or realistic assessment of what is achievable.

The overall size of the global pool in which IGGY is fishing is almost impossible to calculate, but it is much easier to analyse the domestic market.

If we leave aside post-compulsory education (including around one million 16-19 year-olds) the total number of 13-19 year-olds in UK schools – maintained and independent – is around 3.55m.

Assuming 5% are eligible for IGGY membership that gives a potential pool of 177,500 school-based students. (There is no reference to further education on the website so I am assuming this is not currently a target.)

If we assume that 50% of IGGY members are to be drawn from the UK, this means that:

  • By September 2013, the target is to enrol 4.22% of all eligible students, or roughly one in every 24;
  • By September 2015, the target is to enrol 14.08% of students, or roughly one in every seven.

But, as we shall see below, the 5% assumption is not really reflected in the eligibility criteria.

.

Eligibility Criteria

IGGY was originally intended for 11-19 year-olds, as was NAGTY before it, but the lower age limit has now been raised to 13. Why this step was taken is not explained, though it probably rests on the assumption that a social networking environment is relatively less suitable for 11-13 year-olds, while the associated risks are that much greater.

Prospective members need to demonstrate:

‘The potential to perform in the top 5% of their peers worldwide in at least one curriculum area’

But this is inherently unmeasurable, so a degree of subjectivity is inevitable.

Emphasis is seemingly placed on:

  • Ability rather than achievement and
  • Ability in one or more school curriculum subjects, as opposed to all-round ability, or talent in practical fields such as art, music, sport or leadership.

Within the UK, however, this translates into one more specific criterion:

‘The potential to achieve level 8 grades in SATs at the end of Key Stage 3 (year 9) and A*/A grades at GCSE and A level.’

These are of course attainment measures. Presumably students who have already achieved at least one Level 8 or one GCSE A grade automatically become eligible.

Only students not yet at the end of Year 9 and those with a string of Bs at GCSE must necessarily rely on showing potential, as opposed to achievement. There is scope to accommodate students who have underachieved in KS3 and/or KS4 assessments, provided they can supply evidence that they are expected to do better in future.

It is instructive to compare these measures with the 5% threshold.

  • In 2012, just 1% of pupils achieved level 8 in KS3 teacher assessment in English and science, but 8% did so in mathematics. As far as I am aware, national teacher assessment data is no longer collected for non-core subjects, but it will continue to be available in schools and so would qualify under these eligibility criteria;
  • In 2012, the percentage of entrants achieving a full course GCSE grade A/A* across the UK varies from 4.7% (Other Technology) to 61.4% (Classical Subjects). The average percentage across all subjects is 22.4%.

This suggests that the IGGY entry threshold is pitched extremely low, especially at KS4, and when one reflects that it requires only (typically higher) predicted rather than actual grades.

Of course that significantly improves the probability of recruiting members but, conversely, it threatens to dilute the academic experience of many joining in expectation of a challenging experience amongst their intellectual peers.

The reference to SATs, GCSEs and A levels is also rather Anglo-centric, suggesting that the other home countries are not a priority (or at least some neglect of their sensibilities).

For those outside the UK eligibility depends on ‘the potential to achieve top grades for their particular mode of assessment’, which is largely a subjective measure.

Four other less specific evidential measures are mentioned (the criteria aren’t specific on the point but presumably only one criterion needs to be satisfied by each applicant):

  • ‘in the top one or two students in the average class of 30 students in an averagely-performing school’;
  • ‘regularly outperforming their peers in assessments’;
  • ‘on the schools [sic] ‘gifted and talented register’;
  • ‘have been accelerated in school (eg moved up a year or started higher qualifications earlier than their peers)’.

Some of these are rather vague and variable. Some schools even manage to include all their pupils on a gifted and talented register, and not only selective schools either! The final criterion leaves open the possibility that some under 13s will after all be admitted.

Eligible students must have applications endorsed by their school and approved by their parents (or presumably their carers, though IGGY uses ‘parents’ as its standard terminology).

No evidence of ability is required:

‘We do not ask for written evidence that a student is gifted but we do require an email…to confirm they are gifted and would benefit from membership’.

Schools are encouraged to sign up groups of students and are incentivised to do so by receiving discounts on fees.

The registration process is kept as light-touch as possible:

‘If you want to register your students for IGGY membership contact us at info@IGGY.net. We will contact you to discuss how many students you want to enrol and whether any are eligible for free membership, and agree the overall cost. Your school will then be given the appropriate number of codes and you will allocate these to the individual students.

Your students have to register themselves online. An email will be sent to their parents asking them to confirm the student’s details and explaining they are joining IGGY. Once the parents have confirmed these details the student’s account will be activated.’

As far as I can establish, this is a once-only process so students, once admitted, remain members until they exceed the upper age limit. Those who move from one school to another, or who transfer at age 14 or 16, do not seem to require additional endorsement from their new institutions.

It follows that many institutions will not know, unless they check, that some of their students are IGGY members (unless IGGY approaches them for payment of the annual fee, having been refused by the student’s former school).

While this is no doubt attractive to schools – apart from the last detail above – it rather leaves open to question whether IGGY genuinely caters for the top 5%.

Pragmatically of course, IGGY has everything to gain from a liberal set of eligibility criteria, especially while it is striving to build up numbers. There is an associated risk though that membership becomes less attractive simply because it is less exclusive.

.

Pricing

Since 15 October 2012, IGGY has been charging members an annual subscription which it says is highly subsidised by the University. The current subscription is £120 per year for members resident within the UK and £200 per year for those resident elsewhere. These rates are not necessarily fixed.

This differential is justified on the grounds that:

‘It is more expensive for us to deliver student mentoring, arrange and deliver face to face events and generate content partnerships with organisations outside our UK base, and we do need to ensure that these additional costs are covered.’

This seems a little unfair since most overseas members are likely to access the online environment rather than face-to-face experiences. It is unlikely that such events will be offered free at the point of delivery: if there are additional costs, those would be recouped in the additional charges levied.

The FAQ written by the Academic Principal contain a section: ‘Why is IGGY only offered online’ which tends to contradict the rationale given above.

In effect the price differential means that overseas members are cross-subsidising those resident in the UK. Such an arrangement could be open to challenge.

What IGGY calls ‘sponsored memberships’ are available for UK disadvantaged students if they are:

  • Eligible for free school meals
  • Children in care
  • Live ‘in an area that has low participation in higher education’.

The latter provision can be applied wholesale where school-level applications are made. Other extenuating circumstances may be considered for individual applicants.

It is curious that this entitlement is not extended to all otherwise eligible learners aged under 16 in England who qualify for the Pupil Premium since that would be much simpler administratively for schools.

The inclusion of an area-based low HE participation criterion – both at individual and school level – will extend eligibility to relatively advantaged students who live in relatively disadvantaged areas, so generating significant deadweight.

Presumably the POLAR classification is applied, though it is open to question whether schools are always aware of their POLAR classification.

For members outside the UK, the definition of disadvantage is:

‘Based on whether students already receive educational financial support or if they are living in an area that has low participation in higher education.’

Quite what that means in practice is unclear, though overseas applicants faced with the higher basic fee are quite likely to find some evidence to back up a claim of disadvantage.

Schools that take advantage of the opportunity to register groups of students with IGGY can qualify for additional discounts.

A three month trial for up to 10 students attracts a one-off fee of £450. Otherwise discounts are on a sliding scale, depending on the number of students admitted.

It costs:

  • £1,200 to register up to ten students, then £100 per additional student;
  • £2,500 to register up to 25 students, then £80 per additional student;
  • £4,000 to register up to 50 students, then £60 per additional student;
  • £6,000 to register up to 100 students, then £40 per additional student.

So there is clearly an incentive to schools to maximise enrolments rather than limiting recruitment to students who genuinely fall within the top 5% by ability.

This provision also favours selective schools and those in the most advantaged areas with a heavy concentration of high attaining students.

The online guidance makes clear that some schools pass on membership fees to parents, whereas others pay subscriptions themselves or share the cost. Since schools qualify for discounts even when parents pay, there is scope here for institutions to play the system, passing on full fees to parents while only paying the discounted fees to IGGY.

If we ignore the impact of discounts, assume that 50% of places are free and 50% of the remaining 25,000 are recruited from abroad, the maximum annual fee income from 50,000 members is:

.

(12,500 x £120)  + (12,500 x £200) = £4.0m or £80 per student.

.

The maximum fee from 15,000 students is:

.

(3,750 x £120)  + (3,750 x £200) = £1.2m or £80 per student.

.

Given the salary and on-costs outlined above above, plus other development and running costs, it is likely that IGGY will not break even for some time.

.

P1020218

.

The Relationship with Schools and Partners

Schools are advised that they will receive ‘a content plan’ and ‘usage statistics’ though it is not quite clear whether these are generic or specific to each learner.

There is also an option to register as ‘IGGY Pilot Schools’. The financial basis of this arrangement is unspecified, as are the specific benefits for the schools concerned. Ten English pilot locations are currently named on the website, the majority located close to Warwick.

Trinity Lismore Catholic College in New South Wales, Australia is also mentioned, as are ‘Al-Hussan National Schools’ – three English-medium day schools in Saudi Arabia. Neither website seems to mention their relationship with IGGY. The Australian school does however feature its gifted and talented provision.

There is a revealing section of the IGGY website headed ‘How much work will this mean for teachers?’

The answer supplied is:

‘Apart from the initial conversations with IGGY to decide how many students to enrol, you won’t have to do much at all.’

But this is surely disingenuous, since the onus clearly rests on schools to ensure complementarity between members’ in-school experience and what IGGY provides.

The comparative inattention given to this crucial connect was a significant weakness of the NAGTY approach and there is a risk of repetition. IGGY would be much better served by an explanation that this is both necessary and critical. Services should be available to schools to make it easier for them – above and beyond usage statistics and a generic content plan showing what provision is available.

At the very least, there should be a portfolio service enabling students and their schools to build and access records of engagement with IGGY. This may be under development, however.

From January 2013, members will be able to undertake:

‘The University of Warwick approved IGGY Award accreditation at Bronze, Silver and Gold Level.’

It may be that this will include a portfolio service, since accreditation will require details of students’ online engagement with IGGY to be stored and verified.

No further details are available, including whether additional fees will be charged for the privilege. The idea is a good one in principle but the devil is in the detail. Quite what value the accreditation will have remains open to question. Warwick would no doubt like to see it feature on future university applications, but whether it will gain any significant currency remains to be seen.

IGGY claims ‘the support of top academics and businesses’ but there are only two declared business ‘content partners’ to date and the vast majority of the content  emanates from Warwick. The internal arrangements – and funding – necessary to support this activity are not made public. It would be interesting to know whether the costs are passed on to IGGY or expected to be swallowed by the faculties that generate them.

The two ‘content partners’ – Severn Trent Water and the National Grid – are not particularly forthcoming about the benefits they foresee, though presumably they might expect some business advantage from IGGY’s ‘junior think tank’ capability.

Four ‘gifted and talented partners’ have recently been added to the website – CTY Ireland, NACE, NAGC and Villiers Park – but only in the first and last  cases do we get any real insight into the nature of the partnership.

CTYI will share ‘good practice and research’ while Villiers Park will provide content in return for sponsored membership for those undertaking its Scholars’ Programme. (The site does carry a second Q and A supplied by NAGC comprising ‘the top ten questions they are asked by parents’. This might imply the future development of parental services in conjunction with NAGC and parallel professional services in collaboration with NACE.)

IGGY says ‘it is always looking for new partners’ but it seems to have a relatively narrow conceptualisation of what it is seeking. The benefits of partnership, other than reputational value, are far from clear, especially for those working outside the educational sector.

.

What Kind of Service Does IGGY Provide?

The website provides access to a range of open-access material which prospective members, their parents and schools can use to judge the nature and quality of what lies behind the subscription paywall. Another section carries an index of materials that members can access.

As we have seen from the Fishinabottle press release, IGGY has nailed its colours firmly to the ‘gamification’ mast. That will help to give it a more contemporary feel for users, but may also attract criticism from those who believe this approach has its own significant shortcomings.

I offer no assessment of the quality and educational relevance of the materials, or the ‘gamified’ structure – that is for others to judge – but much can be gleaned from other parts of its website about the nature of the service IGGY seeks to provide.

IGGY markets itself as providing the extra ‘challenge’ and ‘stimulation’ that learners might not receive through their mainstream education. It provides a supportive global network and community that boosts learning and self-esteem.

It promises to provide a weekly diet of new interactive content, challenges, debates and competitions. There will be a mixture of short puzzles and longer-term research projects. Students can opt to work alone or collaboratively. According to the Beta Website, the initial subject offer has been extended to include creative writing, maths, science, history and politics. There is as yet no timetable for extension beyond those fields.

I cannot find any substantive treatment of the different ways in which schools might utilise the service – whether exclusively for independent learning outside school hours, or integrated into lesson time, or within extended day activities. That is a missed opportunity from the marketing perspective.

An upcoming highlights page is published frequently – it is not clear whether this is the same content plan promised to teachers, or if they get a more developed service.

Other parts of the service include:

‘A support network that includes University of Warwick academics and student mentors…Events, conferences and gatherings for members across the world…Support and advice for gifted students and university applicants.’

But the detail of what exactly is and will be provided under these heads is still rather sketchy, so members cannot see exactly what they will get for their money.

A series of ‘FAQs for Students Parents and Teachers’ authored by the Academic Principal admit that IGGY is ‘primarily an online initiative’:

‘The financial argument is simple. Face-to-face events are relatively expensive compared with online communities of the same scale, yet they only benefit a fraction of the number of people. In order to keep our membership fees as low as possible, to create the best content with the best academics, to allow students to connect with other international students and to make IGGY a sustainable community, we have decided to use an online model. However we do plan to offer some face to face events and will be asking the IGGY community what developments and events they want to see over the next year.’

The FAQs also describe the’ intended learning outcomes’:

‘IGGY aims to encourage independent learning and critical thinking as well as getting students to work collaboratively…encourages students to have an international perspective and understand the impact of globalization… stimulates students to utilize social media and tools to advance their education… each IGGY member can tailor their involvement to match their own areas of interest and personalise their learning experience.’

Moreover:

‘The aim is to develop appropriate 21st century skills for IGGY members, including critical and creative thinking, communication, research and independent learning skills…IGGY’s learning principles are broadly aligned with Vygotsky’s social constructivist approach, which is based on learning through discovery and social interaction’.

Later on the Q and A describes IGGY’s service as fundamentally enrichment-based rather than accelerative, though with some degree of ‘content-based acceleration’. Both these dimensions need to be planned into schools’ understanding of their learners’ experience, to ensure the right fit between their IGGY and school experiences.

Members are expected to take primary responsibility for their own learning. They score points for their involvement in activities and can record what they’ve undertaken via their profile page. (Whether this yet amounts to formal tracking of progress and achievements as claimed is open to question.)

Student mentors also provide feedback but it is not yet clear whether they will play any role in supporting accreditation for the upcoming Bronze, Silver and Gold awards.

In answer to a question about the quality assurance measures that apply in lieu of a test for IGGY membership, the Principal argues that ‘the research literature is currently showing a paradigm shift towards giftedness as a developmental concept’ hence the admissions criteria are deliberately flexible.

This is fair up to a point, but no actual quality assurance measures are cited. One presumes that the only real measure is the freedom for learners to leave IGGY (or, more accurately, become inactive) if they feel that it is not for them.

Some degree of selectivity is implied by a reference to the possibility that applications can be rejected, in which case candidates can re-apply after a period of twelve months. In reality, it seems unlikely that few if any applications will be rejected given the generosity of the eligibility criteria.

Some of the terms and conditions for IGGY members appear rather draconian:

  • IGGY can’t be held accountable if the site is unavailable, regardless of the duration of the gap in service;
  • If usernames or passwords are made public, they can be disabled;
  • Users can print off only single copies of material on the site for personal use, though reproducing content for ‘non-commercial educational use’ also seems to be permitted. (The terms don’t say explicitly whether this allows a school to use the material with other pupils who are not members but, if so, such materials must not be altered in any way.)
  • Anything posted on the site can be used by Warwick for any purpose ‘in any media across the world’ as long as that is consistent with the declared privacy policy. They can change and adapt that material as they see fit. These rights aren’t exclusive, however, so others can be granted similar entitlement. (This presumably applies to any content provided by third parties.)
  • The terms of use can be changed at any time

IGGY even seeks to control links to and from third party sites. Authors must:

‘Make sure you do it [ie link] in a way that is fair (and legal!) and doesn’t damage or take advantage of our reputation’.

They ‘can withdraw permission to link to IGGY whenever we like’, though that begs the question whether permission is required in the first place.

One sincerely hopes that an honest, balanced and constructive review which highlights shortcomings as well as good points doesn’t amount to reputational damage…and that the hyperlinks in this post are unexceptionable.

.

Overall Assessment and Prospects for Success

Some of the commentary above may appear to have accentuated the negative, but I have been stress-testing deliberately some of the more vulnerable aspects of IGGY’s delivery model.

It is early days, at least for the relaunch, and several issues should be ironed out as they emerge through careful monitoring.

The overall concept is sound and I strongly support the broad social networking model which IGGY has adopted:

‘Because social media can address so many of the problems faced by gifted learners, while also capitalising on their familiarity with the online environment, it is tempting to regard the relationship between gifted education (in this narrow sense) and social media as ‘a marriage made in heaven’.

But it is too early to speculate whether or not IGGY will be successful. The final judgment will need to take account of several factors, including:

  • Whether the social network is attractive and addictive enough to pull gifted learners away from Facebook and Playstation for worthwhile periods. Is it a viable alternative, or is it doomed to be a poor second-best, scorned by the majority because of its worthiness and endorsement by parents and teachers?
  • Assuming that IGGY is attractive enough to secure and maintain a substantial audience of 13-19 year-olds, what level of engagement it will engender in its users. Some members may treat IGGY like any other social network, dipping in and out as the mood takes them and valuing the experience primarily for the social interaction. Others may be more engaged with the learning activities, possibly even undertaking them on a systematic basis, so achieving the planned accredited awards. Like its precursors, IGGY’s success must be judged on the number of genuinely and consistently active members (rather than the number of members per se).
  • Whether a methodology is established to secure genuine and system-wide integration with learning in schools. Bolt-on enrichment has very limited value in itself – the added value is only derived when the enrichment activities become a fully integral part of the learners’ educational experience. But that requires significant input on the part of schools, with obvious implications for teacher time. IGGY will need to adjust its position on this and evolve effective tools to support school staff with this process.
  • Whether the educational benefits are confirmed through robust evaluation. This must be able to isolate convincingly the impact of IGGY from all other factors and quantify the benefits, not least the impact on individual and collective educational achievement and on fair access to competitive higher education.  Good evaluation is expensive and one dimension must necessarily be longitudinal. (Like all gifted and talented education interventions, there is a potential contribution to excellence and another to equity. Both are important and must be kept in balance.)
  • Whether IGGY can balance income and expenditure and so achieve longer term financial sustainability. Upfront and running costs are significant and IGGY is unlikely to reach financial equilibrium for some time. It would be interesting to see an evaluation of the monetary benefits likely to accrue to Warwick from this investment, and the probability of those being realised. Ultimately income has to depend on membership rather than sponsorship. There are several more established competitors worldwide, especially those located in the United States. It will be hard for IGGY to attract business away from them, so the alternative is to become established in new markets. The international business brings obvious benefits for Warwick and for learners, but there is a risk that it could deflect the organisation from an initial priority to secure its domestic audience.

There are several other conspicuous risks, not least the following four:

  •  IGGY is ‘high maintenance’ in that it relies on the availability of a never-ending flow of high-quality content, much of which has a cost attached. Should that stream ever falter – even when IGGY has built up a sizeable repository of old material – the value to members will decline significantly.
  • Online security is similarly ‘high maintenance’, carrying with it a huge reputational risk if there is ever a serious breach. IGGY has evolved a relatively light touch procedure which – while it does not inhibit recruitment – could potentially be compromised.
  • The domestic and global markets might evolve in a way that is unhelpful to IGGY. It is vulnerable to bigger generic players choosing to extend their services to gifted learners. Competition here in the UK is currently negligible. While it is open to question whether a continuing IGGY monopoly would be in the best interests of UK gifted learners, the evolving market for HE-driven MOOCs may pull demand away from IGGY if they are deliberately marketed towards younger students. (It is noteworthy that Warwick is a partner in Futurelearn, the new endeavour led by the Open University. The evolving relationship between IGGY and Futurelearn will be interesting to chart.)
  • IGGY is leaving no stone unturned to secure a critical mass of members in line with its targets, but this may compromise the value of the service to learners who are genuinely within the top 5% by ability. There are conspicuous advantages to open access on one hand and strict eligibility criteria on the other – and there may be some cause to suggest that IGGY has fallen between these two stools.

The acid test will be whether IGGY can successfully reconcile its twin imperatives – to improve significantly and measurably the education of a critical mass of gifted learners and, simultaneously, to generate the flow of benefits that will give Warwick University a competitive advantage over its peers.

The UK gifted education community is fragmented, competitive and highly suspicious. There is precious little effective collaboration. IGGY might usefully position itself to change that but, to be successful, it must be fully open and transparent in its proceedings and prepared to learn from the mistakes of the past, not least by opening itself up to constructive criticism emanating within and outside the gifted education community.

I said it was too early to speculate on IGGY’s chances of success but, if pressed (and setting aside my innate pessimism), I would put them close to 50/50 as things stand. We should have a much clearer picture in twelve months’ time.

.

GP

December 2012