What Has Become of the European Talent Network? Part Two

 .

This is the second and concluding part of a post about progress by the European Talent Centre towards a European Talent Network.

EU flag CapturePart One:

  • Provided an updated description of the Hungarian model for talent support and its increasingly complex infrastructure.
  • Described the origins of the European Talent project and how its scope and objectives have changed since its inception and.
  • Outlined the project’s initial advocacy effort within the European Commission.

This second episode describes the evolution of the model for the European Network, continues the history of its advocacy effort and reviews the progress Flag_of_Hungarymade by the European Centre in Budapest towards achieving its aims.

It concludes with an overall assessment of progress that highlights some key fault lines and weaknesses that, if addressed, would significantly improve the chances of overall success.

Initial Efforts to Design the European Network

A Draft Talent Points Plan 

At the 2012 ECHA Conference in Munster, a draft ‘Talent Points Plan’ was circulated which set out proposed criteria for EU Talent Points.

The following entities qualify for inclusion on the EU Talent Map:

  • ‘an already existing at least 2 year-old network connected to talent support
  • organizations/institutions focusing mainly on talent support: research, development, identification (eg schools, university departments, talent centers, excellence centers etc)
  • policy makers on national or international level (ministries, local authorities)
  • NGOs
  • business corporation with talent management programs (talent identification, corporate responsibility programs, creative climates)
  • parent organizations of gifted and talented children.’

But only organisations count as EU Talent Points. Each:

  • ‘has a strategy/action plan connected to talent (identification, support, research, carrier planning, etc…)
  • is willing to share at least one best/good practice, research results, video
  • is willing to share information on talent support (programs, conferences, talent days)
  • is open to be visited by other network members
  • is open to cooperate
  • accepts English as a common language while communicating in the network
  • is willing to update the data of home page 2 times/year.’ [sic]

My feedback on this draft urged a more flexible, inclusive approach – similar to what had been proposed earlier – as well as an online consultation of stakeholders to find out what they wanted from the Centre and the wider network.

Curiously, the ‘Towards a European Talent Support Network’ publication that was also distributed at the Conference took a somewhat different line, suggesting a more distributed network in which each country has its own Talent Support Centre:

‘The Talent Support Centres of the European countries could serve as regional hubs of this network building a contact structure going beyond their own country, while the core elements of our unique network could be the so-called European Talent Points… European Talent Centres are proposed to be registered by the Committee of the European Council of High Ability… A European Talent Centre should be an organization or a distinct part of a larger organization established for this purpose.

This is a pronounced shift from the ‘networked hubs’ proposed previously.

The publication goes on to set out ‘proposed requirements for a European Talent Centre’. Each:

  • ‘has an expertise of at least one year to coordinate the talent support activity of minimum 10 thousand persons 
  • has minimum two full-time employees who are dedicated to the tasks listed below 
  • is able to provide high quality information on theoretical and practical issues of gifted education and talent support
  • is able to keep records on the talent support activity of its region including the registration, help and coordination of European Talent Points and making this information available on the web (in the form of a Talent Support Map of the region)
  • is willing to cooperate with other European Talent Centres and with ECHA
  • is willing and able to coordinate joint actions, international events, Talent Days and other meetings in the field of talent support
  • is open to be visited by representatives, experts, talented young people of other European Talent Centres
  • is able to help and influence decisions on regional, national and/or European policies concerning the gifted and talented.’

The document also offers an alternative version of the criteria for European Talent Points.

Whereas the draft I began with specified that only organisations could be placed on the EU Talent Map, this version offers a more liberal interpretation, saying that Talent Points may be:

  • ‘organizations/institutions focusing mainly on talent support: research, development, identification (e. g: schools, university departments, talent centres, excellence centres, NGOs, etc.)
  • talent-related policy makers on national or international level [sic] (ministries, local authorities)
  • business corporation with talent management programs (talent identification, corporate responsibility programs, creative climate)
  • organizations of gifted and talented people
  • organizations of parents of gifted and talented children, or
  • umbrella organization (network) of organizations of the above types’

Talent points are to be registered (not accredited) by the appropriate European talent centres, but it appears that the centres would not enjoy discretion in such matters because there is a second set of proposed requirements:

  • ‘Has a strategy/action plan connected to talent (identification, support, research, career planning, etc.)
  • Is able and willing to share information on its talent support practices and other talent-related matters with other European Talent Points (programs, conferences, Talent Days) including sending the necessary data to a European Talent Centre and sharing at least one best practice/research result on the web
  • Is open to cooperate with other European Talent Points including the hosting of visiting representatives, talented young people from other European Talent Points.’

 .

Problems with the Talent Points Plan

‘Towards a European Talent Support Network’ stipulates – for no apparent reason – that a European Talent Centre has to be an organisation or part of an organisation established specifically for this purpose. It cannot be subsumed seamlessly into the existing responsibilities of an organisation.

There is no reference to funding to cover the cost of this activity, so that is presumably to be provided, or at least secured, by the organisation in question.

The criteria for European centres seem to be seeking to clone the Budapest Centre. To locate one in every European country – so roughly 50 countries – would be a tall order indeed, requiring a minimum of 100FTE employees.

The impact on the role and responsibilities of the Budapest Centre is not discussed. What would it do in this brave new world, other than to cover Hungary’s contribution to the network?

The only justification for ECHA’s involvement is presumably the reference earlier in ‘Towards a European Talent Support Network’:

‘Stemming from its traditions – and especially due to its consultative status as a non-governmental organization (NGO) at the Council of Europe –ECHA has to stand in the forefront in building a European Talent Support Network; a network of all people involved in talent support.’

ECHA News carries a report of the minutes of an ECHA committee meeting held in April 2013:

‘It was suggested that ECHA should be an accrediting organization for European Talent Centres and Talent Points. In the following discussion it was concluded that (1) it might be possible to establish a special accrediting committee; (2) Talent Centres would decide where Talent Points can be; (3) the proposal for European Talent Centres and European Talent Points criteria would be sent to additional key ECHA members (including National Correspondents) as discussion material. Criteria will be decided later.’

So ECHA would have control of the decision which entities could become European Talent Centres. This is despite the fact that ECHA is an entirely separate membership organisation with no formal responsibility for the EU Talent initiative.

This is not a sensible arrangement.

There is no explanation of why the network itself could not accredit its own members.

Turning back to the proposed requirements for European talent centres, these must be minimum requirements since there would otherwise be no need for an accreditation committee to take decisions.

Presumably the committee might impose its own additional criteria, to distinguish, for example, between two competing proposals for the same region.

The requirement for a year’s experience in relation to ‘co-ordinating’ talent support activity for at least 10,000 people is not explained. What exactly does it mean?

It might have been better to avoid quantitative criteria altogether. Certainly it is questionable whether even the present centre in Budapest meets this description.

And why the attempt to control inputs – the reference to at least two full-time staff – rather than outcomes? Surely the employment of sufficient staff is a matter that should be left to the centre’s discretion entirely.

The broad idea of a distributed network rather than a Budapest-centred network is clearly right, but the reasoning that puts ECHA in a controlling position with regard to the network is out of kilter with that notion, while the criteria themselves are inflexible and unworkable, especially since there is no budget attached to them.

When it comes to the talent points there are clear conflicts between the two versions. The first set of criteria outlined above is the more onerous. They propose an exclusive – rather than illustrative – list of those that can be included on the EU Talent Map.

Additionally they add that existing networks can feature on the map, but only if they are at least two years old! And they stipulate an additional English language requirement and biannual updating of their website homepage.

Only an entity with some serious difficulties could manage to share two sets of different draft criteria – each with its own profound problems – at precisely the same time!

Hungary budapest by night

Budapest by Night

.

The EU Advocacy Effort Continues

.

What Became of the Written Declaration?

Written Declarations are designed to stimulate debate. Once submitted by MEPs they are printed in all official EU languages and entered into a register. There is then a three month window in which other MEPs may sign them.

Those attracting signatures from a majority of MEPs are announced by the President in a plenary session of the European Parliament and forwarded for consideration to the bodies named in the text.

Those that do not attract sufficient signatures officially lapse.

.

The archive of written declarations shows that – despite the revisions outlined above and the best efforts of all those lobbying (including me) – WD 0034/2012 lapsed on 20 February 2013 having attracted 178 signatures. Since there are some 750 MEPs, that represents less than 25% of the total.

 .

A Parliamentary Hearing

As part of this ultimately unsuccessful lobbying effort, the Hungarian MEP who – along with three colleagues – submitted the Written Declaration also hosted a Parliamentary Hearing on the support of talents in the European Union.

The programme lists the speakers as:

  • Anneli Pauli, a Finn, formerly a Deputy Director General of the European Commission’s Research and Innovation Directorate.
  • Laszlo Andor, a Hungarian and EU Commissioner for employment, social affaris and inclusion. (Any contribution he made to the event is not included in the record, so he may or may not have been there.)
  • Peter Csermely. The current ECHA President and the man behind the EU Talent Centre.

There was no-one from the Commission’s Education Directorate involved.

The record of proceedings makes interesting reading, highlighting the Written Declaration, the economic value of talent development to the EU, the contribution it can make to research and innovation, the scope to support the inclusion of immigrants and minorities and the case for developing the European network.

Pauli is reported as saying that:

‘Talents are the heart of the future EU’s research area, thus they will work hard on it that the Horizon 2020 will offer enough support to them.’ [sic]

Horizon 2020 is the EU Framework Programme for Research and Innovation. There is no explicit home for talent support within the framework of the Horizon 2020 programme, so it remains to be seen how this will materialise in practice.

She also says:

‘…that school education on talents and the creative education in school sciences should be strengthened’ [sic]

This presumably carried rather less authority considering her role – and considering that, as we have seen, the Declaration was framed exclusively in terms of ‘non-formal learning’.

There is little explicit reference to the specifics of the European Talent project other than that:

‘…EU-wide talent-support units are needed, Europren [sic] Talent Points Network, a European Talent Day could be organised, or even a Year of Excellence and Talents could be implemented in the future too.’

We are not told how well attended the hearing was, nor do we have any information about its influence.

Only 13 more MEPs signed the WD between the Hearing and the deadline, and that was that.

An EU Thematic Working Group on Talent Support?

The 2013 publication ‘Towards a European Talent Support Network’ puts the best possible spin on the Written Declaration and the associated Hearing.

It then continues:

‘Confirming the importance of WD 34/2012, an EU Thematic Working Group on supporting talent and creativity was initiated by Prof. Péter Csermely. As a starting activity, the EU Thematic Working Group will work out the detailed agenda of discussions and possible EU member state co-operation in the area of talent support. This agenda may include items like:

  • Mutual information on measures to promote curricular and extra-curricular forms of talent support, including training for educational professionals to recognise and help talent;
  • Consideration of the development of an EU member state talent support network bringing together talent support communities, Talent Points and European Talent Centres in order to facilitate co-operation and the development and dissemination of the best talent support practices in Europe;
  • Consideration of celebration of the European Day of Talented;
  • Suggestions to the Commission to include talent support as a priority in future European strategies, such as the strategies guiding the European Research Area and the European Social Fund.’

The proposed status of this group is not discussed, so it is unclear whether it will be an expert group under the aegis of the Commission, or an independent group established with funding from Erasmus Plus or another EU programme.

If it is the latter, we will have to wait some time for it to be established; if it is the former, it does not yet feature in the Commission’s Register.

In either case, we are some nine months on from the publication of the document that brought us this news and there is still no indication of whether this group exists, when it will start work or who its membership is/will be.

 .

A European Economic and Social Committee (EESC) Opinion

At about the same time as a draft Written Declaration was circulated in January 2012, the Bureau of the EU’s European Economic and Social Committee was recommending that the Committee proper should undertake a fresh programme of ‘own initiative opinions’ (so the weakest category of NLA).

These included:

‘Unleashing the potential of young people with high intellectual abilities in the European Union’

Although the development process was undertaken during 2012, the final opinion was not published until January 2013.

The EESC describes itself thus:

‘The European Economic and Social Committee (EESC) is a consultative body that gives representatives of Europe’s socio-occupational interest groups and others, a formal platform to express their points of views on EU issues. Its opinions are forwarded to the Council, the European Commission and the European Parliament.’

Its 353 members are nominated by member governments and belong to an employers’ group, a workers’ group or a ‘various interests’ group. There are six sections, one of which is ‘Employment, Social Affairs and Citizenship’ (SOC).

EESC opinions are prepared by study groups which typically comprise 12 members including a rapporteur. Study groups may make use of up to four experts.

I cannot trace a relationship between the EESC’s opinion and the European Talent initiative.

The latter’s coverage does not mention any involvement and there is no information on the EU side about who prompted the process.

The focus of the opinion – high intellectual ability – is markedly out of kilter with the broader talent focus of the Talent Network, so it is highly likely that this activity originated elsewhere.

If that is the case then we can reasonably conclude that the European Talent initiative has not fulfilled its original commitment to an NLA.

Diligent online researchers can trace the development of this Opinion from its earliest stages through to eventual publication. There is a database of the key documents and also a list of the EESC members engaged in the process.

As far as I can establish the group relied on a single expert – one Jose Carlos Gibaja Velazquez, who is described as Subdirección General de Centros de Educación Infantil, Primaria y Especial Comunidad de Madrid’.

The link between JCBV and the EESC is explained here (translation into English here). I can find no link between Senor Gibaja and the EU Talent Network.

EESC members of the study group were:

  • Beatrice Quin France)
  • Teresa Tsizbierek (Pol)

An Early Draft of the Opinion

The earliest version of the Opinion is included an information memo dated 7 January. This also cites the significance of the Europe 2020 Strategy:

‘One of the top priorities of the Europe 2020 Strategy is to promote smart growth, so that knowledge and innovation become the two key drivers of the European economy. In order to reach this goal, it is essential that the European Union take advantage of the potential of the available human capital, particularly of young people with high intellectual capacities, who make up around 3% of the population.’

But it is clearly coming from a different perspective to the EU Talent Centre, which isn’t mentioned.

The ‘gist of the opinion’ at this early stage is as follows:

‘The EESC recommends that the European Commission and the Member States support further studies and research that would tap the potential of gifted children and young people in a wide variety of fields, aiming to facilitate employment and employability within the framework of the EU and, in a context of economic crisis, enhance specialist knowledge and prevent brain drain;

  • The Committee recommends that, in the future, greater consideration be given to each Member State’s existing models for and experience in working with highly gifted children, particularly those which benefit all of society, facilitate cohesion, reduce school failure and encourage better education in accordance with the objectives of the Europe 2020 strategy;
  • The Committee proposes improving educational care for children and young people with high abilities, in terms of the following aspects:

–          initial and ongoing training of teaching staff regarding the typical characteristics of highly able students, as well as the detection and educational care they need;

–          pooling of procedures for the early detection of high intellectual abilities among students in general and in particular among those from disadvantaged social backgrounds;

–          designing and implementing educational measures aimed at students with high intellectual abilities;

–          incorporating into teacher training the values of humanism, the reality of multiculturalism, the educational use of ICT and, lastly, the encouragement of creativity, innovation and initiative.’

Mount Bel Stone courtesy of Horvabe

Mount Bel Stone courtesy of Horvabe

.

What the Opinion Eventually Recommended

The final version of the Opinion was discussed by the EESC at its meeting on 16 January 2013 and was adopted ‘by 131 votes in favour, none against, with 13 abstentions’.

The analysis contained in the Opinion is by no means uncontentious and a close analysis would generate a long list of reservations. But this would be oblique to the issue under discussion.

The recommendations are as follows (my emboldening):

‘The European Economic and Social Committee is aware that the issue of children and young people with high intellectual abilities has been fairly well researched, as a result of the studies conducted over the last decades and the extensive corpus of specialist scientific literature. However, given the importance of this topic, the EESC recommends that the European Commission and the Member States support further studies and research and adopt suitable measures to cater for diversity among all types of people. These should include programmes that would tap the potential of gifted children and young people in a wide variety of fields. The aims of this action would include facilitating employment and employability within the framework of the EU and, in a context of economic crisis, enhancing specialist knowledge and preventing brain drain to other parts of the world.

The Committee proposes nurturing the development and potential of children and young people with high abilities throughout the various stages and forms of their education, avoiding premature specialisation and encouraging schools to cater for diversity, and exploiting the possibilities of cooperative and non-formal learning.

The Committee recommends fostering education and lifelong learning, bearing in mind that each individual’s intellectual potential is not static but evolves differently throughout the various stages of his or her life.

The Committee recommends that, in the future, greater consideration be given to each Member State’s existing models for and experience in working with highly gifted children, particularly those which benefit all of society, facilitate cohesion, reduce school failure and encourage better education in accordance with the objectives of the Europe 2020 strategy.

The Committee highlights the need to detect, in the workplace, those workers (particularly young workers) who are able and willing to develop their intellectual capabilities and contribute to innovation, and to give them the opportunity to further their education in the field that best matches their ambitions and centres of interest.

The Committee proposes improving educational care for children and young people with high abilities, in terms of the following aspects:

  • initial and ongoing training of teaching staff regarding the typical characteristics of highly able students, as well as the detection and educational care they need;
  • pooling of procedures for the early detection of high intellectual abilities among students in general and in particular among those from disadvantaged social backgrounds;
  • designing and implementing educational measures aimed at students with high intellectual abilities. These measures should include actions inside and outside ordinary educational establishments;
  • incorporating into teacher training the values of humanism, the reality of multiculturalism, the educational use of ICT and, lastly, the encouragement of creativity, innovation and initiative.

Improving the care provided for highly able students should include their emotional education (which is particularly important during adolescence), the acquisition of social skills with a view to facilitating integration and inclusion in society, integration into the labour market, and fostering their teamwork skills.

Schemes and procedures for student exchanges and visits abroad should be tapped into so that gifted students can take part in them, particularly those from disadvantaged backgrounds.

Opportunities for exchanging information and good practices on detecting and caring for gifted students should be harnessed across the EU Member States.

Entrepreneurship should be fostered among children and young people with high abilities, with a view to encouraging responsibility and solidarity towards society overall.

 .

More than One Opinion?

I have devoted significant attention to this apparently unrelated initiative because it shows that the EU lobbying effort in this field is poorly co-ordinated and pursuing substantively different objectives.

The EU Talent project failed to secure the NLA it was pursuing, but someone else has exploited the same route to influence – and for substantially different purposes.

What is worse, the EU Talent lobby seems to have failed entirely to secure any cross-reference to their efforts, despite there being two Hungarians on the study group. Did they try and fail or didn’t they try at all?

Perhaps fortunately, the Opinion seems to have been as influential as the Written Declaration. One wonders whether the enormous energy and time invested in each of these processes was ultimately worthwhile.

 .

What progress has been made by the European Talent Project?

. 

The Mission Has Changed

The website version of the Centre’s mission is subtly different from the original version discussed earlier in this post

The Centre now seeks:

  • ‘to provide talent support an emphasis commensurate with its importance in every European country [same]
  • to provide talented youngsters access to the most adequate forms of education in every Member State [same]
  • to make Europe attractive for the talented youth [same]
  • to create talent-friendly societies in every European country [same]
  • to accelerate the sharing of information on the topic [new]
  • to create a higher number of more efficient forms of talent support for the talented’ [new]
  • to make it easier for social actors interested in talent support to find each other through the European talent support network.’ [new]

The reference to voluntary experts has gone, to be replaced by a call for:

‘…partners – professionals, talents and talent supporters – willing to think and work together.’

Towards a European Talent Support Network’ offers a different version again.

The mission and role of the Centre have changed very slightly, to reflect the new orthodoxy of multiple European talent centres, describing the Budapest body as ‘the first European Talent Centre’.

Four long-term goals are outlined:

  • to give talent support a priority role in the transformation of the sector of education;
  • To reduce talent loss to the minimum in Europe,
  • To accelerate the sharing of information on the topic by integrating talent support initiatives of the Member States of the EU into a network
  • To make it easier for social actors interested in talent support to find each other through the European talent support network.’

It adds some additional short term objectives for good measure:

  • ‘As a hub of a European network, try to trigger mechanisms which bring organizations and individuals together to facilitate collaboration, share best practices and resources
  • Draw the Talent Support Map of Europe
  • Organize conferences for professionals in the region
  • Do research on the field of talent support
  • Collect and share best practices.’

We have now encountered three different versions of a mission statement for an entity that is less than two years old.

It is not clear whether this represents an evolutionary process within the organisation – which might be more understandable if it were better documented – or a certain slipperiness and opportunistic shifting of position that makes it very difficult for outsiders to get a grip on exactly what the Centre is for.

In typical fashion, the document says that:

‘the activities of the Centre fall into four large groups: advocacy, research, organisation (conferences, meetings, Talent Days), contact-keeping (meeting delegations from all over the world) and sharing information.’

Forgive me, but isn’t that five groups?

We have dealt with advocacy already and unfortunately there is negligible information available about the ‘contact-keeping’ activity undertaken – ie the various delegations that have been met by the staff and what the outcomes have been of those meetings.

That leaves research, organisation and sharing information.

.

Esterhazy Castle

Esterhazy Castle

Advisory Board and Partners

Before leaving the Centre’s operations, it is important to note that a three-strong Advisory Board has bee been appointed.

All three are luminaries of ECHA, two of them serving on the current Executive Committee.

There is no explanation of the Board’s role, or how it was chosen, and no published record of its deliberations. It is not clear whether it is intended as a substitute for the advisory group that was originally envisaged, which was to have had much broader membership.

As noted above, there is also a new emphasis on ‘partners’. The full text of the reference on the website says:

‘We are looking for partners – professionals, talents and talent supporters – willing to think and work together. We sincerely hope that the success of the Hungarian example will not stop short at the frontiers of the country, but will soon make its way to European talent support co-operation.’

Four partners are currently listed – ECHA, the Global Centre for Gifted and Talented Children, IGGY and the World Council – but there is no explanation of the status conferred by partnership or the responsibilities expected of partners in return.

Are partners prospective European Talent Centres or do they have a different status? Must partners be talent points or not? We are not told.

Research

This is presumably a reference to the ‘Best Practices’ section of the Budapest Centre’s website, which currently hosts two collections of studies ‘International Horizons of Talent Support Volumes 1 and 2’ and a selection of individual studies (17 at the time of writing).

 .

The quality of this material can best be described as variable. This study of provision in Ireland is relatively unusual, since most of the material is currently devoted to Central and Eastern Europe, but it gives a sense of what to expect.

There has been no effort to date to collect together already-published research and data about provision in different parts of Europe and to make that material openly accessible to readers. That is a major disappointment.

There is nothing in the collection that resembles an independent evaluation of the European Talent Initiative as a whole, or even an evaluation of the Hungarian NTP.

At best one can describe the level and quality of research-related activity as embryonic.

 .

Event Organisation

This Table shows what the Centre has achieved to date and what is planned for 2014:

.

2011 2012 2013 2014
Conference Yes (Budapest) Unofficial (Warsaw) No Yes (Budapest)
EU Talent Day Yes No No Yes

 .

The 2014 Conference is the first official EU-wide event since the 2011 launch conference. The same is true of the 2014 EU Talent Day.

The Polish conference was initially planned for spring 2012, but failed to materialise. By July it was confirmed that there would only be ‘an unofficial follow-up’ in October. My December 2012 post described my personal and ultimately unsuccessful efforts to attend this event and summarised the proceedings.

The 2014 Conference Website insists that it will coincide with the Third EU Talent Day but I can find barely a trace of a Second, except in Estonia, where it was celebrated on 21 March 2012.

.

.

This is not a strikingly positive record.

The 2014 Conference website names an organising ‘international scientific committee’ that is heavily biased towards academics (eight of the eleven), ECHA luminaries (five of the eleven) and Hungarians (four of the eleven).

The programme features four academic keynotes about networks and networking.

The remainder involve Slovenia’s education minister, the EU Commissioner for Employment, Social Affairs and Inclusion (a Hungarian who was advertised to be part of the Parliamentary Hearing on the Written Declaration but, if he did attend, apparently made no contribution) and one devoted to the ‘International Talent Competiveness Index’.

I think this must be INSEAD’s Global Talent Competitiveness Index).

INSEAD’s inaugural 2013 Report ranks Hungary 40th of 103 countries on this Index. (The UK is ranked 7th and the US 9th).

There are eight ‘break-up sessions’ [sic]:

  • The role of governments and the EU in creation a European Network[sic]
  • Digital Networks for Talented Youth
     
  • Social responsibility and organisational climate
  • Practice and Ethics of Networking
  • Multiple disadvanteged children [sic]
  • Parents’ networks in Europe
  • Counselling Centers [sic]
  • Civil networks for Talent Support

The expected outcome of the event is not specified. There is no scheduled opportunity to discuss the progress made to date by the EU Talent initiative, or the policy and implementation issues flagged up in this post. And there is no information about the mediation of the Conference via social media (though there are now Skype links next to the items in the programme).

 .

Talent Map and Resources

The website features a Resource Center [sic] which includes a database of ‘selected resources’. We are not told on what basis the selection has been made.

The database is built into the website and is not particularly accessible, especially if one compares it with the Hungarian equivalent. Indeed, the Talent Centre website is decidedly clunky by comparison.

The Talent Map is now better populated than it was, though inconsistently so. There are only two entries for Hungary, for example, while Romania has 11. There are only three in the UK and none in Ireland. Neither CTYI nor SNAP is mentioned.

It might have been better to pre-populate the map and then to indicate which entries had been ‘authorised’ by their owners.

From a presentational perspective the map is better than the database, though it should have a full page to itself.

Both the database and the map are still works in progress.

Overall Assessment and Key Issues Arising

In the light of this evidence, what are we to make of the progress achieved towards a European Talent Network over the last four years?

In my judgement:

  • The fundamental case for modelling a European Talent Network on the Hungarian National Talent Programme is unproven. The basic design of the NTP may reflect one tradition of consensus on effective practice, but the decision to stop at age 35 is unexplained and idiosyncratic. The full model is extremely costly to implement and relies heavily on EU funding. Even at current levels of funding, it is unlikely to be impacting on more than a relatively small minority of the target population. It is hard to see how it can become financially sustainable in the longer term. 
  • There is no detailed and convincing rationale for, or description of, how the model is being modified (into ‘Hungary-lite’) for European rollout. It is abundantly clear that this rollout will never attract commensurate funding and, compared with the NTP, it is currently being run ‘on a shoestring’. But, as currently envisaged, the rollout will require significant additional funding and the projected sources of this funding are unspecified. The more expensive the rollout becomes, the more unlikely it is to be financially sustainable. In short, the scalability to Europe of the modified Hungarian talent support model is highly questionable.
  • The shape and purpose of the overall European Talent initiative has changed substantively on several occasions during its short lifetime. There is only limited consistency between the goals being pursued now and those originally envisaged. There have been frequent changes to these goals along the way, several of them unexplained. It is not clear whether this is attributable to political opportunism and/or real confusion and disagreement within the initiative over what exactly it is seeking to achieve and how. There are frequently inconsistencies between different sources over exactly how aspects of the rollout are to be implemented. This causes confusion and calls into question the competence of those who are steering the process. Such ‘mission creep’ will radically reduce the chances of success.
  • The relationship with ECHA has always been problematic – and remains so. Fundamentally the European Talent Initiative is aiming to achieve what ECHA itself should have achieved, but failed. The suggestion that ECHA be given control over the accreditation of European Talent Centres is misguided. ECHA is a closed membership organisation rather than an open network and cannot be assumed to be representative of all those engaged in talent support throughout Europe. There is no reason why this process could not be managed by the network itself. In the longer term the continued co-existence of the Network and ECHA as separate entities becomes increasingly problematic. But any merger would demand radical reform of ECHA. Despite the injection of new blood into the ECHA Executive, the forces of conservatism within it remain strong and are unlikely to countenance such a radical step.
  • The progress achieved by the European Talent Centre during its relatively short existence has been less than impressive. That is partly attributable to the limited funding available and the fact that it is being operated on the margins of the Hungarian NTP. The funding it does attract comes with the expectation that it will be used to advertise the successes of the NTP abroad, so raising the status and profile of the domestic effort. There is a tension between this and the Centre’s principal role, which must be to drive the European rollout. 
  • The decision to move to a distributed model in which several European Talent Centres develop the network, rather than a centralised model driven by Budapest, is absolutely correct. (I was saying as much back in 2011.) However, the wider implications of this decision do not appear to have been thought through. I detect a worrying tendency to create bureaucracy for the sake of it, rather than focusing on getting things done.
  • Meanwhile, the Budapest Centre has made some headway with a Talent Map and a database of resources, but not nearly enough given the staffing and resource devoted to the task. The failure to deliver annual EU Conferences and Talent Days is conspicuous and worrying. Conversely, the effort expended on lobbying within the European Commission has clearly been considerable, though the tangible benefits secured from this exercise are, as yet, negligible.
  • For an initiative driven by networking, the quantity and quality of communication is poor. Independent evaluation studies of the Hungarian model do not seem to be available, at least not in English. There should be a fully costed draft specification for the European roll-out which is consulted upon openly and widely. Consultation seems confined currently to ECHA members which is neither inclusive nor representative. No opportunities are provided to challenge the direction of travel pursued by the initiative and its decision-making processes are not transparent. There is no evidence that it is willing to engage with critics or criticism of its preferred approach. The programme for the 2014 Conference does not suggest any marked shift in this respect.

An unkind critic might find sufficient evidence to level an accusation of talent support imperialism, albeit masked by a smokescreen of scientifically justified networkology.

I do not subscribe to that view, at least not yet. But I do conclude that the European Talent effort is faltering badly. It may limp on for several years to come, but it will never achieve its undoubted potential until the issues outlined above are properly and thoroughly addressed.

.

GP

March 2014

 

What Has Become of the European Talent Network? Part One

This post discusses recent progress by the European Talent Centre towards a European Talent Network.

EU flag CaptureIt is a curtain-raiser for an imminent conference on this topic and poses the critical questions I would like to see addressed at that event.

It should serve as a briefing document for prospective delegates and other interested parties, especially those who want to dig beneath the invariably positive publicity surrounding the initiative.

It continues the narrative strand of posts I have devoted to the Network, concentrating principally on developments since my last contribution in December 2012.

 

Flag_of_HungaryThe post is organised part thematically and part chronologically and covers the following ground:

  • An updated description of the Hungarian model for talent support and its increasingly complex infrastructure.
  • The origins of the European Talent project and how its scope and objectives have changed since its inception.
  • The project’s advocacy effort within the European Commission and its impact to date.
  • Progress on the European Talent Map and promised annual European Talent Days and conferences.
  • The current scope and effectiveness of the network, its support structures and funding.
  • Key issues and obstacles that need to be addressed.

To improve readability I have divided the text into two sections of broadly equivalent length. Part One is dedicated largely to bullets one to three above, while Part Two deals with bullets three to six.

Previous posts in this series

If I am to do justice to this complex narrative, I must necessarily draw to some extent on material I have already published in earlier posts. I apologise for the repetition, which I have tried to keep to a minimum.

On re-reading those earlier posts and comparing them with this, it is clear that my overall assessment of the EU talent project has shifted markedly since 2010, becoming progressively more troubled and pessimistic.

This seems to me justified by an objective assessment of progress, based exclusively on evidence in the public domain – evidence that I have tried to draw together in these posts.

However, I feel obliged to disclose the influence of personal frustration at this slow progress, as well as an increasing sense of personal exclusion from proceedings – which seems completely at odds with the networking principles on which the project is founded.

I have done my best to control this subjective influence in the assessment below, confining myself as far as possible to an objective interpretation of the facts.

However I refer you to my earlier posts if you wish to understand how I reached this point.

  • In April 2011 I attended the inaugural conference in Budapest, publishing a report on the proceedings and an analysis of the Declaration produced, plus an assessment of the Hungarian approach to talent support as it then was and its potential scalability to Europe as a whole.
  • In December 2012 I described the initial stages of EU lobbying, an ill-fated 2012 conference in Poland, the earliest activities of the European Talent Centre and the evolving relationship between the project and ECHA, the European Council for High Ability.

I will not otherwise comment on my personal involvement, other than to say that I do not expect to attend the upcoming Conference, judging that the cost of attending will not be exceeded by the benefits of doing so.

This post conveys more thoroughly and more accurately the points I would have wanted to make during the proceedings, were suitable opportunities provided to do so.

A brief demographic aside

It is important to provide some elementary information about Hungary’s demographics, to set in context the discussion below of its talent support model and the prospects for Europe-wide scalability.

Hungary is a medium-sized central European country with an area roughly one-third of the UK’s and broadly similar to South Korea or Portugal.

It has a population of around 9.88 million (2013) about a sixth of the size of the UK population and similar in size to Portugal’s or Sweden’s.

Hungary is the 16th most populous European country, accounting for about 1.4% of the total European population and about 2% of the total population of the European Union (EU).

It is divided into 7 regions and 19 counties, plus the capital, Budapest, which has a population of 1.7 million in its own right.

RegionsHungary

Almost 84% of the population are ethnic Hungarians but there is a Roma minority estimated (some say underestimated) at 3.1% of the population.

Approximately 4 million Hungarians are aged below 35 and approximately 3.5m are aged 5-34.

The GDP (purchasing power parity) is $19,497 (source: IMF), slightly over half the comparable UK figure.

The Hungarian Talent Support Model

The Hungarian model has grown bewilderingly complex and there is an array of material describing it, often in slightly different terms.

Some of the English language material is not well translated and there are gaps that can be filled only with recourse to documents in Hungarian (which I can only access through online translation tools).

Much of this documentation is devoted to publicising the model as an example of best practice, so it can be somewhat economical with the truth.

The basic framework is helpfully illustrated by this diagram, which appeared in a presentation dating from October 2012.

EU talent funding Capture

 .

It shows how the overall Hungarian National Talent Programme (NTP) comprises a series of time-limited projects paid for by the EU Social Fund, but also a parallel set of activities supported by a National Talent Fund which is fed mainly by the Hungarian taxpayer.

The following sections begin by outlining the NTP, as described in a Parliamentary Resolution dating from 2008.

Secondly, they describe the supporting infrastructure for the NTP as it exists today.

Thirdly, they outline the key features of the time-limited projects: The Hungarian Genius Programme (HGP) (2009-13) and the Talent Bridges Programme (TBP) (2012-14).

Finally, they try to make sense of the incomplete and sometimes conflicting information about the funding allocated to different elements of the NTP.

Throughout this treatment my principal purpose is to show how the European Talent project fits into the overall Hungarian plan, as precursor to a closer analysis of the former in the second half of the post.

I also want to show how the direction of the NTP has shifted since its inception.

 .

The National Talent Programme (NTP) (2008-2028)

The subsections below describe the NTP as envisaged in the original 2008 Parliamentary Resolution. This remains the most thorough exposition of the broader direction of travel that I could find.

Governing principles

The framework set out in the Resolution is built on ten general principles that I can best summarise as follows:

  • Talent support covers the period from early childhood to age 35, so extends well beyond compulsory education.
  • The NTP must preserve the traditions of existing successful talent support initiatives.
  • Talent is complex and so requires a diversity of provision – standardised support is a false economy.
  • There must be equality of access to talent support by geographical area, ethnic and socio-economic background.
  • Continuity is necessary to support individual talents as they change and develop over time; special attention is required at key transition points.
  • In early childhood one must provide opportunities for talent to emerge, but selection on the basis of commitment and motivation become increasingly significant and older participants increasingly self-select.
  • Differentiated support is needed to support different levels of talent; there must be opportunities to progress and to step off the programme without loss of esteem.
  • In return for talent support, the talented individual has a social responsibility to support talent development in others.
  • Those engaged in talent support – here called talent coaches – need time and support.
  • Wider social support for talent development is essential to success and sustainability.

Hence the Hungarians are focused on a system-wide effort to promote talent development that extends well beyond compulsory education, but only up to the age of 35. As noted above, if 0-4 year-olds are excluded, this represents an eligible population of about 3.5 million people.

The choice of this age 35 cut-off seems rather arbitrary. Having decided to push beyond compulsory education into adult provision, it is not clear why the principle of lifelong learning is then set aside – or exactly what happens when participants reach their 36th birthdays.

Otherwise the principles above seem laudable and broadly reflect one tradition of effective practice in the field.

Goals

The NTP’s goals are illustrated by this diagram

NTP goals Capture

 .

The elements in the lower half of the diagram can be expanded thus:

  • Talent support traditions: support for existing provision; development of new provision to fill gaps; minimum standards and professional development for providers; applying models of best practice; co-operation with ethnic Hungarian programmes outside Hungary (‘cross border programmes’); and ‘systematic exploration and processing of the talent support experiences’ of EU and other countries which excel in this field. 
  • Integrated programmes: compiling and updating a map of the talent support opportunities available in Hungary as well as ‘cross border programmes’; action to support access to the talent map; a ‘detailed survey of the international talent support practice’; networking between providers with cooperation and collaboration managed through a set of talent support councils; monitoring of engagement to secure continuity and minimise drop-out. 
  • Social responsibility: promoting the self-organisation of talented youth;  developing their innovation and management skills; securing counselling; piloting  a ‘Talent Bonus – Talent Coin’ scheme to record in virtual units the monetary value of support received and provided, leading to consideration of a LETS-type scheme; support for ‘exceptionally talented youth’; improved social integration of talented youth and development of a talent-friendly society. 
  • Equal opportunities: providing targeted information about talent support opportunities; targeted programming for disadvantaged, Roma and disabled people and wider emphasis on integration; supporting the development of Roma talent coaches; and action to secure ‘the desirable gender distribution’. 
  • Enhanced recognition: improving financial support for talent coaches; reducing workload and providing counselling for coaches; improving recognition and celebrating the success of coaches and others engaged in talent support. 
  • Talent-friendly society: awareness-raising activity for parents, family and friends of talented youth; periodic talent days to mobilise support and ‘promote the local utilisation of talent’; promoting talent in the media, as well as international communication about the programme and ‘introduction in both the EU and other countries by exploiting the opportunities provided by Hungary’s EU Presidency in 2011’; ‘preparation for the foreign adaptation of the successful talent support initiatives’ and organisation of EU talent days. 

Hence the goals incorporate a process of learning from European and other international experience, but also one of feeding back to the international community information about the Hungarian talent support effort and extending the model into other European countries.

There is an obvious tension in these goals between preserving the traditions of existing successful initiatives and imposing a framework with minimum standards and built-in quality criteria. This applies equally to the European project discussed below.

The reference to a LETS-type scheme is intriguing but I could trace nothing about its subsequent development.

 .

Planned Infrastructure

In 2008 the infrastructure proposed to undertake the NTP comprised:

  • A National Talent Co-ordination Board, chaired at Ministerial level, to oversee the programme and to allocate a National Talent Fund (see below).
  • A National Talent Support Circle [I’m not sure whether this should be ‘Council’] consisting of individuals from Hungary and abroad who would promote talent support through professional opportunities, financial contribution or ‘social capital opportunities’.
  • A National Talent Fund comprising a Government contribution and voluntary contributions from elsewhere. The former would include the proceeds of a 1% voluntary income tax levy (being one of the good causes towards which Hungarian taxpayers could direct this contribution). Additional financial support would come from ‘the talent support-related programmes of the New Hungary Development Plan’.
  • A system of Talent Support Councils to co-ordinate activity at regional and local level.
  • A national network of Talent Points – providers of talent support activity.
  • A biennial review of the programme presented to Parliament, the first being in 2011.

Presumably there have been two of these biennial reviews to date. They would make interesting reading, but I could find no material in English that describes the outcomes.

The NTP Infrastructure Today

The supporting infrastructure as described today has grown considerably more complex and bureaucratic than the basic model above.

  • The National Talent Co-ordination Board continues to oversee the programme as a whole. Its membership is set out here.
  • The National Talent Support Council was established in 2006 and devised the NTP as set out above. Its functions are more substantial than originally described (assuming this is the ‘Circle’ mentioned in the Resolution), although it now seems to be devolving some of these. Until recently at least, the Council: oversaw the national database of talent support initiatives and monitored coverage, matching demand – via an electronic mailing list – with the supply of opportunities; initiated and promoted regional talent days; supported the network of talent points and promoted the development of new ones; invited tenders for niche programmes of various kinds; collected and analysed evidence of best practice and the research literature; and promoted international links paying ‘special attention to the reinforcement of the EU contacts’. The Council has a Chair and six Vice Presidents as well as a Secretary and Secretariat. It operates nine committees: Higher Education, Support for Socially Disadvantaged Gifted People, Innovations, Public Education, Foreign Relations, Public and Media Relations, Theory of Giftedness, Training and Education and Giftedness Network.
  • The National Talent Point has only recently been identified as an entity in its own right, distinct from the National Council. Its role is to maintain the Talent Map and manage the underpinning database. Essentially it seems to have acquired the Council’s responsibilities for delivery, leaving the Council to concentrate on policy. It recently acquired a new website.
  • The Association of Hungarian Talent Support Organizations (MATEHETZ) is also a new addition. Described as ‘a non-profit umbrella organization that legally represents its members and the National Talent Support Council’, it is funded by the National Council and through membership fees. The Articles of Association date from February 2010 and list 10 founding organisations. The Association provides ‘representation’ for the National Council’ (which I take to mean the membership). It manages the time-limited programmes (see below) as well asthe National Talent Point and the European Talent Centre.
  • Talent Support Councils: Different numbers of these are reported. One source says 76; another 65, of which some 25% were newly-established through the programme. Their role seems broadly unchanged, involving local and regional co-ordination, support for professionals, assistance to develop new activities, helping match supply with demand and supporting the tracking of those with talent.
  • Talent Point Network: there were over 1,000 talent points by the end of 2013. (Assuming 3.5 million potential participants, that is a talent point for every 3,500 people.) Talent points are providers of talent support services – whether identification, provision or counselling. They are operated by education providers, the church and a range of other organisations and may have a local, regional or national reach. They join the network voluntarily but are accredited. In 2011 there were reportedly 400 talent points and 200 related initiatives, so there has been strong growth over the past two years.
  • Ambassadors of Talent: Another new addition, introduced by the National Talent Support Council in 2011. There is a separate Ambassador Electing Council which appoints three new ambassadors per year. The current list has thirteen entries and is markedly eclectic.
  • Friends of Talent Club: described in 2011 as ‘a voluntary organisation that holds together those, who are able and willing to support talents voluntarily and serve the issue of talent support…Among them, there are mentors, counsellors and educators, who voluntarily help talented people develop in their professional life. The members of the club can be patrons and/or supporters. “Patrons” are those, who voluntarily support talents with a considerable amount of service. “Supporters” are those, who voluntarily support the movement of talent support with a lesser amount of voluntary work, by mobilizing their contacts or in any other way.’ This sounds similar to the originally envisioned ‘National Talent Support Circle’ [sic]. I could find little more about the activities of this branch of the structure.
  • The European Talent Centre: The National Talent Point says that this:

‘…supports and coordinates European actions in the field of talent support in order to find gifted people and develop their talent in the interest of Europe as a whole and the member states.’

Altogether this is a substantial endeavour requiring large numbers of staff and volunteers and demanding a significant budgetary topslice.

I could find no reliable estimate of the ratio of the running cost to the direct investment in talent support, but there must be cause to question the overall efficiency of the system.

My hunch is that this level of bureaucracy must consume a significant proportion of the overall budget.

Clearly the Hungarian talent support network is a long, long way from being financially self-sustaining, if indeed it ever could be.

 .

Hungary Parliament Building Budapest

Hungarian Parliament Building

.

The Hungarian Genius Programme (HGP) (2009-13)

Launched in June 2009, the HGP had two principal phases lasting from 2009 to 2011 and from 2011 to 2013. The fundamental purpose was to establish the framework and infrastructure set out in the National Talent Plan.

This English language brochure was published in 2011. It explains that the initial focus is on adults who support talents, establishing a professional network and training experts, as well as creating the network and map of providers.

It mentions that training courses lasting 10 to 30 hours have been developed and accredited in over 80 subjects to:

‘…bring concepts and methods of gifted and talented education into the mainstream and reinforce the professional talent support work… These involve the exchange of experience and knowledge expansion training, as well as programs for those who deal with talented people in developing communities, and awareness-raising courses aimed at the families and environment of young pupils, on the educational, emotional and social needs of children showing special interest and aptitude in one or more subject(s). The aims of the courses are not only the exchange of information but to produce and develop the professional methodology required for teaching talents.’

The brochure also describes an extensive talent survey undertaken in 2010, the publication of several good practice studies and the development of a Talent Loan modeled on the Hungarian student loan scheme.

It lists a seven-strong strategic management group including an expert adviser, project manager, programme co-ordinator and a finance manager. There are also five operational teams, each led by a named manager, one of which focused on ‘international relations: collecting and disseminating international best practices; international networking’.

A subsequent list of programme outputs says:

  • 24,000 new talents were identified
  • The Talent Map was drawn and the Talent Network created (including 867 talent points and 76 talent councils).
  • 23,500 young people took part in ‘subsidised talent support programmes’
  • 118 new ‘local educational talent programmes’ were established
  • 25 professional development publications were written and made freely available
  • 13,987 teachers (about 10% of the total in Hungary) took part in professional development.

Evidence in English of rigorous independent evaluation is, however, limited:

‘The efficiency of the Programme has been confirmed by public opinion polls (increased social acceptance of talent support) and impact assessments (training events: expansion of specialised knowledge and of the methodological tool kit).’

 .

The Talent Bridges Project (TBP) (2012-2014)

TBP began in November 2012 and is scheduled to last until ‘mid-2014’.

The initially parallel TBP is mentioned in the 2011 brochure referenced above:

‘In the strategic plan of the Talent Bridges Program to begin in 2012, we have identified three key areas for action: bridging the gaps in the Talent Point network, encouraging talents in taking part in social responsibility issues and increasing media reach. In order to become sustainable, much attention should be payed [sic] to maintaining and expanding the support structure of this system, but the focus will significantly shift towards direct talent care work with the youth.’

Later on it says:

‘Within the framework of the Talent Bridges Program the main objectives are: to further improve the contact system between the different levels of talent support organisations; to develop talent peer communities based on the initiatives coming from young people themselves; to engage talents in taking an active role in social responsibility; to increase media reach in order to enhance the recognition and social support for both high achievers and talent support; and last, but not least, to arrange the preliminary steps of setting up an EU Institute of Talent Support in Budapest.’

A list of objectives published subsequently contains the following items:

  • Creating a national talent registration and tracking system
  • Developing programmes for 3,000 talented young people from  disadvantaged backgrounds and with special educational needs
  • Supporting the development of ‘outstanding talents’ in 500 young people
  • Supporting 500 enrichment programmes
  • Supporting ‘the peer age groups of talented young people’
  • Introducing programmes to strengthen interaction between parents, teachers and  talented youth benefiting  5,000 young people
  • Introducing ‘a Talent Marketplace’ to support ‘the direct social utilisation of talent’ involving ‘150 controlled co-operations’
  • Engaging 2,000 mentors in supporting talented young people and training 5,000 talent support facilitators and mentors
  • Launching a communication campaign to reach 100,000 young people and
  • Realise European-Union-wide communication (in addition to the current 10, to involve 10 more EU Member States into the Hungarian initiatives, in co-operation with the European Talent Centre in Budapest established in the summer of 2012).

Various sources describe how the TBP is carved up into a series of sub-projects. The 2013 Brochure ‘Towards a European Talent Support Network’ lists 14 of these, but none mention the European work.

However, what appears to be the bid for TBP (in Hungarian) calls the final sub-project ‘an EU Communications Programme’ (p29), which appears to involve:

  • Raising international awareness of Hungary’s talent support activities
  • Strengthening Hungary’s position in the EU talent network
  • Providing a foreign exchange experience for talented young Hungarians
  • Influencing policy makers.

Later on (p52) this document refers to an international campaign, undertaken with support from the European Talent Centre, targeting international organisations and the majority of EU states.

Work to be covered includes the preparation of promotional publications in foreign languages, the operation of a ‘multilingual online platform’, participation in international conferences (such as those of ECHA, the World Council, IRATDE and ICIE); and ‘establishing new professional collaborations with at least 10 new EU countries or international organisations’.

Funding

It is not a straightforward matter to reconcile the diverse and sometimes conflicting sources of information about the budgets allocated to the National Talent Fund, HGP and the TBP, but this is my best effort, with all figures converted into pounds sterling.

 .

2009 2010 2011 2012 2013 2014 Total
NTF x £2.34m.or £4.1m  £2.34m.or £4.1m £8.27m tbc tbc tbc
Of which ETC x x x £80,000 £37,500 £21,350 £138,850
HGP £8.0m £4.6m x £12.6m
TBP x x x £5.3m £5.3m
Of which EU comms x x x £182,000 £182,000

Several sources say that the Talent Fund is set to increase in size over the period.

‘This fund has an annual 5 million EUR support from the national budget and an additional amount from tax donations of the citizens of a total sum of 1.5 million EUR in the first year doubled to 3 million EUR and 6 million EUR in the second and third years respectively.’ (Csermely 2012)

That would translate into a budget of £5.4m/£6.7m/£9.2m over the three years in question, but it is not quite clear which three years are included.

Even if we assume that the NTF budget remains the same in 2013 and 2014 as in 2012, the total investment over the period 2009-2014 amounts to approximately £60m.

That works out at about £17 per eligible Hungarian. Unfortunately I could find no reliable estimate of the total number of Hungarians that have benefited directly from the initiative to date.

On the basis of the figures I have seen, my guesstimate is that the total will be below 10% of the total eligible population – so under 350,000. But I must stress that there is no evidence to support this.

Whether or not the intention is to reach 100% of the population, or whether there is an in-built assumption that only a proportion of the population are amenable to talent development, is a moot point. I found occasional references to a 25% assumption, but it was never clear whether this was official policy.

Even if this applies, there is clearly a significant scalability challenge even within Hungary’s national programme.

It is also evident that the Hungarians have received some £18m from the European Social Fund over the past five years and have invested at least twice as much of their own money. That is a very significant budget indeed for a country of this size.

Hungary’s heavy reliance on EU funding is such that they will find it very difficult to sustain the current effort if that largesse disappears.

One imagines that they will be seeking continued support from EU sources over the period 2014-2020. But, equally, one would expect the EU to demand robust evidence that continued heavy dependency on EU funding will not be required.

And of course a budget of this size also begs questions about scalability to Europe in the conspicuous absence of a commensurate figure. There is zero prospect of equivalent funding being available to extend the model across Europe. The total bill would run into billions of pounds!

A ‘Hungarian-lite’ model would not be as expensive, but it would require a considerable budget.

However, it is clear from the table that the present level of expenditure on the European network has been tiny by comparison with the domestic investment – probably not much more than £100,000 per year.

Initially this came from the National Talent Fund budget but it seems as though the bulk is now provided through the ESF, until mid-2014 at least.

This shift seems to have removed a necessity for the European Talent Centre to receive its funding in biannual tranches through a perpetual retendering process.

For the sums expended from the NTF budget are apparently tied to periods of six months or less.

The European Talent Centre website currently bears the legend:

‘Operation of the European Talent Centre – Budapest between 15th December 2012 and 30th June 2013 is realised with the support of Grant Scheme No. NTP-EUT-M-12 announced by the Institute for Educational Research and Development and the Human Resources Support Manager on commission of the Ministry of Human Resources “To support international experience exchange serving the objectives of the National Talent Programme, and to promote the operation and strategic further development of the European Talent Centre – Budapest”.’

But when I wrote my 2012 review it said:

‘The operation of the European Talent Centre — Budapest is supported from 1 July 2012 through 30 November 2012 by the grant of the National Talent Fund. The grant is realised under Grant Scheme No. NTP-EU-M-12 announced by the Hungarian Institute for Educational Research and Development and the SándorWekerle Fund Manager of the Ministry of Administration and Justice on commission of the Ministry of Human Resources, from the Training Fund Segment of the Labour Market Fund.’

A press release confirmed the funding for this period as HUF 30m.

Presumably it will now need to be amended to reflect the arrival of £21.3K under Grant Scheme No. NTP-EU-M-13 – and possibly to reflect income from the ESF-supported TBP too.

A comparison between the Hungarian http://tehetseg.hu/ website and the European Talent Centre website is illustrative of the huge funding imbalance in favour of the former.

Danube Bend at Visegrad courtesy of Phillipp Weigell

Danube Bend at Visegrad courtesy of Phillipp Weigell

.

Origins of the European Talent Project: Evolution to December 2012

Initial plans

Hungary identified talent support as a focus during its EU Presidency, in the first half of 2011, citing four objectives:

  • A talent support conference scheduled for April 2011
  • A first European Talent Day to coincide with the conference, initially ‘a Hungarian state initiative…expanding it into a public initiative by 2014’.
  • Talent support to feature in EU strategies and documents, as well as a Non-Legislative Act (NLA). It is not specified whether this should be a regulation, decision, recommendation or opinion. (Under EU legislation the two latter categories have no binding force.)
  • An OMCexpert group on talent support – ie an international group run under the aegis of the Commission.

The Budapest Declaration

The Conference duly took place, producing a Budapest Declaration on Talent Support in which conference participants:

  • ‘Call the European Commission and the European Parliament to make every effort to officially declare the 25th of March the European Day of the Talented and Gifted.’
  • ‘Stress the importance of…benefits and best practices appearing in documents of the European Commission, the European Council and the European Parliament.’
  • ‘Propose to establish a European Talent Resource and Support Centre in Budapest’ to ‘coordinate joint European actions in the field’.
  • ‘Agree to invite stakeholders from every country of the European Union to convene annually to discuss the developments and current questions in talent support. Upon the invitation of the Government of Poland the next conference will take place in Warsaw.’

The possibility of siting a European Centre anywhere other than Budapest was not seriously debated.

 .

Evolution of a Written Declaration to the EU

Following the Conference an outline Draft Resolution of the European Parliament was circulated for comment.

This proposed that:

 ‘A Europe-wide talent support network should be formed and supported with an on-line and physical presence to support information-sharing, partnership and collaborations. This network should be open for co-operation with all European talent support efforts, use the expertise and networking experiences of existing multinational bodies such as the European Council of High Ability and support both national and multinational efforts to help talents not duplicating existing efforts but providing an added European value.’

Moreover, ‘A European Talent Support Centre should be established…in Budapest’. This:

‘…should have an Advisory Board having the representatives of interested EU member states, all-European talent support-related institutions as well as key figures of European talent support.’

The Centre’s functions are five-fold:

‘Using the minimum bureaucracy and maximising its use of online solutions the European Talent Support Centre should:

  • facilitate the development and dissemination of best curricular and extra-curricular talent support practices;
  • coordinate the trans-national cooperation of Talent Points forming an EU Talent Point network;
  • help  the spread of the know-how of successful organization of Talent Days;
  • organize annual EU talent support conferences in different EU member states overseeing the progress of cooperation in European talent support;
  • provide a continuously updated easy Internet access for all the above information.’

Note the references on the one hand to an inclusive approach, a substantial advisory group (though without the status of an EU-hosted OMC expert group) and a facilitating/co-ordinating role, but also – on the other hand – the direct organisation of annual EU-wide conferences and provision of a sophisticated supporting online environment.

MEPs were lined up to submit the Resolution in Autumn 2011 but, for whatever reason, this did not happen.

Instead a new draft Written Declaration was circulated in January 2012. This called on:

  •  Member States to consider measures helping curricular and extracurricular forms of talent support including the training of educational professionals to recognize and help talent;
  • The Commission to consider talent support as a priority of future European strategies, such as the European Research Area and the European Social Fund;
  • Member States and the Commission to support the development of a Europe-wide talent support network, formed by talent support communities, Talent Points and European Talent Centres facilitating cooperation, development and dissemination of best talent support practices;
  • Member States and the Commission to celebrate the European Day of the Talented and Gifted.’

The focus has shifted from the Budapest-centric network to EU-led activity amongst member states collectively. Indeed, no specific role for Hungary is mentioned.

There is a new emphasis on professional development and – critically – a reference to ‘European talent centres’. All mention of NLAs and OMC expert groups has disappeared.

There followed an unexplained 11-month delay before a Final Written Declaration was submitted by four MEPs in November 2012.

 .

The 2012 Written Declaration 

There are some subtle adjustments in the final version of WD 0034/2012. The second bullet point has become:

  • ‘The Commission to consider talent support as part of ‘non-formal learning’ and a priority in future European strategies, such as the strategies guiding the European Research Area and the European Social Fund’.

While the third now says:

  • ‘Member States and the Commission to support the development of a Europe-wide talent support network bringing together talent support communities, Talent Points and European Talent Centres in order to facilitate cooperation and the development and dissemination of the best talent support practices.’

And the fourth is revised to:

  • ‘Member States and the Commission to celebrate the European Day of Highly Able People.’

The introduction of a phrase that distinguishes between education and talent support is curious.

CEDEFOP – which operates a European Inventory on Validation of Non-formal and Informal Learning – defines the latter as:

‘…learning resulting from daily work-related, family or leisure activities. It is not organised or structured (in terms of objectives, time or learning support). Informal learning is in most cases unintentional from the learner’s perspective. It typically does not lead to certification.’

One assumes that a distinction is being attempted between learning organised by a school or other formal education setting and that which takes place elsewhere – presumably because EU member states are so fiercely protective of their independence when it comes to compulsory education.

But surely talent support encompasses formal and informal learning alike?

Moreover, the adoption of this terminology appears to rule out any provision that is ‘organised or structured’, excluding huge swathes of activity (including much of that featured in the Hungarian programme). Surely this cannot have been intentional.

Such a distinction is increasingly anachronistic, especially in the case of gifted learners, who might be expected to access their learning from a far richer blend of sources than simply in-school classroom teaching.

Their schools are no longer the sole providers of gifted education, but facilitators and co-ordinators of diverse learning streams.

The ‘gifted and talented’ terminology has also disappeared, presumably on the grounds that it would risk frightening the EU horses.

Both of these adjustments seem to have been a temporary aberration. One wonders who exactly they were designed to accommodate and whether they were really necessary.

 .

Establishment and early activity of the EU Talent Centre in Budapest

The Budapest centre was initially scheduled to launch in February 2012, but funding issues delayed this, first until May and then the end of June.

The press release marking the launch described the long-term goal of the Centre as:

‘…to contribute on the basis of the success of the Hungarian co-operation model to organising the European talent support actors into an open and flexible network overarching the countries of Europe.’

Its mission is to:

‘…offer the organisations and individuals active in an isolated, latent form or in a minor network a framework structure and an opportunity to work together to achieve the following:

  • to provide talent support an emphasis commensurate with its importance in every European country
  • to reduce talent loss to the minimum in Europe,
  • to give talent support a priority role in the transformation of the sector of education; to provide talented young persons access to the most adequate forms of education in every Member State,
  • to make Europe attractive for the talented youth,
  • to create talent-friendly societies in every European country.’

The text continues:

‘It is particularly important that network hubs setting targets similar to those of the European Talent Centre in Budapest should proliferate in the longer term.

The first six months represent the first phase of the work: we shall lay the bases [sic] for establishing the European Talent Support Network. The expected key result is to set up a team of voluntary experts from all over Europe who will contribute to that work and help draw the European talent map.

But what exactly are these so-called network hubs? We had to wait some time for an explanation.

There was relatively little material on the website at this stage and this was also slow to change.

My December 2012 post summarised progress thus:

‘The Talent Map includes only a handful of links, none in the UK.

The page of useful links is extensive but basically just a very long list, hard to navigate and not very user-friendly. Conversely, ‘best practices’ contains only three resources, all of them produced in house.

The whole design is rather complex and cluttered, several of the pages are too text-heavy and occasionally the English leaves something to be desired.’

 

Here ends the first part of this post. Part Twoexplains the subsequent development of the ‘network hubs’ concept, charts the continuation of the advocacy effort  and reviews progress in delivering the services for which the Budapest Centre is  responsible.

It concludes with an overall assessment of the initiative highlighting some of its key weaknesses.

GP

March 2014

How Well Does Gifted Education Use Social Media?

.

This post reviews the scope and quality of gifted education coverage across selected social media.

It uses this evidence base to reflect on progress in the 18 months since I last visited this topic and to establish a benchmark against which to judge future progress.

tree-240470_640More specifically, it:

  • Proposes two sets of quality criteria – one for blogs and other websites, the other for effective use of social media;
  • Reviews gifted education-related social media activity:

By a sample of six key players  – the World Council (WCGTC) and the European Council for High Ability (ECHA), NAGC and SENG in the United States and NACE and Potential Plus UK over here

Across the Blogosphere and five of the most influential English language social media platforms – Facebook, Google+, LinkedIn, Twitter and You Tube and

Utilising four content curation tools particularly favoured by gifted educators, namely PaperLi, Pinterest, ScoopIt and Storify.

  • Considers the gap between current practice and the proposed quality criteria – and whether there has been an improvement in the application of social media across the five dimensions of gifted education identified in my previous post.

I should declare at the outset that I am a Trustee of Potential Plus UK and have been working with them to improve their online and social media presence. This post lies outside that project, but some of the underlying research is the same.

.

I have been this way before

This is my second excursion into this territory.

In September 2012 I published a two-part response to the question ‘Can Social Media Help Overcome the Problems We Face in Gifted Education?’

  • Part One outlined an analytical framework based on five dimensions of gifted education. Each dimension is stereotypically associated with a particular stakeholder group though, in reality, each group operates across more than one area. The dimensions (with their associated stakeholder groups in brackets) are: advocacy (parents); learning (learners); policy-making (policy makers); professional development (educators); and research (academics).
  • Part Two used this framework to review the challenges faced by gifted education, to what extent these were being addressed through social media and how social media could be applied more effectively to tackle them. It also outlined the limitations of a social media-driven approach and highlighted some barriers to progress.

The conclusions I reached might be summarised as follows:

  • Many of the problems associated with gifted education are longstanding and significant, but not insurmountable. Social media will not eradicate these problems but can make a valuable contribution towards that end by virtue of their unrivalled capacity to ‘only connect’.
  • Gifted education needs to adapt if it is to thrive in a globalised environment with an increasingly significant online dimension driven by a proliferation of social media. The transition from early adoption to mainstream practice has not yet been effected, but rapid acceleration is necessary otherwise gifted education will be left behind.
  • Gifted education is potentially well-placed to pioneer new developments in social media but there is limited awareness of this opportunity, or the benefits it could bring.

The post was intended to inform discussion at a Symposium at the ECHA Conference in Munster, Germany in September 2012. I published the participants’ presentations and a report on proceedings (which is embedded within a review of the Conference as a whole).

.

Defining quality

I have not previously attempted to pin down what constitutes a high quality website or blog and effective social media usage, not least because so many have gone before me.

But, on reviewing their efforts, I could find none that embodied every dimension I considered important, while several appeared unduly restrictive.

It seems virtually impossible to reconcile these two conflicting pressures, defining quality with brevity but without compromising flexibility. Any effort to pin down quality risks reductionism while also fettering innovation and wilfully obstructing the pioneering spirit.

I am a strong advocate of quality standards in gifted education but, in this context, it seemed beyond my capacity to find or generate the ideal ‘flexible framework’, offering clear guidance without compromising innovation and capacity to respond to widely varying needs and circumstances.

But the project for Potential Plus UK required us to consult stakeholders on their understanding of quality provision, so that we could reconcile any difference between their perceptions and our own.

And, in order to consult effectively, we needed to make a decent stab at the task ourselves.

So I prepared some draft success criteria, drawing on previous efforts I could find online as well as my own experience over the last four years.

I have reproduced the draft criteria below, with slight amendment to make them more universally applicable. The first set – for a blog or website – are generic, while those relating to wider online and social media presence are made specific to gifted education.

.

Draft Quality Criteria for a Blog or Website

1.    The site is inviting to regular and new readers alike; its purpose is up front and explicit; as much content as possible is accessible to all.

 2.    Readers are encouraged to interact with the content through a variety of routes – and to contribute their own (moderated) content.

3.    The structure is logical and as simple as possible, supported by clear signposting and search.

 4.    The design is contemporary, visually attractive but not obtrusive, incorporating consistent branding and a complementary colour scheme. There is no external advertising.

 5.    The layout makes generous and judicious use of space and images – and employs other media where appropriate.

 6.    Text is presented in small blocks and large fonts to ensure readability on both tablet and PC.

 7.    Content is substantial, diverse and includes material relevant to all the site’s key audiences.

 8.    New content is added weekly; older material is frequently archived (but remains accessible).

 9.    The site links consistently to – and is linked to consistently by – all other online and social media outlets maintained by the authors.

 10. Readers can access site content by multiple routes, including other social media, RSS and email.

.

Draft quality criteria for wider online/social media activity

1.    A body’s online and social media presence should be integral to its wider communications strategy which should, in turn, support its purpose, objectives and priorities.

 2.    It should:

 a.    Support existing users – whether they are learners, parents/carers, educators, policy-makers or academics – and help to attract new users;

 b.    Raise the entity’s profile and build its reputation – both nationally and internationally – as a first-rate provider in one or more of the five areas of gifted education;

 c.    Raise the profile of gifted education as an  issue and support  campaigning for stronger provision;

 d.    Help to generate income to support the pursuit of these objectives and the body’s continued existence.

3.    It should aim to:

 a.    Provide a consistently higher quality and more compelling service than its main competitors, generating maximum benefit for minimum cost.

 b.    Use social media to strengthen interaction with and between users and provide more effective ‘bottom-up’ collaborative support.

 c.    Balance diversity and reach against manageability and effectiveness, prioritising media favoured by users but resisting pressure to diversify without justification and resource.

 d.    Keep the body’s online presence coherent and uncomplicated, with clear and consistent signposting so users can navigate quickly and easily between different online locations.

e.    Integrate all elements of the body’s online presence, ensuring they are mutually supportive.

 4.    It should monitor carefully the preferences of users, as well as the development of online and social media services, adjusting the approach only when there is a proven business case for doing so.

.

P1010262-001

Perth Pelicans by Gifted Phoenix

.

Applying the Criteria

These draft criteria reflect the compromise I outlined above. They are not the final word. I hope that you will help us to refine them as part of the consultation process now underway and I cannot emphasise too much that they are intended as guidelines, to be applied with some discretion.

I continue to maintain my inalienable right – as well as yours – to break any rules imposed by self-appointed arbiters of quality.

To give an example, readers will know that I am particularly exercised by any suggestion that good blog posts are, by definition, brief!

I also maintain your inalienable right to impose your own personal tastes and preferences alongside (or in place of) these criteria. But you might prefer to do so having reflected on the criteria – and having dismissed them for logical reasons.

There are also some fairly obvious limitations to these criteria.

For example, bloggers like me who use hosted platforms are constrained to some extent by the restrictions imposed by the host, as well as by our preparedness to pay for premium features.

Moreover, the elements of effective online and social media practice have been developed with a not-for-profit charity in mind and some in particular may not apply – or may not apply so rigorously – to other kinds of organisations, or to individuals engaged in similar activity.

In short, these are not templates to be followed slavishly, but rather a basis for reviewing existing provision and prompting discussion about how it might be further improved.

It would be forward of me to attempt a rigorous scrutiny against each of the criteria of the six key players mentioned above, or of any of the host of smaller players, including the 36 active gifted education blogs now listed on my blogroll.

I will confine myself instead to reporting factually all that I can find in the public domain about the activity of the six bodies, comparing and contrasting their approaches with broad reference to the criteria and arriving at an overall impressionistic judgement.

As for the blogs, I will be even more tactful, pointing out that my own quick and dirty self-review of this one – allocating a score out of ten for each of the ten items in the first set of criteria – generated a not very impressive 62%.

Of course I am biased. I still think my blog is better than yours, but now I have some useful pointers to how I might make it even better!

.

Comparing six major players

I wanted to compare the social media profile of the most prominent international organisations, the most active national organisations based in the US (which remains the dominant country in gifted education and in supporting gifted education online) and the two major national organisations in the UK.

I could have widened my reach to include many similar organisations around the world but that would have made this post more inaccessible. It also struck me that I could evidence my key messages by analysis of this small sample alone – and that my conclusions would be equally applicable to others in the field, wherever they are located geographically.

My analysis focuses on these organisations’:

  • Principal websites, including any information they contain about their wider online and social media activity;
  • Profile across the five selected social media platforms and use of blogs plus the four featured curational tools.

I have confined myself to universally accessible material, since several of these organisations have additional material available only to their memberships.

I have included only what I understand to be official channels, tied explicitly to the main organisation. I have included accounts that are linked to franchised operations – typically conferences – but have excluded personal accounts that belong to individual employees or trustees of the organisations in question.

Table 1 below shows which of the six organisations are using which social media. The table includes hyperlinks to the principal accounts and I have also repeated these in the commentary that follows.

.

Table 1: The social media used by the sample of six organisations

WCGTC ECHA SENG NAGC PPUK NACE
Blog No No [Yes] No No No
Facebook Yes Yes Yes Yes Yes No
Google+ Yes No Yes No Yes Yes
LinkedIn Yes No Yes No Yes No
Twitter Yes No Yes Yes Yes Yes
You Tube Yes No Yes Yes No Yes
PaperLi Yes No No No No No
Pinterest No No No Yes Yes No
ScoopIt No No No No No No
Storify No No No Yes No No

.

The table gives no information about the level or quality of activity on each account – that will be addressed in the commentary below – but it gives a broadly reliable indication of which organisations are comparatively active in social media and which are less so.

The analysis shows that Facebook and Twitter are somewhat more popular platforms than Google+, LinkedIn and You Tube, while Pinterest leads the way amongst the curational tools. This distribution of activity is broadly representative of the wider gifted education community.

The next section takes a closer look at this wider activity on each of the ten platforms and tools.

.

Comparing gifted-related activity on the ten selected platforms and tools

 .

Blogs

As far as I can establish, none of the six organisations currently maintains a blog. SENG does have what it describes as a Library of Articles, which is a blog to all intents and purposes – and Potential Plus UK is currently planning a blog.

Earlier this year I noticed that my blogroll was extremely out of date and that several of the blogs it contained were no longer active. I reviewed all the blogs I could find in the field and sought recommendations from others.

I imposed a rule to distinguish live blogs from those that are dead or dormant – they had to have published three or more relevant posts in the previous six months.

I also applied a slightly more subjective rule, in an effort to sift out those that had little relevance to anyone beyond the author (being cathartic diaries of sorts) and those that are entirely devoted to servicing a small local advocacy group.

I ended up with a long shortlist of 36 blogs, which now constitutes the revised blogroll in the right hand column.  Most are written in English but I have also included a couple of particularly active blogs in other languages.

The overall number of active blogs is broadly comparable with what I remember in 2010 when I first began, but the number of posts has probably fallen.

I don’t know to what extent this reflects changes in the overall number of active blogs and posts, either generically or in the field of education. In England there has been a marked renaissance in edublogging over the last twelve months, yet only three bloggers venture regularly into the territory of gifted education.

.

Facebook

Alongside Twitter, Facebook has the most active gifted education community.

There are dozens of Facebook Groups focused on giftedness and high ability. At the time of writing, the largest and most active are:

The Facebook Pages with the most ‘likes’ have been established by bodies located in the United States. The most favoured include:

There is a Gifted Phoenix page, which is rigged up to my Twitter account so all my tweets are relayed there. Only those with a relevant hashtag – #gtchat or #gtvoice – will be relevant to gifted education.

.

Google+

To date there is comparatively little activity on Google+, though many have established an initial foothold there.

Part of the problem is lack of familiarity with the platform, but another obstacle is the limited capacity to connect other parts of one’s social media footprint with one’s Google+ presence.

There is only one Google+ Community to speak of: ‘Gifted and Talented’ currently with 134 members.

A search reveals a large number of people and pages ostensibly relevant to gifted education, but few are useful and many are dormant.

Amongst the early adopters are:

My own Google+ page is dormant. It should now be possible to have WordPress.com blogposts appear automatically on a Google+ page, but the service seems unreliable. There is no capacity to link Twitter and Google+ in this fashion. I am waiting on Google to improve the connectivity of their service.

.

LinkedIn

LinkedIn is also comparatively little used by the gifted education community. There are several groups:

But none is particularly active, despite the rather impressive numbers above. Similarly, a handful of organisations have company pages on LinkedIn, but only one or two are active.

The search purports to include a staggering 98,360 people who mention ‘gifted’ in their profiles, but basic account holders can only see 100 results at a time.

My own LinkedIn page is registered under my real name rather than my social media pseudonym and is focused principally on my consultancy activity. I often forget it exists.

 .

Twitter

By comparison, Twitter is much more lively.

My brief January post mentioned my Twitter list containing every user I could find who mentions gifted education (or a similar term, whether in English or a selection of other languages) in their profile.

The list currently contains 1,263 feeds. You are welcome to subscribe to it. If you want to see it in action first, it is embedded in the right-hand column of this Blog, just beneath the blogroll.

The majority of the gifted-related activity on Twitter takes place under the #gtchat hashtag, which tends to be busier than even the most popular Facebook pages.

This hashtag also accommodates an hour long real-time chat every Friday (at around midnight UK time) and at least once a month on Sundays, at a time more conducive to European participants.

Other hashtags carrying information about gifted education include: #gtvoice (UK-relevant), #gtie (Ireland-relevant), #hoogbegaafd (Dutch-speaking); #altascapacidades (Spanish-speaking), #nagc and #gifteded.

Chats also take place on the #gtie and #nagc hashtags, though the latter may now be discontinued.

Several feeds provide gifted-relevant news and updates from around the world. Amongst the most followed are:

  • NAGC (4,240 followers)
  • SENG (2,709 followers)

Not forgetting Gifted Phoenix (5,008 followers) who publishes gifted-relevant material under the #gtchat (globally relevant material) and #gtvoice (UK-relevant material) hashtags.

.

Twitter network 2014 Capture

Map of Gifted Phoenix’s Twitter Followers March 2014

.

You Tube

You Tube is of course primarily an audio-visual channel, so it tends to be used to store public presentations and commercials.

A search on ‘gifted education’ generates some 318,000 results including 167,000 videos and 123,000 channels, but it is hard to see the wood for the trees.

The most viewed videos and the most used channels are an eclectic mix and vary tremendously in quality.

Honourable mention should be made of:

The most viewed video is called ‘Top 10 Myths in Gifted Education’, a dramatised presentation which was uploaded in March 2010 by the Gifted and Talented Association of Montgomery County. This has had almost 70,000 views.

Gifted Phoenix does not have a You Tube presence.

.

Paper.li

Paper.li describes itself as ‘a content curation service’ which ‘enables people to publish newspapers based on topics they like and treat their readers to fresh news, daily.’

It enables curators to draw on material from Facebook, Twitter, Google+, embeddable You Tube videos and websites via RSS feeds.

In September 2013 it reported 3.7m users each month.

I found six gifted-relevant ‘papers’ with over 1,000 subscriptions:

There is, as yet, no Gifted Phoenix presence on paper.li, though I have been minded for some months to give it a try.

.

Pinterest

Pinterest is built around a pinboard concept.  Pins are illustrated bookmarks designating something found online or already on Pinterest, while Boards are used to organise a collection of pins. Users can follow each other and others’ boards.

Pinterest is said to have 70 million users, of which 80% are female.

A search on ‘gifted education’ reveals hundreds of boards dedicated to the topic, but unfortunately there is no obvious way to rank them by number of followers or number of pins.

Since advanced search capability is conspicuous by its absence, the user apparently has little choice but to sift laboriously through each board. I have not undertaken this task so I can bring you no useful information about the most used and most popular boards.

Judging by the names attached to these boards, they are owned almost exclusively by women. It is interesting to hypothesise about what causes this gender imbalance – and whether Pinterest is actively pursuing female users at the expense of males.

There are, however, some organisations in the field making active use of Pinterest. A search of ‘pinners’ suggests that amongst the most popular are:

  • IAGC Gifted which has 26 boards, 734 pins and 400 followers.

Gifted Phoenix is male and does not have a presence on Pinterest…yet!

 .

Scoop.it

Scoop.it stores material on a page somewhere between a paper.li-style newspaper and a Pinterest-style board. It is reported to have almost seven million unique visitors each month.

‘Scoopable’ material is drawn together via URLs, a programmable ‘suggestions engine’ and other social media, including all the ‘big four’. The free version permits a user to link only two social media accounts however, putting significant restrictions on Scoop.it’s curational capacity.

Scoop.it also has limited search engine capability. It is straightforward to conduct an elementary search like this one on ‘gifted’ which reveals 107 users.

There is no quick way of finding those pages that are most used or most followed, but one can hover over the search results for topics to find out which have most views:

Gifted Phoenix has a Scoop.it topic which is still very much a work in progress.

.

Storify

Storify is a slightly different animal to the other three tools. It describes itself as:

‘the leading social storytelling platform, enabling users to easily collect tweets, photos, videos and media from across the web to create stories that can be embedded on any website.  With Storify, anyone can curate stories from the social web to embed on their own site and share on the Storify platform.’

Estimates of user numbers vary but are typically from 850,000 to 1m.

Storify is a flexible tool whose free service permits one to collect material already located on the platform and from a range of other sources including Twitter, Facebook, You Tube, Flickr, Instagram, Google search, Tumblr – or via RSS or URL.

The downside is that there is no way to search within Storify for stories or users, so one cannot provide information about the level of activity or users that it might be helpful to follow.

However, a Google search reveals that users of Storify include:

  • IGGY with 9 followers

These tiny numbers show that Storify has not really taken off as a curational platform in its own right, though it is an excellent supporting tool, particularly for recording transcripts of Twitter chats.

Gifted Phoenix has a Storify profile and uses the service occasionally.

 .

The Cold Shoulder in Perth Zoo by Gifted Phoenix

The Cold Shoulder in Perth Zoo by Gifted Phoenix

.

Comparing the six organisations

So, having reviewed wider gifted education-related activity on these ten social media platforms and tools, it is time to revisit the online and social media profile of the six selected organisations.

.

World Council

The WCGTC website was revised in 2012 and has a clear and contemporary design.

The Council’s Mission Statement has a strong networking feel to it and elsewhere the website emphasises the networking benefits associated with membership:

‘…But while we’re known for our biennial conference the spirit of sharing actually goes on year round among our membership.

By joining the World Council you can become part of this vital network and have access to hundreds of other peers while learning about the latest developments in the field of gifted children.’

The home page includes direct links to the organisation’s Facebook Page and Twitter feed. There is also an RSS feed symbol but it is not active.

Both Twitter and Facebook are of course available to members and non-members alike.

At the time of writing, the Facebook page has 1,616 ‘likes’ and is relatively current, with five posts in the last month, though there is relatively little comment on these.

The Twitter feed typically manages a daily Tweet. Hashtags are infrequently if ever employed. At the time of writing the feed has 1,076 followers.

Almost all the Tweets are links to a daily paper.li production ‘WCGTC Daily’ which was first published in late July 2013, just before the last biennial conference. This has 376 subscribers at the present time, although the gifted education coverage is selective and limited.

However, the Council’s most recent biennial conference was unusual in making extensive use of social media. It placed photographs on Flickr, videos of keynotes on YouTube and podcasts of keynotes on Mixlr.

There was also a Blog – International Year of Giftedness and Creativity – which was busy in the weeks immediately preceding the Conference, but has not been active since.

There are early signs that the 2015 Conference will also make strong use of social media. In addition to its own website, it already has its own presence on Twitter and Facebook.

One of the strands of the 2015 Conference is:

‘Online collaboration

  • Setting the stage for future sharing of information
  • E-networking
  • E-learning options’

And one of the sponsors is a social media company.

As noted above, the World Council website provides links to two of its six strands of social media activity, but not the remaining four. It is not yet serving as an effective hub for the full range of this activity.

Some of the strands link together well – eg Twitter to paper.li – but there is considerable scope to improve the incidence and frequency of cross-referencing.

.

ECHA

Of the six organisations in this sample, ECHA is comfortably the least active in social media with only a Facebook page available to supplement its website.

The site itself is rather old-fashioned and could do with a refresh. It includes a section ‘Introducing ECHA’ which emphasises the organisation’s networking role:

‘The major goal of ECHA is to act as a communications network to promote the exchange of information among people interested in high ability – educators, researchers, psychologists, parents and the highly able themselves. As the ECHA network grows, provision for highly able people improves and these improvements are beneficial to all members of society.’

This is reinforced in a parallel Message from the President.

There is no reference on the website to the Facebook group which is closed, but not confined solely to ECHA members. There are currently 191 members. The group is fairly active, but does not rival those with far more members listed above.

There’s not much evidence of cross-reference between the Facebook group and the website, but that may be because the website is infrequently updated.

As with the World Council, ECHA conferences have their own social media profile.

At the 2012 Conference in In Munster this was left largely to the delegates. Several of us live Tweeted the event.

I blogged about the Conference and my part in it, providing links to transcripts of the Twitter record. The post concluded with a series of learning points for this year’s ECHA Conference in Slovenia.

The Conference website explains that the theme of the 2014 event is ‘Rethinking Giftedness: Giftedness in the Digital Age’.

Six months ahead of the event, there is a Twitter feed with 29 followers that has been dormant for three months at the time of writing and a LinkedIn group with 47 members that has been quiet for five months.

A Forum was also established which has not been used for over a year. There is no information on the website about how the event will be supported by social media.

I sincerely hope that my low expectations will not be fulfilled!

.

SENG

SENG is far more active across social media. Its website carries a 2012 copyright notice and has a more contemporary feel than many of the others in this sample.

The bottom of the home page extends an invitation to ‘connect with the SENG community’ and carries links to Facebook, Twitter and LinkedIn (though not to Google+ or You Tube).

In addition, each page carries a set of buttons to support the sharing of this information across a wide range of social media.

The organisation’s Strategic Plan 2012-2017 makes only fleeting reference to social media, in relation to creating a ‘SENG Liaison Facebook page’ to support inter-state and international support.

It does, however, devote one of its nine goals to the further development of its webinar programme (each costs $40 to access or $40 to purchase a recording for non-participants).

SENG offers online parent support groups but does not state which platform is used to host these. It has a Technology/Social Media Committee but its proceedings are not openly available.

Reference has already been made above to the principal Facebook Page which is popular, featuring posts on most days and a fair amount of interaction from readers.

The parallel group for SENG Liaisons is also in place, but is closed to outsiders, which rather seems to defeat the object.

The SENG Twitter feed is relatively well followed and active on most days. The LinkedIn page is somewhat less active but can boast 142 followers while Google+ is clearly a new addition to the fold.

The You Tube channel has 257 subscribers however and carries 16 videos, most of them featuring presentations by James Webb. Rather strangely, these don’t seem to feature in the media library carried by the website.

SENG is largely a voluntary organisation with little staff resource, but it is successfully using social media to extend its footprint and global influence. There is, however, scope to improve coherence and co-ordination.

.

National Association for Gifted Children

The NAGC’s website is also in some need of refreshment. Its copyright notice dates from 2008, which was probably when it was designed.

There are no links to social media on the home page but ‘NAGC at a glance’ carries a direct link to the Facebook group and a Twitter logo without a link, while the page listing NAGC staff has working links to both Facebook and Twitter.

In the past, NAGC has been more active in this field.

There was for a time a Parenting High Potential Blog but the site is now marked private.

NAGC’s Storify account contains the transcripts of 6 Twitter chats conducted under the hashtag #nagcchat between June and August 2012. These were hosted by NAGC’s Parent Outreach Specialist.

But, by November 2012 I was tweeting:

.

.

And in February 2013:

.

.

This post was filled by July 2013. The postholder seems to have been concentrating primarily on editing the magazine edition of Parenting High Potential, which is confined to members only (but also has a Facebook presence – see below).

NAGC’s website carries a document called ‘NAGC leadership initiatives 2013-14’ which suggests further developments in the next few months.

The initiatives include:

‘Leverage content to intentionally connect NAGC resources, products and programs to targeted audiences through an organization-wide social media strategy.’

and

‘Implement a new website and membership database that integrates with social media and provides a state-of-the-art user interface.’

One might expect NAGC to build on its current social media profile which features:

  • A Facebook Group which currently has 2,420 members and is reasonably active, though not markedly so. Relatively few posts generate significant comments.
  • A Twitter feed boasting an impressive 4,287 followers. Tweets are published on a fairly regular basis

There is additional activity associated with the Annual NAGC Convention. There was extensive live Tweeting from the 2013 Convention under the rival hashtags #NAGC2013 and #NAGC13. #NAGC14 looks the favourite for this year’s Convention which has also established a Facebook presence

NAGC also has its own networks. The website lists 15 of these but hardly any of their pages give details of their social media activity. A cursory review reveals that:

Overall, NAGC has a fairly impressive array of social media activity but demonstrates relatively little evidence of strategic coherence and co-ordination. This may be expected to improve in the next six months, however.

.

NACE

NACE is not quite the poorest performer in our sample but, like ECHA, it has so far made relatively little progress towards effective engagement with social media.

Its website dates from 2010 but looks older. Prominent links to Twitter and Facebook appear on the front page as well as – joy of joys – an RSS feed.

However, the Facebook link is not to a NACE-specific page or group and the RSS feed doesn’t work.

There are references on the website to the networking benefits of NACE membership, but not to any role for the organisation in wider networking activity via social media. Current efforts seem focused primarily on advertising NACE and its services to prospective members and purchasers.

The Twitter feed has a respectable 1,426 followers but Tweets tend to appear in blocks of three or four spaced a few days apart. Quality and relevance are variable.

The Google+ page and You Tube channel contain the same two resources, posted last November.

There is much room for improvement.

.

Potential Plus UK

All of which brings us back to Potential Plus and the work I have been supporting to strengthen its online and social media presence.

.

Current Profile

Potential Plus’s current social media profile is respectably diverse but somewhat lacking in coherence.

The website is old-fashioned. There is a working link to Facebook on the home page, but this takes readers to the old NAGC Britain page which is no longer used, rather than directing them to the new Potential Plus UK page.

Whereas the old Facebook page had reached 1,344 likes, the new one is currently at roughly half that level – 683 – but the level of activity is reasonably impressive.

There is a third Facebook page dedicated to the organisation’s ‘It’s Alright to Be Bright’ campaign, which is not quite dormant.

All website pages carry buttons supporting information-sharing via a wide range of social media outlets. But there is little reference in the website content to its wider social media activity.

The Twitter feed is fairly lively, boasting 1,093 followers. It currently has some 400 fewer followers than NACE but has published about 700 more Tweets. Both are publishing at about the same rate. Quality and relevance are similarly variable.

The LinkedIn page is little more than a marker and does not list the products offered.

The Google+ presence uses the former NAGC Britain name and is also no more than a marker.

But the level of activity on Pinterest is more significant. There are 14 boards each containing a total of 271 pins and attracting 26 followers.  This material has been uploaded during 2014.

There is at present no substantive blog activity, although the stub of an old wordpress.com site still exists and there is also a parallel stub of an old wordpress.com children’s area.

There are no links to any of these services from the website – nor do these services link clearly and prominently with each other.

.

Future Strategy

The new wordpress.com test site sets out our plans for Potential Plus UK, which have been shaped in accordance with the two sets of draft success criteria above.

The purpose of the project is to help the organisation to:

  • improve how it communicates and engage with its different audiences clearly and effectively
  • improve support for members and benefit all its stakeholder groups
  • provide a consistently higher quality and more compelling service than its main competitors that generates maximum benefit for minimum cost

Subject to consultation and if all goes well, the outcome will be:

  • A children’s website on wordpress.org
  • A members’ and stakeholders’ website on wordpress.com (which may transfer to wordpress.org in due course)
  • A new forum and a new ‘bottom-up’ approach to support that marries curation and collaboration and
  • A coherent social media strategy that integrates these elements and meets audiences’ needs while remaining manageable for PPUK staff.

You can help us to develop this strategy by responding to the consultation here by Friday 18 April.

.

La Palma Panorama by Gifted Phoenix

La Palma Panorama by Gifted Phoenix

.

Conclusion

.

Gifted Phoenix

I shall begin by reflecting on Gifted Phoenix’s profile across the ten elements included in this analysis:

  • He has what he believes is a reasonable Blog.
  • He is one of the leading authorities on gifted education on Twitter (if not the leading authority).
  • His Facebook profile consists almost exclusively of ‘repeats’ from his Twitter feed.
  • His LinkedIn page reflects a different identity and is not connected properly to the rest of his profile.
  • His Google+ presence is embryonic.
  • He has used Scoop.it and Storify to some extent, but not Paper.li or Pinterest.

GP currently has a rather small social media footprint, since he is concentrating on doing only two things – blogging and microblogging – effectively.

He might be advised to extend his sphere of influence by distributing the limited available human resource more equitably across the range of available media.

On the other hand he is an individual with no organisational objectives to satisfy. Fundamentally he can follow his own preferences and inclinations.

Maybe he should experiment with this post, publishing it as widely as possible and monitoring the impact via his blog analytics…

.

The Six Organisations

There is a strong correlation between the size of each organisation’s social media footprint and the effectiveness with which they use social media.

There are no obvious examples – in this sample at least – of organisations that have a small footprint because of a deliberate choice to specialise in a narrow range of media.

If we were to rank the six in order of effectiveness, the World Council, NAGC and SENG would be vying for top place, while ECHA and NACE would be competing for bottom place and Potential Plus UK would be somewhere in the middle.

But none of the six organisations would achieve more than a moderate assessment against the two sets of quality criteria. All of them have huge scope for improvement.

Their priorities will vary, according to what is set out in their underlying social media strategies. (If they have no social media strategy, the obvious priority is to develop one, or to revise it if it is outdated.)

.

The Overall Picture across the Five Aspects of Gifted Education

This analysis has been based on the activities of a small sample of six generalist organisations in the gifted education field, as well as wider activity involving a cross-section of tools and platforms.

It has not considered providers who specialise in one of the five aspects – advocacy, learning, professional development, policy-making and research – or the use being made of specialist social media, such as MOOCs and research tools.

So the judgements that follow are necessarily approximate. But nothing I have seen across the wider spectrum of social media over the past 18 months would seriously call into question the conclusions reached below.

  • Advocacy via social media is slightly stronger than it was in 2012 but there is still much insularity and too little progress has been made towards a joined up global movement. The international organisations remain fundamentally inward-looking and have been unable to offer the leadership and sense of direction required.  The grip of the old guard has been loosened and some of the cliquey atmosphere has dissipated, but academic research remains the dominant culture.
  • Learning via social media remains limited. There are still several niche providers but none has broken through in a global sense. The scope for fruitful partnership between gifted education interests and one or more of the emerging MOOC powerhouses remains unfulfilled. The potential for social media to support coherent and targeted blended learning solutions – and to support collaborative learning amongst gifted learners worldwide – is still largely unexploited.
  • Professional development via social media has been developed at a comparatively modest level by several providers, but the prevailing tendency seems to be to regard this as a ‘cash cow’ generating income to support other activities. There has been negligible progress towards securing the benefits that would accrue from systematic international collaboration.
  • Policy-making via social media is still the poor relation. The significance of policy-making (and of policy makers) within gifted education is little appreciated and little understood. What engagement there is seems focused disproportionately on lobbying politicians, rather than on developing at working level practical solutions to the policy problems that so many countries face in common.
  • Research via social media is negligible. The vast majority of academic researchers in the field are still caught in a 20th Century paradigm built around publication in paywalled journals and a perpetual round of face-to-face conferences. I have not seen any significant examples of collaboration between researchers. A few make a real effort to convey key research findings through social media but most do not. Some of NAGC’s networks are beginning to make progress and the 2013 World Conference went further than any of its predecessors in sharing proceedings with those who could not attend. Now the pressure is on the EU Talent Conference in Budapest and ECHA 2014 in Slovenia to push beyond this new standard.

Overall progress has been limited and rather disappointing. The three conclusions I drew in 2012 remain valid.

In September 2012 I concluded that ‘rapid acceleration is necessary otherwise gifted education will be left behind’. Eighteen months on, there are some indications of slowly gathering speed, but the gap between practice in gifted education and leading practice has widened meanwhile – and the chances of closing it seem increasingly remote.

Back in 2010 and 2011 several of my posts had an optimistic ring. It seemed then that there was an opportunity to ‘only connect’ globally, but also at European level via the EU Talent Centre and in the UK via GT Voice. But both those initiatives are faltering.

My 2012 post also finished on an optimistic note:

‘Moreover, social media can make a substantial and lasting contribution to the scope, value and quality of gifted education, to the benefit of all stakeholders, but ultimately for the collective good of gifted learners.

No, ‘can’ is too cautious, non-assertive, unambitious. Let’s go for WILL instead!’

Now in 2014 I am resigned to the fact that there will be no great leap forward. The very best we can hope for is disjointed incremental improvement achieved through competition rather than collaboration.

I will be doing my best for Potential Plus UK. Now what about you?

.

GP

March 2014

Challenging NAHT’s Commission on Assessment

.

This post reviews the Report of the NAHT’s National Commission on Assessment, published on 13 February 2014.

pencil-145970_640Since I previously subjected the Government’s consultation document on primary assessment and accountability to a forensic examination, I thought it only fair that I should apply the same high standards to this document.

I conclude that the Report is broadly helpful, but there are several internal inconsistencies and a few serious flaws.

Impatient readers may wish to skip the detailed analysis and jump straight to the summary at the end of the post which sets out my reservations in the form of 23 recommendations addressed to the Commission and the NAHT.

.

Other perspectives

Immediate reaction to the Report was almost entirely positive.

The TES included a brief Ministerial statement in its coverage, attributed to Michael Gove:

‘The NAHT’s report gives practical, helpful ideas to schools preparing for the removal of levels. It also encourages them to make the most of the freedom they now have to develop innovative approaches to assessment that meet the needs of pupils and give far more useful information to parents.’

ASCL and ATL both welcomed the Report, as did the National Governors’ Association, though there was no substantive comment from NASUWT or NUT.

The Blogosphere exhibited relatively little interest, although a smattering of posts began to expose some issues:

  • LKMco supported the key recommendations, but wondered whether the Commission might not be guilty of reinventing National Curriculum levels;
  • Mr Thomas Maths was more critical, identifying three key shortcomings, one being the proposed approach to differentiation within assessment;
  • Warwick Mansell, probably because he blogs for NAHT, confined himself largely to summarising the Report, which he found ‘impressive’, though he did raise two key points – the cost of implementing these proposals and how the recommendations relate to the as yet uncertain position of teacher assessment in the Government’s primary assessment and accountability reforms.

All of these points – and others – are fleshed out in the critique below.

.

Background

.

Remit, Membership and Evidence Base

The Commission was first announced in July 2013, when it was described as:

‘a commission of practitioners to shape the future of assessment in a system without levels.’

By September, Lord Sutherland had agreed to Chair the body and its broad remit had been established:

‘To:

  • establish a set of principles to underpin national approaches to assessment and create consistency;
  • identify and highlight examples of good practice; and
  • build confidence in the assessment system by securing the trust and support of officials and inspectors.’

Written evidence was requested by 16 October.

The first meeting took place on 21 October and five more were scheduled before the end of November.

Members’ names were not included at this stage (beyond the fact that NAHT’s President – a Staffordshire primary head – was involved) though membership was now described as ‘drawn from across education’.

Several members had in fact been named in an early October blog post from NAHT and a November press release from the Chartered Institute of Educational Assessors (CIEA) named all but one – NAHT’s Director of Education. This list was confirmed in the published Report.

The Commission had 14 members but only six of them – four primary heads one primary deputy and one secondary deputy – could be described as practitioners.

The others included two NAHT officials in addition to the secretariat, one being General Secretary Russell Hobby, and one from ASCL;  John Dunford, a consultant with several other strings to his bow, one of those being Chairmanship of the CIEA; Gordon Stobart an academic specialist in assessment with a long pedigree in the field; Hilary Emery, the outgoing Chief Executive of the National Children’s Bureau; and Sam Freedman of Teach First.

There were also unnamed observers from DfE, Ofqual and Ofsted.

The Report says the Commission took oral evidence from a wide range of sources. A list of 25 sources is provided but it does not indicate how much of their evidence was written and how much oral.

Three of these sources are bodies represented on the Commission, two of them schools. Overall seven are from schools. One source is Tim Oates, the former Chair of the National Curriculum Review Expert Panel.

The written evidence is not published and I could find only a handful of responses online, from:

Overall one has to say that the response to the call for evidence was rather limited. Nevertheless, it would be helpful for NAHT to publish all the evidence it received. It might be helpful for NAHT to consult formally on key provisions in its Report.

 .

Structure of the Report and Further Stages Proposed

The main body of the Report is sandwiched between a foreword by the Chair and a series of Annexes containing case studies, historical and international background.  This analysis concentrates almost entirely on the main body.

The 21 Recommendations are presented twice, first as a list within the Executive Summary and subsequently interspersed within a thematic commentary that summarises the evidence received and also conveys the Commission’s views.

The Executive Summary also sets out a series of Underpinning Principles for Assessment and a Design Checklist for assessment in schools, the latter accompanied by a set of five explanatory notes.

It offers a slightly different version of the Commission’s Remit:

‘In carrying out its task, the Commission was asked to achieve three distinct elements:

  • A set of agreed principles for good assessment
  • Examples of current best practice in assessment that meet these principles
  • Buy-in to the principles by those who hold schools to account.’

These are markedly less ambitious than their predecessors, having dropped the reference to ‘national approaches’ and any aspiration to secure support from officials and inspectors for anything beyond the Principles.

Significantly, the Report is presented as only the first stage in a longer process, an urgent response to schools’ need for guidance in the short term.

It recommends that further work should comprise:

  • ‘A set of model assessment criteria based on the new National Curriculum.’ (NAHT is called upon to develop and promote these. The text says that a model document is being  commissioned but doesn’t reveal the timescale or who is preparing it);
  • ‘A full model assessment policy and procedures, backed by appropriate professional development’ that would expand upon the Principles and Design Checklist. (NAHT is called upon to take the lead in this, but there is no indication that they plan to do so. No timescale is attached)
  • ‘A system-wide review of assessment’ covering ages 2-19. It is not explicitly stated, but one assumes that this recommendation is directed towards the Government. Again no timescale is attached.

The analysis below looks first at the assessment Principles, then the Design Checklist and finally the recommendations plus associated commentary. It concludes with an overall assessment of the Report as a whole.

.

Assessment Principles

As noted above, it seems that national level commitment is only sought in respect of these Principles, but there is no indication in the Report – or elsewhere for that matter – that DfE, Ofsted and Ofqual have indeed signed up to them.

Certainly the Ministerial statement quoted above stops well short of doing so.

The consultation document on primary assessment and accountability also sought comments on a set of core principles to underpin schools’ curriculum and assessment frameworks. It remains to be seen whether the version set out in the consultation response will match those advanced by the Commission.

The Report recommends that schools should review their own assessment practice against the Principles and Checklist together, and that all schools should have their own clear assessment principles, presumably derived or adjusted in the light of this process.

Many of the principles are unexceptionable, but there are a few interesting features that are directly relevant to the commentary below.

For it is of course critical to the internal coherence of the Report that the Design Checklist and recommendations are entirely consistent with these Principles.

I want to highlight three in particular:

  • ‘Assessment is inclusive of all abilities…Assessment embodies, through objective criteria, a pathway of progress and development for every child…Assessment objectives set high expectations for learners’.

One assumes that ‘abilities’ is intended to stand proxy for both attainment and potential, so that there should be ‘high expectations’ and a ‘pathway of progress and development’ for the lowest and highest attainers alike.

  • ‘Assessment places achievement in context against nationally standardised criteria and expected standards’.

This begs the question whether the ‘model document’ containing assessment criteria commissioned by NAHT will be ‘nationally standardised’ and, if so, what standardisation process will be applied.

  • ‘Assessment is consistent…The results are readily understandable by third parties…A school’s results are capable of comparison with other schools, both locally and nationally’.

The implication behind these statements must be that results of assessment in each school are transparent and comparable through the accountability regime, presumably by means of the performance tables (and the data portal that we expect to be introduced to support them).

This cannot be taken as confined to statutory tests, since the text later points out that:

‘The remit did not extend to KS2 tests, floor standards and other related issues of formal accountability.’

It isn’t clear, from the Principles at least, whether the Commission believes that teacher assessment outcomes should also be comparable. Here, as elsewhere, the Report does a poor job of distinguishing between statutory teacher assessment and assessment internal to the school.

.

Design Checklist.

 

Approach to Assessment and Use of Assessment

The Design Checklist is described as:

‘an evaluation checklist for schools seeking to develop or acquire an assessment system. They could also form the seed of a revised assessment policy.’

It is addressed explicitly to schools and comprises three sections covering, respectively, a school’s approach to assessment, method of assessment and use of assessment.

The middle section is by far the most significant and also the most complex, requiring five explanatory notes.

I have taken the more straightforward first and third sections first.

‘Our approach to assessment’ simply makes the point that assessment is integral to teaching and learning, while also setting expectations for regular, universal professional development and ‘a senior leader who is responsible for assessment’.

It is not clear whether this individual is the same as, or additional to, the ‘trained assessment lead’ mentioned in the Report’s recommendations.

I can find no justification in the Report for the requirement that this person must be a senior leader.

A more flexible approach would be preferable, in which the functions to be undertaken are outlined and schools are given flexibility over how those are distributed between staff. There is more on this below.

The final section ‘Our use of assessment’ refers to staff:

  • Summarising and analysing attainment and progress;
  • Planning pupils’ learning to ensure every pupil meets or exceeds expectations (Either this is a counsel of perfection, or expectations for some learners are pitched below the level required to satisfy the assessment criteria for the subject and year in question. The latter is much more likely, but this is confusing since satisfying the assessment criteria is also described in the Checklist in terms of ‘meeting…expectations’.)
  • Analysing data across the school to ensure all pupils are stretched while the vulnerable and those at risk make appropriate progress (‘appropriate’ is not defined within the Checklist itself but an explanatory note appended to the central section  – see below – glosses this phrase);
  • Communicating assessment information each term to pupils and parents through ‘a structured conversation’ and the provision of ‘rich, qualitative profiles of what has been achieved and indications of what they [ie parents as well as pupils] need to do next’; and
  • Celebrating a broad range of achievements, extending across the full school curriculum and encompassing social, emotional and behavioural development.

.

Method of Assessment: Purposes

‘Our method of assessment’ is by far the longest section, containing 11 separate bullet points. It could be further subdivided for clarity’s sake.

The first three bullets are devoted principally to some purposes of assessment. Some of this material might be placed more logically in the ‘Our Use of Assessment’ section, so that the central section is shortened and restricted to methodology.

The main purpose is stipulated as ‘to help teachers, parents and pupils plan their next steps in learning’.

So the phrasing suggests that assessment should help to drive forward the learning of parents and teachers, as well as to the learning of pupils. I’m not sure if this is deliberate or accidental.

Two subsidiary purposes are mentioned: providing a check on teaching standards and support for their improvement; and providing a comparator with other schools via collaboration and the use of ‘external tests and assessments’.

It is not clear why these three purposes are singled out. There is some overlap with the Principles but also a degree of inconsistency between the two pieces of documentation. It might have been better to cross-reference them more carefully.

In short, the internal logic of the Checklist and its relationship with the Principles could both do with some attention.

The real meat of the section is incorporated in the eight remaining bullet points. The first four are about what pupils are assessed against and when that assessment takes place. The last four explain how assessment judgements are differentiated, evidenced and moderated.

.

Method of Assessment: What Learners Are Assessed Against – and When

The next four bullets specify that learners are to be assessed against ‘assessment criteria which are short, discrete, qualitative and concrete descriptions of what a pupil is expected to know and be able to do.’

These are derived from the school curriculum ‘which is composed of the National Curriculum and our own local design’ (Of course that is not strictly the position in academies, as another section of the Report subsequently points out.)

The criteria ‘for periodic assessment are arranged into a hierarchy setting out what children are normally expected to have mastered by the end of each year’.

Each learner’s achievement ‘is assessed against all the relevant criteria at appropriate times of the school year’.

.

The Span of the Assessment Criteria

The first explanatory note (A) clarifies that the assessment criteria are ‘discrete, tangible descriptive statements of attainment’ derived from ‘the National Curriculum (and any school curricula)’.

There is no repetition of the provision in the Principles that they should be ‘nationally standardised’ but ‘there is little room for meaningful variety’, even though academies are not obliged to follow the National Curriculum and schools have complete flexibility over the remainder of the school curriculum.

The Recommendations have a different emphasis, saying that NAHT’s model criteria should be ‘based on the new National Curriculum’ (Recommendation 6), but the clear impression here is that they will encompass the National Curriculum ‘and any school curricula’ alike.

This inconsistency needs to be resolved. NAHT might be better off confining its model criteria to the National Curriculum only – and making it clear that even these may not be relevant to academies.

.

The Hierarchy of Assessment Criteria

The second explanatory note (B) relates to the arrangement of the assessment criteria

‘…into a hierarchy, setting out what children are normally expected to have mastered by the end of each year’.

This note is rather muddled.

It begins by suggesting that a hierarchy divided chronologically by school year is the most natural choice, because:

‘The curriculum is usually organised into years and terms for planned delivery’

That may be true, but only the Programmes of Study for the three core subjects are organised by year, and each clearly states that:

‘Schools are…only required to teach the relevant programme of study by the end of the key stage. Within each key stage, schools therefore have the flexibility to introduce content earlier or later than set out in the programme of study. In addition, schools can introduce key stage content during an earlier key stage if appropriate.’

All schools – academies and non-academies alike – therefore enjoy considerable flexibility over the distribution of the Programmes of Study between academic years.

(Later in the Report – in the commentary preceding the first six recommendations – the text mistakenly suggests that the entirety of ‘the revised curriculum is presented in a model of year-by-year progress’ (page 14) It does not mention the provision above).

The note goes on to suggest that the Commission has chosen a different route, not because of this flexibility, but because ‘children’s progress may not fit neatly into school years’:

‘…we have chosen the language of a hierarchy of expectations to avoid misunderstandings. Children may be working above or below their school year…’

But this is not an absolute hierarchy of expectations – in the sense that learners are free to progress entirely according to ability (or, more accurately, their prior attainment) rather than in age-related lock steps.

In a true hierarchy of expectations, learners would be able to progress as fast or as slowly as they are able to, within the boundaries set by:

  • On one hand, high expectations, commensurate challenge and progression;
  • On the other hand, protection against excessive pressure and hot-housing and a judicious blending of faster pace with more breadth and depth (of which more below).

This is no more than a hierarchy by school year with some limited flexibility at the margins.

.

The timing of assessment against the criteria

The third explanatory note (C) confirms the Commission’s assumption that formal assessments will be conducted at least termly – and possibly more frequently than that.

It adds:

‘It will take time before schools develop a sense of how many criteria from each year’s expectations are normally met in the autumn, spring and summer terms, and this will also vary by subject’.

This is again unclear. It could mean that a future aspiration is to judge progress termly, by breaking down the assessment criteria still further – so that a learner who met the assessment criteria for, say, the autumn term is deemed to be meeting the criteria for the year as a whole at that point.

Without this additional layer of lock-stepping, presumably the default position for the assessments conducted in the autumn and spring terms is that learners will still be working towards the assessment criteria for the year in question.

The note also mentions in passing that:

‘For some years to come, it will be hard to make predictions from outcomes of these assessments to the results in KS2 tests. Such data may emerge over time, although there are question marks over how reliable predictions may be if schools are using incompatible approaches and applying differing standards of performance and therefore cannot pool data to form large samples.’

This is one of very few places where the Report picks up on the problems that are likely to emerge from the dissonance between internal and external statutory assessment.

But it avoids the central issue, this being that the approach to internal assessment it advocates may not be entirely compatible with predicting future achievement in the KS2 tests. If so, its value is seriously diminished, both for parents and teachers, let alone the learners themselves.  This issue also reappears below.

.

Method of Assessment: How Assessment Judgements are Differentiated, Evidenced and Moderated

The four final bullet points in this section of the Design Checklist explain that all learners will be assessed as either ‘developing’, ‘meeting’, or ‘exceeding’ each relevant criterion for that year’.

Learners deemed to be exceeding the relevant criteria in a subject for a given year ‘will also be assessed against the criteria in that subject for the next year.’

Assessment judgements are supported by evidence comprising observations, records of work and test outcomes and are subject to moderation by teachers in the same school and in other schools to ensure they are fair, reliable and valid.

I will set moderation to one side until later in the post, since that too lies outside the scope of methodology.

.

Differentiation against the hierarchy of assessment criteria

The fourth explanatory note (D) addresses the vexed question of differentiation.

As readers may recall, the Report by the National Curriculum Review Expert Panel failed abjectly to explain how they would provide stretch and challenge in a system that focused exclusively on universal mastery and ‘readiness to progress’, saying only that further work was required to address the issue.

Paragraph 8.21 implied that they favoured what might be termed an ‘enrichment and extension’ model:

‘There are issues regarding ‘stretch and challenge’ for those pupils who, for a particular body of content, grasp material more swiftly than others. There are different responses to this in different national settings, but frequently there is a focus on additional activities that allow greater application and practice, additional topic study within the same area of content, and engagement in demonstration and discussion with others…These systems achieve comparatively low spread at the end of primary education, a factor vital in a high proportion of pupils being well positioned to make good use of more intensive subject-based provision in secondary schooling.’

Meanwhile, something akin to the P Scales might come into play for those children with learning difficulties.

On this latter point, the primary assessment and accountability consultation document said DfE would:

‘…explore whether P-scales should be reviewed so that they align with the revised national curriculum and provide a clear route to progress to higher attainment.’

We do not yet know whether this will happen, but Explanatory Note B to the Design Checklist conveys the clear message that the P-Scales need to be retained:

‘…must ensure we value the progress of children with special needs as much as any other group. The use of P scales here is important to ensure appropriate challenge and progression for pupils with SEN.’

By contrast, for high attainers, the Commission favours what might be called a ‘mildly accelerative’ model whereby learners who ‘exceed’ the assessment criteria applying to a subject for their year group may be given work that enables them to demonstrate progress against the criteria for the year above.

I describe it as mildly accelerative because there is no provision for learners to be assessed more than one year ahead of their chronological year group. This is a fairly low ceiling to impose on such accelerative progress.

It is also unclear whether the NAHT’s model assessment criteria will cover Year 7, the first year of the KS3 Programmes of Study, to enable this provision to extend into Year 6.

The optimal approach for high attainers would combine the ‘enrichment and extension’ approach apparently favoured by the Expert Panel with an accelerative approach that provides a higher ceiling, to accommodate those learners furthest ahead of their peers.

High attaining learners could then access a customised blend of enrichment (more breadth), extension (greater depth) and acceleration (faster pace) according to their needs.

This is good curricular practice and it should be reflected in assessment practice too, otherwise the risk is that a mildly accelerative assessment process will have an undesirable wash-back effect on teaching and learning.

Elsewhere, the Report advocates the important principle that curriculum, assessment and pedagogy should be developed in parallel, otherwise there is a risk that one – typically assessment – has an undesirable effect on the others. This would be an excellent exemplar of that statement.

The judgement whether a learner is exceeding the assessment criteria for their chronological year would be evidenced by enrichment and extension activity as well as by pre-empting the assessment criteria for the year ahead. Exceeding the criteria in terms of greater breadth or more depth should be equally valued.

This more rounded approach, incorporating a higher ceiling, should also be supported by the addition of a fourth ‘far exceeded’ judgement, otherwise the ‘exceeded’ judgement has to cover far too wide a span of attainment, from those who are marginally beyond their peers to those who are streets ahead.

These concerns need urgently to be addressed, before NAHT gets much further with its model criteria.

.

The aggregation of criteria

In order to make the overall judgement for each subject, learners’ performance against individual assessment criteria has to be combined to give an aggregate measure.

The note says:

‘The criteria themselves can be combined to provide the qualitative statement of a pupil’s achievements, although teachers and schools may need a quantitative summary. Few schools appear to favour a pure binary approach of yes/no. The most popular choice seems to be a three phase judgement of working towards (or emerging, developing), meeting (or mastered, confident, secure, expected) and exceeded. Where a student has exceeded a criterion, it may make sense to assess them also against the criteria for the next year.’

This, too, begs some questions. The statement above is consistent with one of the Report’s central recommendations:

‘Pupil progress and achievement should be communicated in terms of descriptive profiles rather than condensed to numerical summaries (although schools may wish to use numerical data for internal purposes).’

Frankly it seems unlikely that such ‘condensed numerical summaries’ can be kept hidden from parents. Indeed, one might argue that they have a reasonable right to know them.

These aggregations – whether qualitative or quantitative – will be differentiated at three levels, according to whether the learner best fits a ‘working towards’, ‘meeting’ or ‘exceeding’ judgement for the criteria relating to the appropriate year in each programme of study.

I have just recommended that there needs to be an additional level at the top end, to remove undesirable ceiling effects that lower expectations and are inconsistent with the Principles set out in the Report. I leave it to others to judge whether, if this was accepted, a fifth level is also required at the lower end to preserve the symmetry of the scale.

There is also a ‘chicken and egg’ issue here. It is not clear whether a learner must already be meeting some of the criteria for the succeeding year in order to show they are exceeding the criteria for their own year – or whether assessment against the criteria for the succeeding year is one potential consequence of a judgement that they are exceeding the criteria for their own year.

This confusion is reinforced by a difference of emphasis between the checklist – which says clearly that learners will be assessed against the criteria for the succeeding year if they exceeded the criteria for their own – and the explanatory note, which says only that this may happen.

Moreover, the note suggests that this applies criterion by criterion – ‘where a student has exceeded a criterion’ – rather than after the criteria have been aggregated, which is the logical assumption from the wording in the checklist – ‘exceeded the relevant criteria’.

This too needs clarifying.

.

.

Recommendations and Commentary

I will try not to repeat in this section material already covered above.

I found that the recommendations did not always sit logically with the preceding commentary, so I have departed from the subsections used in the Report, grouping the material into four broad sections: further methodological issues; in-school and school-to school support; national support; and phased implementation.

Each section leads with the relevant Recommendations and folds in additional relevant material from different sections of the commentary. I have repeated recommendations where they are relevant to more than one section.

.

Further methodological issues

Recommendation 4: Pupils should be assessed against objective criteria rather than ranked against each other

Recommendation 5: Pupil progress and achievements should be communicated in terms of descriptive profiles rather than condensed to numerical summaries (although schools may wish to use numerical data for internal purposes.

Recommendation 6: In respect of the National Curriculum, we believe it is valuable – to aid communication and comparison – for schools to be using consistent criteria for assessment. To this end, we call upon NAHT to develop and promote a set of model assessment criteria based on the new National Curriculum.

The commentary discusses the evolution of National Curriculum levels, including the use of sub-levels and their application to progress as well as achievement. In doing so, it summarises the arguments for and against the retention of levels.

In favour of retention:

  • The system of levels provides a common language used by schools to summarise attainment and progress;
  • It is argued (by some professionals) that parents have grown up with levels and have an adequate grasp of what they mean;
  • The numerical basis of levels was useful to schools in analysing and tracking the performance of large numbers of pupils;
  • The decision to remove levels was unexpected and caused concern within the profession, especially as it was also announced that being ‘secondary ready’ was to be associated with the achievement of Level 4B;
  • If levels are removed, they must be replaced by a different common language, or at least ‘an element of compatibility or common understanding’ should several different assessment systems emerge.

In favour of removal:

  • It is argued (by the Government) that levels are not understood by parents and other stakeholders;
  • The numerical basis of levels does not have the richness of a more rounded description of achievement. The important narrative behind the headline number is often lost through over-simplification.
  • There are adverse effects from labelling learners with levels.

The Commission is also clear that the Government places too great a reliance on tests, particularly for accountability purposes. This has narrowed the curriculum and resulted in ‘teaching to the test’.

It also creates other perverse incentives, including the inflation of assessment outcomes for performance management purposes or, conversely, the deflation of assessment outcomes to increase the rate of progress during the subsequent key stage.

Moreover, curriculum, assessment and pedagogy must be mutually supportive. Although the Government has not allowed the assessment tail to wag the curricular dog:

‘…curriculum and assessment should be developed in tandem.’

Self-evidently, this has not happened, since the National Curriculum was finalised way ahead of the associated assessment arrangements which, in the primary sector, are still unconfirmed.

There is a strong argument that such assessment criteria should have been developed by the Government and made integral to the National Curriculum.

Indeed, in Chapter 7 of its Report on ‘The Framework for the National Curriculum’, the National Curriculum Expert Panel proposed that attainment targets should be retained, not in the form of level descriptors but as ‘statements of specific learning outcomes related to essential knowledge’ that  would be ’both detailed and precise’. They might be presented alongside the Programmes of Study.

The Government ignored this, opting for a very broad single, standard attainment target in each programme of study:

‘By the end of each key stage, pupils are expected to know, apply and understand the matters, skills and processes specified in the relevant programme of study.’

As I pointed out in a previous post, one particularly glaring omission from the Consultation Document on Primary Assessment and Accountability was any explanation of how Key Stage Two tests and statutory teacher assessments would be developed from these singleton ‘lowest common denominator’ attainment targets, especially in a context where academies, while not obliged to follow the National Curriculum, would undertake the associated tests.

We must await the long-delayed response to the consultation to see if it throws any light on this matter.

Will it commit the Government to producing a framework, at least for statutory tests in the core subjects, or will it throw its weight behind the NAHT’s model criteria instead?

I have summarised this section of the Report in some detail as it is the nearest it gets to providing a rational justification for the approach set out in the recommendations above.

The model criteria appear confined to the National Curriculum at this point, though we have already noted that is not the case elsewhere in the Report.

I have also discussed briefly the inconsistency in permitting the translation of descriptive profiles into numerical data ‘for internal purposes’, but undertook to develop that further, for there is a wider case that the Report does not entertain.

We know that there will be scores attached to KS2 tests, since those are needed to inform parents and for accountability purposes.

The Primary Assessment and Accountability consultation document proposed a tripartite approach:

  • Scaled scores to show attainment, built around a new ‘secondary-ready’ standard, broadly comparable with the current Level 4B;
  • Allocation to a decile within the range of scaled scores achieved nationally, showing attainment compared with one’s peers; and
  • Comparison with the average scaled score of those nationally with the same prior attainment at the baseline, to show relative progress.

Crudely speaking, the first of these measures is criterion-referenced while the second and third are norm-referenced.

We do not yet know whether these proposals will proceed – there has been some suggestion that deciles at least will be dropped – but parents will undoubtedly want schools to be able to tell them what scaled scores their children are on target to achieve, and how those compare with the average for those with similar prior attainment.

It will be exceptionally difficult for schools to convey that information within the descriptive profiles, insofar as they relate to English and maths, without adopting the same numerical measures.

It might be more helpful to schools if the NAHT’s recommendations recognised that fact. For the brutal truth is that, if schools’ internal assessment processes do not respond to this need, they will have to set up parallel processes that do so.

In order to derive descriptive profiles, there must be objective assessment criteria that supply the building blocks, hence the first part of Recommendation 4. But I can find nothing in the Report that explains explicitly why pupils cannot also be ranked against each other. This can only be a veiled and unsubstantiated objection to deciles.

Of course it would be quite possible to rank pupils at school level and, in effect, that is what schools will do when they condense the descriptive profiles into numerical summaries.

The real position here is that such rankings would exist, but would not be communicated to parents, for fear of ‘labelling’. But the labelling has already occurred, so the resistance is attributable solely to communicating these numerical outcomes to parents. That is not a sustainable position.

.

In-school and school-to-school support

Recommendation 1: Schools should review their assessment practice against the principles and checklist set out in this report. Staff should be involved in the evaluation of existing practice and the development of a new, rigorous assessment system and procedures to enable the school to promote high quality teaching and learning.

Recommendation 2: All schools should have clear assessment principles and practices to which all staff are committed and which are implemented. These principles should be supported by school governors and accessible to parents, other stakeholders and the wider school community.

Recommendation 3: Assessment should be part of all school development plans and should be reviewed regularly. This review process should involve every school identifying its own learning and development needs for assessment. Schools should allocate specific time and resources for professional development in this area and should monitor how the identified needs are being met.

Recommendation 7 (part): Schools should work in collaboration, for example in clusters, to ensure a consistent approach to assessment. Furthermore, excellent practice in assessment should be identified and publicised…

Recommendation 9: Schools should identify a trained assessment lead, who will work with other local leads and nationally accredited assessment experts on moderation activities.

Recommendation 16: All those responsible for children’s learning should undertake rigorous training in formative, diagnostic and summative assessment, which covers how assessment can be used to support teaching and learning for all pupils, including those with special educational needs. The government should provide support and resources for accredited training for school assessment leads and schools should make assessment training a priority.

Recommendation 20: Schools should be asked to publish their principles of assessment from September 2014, rather than being required to publish a detailed assessment framework, which instead should be published by 2016. The development of the full framework should be outlined in the school development plan with appropriate milestones that allow the school sufficient time to develop an effective model.

All these recommendations are perfectly reasonable in themselves, but it is worth reflecting for a while on the likely cost and workload implications, particularly for smaller primary schools:

Each school must have a ‘trained assessment lead’ who may or may not be the same as the ‘senior leader who is responsible for assessment’ mentioned in the Design Checklist. There is no list of responsibilities for that person, but it would presumably include:

  • Leading the review of assessment practice and developing a new assessment system;
  • Leading the definition of the school’s assessment principles and practices and communicating these to governors, parents, stakeholders and the wider community;
  • Lead responsibility for the coverage of assessment within the school’s development plan and the regular review of that coverage;
  • Leading the identification and monitoring of the school’s learning and development needs for assessment;
  • Ensuring that all staff receive appropriate professional development – including ‘rigorous training in formative diagnostic and summative assessment’;
  • Leading the provision of in-school and school-to-school professional development relating to assessment;
  • Allocating time and resources for all assessment-related professional development and monitoring its impact;
  • Leading collaborative work with other schools to ensure a consistent approach to assessment;
  • Dissemination of effective practice;
  • Working with other local assessment leads and external assessment experts on moderation activities.

And, on top of this, there is a range of unspecified additional responsibilities associated with the statutory tests.

It is highly unlikely that this range of responsibilities could be undertaken effectively by a single person in less than half a day a week, as a bare minimum. There will also be periods of more intense pressure when a substantially larger time allocation is essential.

The corresponding salary cost for a ‘senior leader’ might be £3,000-£4,000 per year, not to mention the cost of undertaking the other responsibilities displaced.

There will also need to be a sizeable school budget and time allocation for staff to undertake reviews, professional development and moderation activities.

Moderation itself will bear a significant cost. Internal moderation may have a bigger opportunity cost but external moderation will otherwise be more expensive.

Explanatory note (E), attached to the Design Checklist, says:

‘The exact form of moderation will vary from school to school and from subject to subject. The majority of moderation (in schools large enough to support it) will be internal but all schools should undertake a proportion of external moderation each year, working with partner schools and local agencies.’

Hence the cost of external moderation will fall disproportionately on smaller schools with smaller budgets.

It would be wrong to suggest that this workload is completely new. To some extent these various responsibilities will be undertaken already, but the Commission’s recommendations are effectively a ratcheting up of the demand on schools.

Rather than insisting on these responsibilities being allocated to a single individual with other senior management responsibilities, it might be preferable to set out the responsibilities in more detail and give schools greater flexibility over how they should be distributed between staff.

Some of these tasks might require senior management input, but others could be handled by other staff, including paraprofessionals.

.

National support

Recommendation 7 (part): Furthermore, excellent practice in assessment should be identified and publicised, with the Department for Education responsible for ensuring that this is undertaken.

Recommendation 8 (part): Schools should be prepared to submit their assessment to external moderators, who should have the right to provide a written report to the head teacher and governors setting out a judgement on the quality and reliability of assessment in the school, on which the school should act. The Commission is of the view that at least some external moderation should be undertaken by moderators with no vested interest in the outcomes of the school’s assessment. This will avoid any conflicts of interest and provide objective scrutiny and broader alignment of standards across schools.

Recommendation 9: Schools should identify a trained assessment lead, who will work with other local leads and nationally accredited assessment experts on moderation activities.

Recommendation 11: The Ofsted school inspection framework should explore whether schools have effective assessment systems in place and consider how effectively schools are using pupil assessment information and data to improve learning in the classroom and at key points of transition between key stages and schools.

Recommendation 14: Further work should be undertaken to improve training for assessment within initial teacher training (ITT), the newly qualified teacher (NQT) induction year and on-going professional development. This will help to build assessment capacity and support a process of continual strengthening of practice within the school system.

Recommendation 15: The Universities’ Council for the Education of Teachers (UCET) should build provision in initial teacher training for delivery of the essential assessment knowledge.

Recommendation 16: All those responsible for children’s learning should undertake rigorous training in formative, diagnostic and summative assessment, which covers how assessment can be used to support teaching and learning for all pupils, including those with special educational needs. The government should provide support and resources for accredited training for school assessment leads and schools should make assessment training a priority.

Recommendation 17: A number of pilot studies should be undertaken to look at the use of information technology (IT) to support and broaden understanding and application of assessment practice.

Recommendation 19: To assist schools in developing a robust framework and language for assessment, we call upon the NAHT to take the lead in expanding the principles and design checklist contained in this report into a full model assessment policy and procedures, backed by appropriate professional development.

There are also several additional proposals in the commentary that do not make it into the formal recommendations:

  • Schools should be held accountable for the quality of their assessment practice as well as their assessment results, with headteachers also appraising teachers on their use of assessment. (The first part of this formulation appears in Recommendation 11 but not the second.) (p17);
  • It could be useful for the teaching standards to reflect further assessment knowledge, skills and understanding (p17);
  • A national standard in assessment practice for teachers would be a useful addition (p18);
  • The Commission also favoured the approach of having a lead assessor to work with each school or possibly a group of schools, helping to embed good practice across the profession (p18).

We need to take stock of the sheer scale of the infrastructure that is being proposed and its likely cost.

In respect of moderation alone, the Report is calling for sufficient external moderators, ‘nationally accredited assessment experts’ and possibly lead assessors to service some 17,000 primary schools.

Even if we assume that these roles are combined in the same person and that each person can service, say, 25 schools, that still demands something approaching a cadre of 700 people who also need to be supported, managed and trained.

If they are serving teachers there is an obvious opportunity cost. Providing a service of this scale would cost tens of millions of pounds a year.

Turning to training and professional development, the Commission is proposing:

  • Accredited training for some 17,000 school assessment leads (with an ongoing requirement to train new appointees and refresh the training of those who undertook it too far in the past);
  • ‘Rigorous training in formative, diagnostic and summative assessment, which covers how assessment can be used to support teaching and learning for all pupils, including those with special educational needs’ for everyone deemed responsible for children’s learning, so not just teachers. This will include hundreds of thousands of people in the primary sector alone.
  • Revitalised coverage of assessment in ITE and induction, on top of the requisite professional development package.

The Report says nothing of the cost of developing, providing and managing this huge training programme, which would cost some more tens of millions of pounds a year.

I am plucking a figure out of the air, but it would be reasonable to suggest that moderation and training costs combined might require an annual budget of some £50 million – and quite possibly double that. 

Unless one argues that the testing regime should be replaced by a national sampling process – and while the Report says some of the Commission’s members supported that, it stops short of recommending it – there are no obvious offsetting savings.

It is disappointing that the Commission made no effort at all to quantify the cost of its proposals.

These recommendations provide an excellent marketing opportunity for some of the bodies represented on the Commission.

For example, the CIEA press release welcoming the Report says:

‘One of the challenges, and one that schools will need to meet, is in working together, and with local and national assessment experts, to moderate their judgements and ensure they are working to common standards across the country. The CIEA has an important role to play in training these experts.’

Responsibility for undertaking pilot studies on the role of IT in assessment is not allocated, but one assumes it would be overseen by central government and also funded by the taxpayer.

Any rollout from the pilots would have additional costs attached and would more than likely create additional demand for professional development.

The reference to DfE taking responsibility for sharing excellent practice is already a commitment in the consultation document:

‘…we will provide examples of good practice which schools may wish to follow. We will work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’ (paragraph 3.8).

Revision of the School Inspection Framework will require schools to give due priority to the quality of their assessment practice, though Ofsted might reasonably argue that it is already there.

Paragraph 116 of the School Inspection Handbook says:

‘Evidence gathered by inspectors during the course of the inspection should include… the quality and rigour of assessment, particularly in nursery, reception and Key Stage 1.’

We do not yet know whether NAHT will respond positively to the recommendation that it should go beyond the model assessment criteria it has already commissioned by leading work to expand the Principles and Design Checklist into ‘a full model assessment policy and procedures backed by appropriate professional development’.

There was no reference to such plans in the press release accompanying the Report.

Maybe the decision could not be ratified in time by the Association’s decision-making machinery – but this did not prevent the immediate commissioning of the model criteria.

.

Phased Implementation

Recommendation 10: Ofsted should articulate clearly how inspectors will take account of assessment practice in making judgements and ensure both guidance and training for inspectors is consistent with this.

Recommendation 12: The Department for Education should make a clear and unambiguous statement on the teacher assessment data that schools will be required to report to parents and submit to the Department for Education. Local authorities and other employers should provide similar clarity about requirements in their area of accountability.

Recommendation 13: The education system is entering a period of significant change in curriculum and assessment, where schools will be creating, testing and revising their policies and procedures. The government should make clear how they will take this into consideration when reviewing the way they hold schools accountable as new national assessment arrangements are introduced during 2014/15. Conclusions about trends in performance may not be robust.

Recommendation 18: The use by schools of suitably modified National Curriculum levels as an interim measure in 2014 should be supported by the government. However, schools need to be clear that any use of levels in relation to the new curriculum can only be a temporary arrangement to enable them to develop, implement and embed a robust new framework for assessment. Schools need to be conscious that the new curriculum is not in alignment with the old National Curriculum levels.

Recommendation 20: Schools should be asked to publish their principles of assessment from September 2014, rather than being required to publish a detailed assessment framework, which instead should be published by 2016. The development of the full framework should be outlined in the school development plan with appropriate milestones that allow the school sufficient time to develop an effective model.

Recommendation 21: A system wide review of assessment should be undertaken. This would help to repair the disjointed nature of assessment through all ages, 2-19.

The Commission quite rightly identifies a number of issues caused by the implementation timetable, combined with continuing uncertainty over aspects of the Government’s plans.

At the time of writing, the response to the consultation document has still not been published (though it was due in autumn 2013) yet schools will be implementing the new National Curriculum from this September.

The Report says:

‘There was strong concern expressed about the requirement for schools to publish their detailed curriculum and assessment framework in September 2014.’

This is repeated in Recommendation 20, together with the suggestion that this timeline should be amended so that only a school’s principles for assessment need be published by this September.

I have been trying to pin down the source of this requirement.

Schedule 4 of The School Information (England) (Amendment) Regulations 2012 do not require the publication of a detailed assessment framework, referring only to

‘The following information about the school curriculum—

(a)  in relation to each academic year, the content of the curriculum followed by the school for each subject and details as to how additional information relating to the curriculum may be obtained;

(b)  in relation to key stage 1, the names of any phonics or reading schemes in operation; and

(c)  in relation to key stage 4—

(i)            a list of the courses provided which lead to a GCSE qualification,

(ii)          a list of other courses offered at key stage 4 and the qualifications that may be acquired.’

I could find no Government guidance stating unequivocally that this requires schools to carve up all the National Curriculum programmes of study into year-by-year chunks.  (Though there is no additional burden attached to publication if they have already undertaken this task for planning purposes.)

There are references to the publication of Key Stage 2 results (which will presumably need updating to reflect the removal of levels), but nothing on the assessment framework.

Moreover, the DfE mandatory timeline says that from the Spring Term of 2014:

‘All schools must publish their school curriculum by subject and academic year, including their provision of personal, social, health and economic education (PSHE).’

(The hyperlink returns one to the Regulations quoted above.)

There is no requirement for publication of further information in September.

I wonder therefore if this is a misunderstanding. I stand to be corrected if readers can point me to the source.

It may arise from the primary assessment and accountability consultation document, which discusses publication of curricular details and then proceeds immediately to discuss the relationship between curriculum and assessment:

‘Schools are required to publish this curriculum on their website…In turn schools will be free to design their approaches to assessment, to support pupil attainment and progression. The assessment framework must be built into the school curriculum, so that schools can check what pupils have learned and whether they are on track to meet expectations at the end of the key stage, and so that they can report regularly to parents.’ (paras 3.4-3.5)

But this conflation isn’t supported by the evidence above and, anyway, these are merely proposals.

That said, it must be assumed that the Commission consulted its DfE observer on this point before basing recommendations on this interpretation.

If the observer’s response was consistent with the Commission’s interpretation, then it is apparently inconsistent with all the material so far published by the Department!

It may be necessary for NAHT to obtain clarification of this point given the evidence cited above.

That aside, there are issues associated with the transition from the current system to the future system.

The DfE’s January 2014 ‘myths and facts’ publication says:

‘As part of our reforms to the national curriculum, the current system of “levels” used to report children’s attainment and progress will be removed from September 2014. Levels are not being banned, but will not be updated to reflect the new national curriculum and will not be used to report the results of national curriculum tests. Key Stage 1 and Key Stage KS2 [sic] tests taken in the 2014 to 2015 academic year will be against the previous national curriculum, and will continue to use levels for reporting purposes

Schools will be expected to have in place approaches to formative assessment that support pupil attainment and progression. The assessment framework should be built into the school curriculum, so that schools can check what pupils have learned and whether they are on track to meet expectations at the end of the key stage, and so that they can report regularly to parents. Schools will have the flexibility to use approaches that work for their pupils and circumstances, without being constrained by a single national approach.’

The reference here to having approaches in place – rather than the publication of a ‘detailed curriculum and assessment framework’ – would not seem wildly inconsistent with the Commission’s idea that schools should establish their principles by September 2014, and develop their detailed assessment frameworks iteratively over the two succeeding years. However, the Government needs to clarify the position.

Since Key Stage 2 tests will not dispense with levels until May 2016 (and they will be published in the December 2015 Performance Tables), there will be an extended interregnum in which National Curriculum Levels will continue to have official currency.

Moreover, levels may still be used in schools – they are not being banned – though they will not be aligned to the new National Curriculum.

The Report says:

‘…it is important to recognise that, even if schools decide to continue with some form of levels, the new National Curriculum does not align to the existing levels and level descriptors and this alignment is a piece of work that needs to be undertaken now.’ (p19).

However, the undertaking of this work does not feature in the Recommendations, unless it is implicit in the production by NAHT of ‘a full model assessment policy and procedures’, which seems unlikely.

One suspects that the Government would be unwilling to endorse such a process, even as a temporary arrangement, since what is to stop schools from continuing to use this new improved levels structure more permanently?

The Commission would appear to be on stronger ground in asking Ofsted to make allowances during the interregnum (which is what I think Recommendation 10 is about) especially given that, as Recommendation 13 points out, evidence of ‘trends in performance may not be robust’.

The point about clarity over teacher assessment is well made – and one hopes it will form part of the response to the primary assessment and accountability consultation document when that is eventually published.

The Report itself could have made progress in this direction by establishing and maintaining a clearer distinction between statutory and internal teacher assessment.

The consultation document itself made clear that KS2 writing would continue to be assessed via teacher assessment rather than a test, and, moreover:

‘At the end of each key stage schools are required to report teacher assessment judgements in all national curriculum subjects to parents. Teachers will judge whether each pupil has met the expectations set out in the new national curriculum. We propose to continue publishing this teacher assessment in English, mathematics and science, as Lord Bew recommended.’ (para 3.9)

But what it does not say is what requirements will be imposed to ensure consistency across this data. Aside from KS2 writing, will they also be subject to the new scaled scores, and potentially deciles too?

Until schools have answers to that question, they cannot consider the overall shape of their assessment processes.

The final recommendation, for a system-wide review of assessment from 2-19 is whistling in the wind, especially given the level of disruption already caused by the decision to remove levels.

Neither this Government nor the next is likely to act upon it.

 

Conclusion

The Commission’s Report moves us forward in broadly the right direction.

The Principles, Design Checklist and wider recommendations help to fill some of the void created by the decision to remove National Curriculum levels, the limited nature of the primary assessment and accountability consultation document and the inordinate delay in the Government’s response to that consultation.

We are in a significantly better place as a consequence of this work being undertaken.

But there are some worrying inconsistencies in the Report as well as some significant shortcomings to the proposals it contains. There are also several unanswered questions.

Not to be outdone, I have bound these up into a series of recommendations directed at NAHT and its Commission. There are 23 in all and I have given mine letters rather than numerals, to distinguish them from the Commission’s own recommendations.

  • Recommendation A: The Commission should publish all the written evidence it received.
  • Recommendation B: The Commission should consult on key provisions within the Report, seeking explicit commitment to the Principles from DfE, Ofqual and Ofsted.
  •  Recommendation C: The Commission should ensure that its Design Checklist is fully consistent with the Principles in all respects. It should also revisit the internal logic of the Design Checklist.
  • Recommendation D: So far as possible, ahead of the primary assessment and accountability consultation response, the Commission should distinguish clearly how its proposals relate to statutory teacher assessment, alongside schools’ internal assessment processes.
  • Recommendation E: NAHT should confirm who it has commissioned to produce model assessment criteria and to what timetable. It should also explain how these criteria will be ‘nationally standardised’.
  • Recommendation F: The Commission should clarify whether the trained assessment lead mentioned in Recommendation 9 is the same or different to the ‘senior leader who is responsible for assessment’ mentioned in the Design Checklist.
  • Recommendation G: The Commission should set out more fully the responsibilities allocated to this role or roles and clarify that schools have flexibility over how they distribute those responsibilities between staff.
  • Recommendation H:  NAHT should clarify how the model criteria under development apply – if at all – to the wider school curriculum in all schools and to academies not following the National Curriculum.
  • Recommendation I: NAHT should clarify how the model criteria under development will allow for the fact that in all subjects all schools enjoy flexibility over the positioning of content in different years within the same key stage – and can also anticipate parts of the subsequent key stage.
  • Recommendation J: NAHT should clarify whether the intention is that the model criteria should reflect the allocation of content to specific terms as well as to specific years.
  • Recommendation K: The Commission should explain how its approach to internal assessment will help predict future performance in end of Key Stage tests.
  • Recommendation L: The Commission should shift from its narrow and ‘mildly accelerative’ view of high attainment to accommodate a richer concept that combines enrichment (breadth), extension (depth) and acceleration (faster pace) according to learners’ individual needs.
  • Recommendation M: The Commission should incorporate a fourth ‘far exceeded’ assessment judgement, since the ‘exceeded’ judgement covers too wide a span of attainment.
  • Recommendation N: NAHT should clarify whether its model criteria will extend into KS3, to accommodate assessment against the criteria for at least year 7, and ideally beyond.
  • Recommendation O: The Commission should clarify whether anticipating criteria for a subsequent year is a cause or a consequence of being judged to be ‘exceeding’ expectations in the learner’s own chronological year.
  • Recommendation P: The Commission should confirm that numerical summaries of assessment criteria – as well as any associated ranking positions – should be made available to parents who request them.
  • Recommendation Q: The Commission should explain why schools should be forbidden from ranking learners against each other (or allocating them to deciles).
  • Recommendation R: The Commission should assess the financial impact of its proposals on schools of different sizes.
  • Recommendation S: The Commission should cost its proposals for training and moderation, identifying the burden on the taxpayer and any offsetting savings.
  • Recommendation T: NAHT should clarify its response to Recommendation 19, that it should lead the development of a full model assessment policy and procedures.
  • Recommendation U: The Commission should clarify with DfE its understanding that schools are required to publish a detailed curriculum and assessment framework by September 2014.
  • Recommendation V: The Commission should clarify with DfE the expectation that it should have in place ‘approaches to formative assessment’ and whether the proposed assessment principles satisfy this requirement.
  • Recommendation W: The commission should clarify whether it is proposing that work is undertaken to align National Curriculum levels with the new National Curriculum and, if so, who it proposes should undertake this.

So – good overall – subject to these 23 reservations!

Some are more significant than others. Given my area of specialism, I feel particularly strongly about those that relate directly to high attainers, especially L and M above.

Those are the two I would nail to the door of 1 Heath Square.

.

GP

March 2014