A Summer of Love for English Gifted Education? Episode One: KS2 Level 6 Tests

.

summer of love 1967 by 0 fairy 0

summer of love 1967 by 0 fairy 0

This post is the first in a short series, scheduled to coincide with three publications – two yet to be published – that focus directly on provision for gifted learners in England.

Each Episode will foreground one of the publications, set within the emerging overall narrative. Each will assess the likely impact of the target publication and the broader narrative as it unfolds while also reflecting associated developments in educational policy anticipated during the next few months.

Episode One:

  • Analyses the first publication, an Investigation of Level 6 Key Stage 2 Tests, already published in February 2013, exploring its findings in the context of current uncertainty about future arrangements for assessment in primary schools.
  • Reviews the outcomes of the most recent Ofsted survey of gifted and talented education, conducted in December 2009, so establishing a benchmark for consideration of a new Ofsted survey of how schools educate their most able pupils, due for publication in May 2013.
  • Sets out what we know about the third document, an Investigation of School and College-level strategies to raise the Aspirations of High-Achieving Disadvantaged Pupils to Pursue Higher Education, due for publication by mid-September 2013.

Future Episodes will scrutinise the new Ofsted Survey and the second Investigation respectively, linking them with other developments over the summer period, not all of which may yet be in the public domain.

By this means I plan to provide a kind of iterative stocktake of current issues and future prospects for their resolution. I am curious to learn whether I will be more or less positive at the end of the series than at the beginning.

For I enter the fray in a spirit of some world-weariness and pessimism over the continuing inability of the gifted education community to act collaboratively, to reform itself and to improve practice. This is seemingly a global malaise, though some countries stand out as bucking the trend. Many have featured in previous posts.

Will the Summer of Love provide the spur for trend-bucking reform here in England, or will the groundswell of energy it generates be dissipated in the long, languorous, lazy sunshine days ahead?

.

Publications in the Last Two Years and Associated Developments

Following a lengthy period in the doldrums, we may be on the verge of a rather livelier season in the evolving history of English gifted education.

It would be wrong to suggest that we have been entirely becalmed. Over the past two years we have digested a trio of key publications, all of which have been reviewed on this Blog:

  • The Sutton Trust’s ‘Educating the Highly Able’ (July 2012), which I took seriously to task for its over-emphasis on excellence at the expense of equity and almost entire failure to address the needs of underachieving gifted learners, especially those from disadvantaged backgrounds. Given the sponsoring organisation’s raison d’etre (improving social mobility) that seemed, frankly, bizarre.

These documents may have had some limited positive impact, by maintaining gifted education’s profile within wider education policy, but I can find no evidence to suggest that they have reformed our collective thinking about effective gifted education, let alone improved the learning experience and life chances of English gifted learners.

Indeed, it is conceivable that the two latter publications have set back the cause of gifted education by taking us down two successive blind alleys.

I have made my own small efforts to refocus attention on a more productive direction of travel through The Gifted Phoenix Manifesto for Gifted Education.

I do not claim any great status or significance for the Manifesto, though there are encouraging early signs that it is stimulating productive debate amongst others in the field, at least amongst those who are not firmly wedded to the status quo.

The Sutton Trust promises further work, however:

‘Helping the highly able

Piloting programmes that support and stretch bright students from non-privileged backgrounds in state schools, and opening up selective state schools to bright children from low and middle income homes.’

This presumably includes the outcome of the call for proposals that it issued as long ago as July 2012, ‘with a view to developing the first project by the end of the year’ – ie 31 December 2012 (see attachment at the bottom of the linked page).

The call for proposals sought:

‘Cost-effective, scalable projects which support highly able pupils in non-selective maintained schools.  The Trust is particularly interested in initiatives which are based on sound evidence and / or which draw on proven models of intervention.’

It expressed interest in:

  • ‘proposals that focus on those pupils capable of excellence in core academic school subjects’;.
  • ‘various methods of defining this group – for example those attaining at the 90th percentile and above, the 95th percentile, or the new Level 6’ or ‘on the basis of school performance and local context’;
  • Support for ‘“exceptionally able” pupils’ especially ‘imaginative ways of bringing them together’;
  • Provision that is ‘integral to schools and not simply a “bolt-on” to mainstream provision’
  • Programmes that start ‘in key stage three or four, but which may continue to support the students through their transition to FE and HE’.

There is some reasonable hope therefore that the Trust might still contribute in a positive way to the Summer of Love! If there is an announcement during the timeframe of this series I will of course feature the details in a future Episode.

But I plan to build the series around a second trio of documents which have the capacity to be somewhat more influential than those published from 2011 to 2012.

.

Kew once more 1 by giftedphoenix

Kew once more 1 by giftedphoenix

.

Key Stage 2 Level 6

One is already with us: an ‘Investigation of Key Stage 2 Level 6 Tests’ commissioned by the Department for Education and published in late February 2013. (OK, so I’m stretching a point by extending Summer back into the Winter, but this study has so far escaped serious in-depth attention.)

The authors are Mike Coldwell, Ben Willis and Colin McCaig from the Centre for Education and Inclusion Research (CEIR) at Sheffield Hallam University.

Before engaging directly with their findings, it is necessary to sketch in a fair amount of contextual background, since that will be critical to the broader narrative we expect to evolve over the coming months.

 .

Background: Level 6 Tests

Level 6 Tests are by no means the first example of efforts to raise the assessment ceiling for high-attaining learners at the end of Key Stage 2 (KS2) (typically the final year of primary school when children are aged 11), but there is insufficient space here to trace the history of their predecessors.

The current iteration, optional Level 6 tests, was introduced in 2011 in reading, writing and maths. The tests were not externally marked, nor were results published.

QCDA was still in place. Its website said:

‘The tests provide the opportunity to stretch high attaining pupils and also provide a useful tool for measuring the ability and progression of gifted and talented pupils. You are advised to view the tests to make a judgement on how appropriate they are for your pupils.’

In June 2011, the Bew Report into KS2 testing, assessment and accountability reflected this experience:

‘We recognise that the current system of National Curriculum tests can appear to place a ceiling on attainment for the most able pupils. This has important implications for measures of progress, since a pupil who achieves level 3 at the end of Key Stage 1 can currently only achieve level 5 in the end of Key Stage 2 tests, and can therefore only make two levels of progress (currently the expected rate of progress).

Allowing pupils to attain level 6 at the end of Key Stage 2 would enable pupils with high Key Stage 1 attainment to make better than expected progress. Secondary schools receiving pupils who had attained level 6 would understand that these pupils would need to be particularly challenged and stretched from the start of Year 7…

It is important to challenge the most able pupils. We welcome the Government’s decision to make level 6 tests available to schools on an optional basis this year. We believe that these optional tests could allow particularly able pupils an opportunity to develop and fully demonstrate their knowledge and understanding.

However, we do have some concerns, in particular over the extent to which it will be possible for primary schools to cover enough of the Key Stage 3 curriculum to allow pupils to attain level 6. NFER, one of the few respondents who commented on this issue, suggested that it would be more appropriate to award a ‘high 5’ than a level 6.’

So Bew concluded:

‘We believe that the Government should continue to provide level 6 National Curriculum Tests for schools to use on an optional basis, whose results should be reported to parents and secondary schools.’

But there was also a rider:

‘If, following the review of the National Curriculum, any changes are made to the current system of levels, alternative arrangements should be put in place to ensure the most able pupils are challenged.’

More about that anon.

In the light of this, externally marked KS2 Level 6 tests were offered in 2012 in Reading and Maths. There was also an option to undertake internally marked Level 6 teacher assessment in Writing.

The 2012 KS2 Assessment and Reporting Arrangements Booklet offered a brief commentary:

‘These tests are optional and are aimed at high attaining children. Headteachers should take into account a child’s expected attainment prior to entering them for these tests as they should already be demonstrating attainment above level 5…

To be awarded an overall level 6 in a subject, a child must achieve both a level 5 in the end of Key Stage 2 test and pass the level 6 test for that subject. Schools can refer to the 2011 level 6 test papers in order to inform their assessment of whether to enter children for the test.’

The Investigation examines this 2012 experience, but is confined to the two externally marked tests.

Meanwhile – and skipping ahead for a moment – in 2013, the optional Reading and Maths tests are once again available, alongside a new optional test of Grammar, Punctuation and Spelling, in place of the teacher assessment of writing.

Reporting of Level 6 results in School Performance Tables has also changed. In 2012, Level 6 outcomes were used only in the ‘calculation of progress measures, Value Added,  percentage achieving level 5+ and average point scores’.

When it comes to the 2013 Performance Tables:

‘…the percentage of the number of children at the end of Key Stage 2 achieving level 6 in a school will also be shown in performance tables. The Department will not publish any information at school level about the numbers of children entered for the level 6 tests, or the percentage achieving level 6 of those entered for level 6.’

This change may have been significant in driving increased interest in the tests, though not necessarily for all the right reasons, as the discussion below will reveal.

Although the 2012 Performance Tables made limited use of Level 6 results some aggregated performance data was published, as my post on the outcomes noted:

‘900 pupils achieved Level 6 in the KS2 reading test and 19,000 did so in the maths test. While the former is significantly lower than 1% of total entries, the latter is equivalent to 3%, so roughly one pupil per class is now achieving Level 6 in maths. (About 700 pupils also achieved Level 6 in science teacher assessment). Almost all learners achieving a Level 6 will have demonstrated three levels of progress. We know from other provisional data that some 2,500 of those securing Level 6 in maths achieved either Level 2A or even Level 2B in maths alone at KS1, so managing four levels of progress in crude whole-level terms.’

Incidentally, we now know from DfE’s website that:

‘There will not be a Key Stage 2 science sampling test in 2013; a new, biennial (every other year), pupil-level sampling system will be introduced in 2014.’

And slightly more accurate performance data was supplied in an Appendix to the Investigation itself. It tells us that, across all schools (including independent schools that opted to take the tests):

  • 55,212 learners were entered for Level 6 Maths and 18,953 of them (34.3%) achieved it; and
  • 46,810 pupils were entered for level 6 reading and 942 (2.0%) achieved it.

That gives a total of 102,022 entries, though we do not know how many came from independent schools or, indeed, how many learners were entered for Level 6 tests in both Maths and Reading.

.

Background: The Future of National Curriculum Assessment

We have known since June 2012 that National Curriculum levels will be phased out and were informed, through a kind of policy aside in March 2013, that this would happen ‘from 2016’.

The new National Curriculum will be introduced from September 2014, so will be assessed through the existing assessment framework during its first year of implementation, despite the apparently strong case for keeping it and the associated assessment reforms fully synchronised.

It may be that this decision is associated with recent difficulties over the procurement of a contractor to undertake external marking of the KS2 tests from 2014-2016, or else progress on determining the new arrangements was insufficiently advanced by the time that contract came to be negotiated.

At the time of writing we still await a promised consultation document on primary assessment and accountability, some 10 months after the removal of levels was first communicated.

The issues discussed below will need revisiting once the Government’s proposals are safely in the public domain: the spectre of assessment reform hangs over this post as well as the Investigation it is supposed to be reviewing.

There are few clues to the direction of travel, apart from some suggestion that the Government has been influenced by Bew’s deliberations, even though his clarity on this point left something to be desired.

I quote the relevant sections fully below, to ensure that I haven’t missed any vital inflection or  hint of what Bew intended. The emphases are mine:

‘In the short term, we believe we need to retain levels as a means of measuring pupils’ progress and attainment… However, in the long term, we believe the introduction of a new National Curriculum provides an opportunity to improve how we report from statutory assessment. We believe it is for the National Curriculum Review to determine the most appropriate way of defining the national standards which are used to categorise pupils’ attainment.

We realise that, in order to measure progress, it is necessary to have an appropriate scale against which attainment and progress can be measured at various points. For example in Australia, a ‘vertical scale’ (where a movement along the scale between any two equally spaced points must reflect similar levels of progress) is created by testing several year-groups, using some common questions to link scores on each test together. A particular question might be considered difficult for a Year 3 pupil, but much easier for a Year 5 pupil. Although this is technically defensible, it does require tests at more regular intervals than we currently have in England.

In England, we currently use National Curriculum levels as a scale against which to measure progress. However, as stated later in this chapter, concerns have been raised as to whether the levels, as they currently exist, are appropriate as a true vertical scale. We recommend that, as part of the review of the National Curriculum, consideration is given to creating a more appropriate ‘vertical scale’ with which to measure progress.

And, a little later in the Report:

‘In the longer term, we feel it may be helpful for statutory assessment to divide into two parts. All pupils could be expected to master a ‘core’ of essential knowledge by the end of Key Stage 2, concentrating on the basic literacy and numeracy which all pupils require if they are to access the secondary curriculum. This ‘core’ could be assessed through a ‘mastery’ test which all pupils should be expected to pass (only excepting cases of profound Special Educational Needs), providing a high minimum standard of literacy and numeracy at the end of primary education.

We recognise the risk that this approach may lead to ‘teaching to the test’, may set an unhelpfully low ceiling on attainment and would not reflect pupils’ progress. We would suggest two solutions. Firstly, it might be helpful to allow pupils to take ‘core’ tests in Years 4, 5 or 6 to ensure that able pupils are challenged. Secondly, we feel there could also be a separate assessment at the end of Key Stage 2 to allow pupils to demonstrate the extent of their knowledge and therefore to measure pupils’ progress during the Key Stage. This assessment could be designed to identify the extent of pupils’ attainment and understanding at the end of Year 6, spreading them out on a ‘vertical scale’ rather than being a pass/fail mastery test. Such an assessment should be as useful as possible to pupils, parents and teachers. It may be helpful for the results to report in greater detail than is currently provided by National Curriculum Test data, so they can identify more effectively the pupil’s attainment in key broad aspects of a subject.

We feel the combination of these statutory assessments could ensure that all pupils reach a minimum standard of attainment while also allowing pupils to demonstrate the progress they have made – which would indicate the quality of the school’s contribution to their education. It could provide a safety net in that all pupils should achieve a basic minimum, but would not impose a low ceiling on the able.’

And then finally:

‘A key criticism of the current Key Stage 2 tests is that pupils’ knowledge and skills over a four-year Key Stage is assessed via tests in a single specified week in May. Some critics have raised concerns that this approach causes stress for pupils, particularly those working at the lower end of a spectrum, and may have unfair implications for schools, whose overall results may be affected if for example a highly-performing pupil is absent on test day. In addition, criticism suggests there is little incentive to challenge the more able children, who may well be working at level 5 at an earlier point in the Key Stage or year.

We believe that our earlier recommendations address these issues. However, we also recognise the benefits of a system based on the principle of ‘testing when ready’. The proponents of such an approach argue that it would allow each pupil to be entered for statutory tests when he/she is ready, and then able to move on to more advanced learning. We believe that it would be possible for a statutory ‘testing when ready’ system to meet the statutory assessment purposes we have specified.

However, we are not convinced that moving to a ‘testing when ready’ approach is the best way of achieving the purposes of statutory assessment under the current National Curriculum. We suggest that the principle of ‘testing when ready’ should be considered in the future following the National Curriculum Review. We believe that the principle of ‘testing when ready’ may fit well if computer administered testing is introduced, making it easier for each pupil to sit his/her own personalised test at any point in time when teachers deem him/her to be ready.’

In summary then, Bew appears to suggest:

  • Assessment of mastery of an essential core of knowledge that all should pass but which might be undertaken as early as Year 4, two years before the end of KS2;
  • A separate end of KS2 assessment of the extent of learners’ knowledge and their progress against  a new ‘vertical scale’ that will judge their progress over time, this potentially incorporating reporting on attainment in ‘key broad aspects of a subject’;
  • Consideration of transition to a universal ‘testing when ready’ approach at some indeterminate future point (which may or may not be contemporaneous with and complementary to the changes above).

Quite what learners will do after they have successfully completed the mastery test – and its relationship to the draft Programmes of Study that have now been published – is not explained, or even explored.

Are learners expected to begin anticipating the Key Stage 3 programme of study, or to confine themselves to pursuing the KS2 programme in greater breadth and depth, or a combination of the above?

In short, Bew raises more questions than he answers (and so effectively reinforces the argument for keeping curricular and assessment reforms fully synchronised).

At this point we simply do not know whether the Government is ready to unveil plans for the introduction of a radically new ‘test when ready’ assessment regime from 2016, or whether some sort of intermediate position will be adopted.

The former decision would be a very bold reform given the ‘high stakes’ nature of these tests and the current state of cutting edge assessment practice. Given the difficult history of National Curriculum assessment, the risk of catastrophic error might well be too great to contemplate at this stage.

Awash in all this uncertainty, one might be forgiven for assuming that an analysis of the impact of the introduction of Level 6 tests has been overtaken – or almost overtaken – by events.

But that would be unjustified since the Investigation addresses some important issues about gifted education in the upper primary years, effective management of the transition between primary and secondary schools and the role of assessment in that process.

.

Kew once more 2 by giftedphoenix

Kew once more 2 by giftedphoenix

.

The Investigation: Key Points

The Report is structured around the sequence of events leading from a school’s decision to enter learners for the tests, proceeding from there to consider the identification and selection of participants, the support provided to them in the run up to taking the test, and the outcomes for participants, other pupils, the host school and receiving secondary schools.

It addresses five research questions:

  • How have the tests affected school behaviour towards the most able pupils?
  • What is the difference in behaviours between schools that do well in the tests and those which do not?
  • What are the positive and negative effects of the tests, on schools and pupils respectively?
  • Why did some schools enter pupils for the tests whereas others did not?
  • How are schools identifying pupils to enter the tests?

It does so by means of a tripartite methodology, drawing on 20 case studies of schools undertaking the tests, 40 telephone interviews with schools that decided not to take part and 20 telephone interviews with secondary schools.

.

The Decision to Enter Learners

Schools that decided to enter pupils for the tests did so because:

  • They wanted to provide additional challenge for able pupils and/or remove an unhelpful ceiling on their attainment. There was a perceived motivational benefit, for staff as well as learners,  while some primary schools ‘hoped that an externally validated exam might make secondary schools more secure in their views about primaries’ judgements’, as well as protecting learners from expectations that they would repeat work at their receiving secondary schools.
  • They wanted to evidence positive performance by the school, by demonstrating additional progress by learners and confirming teacher assessment outcomes. Entry was assumed to assert their high expectations of able pupils. Some were anxious that failure to take part would be perceived negatively by Ofsted.
  • Some were encouraged by the ‘low stakes’ nature of the assessment, identified entry as consistent with the school’s existing priorities, saw a positive marketing opportunity, or wanted to attract or retain staff ‘with sufficient confidence and expertise to teach level 6 content’.

Conversely, schools deciding against participation most often did so because they judged that they had no pupils for which the tests would be suitable (though there was recognition that this was a cohort-specific issue).

Many said they had received insufficient guidance, about the test itself and about the need to teach the Key Stage 3 programme of study, and there was related concern about the absence of dedicated teaching materials.

Some objected to the tests in principle, preferring an alternative approach to assessing these learners, or concerned at a disproportionate focus on the core subjects. ‘Quite a number’ took the reverse and negative position on secondary schools’ anticipated response, assuming that receiving schools would re-test and repeat the work pupils had undertaken.

.

Identification and Selection of Participants

Concern about lack of guidance extended to advice on selection of participants. There was widespread worry at the limited availability of past papers. Lack of confidence led to schools adopting very different approaches, some rather liberal and others much more conservative.

Some entered only those learners they believed had a very good chance of passing. Others extended entry to all those they believed had some chance of success, sometimes including even those they felt probably would not pass.

On average, case study schools nominated 41% of the subset of learners who achieved Level 5 in Maths, though some entered 20% or fewer and others 81% or more. Most fell between these two extremes. (The national figure is given as 26%.)

But, in Reading, case study schools nominated on average only 25% of learners who had achieved Level 5. Only a minority of schools nominated over 41%. (The national figure is given as 18%.)

Timing of selection varied considerably. Identifying potential entrants relatively early in Year 6 and confirming selection nearer the April deadline was a common strategy.

Decisions typically took into account several factors, foremost of which were learners’ own preferences. Few schools consulted parents systematically. There was generally less clarity and confidence in respect of Reading.

Schools typically utilised a mix of objective, quantifiable and subjective, value-driven measures, but ‘many schools struggled to convey coherently a specific selection strategy’ and it is clear that the probability of a learner being entered varied considerably according to which school they attended.

Objective evidence included formative assessment, tracking data, cross-moderation of work between partner schools and the outcomes of practice tests. Though schools felt secure in their levelling, only a handful stated explicitly that they had learners working at Level 6, either at the point of selection for the tests or subsequently. In reality, most made their judgements on the basis of performance at Level 5.

Subjective considerations – eg learners’ ‘wellbeing’ – were significant:

‘In certain instances possessing the raw ingredients of academic ability and a track record of high academic performance in isolation were not necessarily seen to be sufficient grounds for selection. Instead a number of schools also attached considerable importance to the particular pupils’ maturity, personality and, in some cases, behaviour.’

Many schools expected to tighten their selection criteria in response to low pass rates, especially in Reading. There was marked dissatisfaction with ‘the increased threshold marks (compared with those from the pilot tests)’ and a feeling that this had led schools to underestimate the difficulty of the tests.

The Executive Summary argues that ‘schools were largely effective in ensuring that the very top ability pupils were identified and put forward’, but the substantive text is not quite so bullish.

There was clear evidence of reticence on teachers’ parts in outlining the characteristics of learners working at Level 6. Reference was made to independence, tenacity and motivation and ‘an innate flare or capability to excel at a particular subject’.

Some schools struggled to pin down these traits, especially for Reading. Teachers mentioned ‘excellent inferential skills and capacity to access authorial intent’.

Maturity was also a key consideration:

‘The parameters of the Level 6 Reading test are just not compatible with the vast majority of pupils aged 11 (even the very brightest ones) – they simply do not possess the experiences and emotional maturity to be able to access what is required of them within the level 6 test.’

.

Support Provided to Participants

Limited guidance was a prominent issue, leading schools to use ‘an array of ad hoc means of support’ derived from their own research and experience.

Many adopted aspects of the KS3 Programme of Study, despite concern at the attitude of receiving secondary schools. Materials and support were much more evident in Maths than in Reading.

Lack of clarity over the relationship between Level 6 tests and the KS3 programmes of study was a significant issue. Most schools drew on the KS3 curriculum but a few preferred to emphasise breadth and depth at KS2 instead.

Schools were generally more confident in their support for Maths because ‘there appeared to be more internal and external expertise available’ and they found selection of participants less problematic.

Two aspects of support were prominent:

  • Classroom differentiation, focused on specific aspects of the curriculum – though the tests themselves were not widely perceived to have had a material impact on such practice. Some form of ability grouping was in place in all schools in respect of maths and most schools in respect of reading (as part of literacy).
  • Test preparation, mostly undertaken in additional booster sessions combining teaching with test-taking practice and the wider use of practice papers.

The Report characterises three broad approaches adopted by schools: outcome focussed (heavily emphasising test preparation); teaching and learning focused (with markedly less emphasis on booster sessions and test practice); and a composite approach marking the continuum between these two extremes.

Several schools reported an intention ‘to focus more on teaching and learning’ in the coming year.

.

Outcomes of the Tests

In Maths it was possible ‘to identify a small number of schools that performed particularly well and others that performed relatively poorly’.

The analysis focuses on the simple pass rate, the Level 5 to 6 conversion rate and a ‘top Level 5’ to Level 6 conversion rate across the 20 case study schools.

The simple pass rate was 40% (34% nationally), though this masked significant variation – from 0% to 100% indeed.

These outcomes correlated broadly with the level 5 to 6 conversion rates for which the case study school average was 17%, with variance from 0% to 50%.

However, when it came to the’ top Level 5’ to Level 6 conversion rate, the Report can only admit that, while there was some degree of correlation with the other two measures:

‘On this measure there was polarity: most schools either found that all of their ‘top level 5s’ achieved level 6 or that none of them achieved it. This is difficult to interpret, and the qualitative data does not shed a light on this.’

Even more problematically, only one learner in the entire sample was successful in achieving Level 6 in the Reading test – equivalent to a 1% success rate (the national pass rate was 2%).

The Report offers some rather approximate findings, wrapped around with health warnings, suggesting that better results were more typically found in schools with a combined approach featuring learning and outcomes (see above), as opposed to either of those two extremes.

Positive outcomes for schools have already been outlined above.

Benefits for learners, identified by teachers and learners alike, included the scope provided by the tests for learners to demonstrate (even fulfil) their potential. Wider personal outcomes were also mentioned including a positive impact on motivation (though there were also corresponding concerns about overloading and over-pressurising learners).

Secondary schools rather tended to reinforce the negative expectations of some primary schools:

  • They were ‘generally ambivalent about primary schools’ use of L6 test and aspects of the KS3 curriculum…due to the fact that secondary schools in general felt that measures of KS2 outcomes were not accurate… Consequently, they preferred to test the children pre-entry or at the beginning of Year 7’.
  • ‘Many of the secondary schools were concerned about primary schools ‘teaching to the test’ and thus producing L6 pupils with little breadth and depth of understanding of L6 working…Generally secondaries viewed such results as unreliable, albeit useful for baseline assessment, as they help to identify ‘high fliers’’
  • While most noted the benefits for learners ‘some felt that inaccurate test outcomes made the transition more difficult’. The usual range of concerns was expressed.

.

The Investigation’s own Conclusions

The Investigation offers four main conclusions:

  • It is abundantly clear…that greater guidance on pupil selection and support and more practice materials are key issues’. This needs to incorporate guidance on coverage, or otherwise, of the KS3 curriculum. The main text (but not the executive summary) identifies this as a responsibility of ‘DfE with the STA’. It remains to be seen whether the Government will take on this task or will look instead to the market to respond.
  • Schools adopting a strongly outcome-focussed approach were less likely to produce successful results than those adopting a mixed learning and outcome approach. Some schools seemed too heavily driven by pressure to secure positive inspection results, and

.‘responded to the direction from inspectors and policymakers to support the most able by a narrowing of the curriculum and overemphasising test preparation, which is not in the best interests of pupil, teachers or schools’

There is a ‘need for policy to aim to drive home the vital importance of pedagogy and learning to counteract the tendency’.

  • Secondary schools confirm primary schools’ scepticism that they will not ‘judge the tests as an accurate reflection of levels’. There is therefore ‘a strong need to engage secondaries much more with primaries in, for example, curriculum, assessment and moderation’. This is presumably a process that is most easily undertaken through local collaboration.
  • The very low pass rate in Reading, selection issues (including maturity as a key component) and secondary scepticism point to a need ‘to review whether the L6 Reading test in its current form is the most appropriate test to use to identify a range of higher performing pupils, for example the top 10%’. The full commentary also notes that:

.‘The cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits’.

.

My Conclusions

There is relatively little here that would be unusual or surprising to a seasoned observer of how gifted education is currently practised and of wider educational issues such as the impact of Ofsted on school practice and transfer and transition issues.

The study is rather narrow in its conceptualisation, in that it fails to address the interface between the Level 6 tests and other relevant aspects of Government thinking, not least emerging policy on the curriculum and (of course) assessment.

It entirely ignores the fact that a decision to abandon National Curriculum Levels was announced eight months prior to publication.

There is no attempt to analyse the national data in any depth, or to look at any issues concerning the gender, ethnic and socio-economic profile of learners entered for the tests and successful within it, even though there will have been some heavy biases, especially in favour of those from comparatively advantaged backgrounds.

It would have been particularly helpful to see how much bigger the FSM gap at Level 6 is, compared with Level 5, whether schools had focused on this issue and, if so, what action they had taken to address it. Was there any evidence of the positive use of Pupil Premium funding for this purpose?

The Investigation’s general point about the negative impact of Ofsted on schools’ practice may also be rather misleading, in that the negative influence of overly outcomes-focussed thinking is at least partly attributable to School Performance Tables rather than Ofsted’s school inspection framework.

In that guise it will probably also feature in Ofsted’s own upcoming publication (see below). Whether there is any reference in Ofsted’s report to the case for rebalancing schools towards pedagogy and learning, so they are more in equilibrium with the pursuit of assessment outcomes, is rather more doubtful. Quite how that might be undertaken is ducked by the Level 6 Investigation and so likely to be sidelined.

The issues relating to transition and transfer are longstanding and a heavy drag on the efficiency of our school system, both for gifted learners and the wider population. If the upcoming consultation affects the timing of Key Stage 2 assessment, that may provide the impetus for renewed efforts to address the generic problem. Otherwise this seems unlikely to be a priority for the Government.

The response to date to the call for additional guidance has been rather limited.

Certainly, a range of sample material has been posted to assist schools interested in taking up the new test of grammar, punctuation and spelling. But the information available to support the Maths and Reading tests remains relatively thin. I have found nothing that addresses substantively the issues about pre-empting elements of Key Stage 3.

Despite the limited support available, evidence has recently emerged that Level 6 test entries are significantly higher for 2013 than for 2012. A total of 113,600 pupils have been entered, equivalent to 21% of the relevant pupil population.

This is said to be an increase of 55% compared with the 73,300 entered in 2012 (though that figure does not seem to agree with those quoted in the Investigation and reproduced above).

Moreover, some 11,300 schools have registered for the tests, up 41% on the 2012 figure of 8,300 schools.

Given the issues associated with the Reading test set out in the Report, one might hazard a reasonable guess that the increase will be attributable largely to the Maths test and perhaps to schools experimenting with the new grammar, punctuation and spelling test (though the figures are not broken down by test).

Increased emphasis in the 2013 Performance Tables (see above) will also be a significant factor. Does this suggest that schools are increasingly slaves to the outcomes-driven mentality that the Investigation strives so hard to discourage?

.

.

The key point here is that it is unlikely to be wise or appropriate to enter over one fifth of all end KS2 learners for tests in which so few are likely to be successful.

One might reasonably hope that, incorporated within the design principles for whatever assessment instruments will replace Level 6 tests, there is explicit recognition that a basic pass/fail distinction, combined with an exceptionally high threshold for a pass, is not the optimal solution.

It is important to retain a high threshold for those with the capacity to achieve it, but other relatively strong candidates also need opportunities to demonstrate a positive outcome at a slightly lower level. A new approach might look to recognise positively the performance of the top 10%, top 5% and top 1% respectively.

It will also be critical to ensure an orderly transition from the current arrangements to those in place from 2016. There is a valuable window of opportunity to pilot new approaches thoroughly alongside the existing models. The reform need not be rushed – that is the silver lining to the cloud associated with decoupling curriculum and assessment reforms.

So, what is my overall judgement of the contribution made by this first publication to my wished for ‘Summer of Love’?

A curate’s egg really. Positive and useful in a small way, not least in reminding us that primary-secondary transition for gifted learners remains problematic, but also a missed opportunity to flag up some other critical issues – and of course heavily overshadowed by the primary assessment consultation on the immediate horizon.

Still, one hopes that its recommendations will be revisited as part of a holistic response to all three publications, and that those to follow will take full account of its findings, otherwise the overall narrative will be somewhat impoverished and will almost certainly fail to give due prominence to the critically important upper primary phase.

.

Kew once more 3 by giftedphoenix

Kew once more 3 by giftedphoenix

 

The Ofsted Survey

.

Background

Next in line for publication is an Ofsted Survey, conducted using the Inspectorate’s rapid response methodology, which will examine ‘how state schools teach the most able children’.

Unusually, this was announced in January 2013 through a press briefing with a national newspaper. Given the political leanings of the paper in question, the contents of the story may be a somewhat biased version of reality.

There is no information whatsoever on Ofsted’s own website, with the sole (and recently added) exception of a publication schedule confirming that the survey will be published in May.

The newspaper report explains that:

  • Despite being a rapid response exercise, this publication ‘will be the most extensive investigation of gifted and talented provision undertaken’ by Ofsted.
  • It will focus predominantly – if not exclusively – on secondary schools where ‘children who get top marks in primary school are being let down by some secondary school teachers who leave them to coast rather than stretch them to achieve the best exam results’.
  • It will examine ‘concerns that bright pupils who are taught in mixed ability classes are failing to be stretched and that schools are entering clever children too early for GCSE exams so that they gain only the C grades that count in league tables and are not pushed to the full extent of their abilities’.
  • Ofsted will interrogate existing inspection data on educational provision for gifted and talented learners, as well as pupil progress data. They will also survey provision afresh, through visits to a representative sample of over 50 secondary schools.

HMCI Sir Michael Wilshaw is quoted extensively:

‘I am concerned that our most able pupils are not doing as well as they should be…Are schools pushing them in the way they should be pushed and are pushed in the independent sector and in the selective system?

The statistic that four independent schools and a very prestigious six [sic] form college are sending more youngsters to Oxbridge than 2,000 state secondary schools is a nonsense. When the history of comprehensive education is written people need to say that they did as well by the most able pupils as they did by the least able…

I am passionate about this, it will be a landmark report…I am as concerned as the next person on the issue of social mobility. Are our children and our children from the poorest backgrounds who are naturally bright doing as well as they should?

…I would like to see GCSE league tables reformed…The anxiety to get as many through those C boundaries have sometimes meant that schools haven’t pushed children beyond that.

We need sophisticated league tables which shows [sic] progress. Youngsters leaving primary school with level 5 should be getting A*, A or B at GCSE.’

It is arguable that the Government has already responded to the final specific point via its proposal – in the consultation on secondary accountability released alongside the draft National Curriculum – to publish an ‘average point score 8’ measure based on each pupil’s achievement across eight qualifications at the end of KS4 (though whether it has done enough to counterbalance other pressures in the system to prioritise the C/D borderline is open to question).

Otherwise there are several familiar themes here:

  • whether gifted learners are insufficiently challenged, particularly in secondary comprehensive schools;
  • whether they are making sufficient progress between the end of Key Stage 2 and the end of Key Stage 4;
  • whether they are held back by poor differentiation, including a preponderance of mixed ability teaching;
  • to what extent they are supported by schools’ policies on early entry to examinations, particularly GCSEs;
  • whether more can be to done to support progression by state school students to the most competitive universities, especially by those from disadvantaged backgrounds; and
  • whether there are perverse incentives in the accountability system that result in gifted learners being short-changed.

Given the puff generated by Sir Michael, expectations are high that this will be a substantial and influential piece of work. It follows that, if it turns out to be comparatively a damp squib, the sense of disappointment and frustration will be so much greater.

The Report will be judged by what new and fresh light it can bring to bear on these issues and, critically, by the strength of the recommendations it directs towards stakeholders at national, local and school level.

Just how interventionist will Ofsted show itself in backing up its leader’s passion? Will it take responsibility for co-ordinating a response from central government to any recommendations that it points in that direction – and what exactly will Ofsted commit itself to doing to help bring about real and lasting change?

Not to labour the point (though I fear I may be doing so) a limp effort that repackages familiar findings and appeals rather weakly to stakeholders’ better judgement will not display the landmark qualities of which HMCI has boasted.

A future Episode in this series will be dedicated to assessing whether or not these inflated expectations have been satisfied, and what the consequences are for the Summer of Love.

.

Benchmarking the New Report

In the meantime, it is instructive to look back at the most recent inspection report on gifted education, thus supplying a benchmark of sorts against which to judge the findings in this new publication.

This will help to establish whether the new report is simply bearing out what we know already about long-standing shortcomings in gifted education, or whether it has important messages to convey about the impact – positive or negative – of the predominantly ‘school led’ approach adopted by successive Governments over the past three years.

The most recent report was published in December 2009, in the latter days of the previous government.

Gifted and Talented Pupils in Schools’ is based on a rapid response survey of 26 primary and secondary schools, selected because their most recent school-wide inspections had identified gifted and talented education as ‘an improvement point’.

The survey was undertaken shortly after the previous government had, in the Report’s words:

‘Reviewed its national programme for gifted and talented pupils and concluded that it was not having sufficient impact on schools. As a result, provision is being scaled back to align it more closely with wider developments in personalising learning. Schools will be expected to do more themselves for these pupils.’

Eight of the 26 schools (31%) were judged to be well-placed to respond to this new environment, 14 (54%) displayed adequate capacity for improvement and the remaining four (15%) had ‘poorly developed’ capacity to sustain improvement.

The schools that were well-placed to build their own capacity could demonstrate that their improved provision was having a positive impact on outcomes for all pupils, were making use of available national resources – including the critically important Quality Standards – and were making sure that all pupils were suitably challenged in lessons.

The majority of schools in the middle group could demonstrate some improvement in pupil outcomes since their last inspection, but ‘many of the developments in these schools were fragile and the changes had had limited success in helping gifted and talented pupils to make appropriate and sustained progress’.

Gifted education was not a priority and:

‘To build their capacity to improve provision, they would benefit from better guidance, support and resources from outside agencies and organisations.’

In the four schools with inadequate capacity to improve, lead staff had insufficient status to influence strategic planning, teachers had not received appropriate training and schools:

‘Did not sufficiently recognise their own responsibilities to meet the needs of their gifted and talented pupils’.

The Report’s Key Findings identify a series of specific issues:

  • Many schools’ gifted education policies were ‘generic versions from other schools or the local authority’, so insufficiently effective.
  • In the large majority of schools (77%) pupils said their views were not adequately reflected in curriculum planning and they experienced an inconsistent level of challenge.
  • None of the schools had engaged fully with the parents of gifted learners to understand their needs and discuss effective support.
  • The better-placed schools were characterised by strong senior leadership in this field and lead staff with sufficient status to influence and implement policy. Conversely, in the poorer schools, senior staff demonstrated insufficient drive or commitment to this issue in the face of competing priorities.
  • In schools judged to have adequate capacity to improve, subject leaders had too much flexibility to interpret school policy, resulting in inconsistency and lack of coherence across the curriculum.
  • Most schools ‘needed further support to identify the most appropriate regional and national resources and training to meet their particular needs’. Lead staff were seeking practical subject-specific training for classroom teachers.
  • All schools ‘felt they needed more support and guidance about how to judge what gifted and talented pupils at different ages should be achieving and how well they were making progress towards attaining their challenging targets across key stages’
  • Just over half the schools had established collaborative partnerships with other schools in their localities. Lack of such support was evident in the schools with limited capacity to improve. There was comparatively little scrutiny through local accountability arrangements.
  • All the schools had developed out-of-hours provision though the link with school-based provision was not always clear and schools were not consistently evaluating the impact of such provision.
  • There was little analysis of progression by different groups of gifted learners.

The Report offers the customary series of recommendations, directed at central and local government and schools, designed to help schools build the necessary capacity to improve their performance in these areas. It will be telling whether the new Report assesses progress in implementing those.

Rather oddly, they fail to endorse or propose arrangements for the ongoing application of the Quality Standards in a ‘school-led’ environment, although the Standards incorporate all these elements of effective practice and provide a clear framework for continuous improvement.

With the benefit of hindsight, one might argue that many of the problems Ofsted cited in 2009 would have been rather less pronounced had the Inspectorate fully embraced the Standards as their official criteria for judging the effectiveness of gifted education when they were first introduced.

The Standards are now growing significantly out of date and require an urgent refresh if they are to remain a valuable resource for schools as they continue to pursue improvement.

Ideally Ofsted might lead that process and subsequently endorse the revised Standards as the universal measure for judging the quality of English schools’ gifted education. I can think of nothing that would have a more significant impact on the overall quality of provision

But I suspect that will be an idea too interventionist for even the most passionate HMCI to entertain.

It will be fascinating, nevertheless, to map the shortcomings identified in the upcoming Report against the existing Standards, as well as against those flagged in the predecessor Report. But that’s a topic for another day.

.

Kew once more 4 by giftedphoenix

Kew once more 4 by giftedphoenix

.

Raising the Aspirations of High-Achieving Disadvantaged Pupils

Thirdly and finally, DfE has commissioned an ‘Investigation of School and College-level Strategies to Raise the Aspirations of High-achieving Disadvantaged Pupils to Pursue Higher Education’.

This is still some way from publication, but the contract – including the specification – is available for public scrutiny (see documents section on this link).

The contract was awarded to TNS-BMRB (where the Project Lead is Mark Peters) working with the Institute for Policy Studies in Education (IPSE) based at London Metropolitan University (where the lead is Carole Leathwood).

IPSE is undertaking the qualitative element of the research and carries this outline of the project on its website.

According to the contract, the contractors must deliver their final report by 28 June and the Department must publish it within 12 weeks of this date, so by 20 September 2013 at the latest. The project is costing £114,113 plus VAT.

Its aims, as set down in the contract, are to discover:

  • ‘What strategies are being used by schools across years 7-11 and in school sixth forms (years 12-13) to support high-achieving disadvantaged pupils in to [sic] pursue HE.
  • If the pupil premium is being used in schools to fund aspiration raising activities for high-achieving disadvantaged pupils.
  • What strategies are being used by colleges to support high-achieving disadvantaged pupils pursue HE and
  • To identify assess [sic] any areas of potential good practice.

‘High-achieving’ is defined for these purposes as ‘pupils who achieve a Level 5 or higher in English and Maths at KS2’.

As reported in a previous post, some 27% of pupils achieved this outcome in 2012, up from 21% in 2011, so the focus is on the top quartile, or perhaps the top two deciles of pupils on this measure.

‘Disadvantaged’ is defined as ‘pupils eligible for free school meals’ (and, in the case of post-16 students, those who were eligible for FSM in Year 11). This is of course a somewhat narrower definition than eligibility for the Pupil Premium, even though the Premium is pivotal to the study.

The national proportion of pupils achieving Level 5 in KS2 English and maths in 2012 who are eligible for FSM is, I believe, 14%, compared with 32% of non-FSM pupils, giving a gap on this measure of 18%.

This data is not provided in School Performance Tables nor is it easily sourceable from published national statistics, though it does appear in schools’ Raise Online reports. (Incidentally, the comparable gap at Level 4 is somewhat lower, at 16%.)

The full set of objectives for the project is as follows (my emphases, but not my punctuation):

‘For Schools:

  • To identify to what extent schools are supporting high-achieving disadvantaged pupils to raise their aspiration to go on to HE?
  • To identify what activities take place in Years 7 -11 for high-achieving disadvantaged pupils to raise their aspiration to go on to HE and the Russell Group universities?
  • To identify whether the Pupil Premium being used [sic] to fund specific activities to help pupils pursue HE?
  • To identify what good practice looks like for supporting high-achieving disadvantaged pupils to pursue HE? (Focusing particularly on schools that have a high percentage of FSM pupils who go on to HE).

For FE colleges, sixth forms colleges and school sixth forms:

  • To identify to what extent are colleges supporting high-achieving disadvantaged learners post-16 to pursue HE?
  • To identify what strategies, if any, do high-achieving disadvantaged learners receive post-16 to pursue HE and more specifically Russell Group Universities?
  • To identify what good practice looks like for supporting high-achieving disadvantaged learners to pursue HE? (Focusing in particular on the strategies used by colleges that have a high percentage of disadvantaged learners who go on to HE).

For schools and colleges

  • To establish how schools and colleges are identifying ‘high-achieving, disadvantaged’ pupils/learners?
  • To identify which particular groups (if any) are being identified as requiring specific support and why?
  • To identify what extent schools/colleges engage in aspiration raising activities specifically designed to increase participation in Russell Group Institutions (rather than HE in general)?
  • To identify what good practice look like in relation to different groups of pupils/learners?’

It is evident from this that there is some confusion between aspiration-raising activities and wider support strategies. But there is clearly interest in comparing strategies in the school and post-16 sectors respectively (and perhaps in different parts of the post-16 sector too.) The primary sector does not feature.

There is also interest in establishing approaches to identifying the beneficiaries of such support; how such provision is differentiated between progression to HE and progression to ‘Russell Group universities’ respectively; the nature of good practice in each sector, drawn particularly from institutions where a significant proportion of students progress to HE; and distinguishing practice for different (but non-defined) groups of learners.

Finally, there is some interest – though perhaps a little underplayed – in exploring the extent to which the Pupil Premium is used to fund this activity in schools. (Funding sources in post-16 environments are not mentioned.)

The study comprises 6 phases: pre-survey scoping; survey piloting; national school survey (a sample of 500 schools, including 100 that send a high proportion of FSM-eligible pupils to HE); national FE and sixth form college survey (a sample of 100 institutions); case studies (eight schools and two colleges); and results analysis.

The latter will incorporate:

  • ‘To what extent schools and colleges are providing aspiration raising activities to high achieving disadvantaged pupils.’
  • ‘What activities take place across different year groups.’
  • ‘Analysis by school characteristics including region, school size, distance to the nearest Russell group university, proportion of FSM eligible pupils’
  • Comparison of the 400 schools with the 100 sending a high proportion of their FSM pupils on to higher education.
  • Whether ‘activities are associated with higher numbers of pupils progressing to HE and trends in what works for different pupil groups’
  • Triangulation of data from different strands
  • Analysis of ‘best practice’, incorporating ‘comparisons between schools and colleges’.

There is no overt reference to other Government policies and initiatives that might be expected to impact on institutions’ practice, such as the Destination Measures (which will be presented separately for FSM-eligible learners in 2013, as well as being incorporated in School and College Performance Tables) or the Dux Scheme. Nor is there any explicit reference to the outreach activities of universities.

One assumes, however, that the outcomes will help inform Government decisions as to the effectiveness of existing school and college level policy interventions that contribute towards the achievement of its Social Mobility Indicators, specifically:

The Report is likely to result in arrangements of some sort for for disseminating effective practice between institutions, even if that amounts only to a few brief case studies.

It may even help to inform decisions about whether additional interventions are required and, if so, the nature of those interventions.

Previous posts on this Blog have made the case for a nationally co-ordinated and targeted intervention provided through a ‘flexible framework’ which would synergise the currently separate ‘push’ strategies from schools/colleges with the ‘pull’ strategies from higher education in support of the ‘most disadvantaged, most able’.

This would be a subset of the 14% achieving KS2 Level 5 in English and maths, defined by their capacity to enter the most competitive universities. It might incorporate a specific focus on increasing substantively progression to particular ‘elite’ targets, whether expressed in terms of courses (eg medicine, veterinary, law) or institutions (notably Oxbridge).

At the moment all the running is being made on the ‘pull’ side, spearheaded by joint OFFA/HEFCE efforts to develop a ‘National Strategy for Access and Student Success’.

A joint effort would:

  • Passport funding on individual learners and support them through transition at 16 and 18, probably topslicing Pupil Premium for the purpose.
  • Enable learners and facilitators to draw on provision offered via the (currently fragmented) supply side, drawing in third party providers as well as schools/colleges and universities.
  • Provide for a menu of such provision from various sources to be synthesised into a personalised programme based on needs assessment and subject to regular monitoring and updating.

Although there is presently some ideological inhibition hindering the adoption of such scaffolded programmes, an intervention of this nature – targeted exclusively at a select cohort of ‘high ability, high need’ students – would be likely to result in much more significant improvements against these indicators, and do so much more quickly than generic system-wide reform.

In ‘holding the Government’s feet to the fire’ over social mobility issues, perhaps the recently-established Social Mobility and Child Poverty Commission might see its way to making that case when it reports on Government progress in the Autumn.

.

Kew once more 5 by giftedphoenix

Kew once more 5 by giftedphoenix

 

Drawing These Strands Together

So, as things stand at the end of Episode One:

  • There is a decent, if relatively narrow report on the table which draws attention to longstanding transition and transfer problems and an outcomes-obsessed mentality at the top end of Key Stage 2, as well as a range of narrower issues associated with the effective delivery of Level 6 tests.
  • We impatiently await a consultation document on primary accountability that should provide some clarity over the future assessment of high-attaining learners within Key Stage 2, so enabling us to complete the bigger picture of National Curriculum and associated assessment reforms across Key Stages 1-4.
  • We also await a much-vaunted Ofsted survey report which – if it satisfies our high expectations – might provide the spur for real action at national, local and school levels, perhaps even inspiring the Sutton Trust to announce the outcomes of its 2012 call for proposals.
  • Then in September the third report (the second Investigation) will ideally be sufficiently strategic and influential to cause some important joining up to be undertaken across that part of the agenda focused on progression to higher education by high-attaining learners from disadvantaged backgrounds, potentially at the behest of the Social Mobility and Child Poverty Commission.

I am hopeful that this series of posts will support the process of distilling and synthesising these different elements to provide a composite picture of national strengths and weaknesses in gifted education throughout the continuum from upper Key Stage 2 to university entry. Some kind of audit if you will.

But the question begged is how to respond to the state of affairs that this ‘joining up’ process reveals.

As matters stand, at the end of this first post in the series, I have proffered unto the melting pot a cautiously provisional wishlist comprising three main items: a Manifesto that sets out some principles and arguments for a genuinely collaborative response, revised Quality Standards integrated within the accountability machinery and a targeted intervention for ‘high ability; high need’ learners designed to eliminate the fragmentation that bedevils current efforts.

This menu may well grow and change as the ‘Summer of Love’ progresses, not least to reflect planned and unplanned discussion of the issues . I would be delighted if some of that discussion were to take place in the comments facility below.

I believe one of the Manifesto principles must be to pursue an optimal middle way that is neither top-down nor bottom-up but a ‘strategy of all the talents’. That is reflected in my own version. Your comments are ever welcome about that, too.

But that principle presupposes a national gifted education community with the capacity and wherewithal to build on strengths and tackle weaknesses in a strategic, collaborative, inclusive and universal fashion.

For, if the next stage of reform is once more to be school-led, it is abundantly clear from the evidence presented above that schools will need our support to bring about real and lasting improvements in gifted education practice, for the benefit of all English gifted learners.

I was once optimistic about the prospects, but now I’m not so sure. Perhaps the Summer of Love is a chance in a generation – maybe the last chance – to galvanise the putative community into a real community and so make that happen.

.

GP

May 2013

4 thoughts on “A Summer of Love for English Gifted Education? Episode One: KS2 Level 6 Tests

  1. Could I suggest that you change the background colour of your website as it is very difficult to read the text when it is against a dark blue background. I’m afraid I gave up reading after the first couple of paragraphs.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s