A Primary Assessment Progress Report

.

This post tracks progress towards the introduction of the primary assessment and accountability reforms introduced by England’s Coalition Government.

pencil-145970_640It reviews developments since the Government’s consultation response was published, as well as the further action required to ensure full and timely implementation.

It considers the possibility of delay as a consequence of the May 2015 General Election and the potential impact of a new government with a different political complexion.

An introductory section outlines the timeline for reform. This is followed by seven thematic sections dealing with:

There are page jumps from each of the bullets above, should readers wish to refer to these specific sections.

Each section summarises briefly the changes and commitments set out in the consultation response (and in the original consultation document where these appear not to have been superseded).

Each then reviews in more detail the progress made to date, itemising the tasks that remain outstanding.

I have included deadlines for all outstanding tasks. Where these are unknown I have made a ‘best guess’ (indicated by a question mark after the date).

I have done my best to steer a consistent path through the variety of material associated with these reforms, pointing out apparent conflicts between sources wherever these exist.

A final section considers progress across the reform programme as a whole – and how much remains to be done.

It discusses the likely impact of Election Purdah and the prospects for changes in direction consequent upon the outcome of the Election.

I have devoted previous posts to ‘Analysis of the Primary Assessment and Accountability Consultation Document’ (July 2013) and to the response in ‘Unpacking the Primary Assessment and Accountability Reforms’ (April 2014) so there is inevitably some repetition here, for which I apologise.

This is a long and complex post, even by my standards. I have tried to construct the big picture from a variety of different sources, to itemise all the jigsaw pieces already in place and all those that are still missing.

If you spot any errors or omissions, do let me know and I will do my best to correct them.

.

[Postscript: Please note that I have added several further postscripts to this document since the original date of publication. If you are revisiting, do pause at the new emboldened paragraphs below.]

Timeline for Reform

The consultation document ‘Primary assessment and accountability under the new national curriculum’ was published on 7 July 2013.

It contained a commitment to publish a response in ‘autumn 2013’, but ‘Reforming assessment and accountability for primary schools’ did not appear until March 2014.

The implementation timetable has to be inferred from a variety of sources but seems to be as shown in the table below. (I have set aside interim milestones until the thematic sections below.)

Month/year Action
Sept 2014 Schools no longer expected to use levels for non-statutory assessment
May 2015 End of KS1 and KS2 national curriculum tests and statutory teacher assessment reported through levels for the final time. .
Summer term 2015 Final 2016 KS1 and KS2 test frameworks, sample materials and mark schemes published.
Guidance published on reporting of test results.
Sept 2015 Schools can use approved reception baseline assessments (or a KS1 baseline).
Sept/Autumn term 2015 New performance descriptors for statutory teacher assessment published.
Dec 2015 Primary Performance Tables use levels for the final time.
May 2016 New KS1 and KS tests introduced, reported through new attainment and progress measures.
June 2016 Statutory teacher assessment reported through new performance descriptors.
Sept 2016 Reception baseline assessment the only baseline option for all-through primaries
Schools must publish new headline measures on their websites.
New floor standards come into effect (with progress element still derived from KS1 baseline).
Dec 2016 New attainment and performance measures published in Primary Performance Tables.

The General Election takes place on 7 May 2015, but pre-Election Purdah will commence on 30 March, almost exactly a year on from publication of the consultation response.

At the time of writing, some 40 weeks have elapsed since the response was published – and there are some 10 weeks before Purdah descends.

Assuming that the next Government is formed within a week of the Election (which might be optimistic), there is a second working period of roughly 10 weeks between that and the end of the AY 2014/15 summer term.

The convention is that all significant assessment and accountability reforms are notified to schools a full academic year before implementation, so allowing them sufficient time to plan for implementation.

A full year’s lead time is no longer sacrosanct (and has already been set aside in some instances below) but any shorter notification period may have significant implications for teacher workload – something that the Government is committed to tackling.

.

[Postscript: On 6 February the Government published its response to the Workload Challenge, which contained a commitment to introduce, from ‘Spring 2015’, a:

‘DfE Protocol setting out minimum lead-in times for significant curriculum, qualifications and accountability changes…’

Elsewhere the text says that the minimum lead time will be a year, thus reinforcing the convention described above.

The term ‘significant’ allows some wriggle room, but one might reasonably expect it to be applied to some of the outstanding actions below.

The Protocol was published on 23 March. The first numbered paragraph implicitly defines a significant change as one having ‘a significant workload impact on schools’, though what constitutes significance (and who determines it) is left unanswered.

There is provision for override ‘in cases where change is urgently required’ but criteria for introducing an override are not supplied.]

.

.

We now know that a minimum lead time will not be applied to the introduction of new performance descriptors for statutory teacher assessment (see below). The original timescale did not fit this description and it has not been adjusted in the light of consultation.]

.

Announcements made during the long summer holiday are much disliked by schools, so the end of summer term 2015 becomes the de facto target for any reforms requiring implementation from September 2016.

One might therefore conclude that:

  • We are about two-thirds of the way through the main implementation period.
  • There is a period of some 100 working days in which to complete the reforms expected to be notified to schools before the end of the AY2014/15 summer term. This is divided into two windows of some 50 working days on either side of Purdah.
  • There is some scope to extend more deadlines into the summer break and autumn 2015, but the costs of doing so – including loss of professional goodwill – might outweigh the benefits.

Purdah will act as a brake on progress across the piece. It will delay announcements that might otherwise have been made in April and early May, such as those related to new tests scheduled for May 2016.

The implications of Purdah are discussed further in the final section of this post.

.

Reception Baseline Assessment

Consultation response

A new Reception Baseline will be introduced from September 2015. This will be undertaken by children within their first few weeks of school (so not necessarily during the first half of the autumn term).

Teachers will be able to select from a range of assessments ‘but most are likely to be administered by the reception teaching staff’.  Assessments will be ‘short’ and ‘sit within teachers’ broader assessments of children’s development’.

They will be:

‘…strong predictors of key stage 1 and key stage 2 attainment whilst reflecting the age and abilities of children in reception’

Schools that use an approved baseline assessment ‘in September 2015’ (and presumably later during the 2015/16 academic year) will have their progress measured in 2022 against that or a KS1 baseline, whichever gives the best result.

However, only the reception baseline will be available from September 2016 and, from this point, the Early Years Foundation Stage (EYFS) profile will no longer be compulsory.

The reception baseline will not be compulsory either, since:

‘Schools that choose not to use an approved baseline assessment from 2016 will be judged on an attainment floor standard alone.’

But, since the attainment floor standard is so demanding (see below), this apparent choice may prove illusory for most schools.

Further work includes:

  • Engaging experts to develop criteria for the baselines.
  • A study in autumn 2014 of schools that already use such assessments, to inform decisions on moderation and the reporting of results to parents.
  • Communicating those decisions about moderation and reporting results – to Ofsted as well as to parents – ensuring they are ‘contextualised by teachers’ broader assessments’.
  • Publishing a list of assessments that meet the prescribed criteria.

.

Developments to date

Baseline criteria were published by the STA in May 2014.

The purpose of the assessments is described thus:

‘…to support the accountability framework and help assess school effectiveness by providing a score for each child at the start of reception which reflects their attainment against a pre-determined content domain and which will be used as the basis for an accountability measure of the relative progress of a cohort of children through primary school.’

This emphasis on the relevance of the baseline to floor targets is in marked contrast with the emphasis on reporting progress to parents in the original consultation document.

Towards the end of the document here is a request for ‘supporting information in addition to the criteria’:

‘What guidance will suppliers provide to schools in order to enable them to interpret the results and report them to parents in a contextualised way, for example alongside teacher observation?’

This seems to refer to the immediate reporting of baseline outcomes rather than of subsequent progress measures. Suitability for this purpose does not appear within the criteria themselves.

Interestingly, the criteria specify that the content domain:

‘…must demonstrate a clear progression towards the key stage 1 national curriculum in English and mathematics’,

but there is no reference to progression to KS2, and nothing about assessments being ‘strong predictors’ of future attainment, whether at KS1 or KS2.

Have expectations been lowered, perhaps because of concerns about the predictive validity of the assessments currently available?

A research study was commissioned in June 2014 (so earlier than anticipated) with broader parameters than originally envisaged.

The Government awarded a 9-month contract to NFER worth £49.7K, to undertake surveys of teachers’, school leaders’ and parents’ views on baseline assessment.

The documentation reveals that CEM is also involved in a parallel quantitative study which will ‘simulate an accountability environment’ for a group of schools, to judge changes in their behaviour.

Both of these organisations are also in the running for concession contracts to deliver the assessments from September 2015 (see below).

The aims of the project are to identify:

  • The impact of the introduction of baseline assessments in an accountability context.
  • Challenges to the smooth introduction of baseline assessments as a means to constructing an accountability measure.
  • Potential needs for monitoring and moderation approaches.
  • What reporting mechanisms and formats stakeholders find most useful.

Objectives are set out for an accountability strand and a reporting strand respectively. The former refer explicitly to identification of ‘gaming’ and the exploration of ‘perverse incentives’.

It is not entirely clear from the latter whether researchers are focused solely on initial contexualised reporting of reception baseline outcomes, or are also exploring the subsequent reporting of progress.

The full objectives are reproduced below

.

Reception baseline capture

.

The final ‘publishable’ report is to be delivered by March 2015. It will be touch and go whether this can be released before Purdah descends. Confirmation of policy decisions based on the research will likely be delayed until after the Election.

.

The process has begun to identify and publish a list of assessments that meet the criteria.

A tender appeared on Contracts Finder in September 2014 and has been updated several times subsequently, the most recent version appearing in early December.

The purpose is to award several concession contracts, giving holders the right to compete with each other to deliver baseline assessments.

Contracts were scheduled to be awarded on 26 January 2015, but there was no announcement. Each will last 19 months (to August 2016), with an option to extend for a further year. The total value of the contracts, including extensions, is calculated at £4.2m.

There is no limit to the number of concessions to be awarded, but providers must meet specified (and complex) school recruitment and delivery targets which essentially translate into a 10% sample of all eligible schools.

Under-recruiting providers can be included if fewer than four meet the 10% target, as long as they have recruited at least 1,000 eligible schools.

Moreover:

‘The minimum volume requirement may be waived if the number of schools choosing to administer the reception baseline is fewer than 8,887 [50% of the total number of schools with a reception class].’

Hence the number of suppliers in the market is likely to be limited to 10 or so: there will be some choice, but not too much.

My online researches unearthed four obvious candidates:

And suggestions that this might constitute the entire field

.

.

The initial deadline for recruiting the target number of schools is 30 April 2015, slap-bang in the middle of Purdah. This may prove problematic.

.

[Postscript: The award of six concession contracts was quietly confirmed on Wednesday 4 February, via new guidance on DfE’s website. The two contractors missing from the list above are Early Excellence and Hodder Education.

The guidance confirms that schools must sign up with their preferred supplier. They can do so after the initial deadline of 30 April but, on 3 June, schools will be told if they have chosen a provider that has been suspended for failing to recruit sufficient schools.  They will then need to choose an alternative provider.

It adds that, in AY2015/16, LA-maintained schools, academies and free schools will be reimbursed for the ‘basic cost’ of approved reception baselines. Thereafter, school budgets will include the necessary funding.

In the event, the Government has barely contributed to publicity for the assessment, leaving it to suppliers to make the running. The initial low-key approach (including links to the contractors’ home pages rather than to details of their baseline offers) has been maintained.

The only addition to the guidance has been the inclusion, from 20 March, of the criteria used to evaluate the original bids. This seems unlikely to help schools select their preferred solution since, by definition, all the successful bids must have satisifed these criteria!

Purdah will now prevent any further Government publicity.]

.

It seems likely that the decision to allow a range of baseline assessments – as opposed to a single national measure – will create significant comparability issues.

One of the ‘clarification questions’ posed by potential suppliers is:

‘We can find no reference to providing a comparability score between provider assessments. Therefore, can we assume that each battery of assessments will be independent, stand-alone and with no need to cross reference to other suppliers?’

The answer given is:

‘The assumption is correct at this stage. However, STA will be conducting a comparability study with successful suppliers in September 2015 to determine whether concordance tables can be constructed between assessments.’

This implies that progress measures will need to be calculated separately for users of each baseline assessment – and that these will be comparable only through additional ‘concordance tables’, should these prove feasible.

There are associated administrative and workload issues for schools, particularly those with high mobility rates, which may find themselves needing to engage with several different baseline assessment products.

One answer to a supplier’s question reveals that:

‘As currently, children will be included in performance measures for the school in which they take their final assessment (i.e. key stage 2 tests) regardless of which school they were at for the input measure (i.e. reception baseline on key stage 1). We are currently reviewing how long a child needs to have attended a school in order for their progress outcome to be included in the measure.’

The issue of comparability also raises questions about their aggregation for floor target purposes. Will targets based on several different baseline assessments be comparable with those based on only one? Will schools with high mobility rates be disadvantaged?

Schools will pay for the assessments. The supporting documentation says that:

‘The amount of funding that schools will be provided with is still to be determined. This will not be determined until after bids have been submitted to avoid accusations of price fixing.’

One of the answers to a clarification question says:

‘The funding will be available to schools from October 2015 to cover the reception baseline for the academic year 2015/16.’

Another says this funding is unlikely to be ringfenced.

There is some confusion over the payment mechanism. One answer says:

‘…the mechanism for this is still to be determined. In the longer term, money will be provided to schools through the Dedicated Schools Grant (DSG) to purchase the reception baseline. However, the Department is still considering options for the first year and may pay suppliers directly depending on the amount of data provided.’

But yet another is confident that:

‘Suppliers will be paid directly by schools. The Department will reimburse schools separately.’

The documentation also reveals that there has as yet been no decision on how to measure progress between the baseline and the end of KS2:

‘The Department is still considering how to measure this and is keen for suppliers to provide their thoughts.’

The ‘Statement of requirements’ once again foregrounds the use of the baseline for floor targets rather than reporting individual learners’ progress.

‘On 27 March 2014, the Department for Education (DfE) announced plans to introduce a new floor standard from September 2016. This will be based on the progress made by pupils from reception to the end of primary school.  The DfE will use a new Reception Baseline Assessment to capture the starting point from which the progress that schools make with their pupils will be measured.  The content of the Reception Baseline will reflect the knowledge and understanding of children at the start of reception, and will be clearly linked to the learning and development requirements of the Early Years Foundation Stage and key stage 1 national curriculum in English and mathematics.  The Reception Baseline will be administered within the first half term of a pupil’s entry to a reception class.’

In relation to reporting to parents, one of the answers to suppliers’ questions states:

‘Some parents will be aware of the reception baseline from the national media coverage of the policy announcement. We anticipate that awareness of the reception baseline will develop over time. As with other assessments carried out by a school, we would expect schools to share information with parents if asked, though there will be no requirement to report the outcome of the reception baseline to parents.’

So it appears that, regardless of the outcomes of the research above, initial short term reporting of reception baseline outcomes will be optional.

.

[Postscript: This position is still more vigorously stated in a letter dated November 2014 from Ministers to a primary group formed by two maths associations. It says (my emphasis):

‘Let me be clear that we do not intend the baseline assessment to be used to monitor the progress of individual children. You rightly point out that any assessment that was designed to be reliable at individual child level would need to take into account the different ages at which children start reception and be sufficiently detailed to account for the variation in performance one expects from young children day-to-day. Rather, the baseline assessment is about capturing the starting point for the cohort which can then be used to assess the progress of that cohort at the end of primary school,’

This distinction has not been made sufficiently explicit in material published elsewhere.]

.

The overall picture is of a process in which procurement is running in parallel with research and development work intended to help resolve several significant and outstanding issues. This is a consequence of the September 2015 deadline for introduction, which seems increasingly problematic.

Particularly so given that many professionals are yet to be convinced of the case for reception baseline assessment, expressing reservations on several fundamental grounds, extending well beyond the issues highlighted above.

A January 2015 Report from the Centre Forum – Progress matters in Primary too – defends the plan against its detractors, citing six key points of concern. Some of the counter-arguments summarised below are rather more convincing than others:

  • Validity: The contention that reception level assessments are accurate predictors of attainment at the end of KS2 is justified by reference to CEM’s PIPS assessment, which was judged in 2001 to give a correlation of 0.7. But of course KS2 tests were very different in those days.
  • Reliability: The notion that attainment can be reliably determined in reception is again justified with reference to PIPS data from 2001 (showing a 0.98 correlation on retesting). The authors argue that the potentially negative effects of test conditions on young children and the risks of bias should be ‘mitigated’ (but not eliminated) through the development and selection process.
  • Contextualisation: The risk of over-simplification through reporting a single numerical score, independent of factors such as age, needs to be set against the arguments in favour of a relatively simple and transparent methodology. Schools are free to add such context when communicating with parents.
  • Labelling: The argument that baseline outcomes will tend to undermine universally high expectations is countered by the view that assessment may actually challenge labelling attributable to other causes, and can in any case be managed in reporting to parents by providing additional contextual information.
  • Pupil mobility: Concern that the assessment will be unfair on schools with high levels of mobility is met by reference to planned guidance on ‘how long a pupil needs to have attended a school in order to be included in the progress measure’. However, the broader problems associated with a choice of assessments are acknowledged.
  • Gaming: The risk that schools will artificially depress baseline outcomes will be managed through effective moderation and monitoring.

The overall conclusion is that:

‘…the legitimate concerns raised by stakeholders around the reliability and fairness of a baseline assessment do not present fundamental impediments to implementing the progress measure. Overall, a well-designed assessment and appropriate moderation could address these concerns to the extent that a baseline assessment could provide a reasonable basis for constructing a progress measure.

That said, the Department for Education and baseline assessment providers need to address, and, where indicated, mitigate the concerns. However, in principle, there is nothing to prevent a well-designed baseline test being used to create a progress-based accountability measure.’

The report adds:

‘However, this argument still needs to be won and teachers’ concerns assuaged….

.. Since the majority of schools will be reliant on the progress measure under the new system, they need to be better informed about the validity, reliability and purpose of the baseline assessment. To win the support of school leaders and teachers, the Department for Education must release clear, defensible evidence that the baseline assessment is indeed valid, fair and reliable.’

.

[Postscript: On 25 March the STA tendered for a supplier to ‘determine appropriate models for assuring the national data from the reception baseline’. The notice continues:

‘Once models have been determined, STA will agree up to three approaches to be implemented by the supplier in small scale pilots during September/October 2015. The supplier will also be responsible for evaluating the approaches using evidence from the pilots with the aim of recommending an approach to be implemented from September 2016.’

The need for quality assurance is compounded by the fact that there are six different assessment models. The documentation makes clear that monitoring, moderation and other quality assurance methods will be considered.

The contract runs from 1 July 2015 to 31 January 2016 with the possibility of extension for a further 12 months. It will be let by 19 June.]

 .

Outstanding tasks

  • Publish list of contracts for approved baseline assessments (26 January 2015) COMPLETED
  • Explain funding arrangements for baseline assessments and how FY2015-16 funding will be distributed (January 2015?) COMPLETED
  • Publish research on baseline assessment (March/April 2015) 
  • Confirm monitoring and moderation arrangements (March/April 2015?) 
  • Deadline for contractors recruiting schools for initial baseline assessments (30 April 2015) 
  • Publish guidance on the reporting of baseline assessment results (May 2015?) 
  • Award quality assurance tender (June 2016)
  • Undertake comparability study with successful suppliers to determine whether concordance tables can be constructed (Autumn 2015) 
  • Determine funding required for AY2015/16 assessment and distribute to schools (or suppliers?) (October 2015?)
  • Pilot quality assurance models (October 2015)

KS1 and KS2 tests

.

Consultation response

The new tests will comprise:

  • At KS1 – externally set and internally marked tests of maths and reading and an externally set test of grammar, punctuation and spelling (GPS). It is unclear from the text whether the GPS test will be externally marked.
  • At KS2 – externally set and externally marked tests of maths, reading and science, plus a sampling test in science.

Outcomes of both KS1 and KS2 tests (other than the science sampling test) will be expressed as scaled scores. A footnote makes it clear that, in both cases, a score of ‘100 will represent the new expected standard for that stage’

The consultation document says of the scaled scores:

‘Because it is not possible to create tests of precisely the same difficulty every year, the number of marks needed to meet the secondary readiness standard will fluctuate slightly from one year to another. To ensure that results are comparable over time, we propose to convert raw test marks into a scaled score, where the secondary readiness standard will remain the same from year to year. Scaled scores are used in all international surveys and ensure that test outcomes are comparable over time.’

It adds that the Standards and Testing Agency (STA) will develop the scale.

Otherwise very little detail is provided about next steps. The consultation response is silent on the issue. The original consultation document says only that:

‘The Standards and Testing Agency will develop new national curriculum tests, to reflect the new national curriculum programmes of study.’

Adding, in relation to the science sampling test:

‘We will continue with national sample tests in science, designed to monitor national standards over time. A nationally-representative sample of pupils will sit a range of tests, designed to produce detailed information on the cohort’s performance across the whole science curriculum. The design of the tests will mean that results cannot be used to hold individual schools or pupils accountable.’

.

Developments to date

On March 31 2014, the STA published  draft test frameworks for the seven KS1 and KS2 tests to be introduced from 2016:

  • KS1 GPS: a short written task (20 mins); short answer questions (20 mins) and a spelling task (15 mins)
  • KS1 reading: two reading tests, one with texts and questions together, the other with a separate answer booklet (2 x 20 mins)
  • KS1 maths: an arithmetic test (15 mins) and a test of fluency, problem-solving and reasoning (35 mins)
  • KS2 GPS: a grammar and punctuation test (45 mins) and a spelling task (15 mins)
  • KS2 reading: a single test (60 mins)
  • KS2 maths: an arithmetic test (30 mins) and two tests of fluency, problem-solving and reasoning (2 x 40 mins)
  • KS2 science (sampling): tests in physics, chemistry and biology contexts (3 x 25 mins).

Each test will be designed for the full range of prior attainment and questions will typically be posed in order of difficulty.

Each framework explains that all eligible children at state-funded schools will be required to take the tests, but some learners will be exempt.

For further details of which learners will be exempted, readers are referred to the current Assessment and Reporting Arrangements (ARA) booklets.

According to these, the KS1 tests should be taken by all learners working at level 1 or above and the KS2 tests by all learners working at level 3 and above. Teacher assessment data must be submitted for pupils working below the level of the tests.

But of course levels will no longer exist – and we have no equivalent in the form of scaled scores – so the draft frameworks do not define clearly the lower parameter of the range of prior attainment the tests are intended to accommodate.

It will not be straightforward to design workable tests for such broad spans of prior attainment.

Each framework has a common section on the derivation of scaled scores:

‘The raw score on the test…will be converted into a scaled score. Translating raw scores into scaled scores ensures performance can be reported on a consistent scale for all children. Scaled scores retain the same meaning from one year to the next. Therefore, a particular scaled score reflects the same level of attainment in one year as in the previous year, having been adjusted for any differences in difficulty of the test.

Additionally, each child will receive an overall result indicating whether or not he or she has achieved the required standard on the test. A standard-setting exercise will be conducted on the first live test in 2016 in order to determine the scaled score needed for a child to be considered to have met the standard. This process will be facilitated by the performance descriptor… which defines the performance level required to meet the standard. In subsequent years, the standard will be maintained using appropriate statistical methods to translate raw scores on a new test into scaled scores with an additional judgemental exercise at the expected standard. The scaled score required to achieve the expected level on the test will always remain the same.

The exact scale for the scaled scores will be determined following further analysis of trialling data. This will include a full review of the reporting of confidence intervals for scaled scores.’

In July 2014 STA also published sample questions, mark schemes and associated commentaries for each test.

.

Outstanding tasks

I have been unable to trace any details of the timetable for test development and trialling.

As far as I can establish, STA has not published an equivalent to QCDA’s ‘Test development, level setting and maintaining standards’ (March 2010) which describes in some detail the different stages of the test development process.

This old QCA web-page describes a 22-month cycle, from the initial stages of test development to the administration of the tests.

This aligns reasonably well with the 25-month period between publication of the draft test frameworks on 31 March 2014 and the administration of the tests in early May 2016.

Applying the same timetable to the 2016 tests – using publication of the draft frameworks as the starting point – suggests that:

  • The first pre-test should have been completed by November 2014
  • The second pre-test should take place by February 2015 
  • Mark schemes and tests should be finalised by July 2015

STA commits to publishing, the final test frameworks and a full set of sample tests and mark schemes for each of the national curriculum tests at key stages 1 and 2 ‘during the 2015 summer term’.

Given Purdah, these seem most likely to appear towards the end of the summer term rather than a full year ahead of the tests.

In relation to the test frameworks, STA says:

‘We may make small changes as a result of this work; however, we do not expect the main elements of the frameworks to change.’

They will also produce, to the same deadline, guidance on how the results of national curriculum tests will be reported, including an explanation of scaled scores.

So we have three further outstanding tasks:

  • Publishing the final test frameworks (summer term 2015) 
  • Finalising the scale to be used for the tests (summer term 2015) 
  • Publishing guidance explaining the use and reporting of scaled scores (summer term 2015)

.

[Postscript: Since publishing this post, I have found on Contracts Finder various STA contracts, as follows:

How these square with the timetable above is, as yet, unclear. If there is a possibility that final test frameworks cannot be finalised until Autumn 2015, the Workload Challenge Protocol may well bite here too.]

.

Statutory teacher assessment

.

Consultation response

The response confirms statutory teacher assessment of:

  • KS1 maths, reading, writing, speaking and listening and science
  • KS2 maths, reading, writing and science.

There are to be performance descriptors for each statutory teacher assessment:

  • a single descriptor for KS1 science and KS2 science, reading and maths
  • several descriptors for KS1 maths, reading, writing and speaking and listening, and also for KS2 writing.

There is a commitment to improve KS1 moderation, given concerns expressed by Ofsted and the NAHT Commission.

In respect of low attaining pupils the response says:

‘All pupils who are not able to access the relevant end of key stage test will continue to have their attainment assessed by teachers. We will retain P-scales for reporting teachers’ judgements. The content of the P-scales will remain unchanged. Where pupils are working above the P-scales but below the level of the test, we will provide further information to enable teachers to assess attainment at the end of the relevant key stage in the context of the new national curriculum.’

And there is to be further consideration of whether to move to external moderation of P-scale teacher assessment.

So, to summarise, the further work involves:

  • Developing new performance descriptors – to be drafted by an expert group. According to the response, the KS1 descriptors would be introduced in ‘autumn 2014’. No date is given for the KS2 descriptors.
  • Improving moderation of KS1 teacher assessment, working closely with schools and Ofsted.
  • Providing guidance to support teacher assessment of those working above the P-scales but below the level of the tests.
  • Deciding whether to move to external moderation of P-scale teacher assessment.

.

Developments to date

Updated statutory guidance on the P-Scale attainment targets for pupils with SEN was released in July 2014, but neither it nor the existing guidance on when to use the P-Scales relates them to the new scaled scores, or discusses the issue of moderation.

.

In September 2014, a guidance noteNational curriculum and assessment from September 2014: Information for schools’ revised the timeline for the development of performance descriptors:

‘New performance descriptors will be published (in draft) in autumn 2014 which will inform statutory teacher assessment at the end of key stage 1 and 2 in summer 2016. Final versions will be published by September 2015.’

.

A consultation document on performance descriptors: ‘Performance descriptors for use in key stage 1 and 2 statutory teacher assessment for 2015 to 2016’ was published on 23 October 2014.

The descriptors were:

‘… drafted with experts, including teachers, representatives from Local Authorities, curriculum and subject experts. Also Ofsted and Ofqual have observed and supported the drafting process’

A November 2014 FoI response revealed the names of the experts involved and brief biographies were provided in the media.

A further FoI has been submitted requesting details of their remit but, at the time of writing, this has not been answered.

.

[Postscript: The FoI response setting out the remit was published on 5 February.]

.

The consultation document revealed for the first time the complex structure of the performance descriptor framework.

It prescribes four descriptors for KS1 reading, writing and maths but five for KS2 writing.

The singleton descriptors reflect ‘working at the national standard’.

Where four descriptors are required these are termed (from the top down): ‘mastery’, ‘national’, ‘working towards national’ and ‘below national’ standard.

In the case of KS2 writing ‘above national standard’ is sandwiched between ‘mastery’ and ‘national’.

.

Performance descriptor Capture 1Perfromance Decriptor Capture 2

The document explains how these different levels cross-reference to the assessment of learners exempted from the tests.

In the case of assessments with only a single descriptor, it becomes clear that a further distinction is needed:

‘In subjects with only one performance descriptor, all pupils not assessed against the P-scales will be marked in the same way – meeting, or not meeting, the ‘national standard’.

So ‘not meeting the national standard’ should also be included in the table above. The relation between ‘not meeting’ and ‘below’ national standard is not explained.

But still further complexity is added since:

‘There will be some pupils who are not assessed against the P-scales (because they are working above P8 or because they do not have special educational needs), but who have not yet achieved the contents of the ‘below national standard’ performance descriptor (in subjects with several descriptors). In such cases, pupils will be given a code (which will be determined) to ensure that their attainment is still captured.’

This produces a hierarchy as follows (from the bottom up):

  • P Scales
  • In cases of assessments with several descriptors, an attainment code yet to be determined
  • In case of assessments with single descriptors, an undeclared ‘not meeting the national standard’ descriptor
  • The single descriptor or four/five descriptors listed above.

However, the document says:

‘The performance descriptors do not include any aspects of performance from the programme of study for the following key stage. Any pupils considered to have attained the ‘Mastery standard’ are expected to explore the curriculum in greater depth and build on the breadth of their knowledge and skills within that key stage.’

This places an inappropriate brake on the progress of the highest attainers because the assessment ceiling is pitched too low to accommodate them.

It is acknowledging that some high attainers will be performing above the level of the highest descriptors but, regardless of whether or not they move into the programme for the next key stage, there is no mechanism to record their performance.

This raises the further question whether the mastery standard is pitched at the equivalent of level 6, or below it. It will be interesting to see whether this is addressed in the consultation response.

The consultation document says that the draft descriptors will be trialled during summer term 2015 in a representative sample of schools.

These trials and the consultation feedback will together inform the development of the final descriptors, but also:

  • ‘statutory arrangements for teacher assessment using the performance descriptors;
  • final guidance for schools (and those responsible for external moderation arrangements) on how the performance descriptors should be used;
  • an updated national model for the external moderation of teacher assessment; and
  • nationally developed exemplification of the work of pupils for each performance descriptor at the end of each key stage.’

Published comments on the draft descriptors have been almost entirely negative, which might suggest that the response could be delayed. The consultation document said it should appear ‘around 26 February 2015’.

According to the document, the final descriptors will be published either ‘in September 2015’ or ‘in the autumn term 2015’, depending whether you rely on the section headed ‘Purpose’ or the one called ‘Next Steps’. The first option would allow them to appear as late as December 2015.

A recent newspaper report suggested that the negative reception had resulted in an ‘amber/red’ assessment of primary assessment reform as a whole. The leaked commentary said that any decision to review the approach would increase the risk that the descriptors could not be finalised ‘by September as planned’.

However, the story concludes:

‘The DfE says: “We do not comment on leaks,” but there are indications from the department that the guidance will be finalised by September. Perhaps ministers chose, in the end, not to “review their approach”, despite the concerns.’

Hence it would appear that delay until after the beginning of AY2015/16 will not be countenanced

Note that the descriptors are for use in academic year 2015/16, so even publication in September is problematic, since teachers will begin the year not knowing which descriptors to apply.

The consultation document refers only to descriptors for AY2015/16, which might imply that they will be further refined for subsequent years. Essentially therefore, the arrangements proposed here would be an imperfect interim solution.

.

[Postscript: On 26 February 2015 the Consultation Response was published – so on the date commited to in the consultation document. 

As expected, it revealed significant opposition to the original proposals:

  • 74% of respondents were concerned about nomenclature
  • 76% considered that the descriptors were not spaced effectively across the range of pupils’ performance
  • 69% of respondents considered them not clear or easy to understand

The response acknowledges that the issues raised:

‘….amount to a request for greater simplicity, clarity and consistency to support teachers in applying performance descriptors and to help parents understand their meaning.’

But goes on to allege that: 

‘…there are some stakeholders who valued the levels system and would like performance descriptors to function in a similar way across the key stages, which is not their intention.’

Even so, although the Descriptors are not intended to inform formative assessment, respondents have raised concerns that they could be applied in this manner.

There is also the issue of comparability between formative and summative assessment measures, but this is not addressed.

The response does not entirely acknowledge that opposition to the original proposals is sending it back to the drawing board but:

‘As a result of some of the conflicting responses to the consultation, we will work with relevant experts to determine the most appropriate course of action to address the concerns raised and will inform schools of the agreed approach according to the timetable set out in the consultation document – i.e. by September 2015.

The new assessment commission (see below) will have an as yet undefined role in this process:

‘In the meantime, and to help with this [ie determining the most appropriate course of action] the Government is establishing a Commission on Assessment Without Levels….’

Unfortunately, this role has not been clarified in the Commission’s Statement of Intended Outputs

There is no reference to the trials in schools, which may or may not continue. A DfE Memorandum to the Education Select Committee on its 2014-15 Supplementary Estimates reveals that £0.3m has been reallocated to pay for them, but this is no guarantee that they will take place.

Implementation will not be delayed by a year, despite the commitment to allow a full year’s notice for significant reforms announced in the response to the Workload Challenge.

This part of the timetable is now seriously concertina’d and there must be serious doubt whether the timescale is feasible, especially if proper trialling is to be accommodated.]

.

Outstanding tasks 

  • Publish response to performance descriptors consultation document (26 February 2015) COMPLETED
  • Trial (revised?) draft performance descriptors (summer term 2015) 
  • Publish adjusted descriptors, revised in the light of consultation with experts and input from the commission (summer term 2015)
  • Experts and commission on assessment produce response to concerns raised and inform schools of outcomes (September 2015)
  • Confirm statutory arrangements for use of the performance descriptors (September/autumn term 2015) 
  • Publish final performance descriptors for AY2015/16 (September/autumn term 2015) 
  • Publish final guidance on the use of performance descriptors (September/autumn term 2015) 
  • Publish exemplification of each performance descriptor at each key stage (September/autumn term 2015)
  • Publish an updated model for the external moderation of teacher assessment (September/autumn term 2015?) 
  • Confirm plans for the moderation of KS1 teacher assessment and use of the P-scales (September/autumn term 2015?) 
  • Publish guidance on assessment of those working above the P-scales but below the level of the tests (September/autumn term 2015?) 
  • Decide whether performance descriptors require adjustment for AY2016/17 onwards (summer term 2016)

.

Schools’ internal assessment and tracking systems

.

Consultation response

The consultation document outlined some of the Government’s justification for the removal of national curriculum levels. The statement that:

‘Schools will be able to focus their teaching, assessment and reporting not on a set of opaque level descriptions, but on the essential knowledge that all pupils should learn’

may be somewhat called into question by the preceding discussion of performance descriptors.

The consultation document continues:

‘There will be a clear separation between ongoing, formative assessment (wholly owned by schools) and the statutory summative assessment which the government will prescribe to provide robust external accountability and national benchmarking. Ofsted will expect to see evidence of pupils’ progress, with inspections informed by the school’s chosen pupil tracking data.’

A subsequent section adds:

‘We will not prescribe a national system for schools’ ongoing assessment….

…. We expect schools to have a curriculum and assessment framework that meets a set of core principles…

 … Although schools will be free to devise their own curriculum and assessment system, we will provide examples of good practice which schools may wish to follow. We will work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’

The consultation response does not cover this familiar territory again, saying only:

‘Since we launched the consultation, we have had conversations with our expert group on assessment about how to support schools to make best use of the new assessment freedoms. We have launched an Assessment Innovation Fund to enable assessment methods developed by schools and expert organisations to be scaled up into easy-to-use packages for other schools to use.’

Further work is therefore confined to the promulgation of core principles, the application of the Assessment Innovation Fund and possibly further work to ‘signpost schools to a range of potential approaches’.

.

Developments to date

The Assessment Innovation Fund was originally announced initially in December 2013.

A factsheet released at that time explains that many schools are developing new curriculum and assessment systems and that the Fund is intended to enable schools to share these.

Funding of up to £10K per school is made available to help up to 10 schools to prepare simple, easy-to-use packages that can be made freely available to other schools.

They must commit to:

‘…make their approach available on an open licence basis. This means that anyone who wishes to use the package (and any trade-marked name) must be granted a non-revocable, perpetual, royalty-free licence to do so with the right to sub-licence. The intellectual property rights to the system will remain with the school/group which devised it.’

Successful applicants were to be confirmed ‘in the week commencing 21 April 2014’

In the event, nine successful applications were announced on 1 May, although one subsequently withdrew, apparently over the licensing terms.

The packages developed with this funding are stored – in a rather user-unfriendly fashion – on this TES Community Blog, along with other material supportive of the decision to dispense with levels.

Much other useful material has been published online which has not been collected into this repository and it is not clear to what extent it will develop beyond its present limits, since the most recent addition was in early November 2014.

A recent survey by Capita Sims (itself a provider of assessment support) conducted between June and September 2014, suggested that:

  • 25% of primary and secondary schools were unprepared for and 53% had not yet finalised plans for replacing levels.
  • 28% were planning to keep the existing system of levels, 21% intended to introduce a new system and 28% had not yet made a decision.
  • 50% of those introducing an alternative expected to do so by September 2015, while 23% intended to do so by September 2016.
  • Schools’ biggest concern (53% of respondents) is measuring progress and setting targets for learners.

Although the survey is four months old and has clear limitations (there were only 126 respondents) this would suggest further support may be necessary, ideally targeted towards the least confident schools.

.

In April 2014 the Government published a set of Assessment Principles, building on earlier material in the primary consultation document. These had been developed by an ‘independent expert panel’.

It is not entirely clear whether the principles apply solely to primary schools and to schools’ own assessment processes (as opposed to statutory assessment).

The introductory statement says:

‘The principles are designed to help all schools as they implement arrangements for assessing pupils’ progress against their school curriculum; Government will not impose a single system for ongoing assessment.

Schools will be expected to demonstrate (with evidence) their assessment of pupils’ progress, to keep parents informed, to enable governors to make judgements about the school’s effectiveness, and to inform Ofsted inspections.’

This might suggest they are not intended to cover statutory assessment and testing but are relevant to secondary schools.

There are nine principles in all, divided into three groups:

.

Principles Capture

.

The last of these seems particularly demanding.

 .

In July 2014, Ofsted published guidance in the form of a ‘Note for inspectors: use of assessment information during inspections in 2014/15’. This says that:

‘In 2014/15, most schools, academies and free schools will have historic performance data expressed in national curriculum levels, except for those pupils in Year 1. Inspectors may find that schools are tracking attainment and progress using a mixture of measures for some, or all, year groups and subjects.

As now, inspectors will use a range of evidence to make judgements, including by looking at test results, pupils’ work and pupils’ own perceptions of their learning. Inspectors will not expect to see a particular assessment system in place and will recognise that schools are still working towards full implementation of their preferred approach.’

It goes on to itemise the ways in which inspectors will check that these systems are effective, without judging the systems themselves, but by gathering evidence of effective implementation through leadership and management, the accuracy of assessment, effectiveness in securing progress and quality of reporting to parents.

. 

In September 2014, NCTL published a research reportBeyond Levels: alternative assessment approaches developed by teaching schools.’

The report summarises the outcomes of small-scale research conducted in 34 teaching school alliances. It offers six rather prolix recommendations for schools and DfE to consider, which can be summarised as follows:

  • A culture shift is necessary in recognition of the new opportunities provided by the new national curriculum and the removal of levels.
  • Schools need access to conferences and seminars to help develop their assessment expertise.
  • Schools would benefit from access to peer reviewed commercial tracking systems relating to the new national curriculum. Clarification is needed about what data will be collected centrally.
  • Teaching school alliances and schools need financial support to further develop assessment practice, especially practical classroom tools, which should be made freely available online.
  • Financial support is needed for teachers to undertake postgraduate research and courses in this field.
  • It is essential to develop professional knowledge about emerging effective assessment practice.

I can find no government response to these recommendations and so have not addressed them in the list of outstanding tasks below.

.

[Postscript: On 25 February 2015, the Government announced the establishment of a ‘Commission on Assessment Without Levels’:

‘To help schools as they develop effective and valuable assessment schemes, and to help us to identify model approaches we are today announcing the formation of a commission on assessment without levels. This commission will continue the evidence-based approach to assessment which we have put in place, and will support primary and secondary schools with the transition to assessment without levels, identifying and sharing good practice in assessment.’

This appears to suggest belated recognition that the steps outlined above have provided schools with insufficient support for the transition to levels-free internal assessment. It is also a response to the possibility that Labour might revisit the decision to remove them (see below).

The Consultation Response on Performance Descriptors released on 26 February (see above) says that the Commission will help to determine the most appropriate response to concerns raised about the Descriptors, while also suggesting that this task will not be devolved exclusively to them.

It adds that the Commission will:

‘…collate, quality assure, publish and share best practice in assessment with schools across the country…and will help to foster innovation and success in assessment practice more widely.’

The membership of the Commission was announced on 9 March.

.

.

The Commission met on 10 March and 23 March 2015 and will meet four more times – in April, May, June and July.

Its Terms of Reference have been published. The Statement of Intended Outputs mentioned in the consultation response on Performance Descriptors appeared without any publicity on 27 March

It seemed that the Commission, together with the further consultation of experts, supplied a convenient mechanism for ‘parking’ some difficult issues until the other side of the Election.

However, neither the terms of reference nor the statement of outputs mentions the Performance Descriptors, so the Commission’s role in relation to them remains shrouded in mystery.

.

.

The authors of the Statement of Outputs feel it necessary to mention in passing that it:

‘…supports the decision to removel levels, but appreciates that the reasons for removing levels are not widely understood’.

It sets out a 10-point list of outputs comprising:

  • Another statement of the purposes of assessment and another set of principles to support schools in developing effective assessment systems, presumably different to those published by the previous expert group in April 2014. (It will be interesting to compare the two sets of principles, to establish whether Government policy on what constitutes effective assessment has changed over the last 12 months. It will also be worthwhile monitoring the gap between the principles and the views of Alison Peacock, one of the Commission’s members. She also sat on the expert panel that developed the original principles, some of which seem rather at odds with her own practice and preferences. Meanwhile, another member – Sam Freedman – has stated

.

.

  • An explanation of ‘how assessment without levels can better serve the needs of pupils and teachers’.
  • Guidance to ‘help schools create assessment policies which reflect the principles of effective assessment without levels’.
  • Clear information about ‘the legal and regulatory assessment requirements’, intende to clarify what they are now, how they will change and when. (The fact that the Commission concludes that such information is not already available is a searing indictment of the Government’s communications efforts to date.)
  • Clarification with Ofsted of ‘the role that assessment without levels will play in the inspection process’ so schools can demonstrate effectiveness without adding to teacher workload. (So again they must believe that Ofsted has not sufficiently clarified this already.)
  • Dissemination of good practice, obtained through engagement with ‘a wide group of stakeholders including schools, local authorities, teachers and teaching unions’. (This is tacit admission that the strategy described above is not working.)
  • Advice to the Government on how ITT and CPD can support assessment without levels and guidance to schools on the use of CPD for this purpose. (There is no reference to the resource implications of introducing additional training and development.)
  • Advice to the Government on ensuring ‘appropriate provision is made for pupils with SEN in the development of assessment policy’. (Their judgement that this is not yet accounted for is a worrying indictment of Government policy to date. They see this as not simply a lapse of communication but a lacuna in the policy-making process.)
  • ‘Careful consideration’ of commitments to tackling teacher workload – which they expect to alleviate by providing information, advice and support. (There is no hint that the introduction of Performance Descriptors will be delayed in line with the Workload Challenge.)
  • A final report before the end of the summer term, though it may publish some outputs sooner. (It will not be able to do so until the outcome of the Election is decided.)

Although there is some implicit criticism of Government policy and communications to date, the failure to make any reference to the Performance Descriptors is unlikely to instil confidence in the capacity of the Commission to provide the necessary challenge to the original proposals, or support to the profession in identifying a workable alternative.]

.

Outstanding tasks

  • Further dissemination of good practice through the existing mechanisms (ongoing) 
  • Further ‘work with professional associations, subject experts, education publishers and external test developers to signpost schools to a range of potential approaches.’ (ongoing)
  • Additional work (via the commission) to ‘collate, quality assure, publish and share’ best practice (Report by July 2015 with other outputs possible from May 2015)

Reporting to parents

.

Consultation response

The consultation document envisaged three outcomes for each test:

  • A scaled score
  • The learner’s position in the national cohort, expressed as a decile
  • The rate of progress from a baseline, derived by comparing a learner’s scaled score with that of other learners with the same level of prior attainment.

Deciles did not survive the consultation

The consultation response confirms that, for each test, parents will receive:

  • Their own child’s scaled score; and
  • The average scaled score for the school, ‘the local area’ (presumably the geographical area covered by the authority in which the school is situated) and the country as a whole.

They must also receive information about progress, but the response only discusses how this might be published on school websites and for the purposes of the floor targets (see sections below), rather than how it should be reported directly to parents.

We have addressed already the available information about the calculation of the scaled scores.

The original consultation document also outlined the broad methodology underpinning the progress measures:

‘In order to report pupils’ progress through the primary curriculum, the scaled score for each pupil at key stage 2 would be compared to the scores of other pupils with the same prior attainment. This will identify whether an individual made more or less progress than pupils with similar prior attainment…

…. Using this approach, a school might report pupils’ national curriculum test results to parents as follows:

In the end of key stage 2 reading test, Sally received a scaled score of 126 (the secondary ready standard is 100), placing her in the top 10% of pupils nationally. The average scaled score for pupils with the same prior attainment was 114, so she has made more progress in reading than pupils with a similar starting-point.’

.

Developments to date

On this web page first published in April 2014 STA commits to publishing guidance during summer term 2015 on how the results of national curriculum tests will be reported, including an explanation of scaled scores.

In September 2014, a further guidance note ‘National curriculum and assessment from September 2014: Information for schools’ shed a little further light on the calculation of the progress measures:

‘Pupil progress will be determined in relation to the average progress made by pupils with the same baseline (i.e. the same KS1 average point score). For example, if a pupil had an APS of 19 at KS1, we will calculate the average scaled score in the KS2 tests for all pupils with an APS of 19 and see whether the pupil in question achieved a higher or lower scaled score than that average The exact methodology of how this will be reported is still to be determined.’

It is hard to get a clear sense of the full range of assessment information that parents will receive.

I have been unable to find any comprehensive description, which would suggest that this is being held back until the methodology for calculating the various measures is finalised.

The various sections above suggest that they will receive details of:

  • Reception baseline assessment outcomes.
  • Attainment in end of KS1 and end of KS2 tests, now expressed as scaled scores (or via teacher assessment, code or P-scales if working below the level of the tests). This will be supplemented by a series of average scaled scores for each test.
  • Progress between the baseline assessment (reception baseline from 2022; KS1 baseline beforehand) and end of KS2 tests, relative to learners with similar prior attainment at the baseline.
  • Attainment in statutory teacher assessments, normally expressed through performance descriptors, but with different arrangements for low attainers.
  • Attainment and progress between reception baseline, KS1 and KS2 tests, provided through schools’ own internal assessment and tracking systems.

We have seen that reporting mechanisms for the first and fourth are not yet finalised.

The fifth is now for schools to determine, taking account of Ofsted’s guidance and, if they wish, the Assessment Principles.

The scales necessary to report the second are not yet published, and these also form the basis of the remaining progress measures.

Parents will be receiving this information in a variety of different formats: scaled scores, average scaled scores, baseline scores, performance descriptors, progress scores and internal tracking measures.

Moreover, the performance descriptor scales will vary according to the assessment and internal tracking will vary from school to school.

This is certainly much more complex than the current unified system of reporting based on levels. Parents will require extensive support to understand what they are receiving.

Outstanding tasks

Previous sections have already referenced expected guidance on reporting baseline assessments, scaled scores and the use of performance descriptors (which presumably includes parental reporting).

One assumes that there will also need to be unified guidance on all aspects of reporting to parents, intended for parental consumption.

So, avoiding duplication of previous sections, the remaining outstanding tasks are to:

  • Finalise the methodology for reporting on pupil progress (summer term 2015) 
  • Provide comprehensive guidance to parents on all aspects of reporting (summer term 2015?)

Publication of outcomes

.

Consultation response

This section covers publication of material for public consumption, within and alongside the Primary School Performance Tables and on schools’ websites.

The initial consultation document has much to say about first of these, while the consultation response barely mentions the Tables, focusing almost exclusively on school websites

The original document suggests that the Performance Tables will include a variety of measures, including:

  • The percentage of pupils meeting the secondary readiness standard
  • The average scaled score
  • Where the school’s pupils fit in the national cohort
  • Pupils’ rate of progress
  • How many of the school’s pupils are among the highest-attaining nationally, through a measure showing the percentage of pupils attaining a high scaled score in each subject.
  • Teacher assessment outcomes in English maths and science
  • Comparisons of each school’s performance with that of schools with similar intake
  • Data about the progress of those with very low prior attainment.

All the headline measures will be published separately for pupils in receipt of the pupil premium.

All measures will be published as three year rolling averages in addition to annual results.

There is also a commitment to publish a wide range of test and teacher assessment data, relating to both attainment and progress, through a Data Portal:

‘The department is currently procuring a new data portal or “data warehouse” to store the school performance data that we hold and provide access to it in the most flexible way. This will allow schools, governors and parents to find and analyse the data about schools in which they are most interested, for example focusing on the progress of low attainers in mathematics in different schools or the attainment of certain pupil groups.’

The consultation response acknowledges as a guiding principle:

‘…a broad range of information should be published to help parents and the wider public know how well schools are performing.’

The accountability system will:

‘…require schools to publish information on their websites so that parents can understand both the progress pupils make and the standards they achieve.’

Data on low attainers’ attainment and progress will not be published since the diversity of this group demands extensive contextual information.

But when it comes to Performance Tables, the consultation response says only:

‘As now, performance tables will present a wide range of information about primary school performance.’

By implication, they will include progress measures since the text adds:

‘In 2022 performance tables, we will judge schools on whichever is better: their progress from the reception baseline to key stage 2; or their progress from key stage 1 to key stage 2.

However, schools will be required to publish a suite of indicators in standard format on their websites, including:

  • The average progress made by pupils in reading, writing and maths
  • The percentage of pupils achieving the expected standard at the end of KS2 in reading, writing and maths
  • The average score of pupils in their end of KS2 assessments and
  • The ‘percentage of pupils who achieve a high score in all areas’ at the end of KS2.

The precise form of the last of these indicators is not explained. This is not quite the same as the ‘measure showing the percentage of pupils attaining a high scaled score in each subject’ mentioned in the original consultation document.

Does ‘all areas’ mean reading, writing and maths? Must learners achieve a minimum score in each assessment, or a single aggregate score above a certain threshold?

In addition:

‘So that parents can make comparisons between schools, we would like to show each school’s position in the country on these measures and present these results in a manner that is clear for all audiences to understand. We will discuss how best to do so with stakeholders, to ensure that the presentation of the data is clear, fair and statistically robust.’

.

Developments to date

In June 2014, a consultation document was issued ‘Accountability: publishing headline performance measures on school and college websites’. This was accompanied by a press release.

The consultation document explains the intended relationship between the Performance Tables, Data Portal and material published on schools’ websites:

‘Performance tables will continue to provide information about individual schools and colleges and be the central source of school and college performance information.’

Moreover:

‘Future changes to the website, through the school and college performance data portal, will improve accessibility to a wide range of information, including the headline performance measures. It will enable interested parents, students, schools, colleges and researchers to interrogate educational data held by the Department for Education to best meet their requirements.’

But:

‘Nevertheless, the first place many parents and students look for information about a school or college is the institution’s own website’

Schools are already required to publish such information, but there is inconsistency in where and how it is presented. The document expresses the intention that consistent information should be placed ‘on the front page of every school and college website’.

The content proposed for primary school’s websites covers the four headline measures set out in the consultation response.

A footnote says:

‘These measures will apply to all-through primary, junior and middle schools. Variants of these measures will apply for infant and first schools.’

But the variants are not set out.

There is no reference to the plan to show ‘each school’s position in the country on these measures’ as mentioned in the consultation response.

The consultation proposes a standard visual presentation which, for primary schools, looks like this

.

school websites Capture

.

The response to this consultation ‘Publishing performance measures on school and college websites’ appeared in December 2014 (the consultation document had said ‘Autumn 2014’).

The summary of responses says:

‘The majority of respondents to the consultation welcomed the proposals to present headline performance measures in a standard format. There was also strong backing for the proposed visual presentation of data to aid understanding of performance. However, many respondents suggested that without some sense of scale or spread to provide some context to the visual presentation, the data could be misleading. Others said that the language used alongside the charts should be clearer…’

…Whilst most respondents favoured a data application tool that would remove the burden of annually updating performance data on school and college websites, they also highlighted the difficulties of developing a data application that would be compatible with a wide range of school and college websites.’

It is clear that some respondents had questioned why school websites should not simply carry a link on their homepage to the School Performance Tables.

In the light of this reaction, further research will be undertaken to:

  • develop a clear and simple visual representation of the data, but with added contextual information.
  • establish how performance tables data can be presented ‘in a way that reaches more parents’.

The timeline suggests that this will result in ‘proposals for redevelopment of performance tables’ by May 2015, so we can no longer assume that the Tables will cover the list of material suggested in the original consultation document.

The timeline indicates that if initial user research concludes that a data application is required, that will be developed and tested between June and October 2015, for roll out between September 2016 and January 2017.

Schools will be informed by autumn 2015 whether they should carry a link to the Tables, download a data application or pursue a third option.

But, nevertheless:

‘All schools and colleges, including academies, free schools and university technical colleges, will be required to publish the new headline performance measures in a consistent, standard format on their websites from 2016.’

So, if an application is not introduced, it seems that schools will still have to publish the measures on their websites: they will not be able to rely solely on a link to the Performance Tables.

Middle schools will only be required to publish the primary measures. No mention is made of infant or first schools.

.

There is no further reference to the data portal, since this project was quietly shelved in September 2014, following unexplained delays in delivery.

.

.

There has been no subsequent explanation of the implications of this decision. Will the material intended for inclusion in the Portal be included in the Performance Tables, or published by another route, or will it no longer be published?

.

Finally, some limited information has emerged about accountability arrangements for infant schools.

This appears on a web page – New accountability arrangements for infant schools from 2016 – published in June 2014.

It explains that the reception baseline will permit the measurement of progress alongside attainment. The progress of infant school pupils will be published for the first time in the 2019 Performance Tables.

This might mean a further addition to the list of information reported to parents set out in the previous section.

There is also a passing reference to moderation:

‘To help increase confidence and consistency in our moderation of infant schools, we will be increasing the proportion of schools where KS1 assessments are moderated externally. From summer 2015, half of all infant schools will have their KS1 assessments externally moderated.’

But no further information is forthcoming about the nature of other headline measures and how they will be reported.

.

Outstanding tasks

  • Complete user research and publish proposals for redevelopment of Performance Tables (May 2015) 
  • Confirm what data will be published in the 2016 Performance Tables (summer Term 2015?)
  • Confirm how material originally intended for inclusion in Data Portal will be published (summer term 2015?)
  • Confirm the format and publication route for data showing each school’s position in the country on the headline measures (summer term 2015?) 
  • Confirm headline performance measures for infant and first schools (summer term 2015?) 
  • If necessary, further develop and test a prototype data application for schools’ websites (October 2015) 
  • Inform schools whether a data application will be introduced (autumn 2015) 
  • Amend School Information Regulations to require publication of headline measures in standard format (April 2016) 
  • If proceeding, complete development and testing of a data application (May 2016) 
  • If proceeding, complete roll out of data application (February 2017)

.

Floor standards

.

Consultation response

Minimum expectations of schools will continue to be embodied in floor standards. Schools falling below the floor will attract ‘additional scrutiny through inspection’ and ‘intervention may be required’.

Although the new standard:

‘holds schools to account both on the progress they make and on how well their pupils achieve.’

In practice they are able to choose between one or the other.

An all-through primary school will be above the floor standards if:

  • Pupils make sufficient progress between the reception baseline and the end of KS2 in all of reading, writing and maths or
  • 85% or more of pupils meet the new expected standard at the end of KS2 (similar to Level 4b under the current system).

A junior or middle school will be above the floor standard if:

  • pupils make sufficient progress at key stage 2 from their starting point at key stage 1; or
  • 85% or more of pupils meet the new expected standard at the end of key stage 2

At this stage arrangements for measuring the progress of pupils in infant or first schools are still to be considered.

Since the reception baseline will be introduced in 2015, progress in all-through primary schools will continue to be measured from the end of KS1 until 2022.

This should mean that, prior to 2022, the standard would be achieved by ensuring that the progress made by pupils in a school – in reading, writing and maths – equals or exceeds the national average progress made by pupils with similar prior attainment at the end of KS1.

Exactly how individual progress will be aggregated to create a whole school measure is not yet clear. The original consultation document holds up the possibility that slightly below average progress will be acceptable:

‘…we expect the value-added score required to be above the floor to be between 98.5 and 99 (a value-added score of 100 represents average progress).’

The consultation response says the amount of progress required will be determined in 2016:

‘The proposed progress measure will be based on value-added in each of reading, writing and mathematics. Each pupil’s scaled scores in each area at key stage 2 will be compared with the scores of pupils who had the same results in their assessments at key stage 1.

For a school to be above the progress floor, pupils will have to make sufficient progress in all of reading, writing and mathematics. For 2016, we will set the precise extent of progress required once key stage 2 tests have been sat for the first time. Once pupils take a reception baseline, progress will continue to be measured using a similar value added methodology.’

In 2022 schools will be assessed against either the reception or KS1 baseline, whichever gives the best result. From 2023 only the reception baseline will be in play.

The attainment standard will be based on achievement of ‘a scaled score of 100 or more’ in each of the reading and maths tests and achievement, via teacher assessment, of the new expected standard in writing (presumably the middle of the five described above).

The attainment standard is significantly more demanding, in that the present requirement is for 65% of learners to meet the expected standard – and the standard itself will now be pitched higher, at the equivalent of Level 4B.

The original consultation document says:

‘Our modelling suggests that a progress measure set at this level, combined with the 85% threshold attainment measure, would result in a similar number of schools falling below the floor as at present. Over time we will consider whether schools should make at least average progress as part of floor standards.’

The consultation response does not confirm this judgement.

.

Developments

The only significant development since the publication of the consultation response is the detail provided on the June 2014 webpage New accountability arrangements for infant schools from 2016.

In addition to the points in the previous section, this also confirms that:

‘…there will not be a floor standard for infant schools’

But this statement has been called into question, since the table from the performance descriptors consultation, reproduced above, appears to suggest that KS1 teacher assessments in reading, writing and maths do contribute to a floor standard – whether for infant or all-through primary schools is unclear.

.

The aforementioned Centre Forum Report ‘Progress matters in Primary too’ (January 2015) also appears to call into question the results of the modelling reported in the initial consultation document.

It says:

‘…the likelihood is that, based on current performance, progress will be the measure used for the vast majority of schools, at least in the short to medium term. Even those schools which achieve the attainment floor target will only do so by ensuring at least average progress is made by their pupils. As a result, progress will in practice be the dominant accountability metric.’

It undertakes modelling based on 2013 attainment data – ie simulating the effect of the new standards had they been in place in 2013, using selected learning areas within the EYFSP as a proxy for the reception baseline – which suggests that just 10% of schools in 2013 would have met the new attainment floor.

It concludes that:

‘For the vast majority of schools, progress will be their only option for avoiding intervention when the reforms come into effect.’

Unfortunately though, it does not provide an estimate of the proportion of schools likely to achieve the progress floor standard, with either the current KS1 baseline or its proxy for a reception baseline.

Outstanding Tasks

  • Confirm the detailed methodology for deriving both the attainment and progress elements of the floor standards, in relation to both the new reception baseline and the interim KS1 baseline (summer 2015?)
  • Set the amount of progress required to achieve the progress element of the floor standards (summer 2016)
  • (In the consultation document) Consider whether schools should make at least average progress as part of floor standards and ‘move to three year rolling averages for floor standard measures’ (long term)

.

Overall progress, Purdah and General Election outcomes

Progress to date and actions outstanding

The lists of outstanding actions above record some 40 tasks necessary to the successful implementation of the primary assessment and accountability reforms.

If the ‘advance notice’ conventions are observed, roughly half of these require completion by the end of the summer term in July 2015, within the two windows of 50 working days on either side of Purdah.

These conventions have already been set aside in some cases, most obviously in respect of reception baseline assessment and the performance descriptors for statutory teacher assessment.

Unsurprisingly, the commentary above suggests that these two strands of the reform programme are the most complex and potentially the most problematic.

The sheer number of outstanding tasks and the limited time in which to complete them could pose problems.

It is important to remember that there are similar reforms in the secondary and post-16 sectors that need to be managed in parallel.

The leaked amber/red rating was attributed solely to the negative reaction to the draft performance descriptors, but it could also reflect a wider concern that all the necessary steps may not be completed in time to give schools the optimal period for planning and preparation.

Schools may be able to cope with shorter notice in a few instances, where the stakes are relatively low, but if too substantive a proportion of the overall reform programme is delayed into next academic year, they will find the cumulative impact much harder to manage.

In a worst case scenario, implementation of some elements might need to be delayed by a year, although the corollary would be an extended transition period for schools that would be less than ideal. It may also be difficult to disentangle the different strands given the degree of interdependency between them.

Given the proximity of a General Election, it may not be politic to confirm such delays before Purdah intervenes: the path of least resistance is probably to postpone any difficult decisions for consideration by the incoming government.

.

The implications of Purdah

As noted above, if the General Election result is clear-cut, Purdah will last some five-and-a-half weeks and will occur at a critical point in the implementation timetable.

The impact of Purdah should not be under-estimated.

From the point at which Parliament is dissolved on Monday 30 March, the Government must abstain from major policy decisions and announcements.

The Election is typically announced a few days before the dissolution of Parliament. This ‘wash up’ period between announcement and dissolution is typically used to complete essential unfinished business.

The Cabinet Office issues guidance on conduct during Purdah shortly before it begins.

The 2015 guidance has not yet issued so the 2010 guidance is the best source of information about what to expect.

.

[Postscript: 2015 Guidance was posted on 30 March 2015 and is substantively the same as the 2010 edition.]

.

Key points include:

  • ‘Decisions on matters of policy on which a new Government might be expected to want the opportunity to take a different view from the present Government should be postponed until after the Election, provided that such postponement would not be detrimental to the national interest or wasteful of public money.’
  • ‘Officials should not… be asked to devise new policies or arguments…’
  • ‘Departmental communications staff may…properly continue to discharge during the Election period their normal function only to the extent of providing factual explanation of current Government policy, statements and decisions.’
  • ‘There would normally be no objection to issuing routine factual publications, for example, health and safety advice but these will have to be decided on a case by case basis taking account of the subject matter and the intended audience.’
  • ‘Regular statistical releases and research reports (e.g. press notices, bulletins, publications or electronic releases) will continue to be issued and published on dates which have been pre-announced. Ad hoc statistical releases or research reports should be released only where a precise release date has been published prior to the Election period. Where a pre-announcement has specified that the information would be released during a specified period (e.g. a week, or longer time period), but did not specify a precise day, releases should not be published within the Election period.’
  • ‘Research: Fieldwork involving interviews with the public or sections of it will be postponed or abandoned although regular, continuous and on-going statistical surveys may continue.’
  • ‘Official websites…the release of new online services and publication of reworked content should not occur until after the General Election… Content may be updated for factual accuracy but no substantial revisions should be made and distributed.’
  • The general principles and conventions set out in this guidance apply to NDPBs and similar public bodies.

Assuming similar provisions in 2015, most if not all of the assessment and accountability work programme would grind to a halt.

To take an example, it is conceivable that those awarded baseline assessment contracts would be able to recruit schools after 30 March, but they will receive little or no help from the DfE during the Purdah period. Given that the recruitment deadline is 30 April, this may be expected to depress recruitment significantly.

.

The impact of different General Election outcomes

Forming a Government in the case of a Hung Parliament may also take some time, further delaying the process.

The six days taken in 2010 may not be a guide to what will happen in 2015.

The Cabinet Manual (2011) says:

‘Where an election does not result in an overall majority for a single party, the incumbent government remains in office unless and until the Prime Minister tenders his or her resignation and the Government’s resignation to the Sovereign. An incumbent government is entitled to wait until the new Parliament has met to see if it can command the confidence of the House of Commons, but is expected to resign if it becomes clear that it is unlikely to be able to command that confidence and there is a clear alternative…

…The nature of the government formed will be dependent on discussions between political parties and any resulting agreement. Where there is no overall majority, there are essentially three broad types of government that could be formed:

  • single-party, minority government, where the party may (although not necessarily) be supported by a series of ad hoc agreements based on common interests;
  • formal inter-party agreement, for example the Liberal–Labour pact from 1977 to 1978; or
  • formal coalition government, which generally consists of ministers from more than one political party, and typically commands a majority in the House of Commons’.

If one or more of the parties forming the next government has a different policy on assessment and accountability, this could result in pressure to amend or withdraw parts of the reform programme.

If a single party is involved, pre-Election contact with civil servants may have clarified its intentions, enabling work to resume as soon as the new government is in place but, if more than one party is involved, it may take longer to agree the preferred way forward.

Under a worst case scenario, planners might need to allow for Purdah and post-Election negotiations to consume eight weeks or longer.

The impact of the Election on the shape and scope of the primary assessment and accountability reforms will also depend on which party or parties enter government.

If the same Coalition partners are returned, one might expect uninterrupted implementation, unless the minority Lib Dems seek to negotiate different arrangements, which seems unlikely.

But if a different party or a differently constituted Coalition forms the Government, one might expect decisions to abandon or delay some aspects of the programme.

If Labour forms the Government, or is the major party in a Coalition, some unravelling will be necessary.

They are broadly committed to the status quo:

‘Yet when it comes to many of the technical day-to-day aspects of school leadership – child protection, curriculum reform, assessment and accountability – we believe that a period of stability could prove beneficial for raising pupil achievement. This may not be an exciting rallying cry, but it is crucial that the incoming government takes account of the classroom realities.’

Hunt has also declared:

‘Do not mistake me: I am zealot for minimum standards, rigorous assessment and intelligent accountability.

But if we choose to focus upon exam results and league tables to the detriment of everything else, then we are simply not preparing our young people for the demands of the 21st century.’

And, thus far, Labour has made few specific commitments in this territory.

  • They support reception baseline assessment but whether that extends to sustaining a market of providers is unknown. Might they be inclined to replace this with a single national assessment?.
  • There is very little about floor targets – a Labour invention – although the Blunkett Review appears to suggest that Directors of School Standards will enjoy some discretion in respect of their enforcement.

Reading between the lines, it seems likely that they would delay some of the strands described above – and potentially simplify others.

.

Conclusion

The primary assessment reform programme is both extensive and highly complex, comprising several strands and many interdependencies.

Progress to date can best be described as halting.

There are still many steps to be taken and difficult issues to resolve, about half of which should be completed by the end of this academic year. Pre-Election Purdah will cut significantly into the time available.

More announcements may be delayed into the summer holidays or the following autumn term, but this reduces the planning and preparation time available to schools and has potentially significant workload implications.

Alternatively, implementation of some elements or strands may be delayed by a year, but this extends the transition period between old and new arrangements. Any such rationalisation seems likely to be delayed until after the Election and decisions will be influenced by its outcome.

.

[Postscript: The commitment in the Government’s Workload Challenge response to a one-year lead time, now encapsulated in the Protocol published on 23 March, has not resulted in any specific commitments to delay ahead of the descent of Purdah.

At the onset of Purdah on 30 March some 18 actions appear to be outstanding and requiring completion by the end of the summer term. This will be a tall order for a new Government, especially one of a different complexion.]

.

If Labour is the dominant party, they may be more inclined to simplify some strands, especially baseline assessment and statutory teacher assessment, while also providing much more intensive support for schools wrestling with the removal of levels.

Given the evidence set out above, ‘amber/red’ seems an appropriate rating for the programme as a whole.

It seems increasingly likely that some significant adjustments will be essential, regardless of the Election outcome.

.

GP

January 2015

What Becomes of Schools That Fail Their High Attainers?*

.

This post reviews the performance and subsequent history of schools with particularly poor results for high attainers in the Secondary School Performance Tables over the last three years.

P1010120

Seahorse in Perth Aquarium by Gifted Phoenix

It establishes a high attainer ‘floor target’ so as to draw a manageable sample of poor performers and, having done so:

  • Analyses the characteristics of this sample;
  • Explores whether these schools typically record poor performance in subsequent years or manage to rectify matters;
  • Examines the impact of various interventions, including falling below the official floor targets, being placed in special measures or deemed to have serious weaknesses following inspection, becoming an academy and receiving a pre-warning and/or warning notice;
  • Considers whether the most recent Ofsted reports on these schools do full justice to this issue, including those undertaken after September 2013 when new emphasis was placed on the performance of the ‘most able’.

The post builds on my previous analysis of high attainment in the 2013 School Performance Tables (January 2014). It applies the broad definition of high attainers used in the Tables, which I discussed in that post and have not repeated here.

I must emphasise at the outset that factors other than poor performance may partially explain particularly low scores in the Tables.

There may be several extenuating circumstances that are not reflected in the results. Sometimes these may surface in Ofsted inspection reports, but the accountability and school improvement regime typically imposes a degree of rough justice, and I have followed its lead.

It is also worth noting that the Performance Tables do not provide data for schools where the number of high attainers is five or fewer, because of the risk that individuals may be identifiable even though the data is anonymised.

This is unfortunate since the chances are that schools with very few high attainers will find it more difficult than others to address their needs. We may never know, but there is more on the impact of cohort size below.

Finally please accept my customary apology for any transcription errors. Do let me know if you notice any and I will correct them.

.

Drawing the Sample

The obvious solution would be to apply the existing floor targets to high attainers.

So it would include all schools recording:

  • Fewer than 35% (2011) or 40% (2012 and 2013) of high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and mathematics and
  • Below median scores for the percentage of high attainers making at least the expected three levels of progress between Key Stages 2 and 4 in English and maths respectively.

But the first element is far too undemanding a threshold to apply for high attaining learners and the overall target generates a tiny sample.

The only school failing to achieve it in 2013 was Ark Kings Academy in Birmingham, which recorded just six high attainers, forming 9% of the cohort (so only just above the level at which results would have been suppressed).

In 2012 two schools were in the same boat:

  • The Rushden Community College in Northamptonshire, with 35 high attainers (26% of the cohort), which became a sponsored academy with the same name on 1 December 2012; and
  • Culverhay School in Bath and North East Somerset, with 10 high attainers (19% of the cohort), which became Bath Community Academy on 1 September 2012.

No schools at all performed at this level in 2011.

A sample of just three schools is rather too unrepresentative, so it is necessary to set a more demanding benchmark which combines the same threshold and progress elements.

The problem is not with the progress measure. Far too many schools fail to meet the median level of performance – around 70% each year in both English and maths – even with their cadres of high attainers. Hence I need to lower the pitch of this element to create a manageable sample.

I plumped for 60% or fewer high attainers making at least the expected progress between KS2 and KS4 in both English and maths. This captured 22 state-funded schools in 2013, 31 in 2012 and 38 in 2011. (It also enabled Ark King’s Academy to escape, by virtue of the fact that 67% of its high attainers learners achieved the requisite progress in English.)

For the threshold element I opted for 70% or fewer high attainers achieving five or more GCSEs at grades A*-C or equivalent including GCSEs in English and maths. This captured 19 state-funded schools in 2013, 29 in 2012 and 13 in 2011.

.

Venn 2.

The numbers of state-funded schools that met both criteria were seven in 2013, eight  in 2012 and five in 2011, so 20 in all.

I decided to feature this small group of schools in the present post while also keeping in mind the schools occupying each side of the Venn Diagram. I particularly wanted to see whether schools which emerged from the central sample in subsequent years continued to fall short on one or other of the constituent elements.

The 20 schools in the main sample are:

Table 1 below provides more detail about these 20 schools.

.

Table 1: Schools Falling Below Illustrative High Attainer Floor Targets 2011-2013

Name Type LA Status/Sponsor Subsequent History
2011
Carter Community School 12-16 mixed modern Poole Community Sponsored academy (ULT) 1/4/13
Hadden Park High School 11-16 mixed comp Nottingham Foundation Sponsored Academy (Bluecoat School) 1/1/14
Merchants Academy 11-18 mixed comp Bristol Sponsored Academy (Merchant Venturers/ University of Bristol
The Robert Napier School 11-18 mixed modern Medway Foundation Sponsored Academy (Fort Pitt Grammar School)  1/9/12
Bishop of Rochester Academy 11-18 mixed comp Kent Sponsored Academy (Medway Council/ Canterbury Christ Church University/ Diocese of Rochester)
2012
The Rushden Community College 11-18 mixed comp Northants Community Sponsored Academy (The Education Fellowship) 12/12
Culverhay School 11-18 boys comp Bath and NE Somerset Community Bath Community Academy – mixed (Cabot Learning) 1/9/12
Raincliffe School 11-16 mixed comp N Yorks Community Closed 8/12 (merged with Graham School)
The Coseley School 11-16 mixed comp Dudley Foundation
Fleetwood High School 11-18 mixed comp Lancs Foundation
John Spendluffe Foundation Technology College 11-16 mixed modern Lincs Academy converter
Parklands High School 11-18 mixed Liverpool Foundation Discussing academy sponsorship (Bright Tribe)
Frank F Harrison Engineering College 11-18 mixed comp Walsall Foundation Mirus Academy (sponsored by Walsall College) 1/1/12
2013
Gloucester Academy 11-19 mixed comp Glos Sponsored Academy (Prospect Education/ Gloucestershire College)
Christ the King Catholic and Church of England VA School 11-16 mixed comp Knowsley VA Closed 31/8/13
Aireville School 11-16 mixed modern N Yorks Community
Manchester Creative and Media Academy for Boys 11-19 boys comp Manchester Sponsored Academy (Manchester College/ Manchester Council/ Microsoft)
Fearns Community Sports College 11-16 mixed comp Lancs Community
Unity College Blackpool 5-16 mixed comp Blackpool Community Unity Academy Blackpool (sponsored by Fylde Coast Academies)
The Mirus Academy 3-19 mixed comp Walsall Sponsored Academy (Walsall College)

 .

Only one school appears twice over the three-year period albeit in two separate guises – Frank F Harrison/Mirus.

Of the 20 in the sample, seven were recorded in the relevant year’s Performance Tables as community schools, six as foundation schools, one was VA, one was an academy converter and the five remaining were sponsored academies.

Of the 14 that were not originally academies, seven have since become sponsored academies and one is discussing the prospect. Two more have closed, so just five – 25% of the sample – remain outside the academies sector.

All but two of the schools are mixed (the other two are boys’ schools). Four are modern schools and the remainder comprehensive.

Geographically they are concentrated in the Midlands and the North, with a few in the South-West and the extreme South-East. There are no representatives from London, the East or the North-East.

.

Performance of the Core Sample

Table 2 below looks at key Performance Table results for these schools. I have retained the separation by year and the order in which the schools appear, which reflects their performance on the GCSE threshold measure, with the poorest performing at the top of each section.

.

Table 2: Performance of schools falling below proposed high attainer floor targets 2011-2013

Name No of HA % HA 5+ A*-C incl E+M 3+ LoP En 3+ LoP Ma APS (GCSE)
2011
Carter Community School 9 13 56 56 44 304.9
Hadden Park High School 15 13 60 40 20 144.3
Merchants Academy 19 19 68 58 42 251.6
The Robert Napier School 28 12 68 39 46 292.8
Bishop of Rochester Academy 10 5 70 50 60 298.8
2012
The Rushden Community College 35 26 3 0 54 326.5
Culverhay School 10 19 30 40 20 199.3
Raincliffe School 6 11 50 50 33 211.5
The Coseley School 35 20 60 51 60 262.7
Fleetwood High School 34 22 62 38 24 272.9
John Spendluffe Foundation Technology College 14 12 64 50 43 283.6
Parklands High School 13 18 69 23 8 143.7
Frank F Harrison Engineering College 20 12 70 35 60 188.3
2013
Gloucester Academy 18 13 44 28 50 226.8
Christ the King Catholic and Church of England VA School 22 22 55 32 41 256.5
Aireville School 23 23 61 35 57 267.9
Manchester Creative and Media Academy for Boys 16 19 63 50 50 244.9
Fearns Community Sports College 22 13 64 36 59 306.0
Unity College Blackpool 21 18 67 57 52 277.1
The Mirus Academy 23 13 70 57 52 201.4

.

The size of the high attainer population in these schools varies between 6 (the minimum for which statistics are published) and 35, with an average of just under 20.

The percentage of high attainers within each school’s cohort ranges from 5% to 26% with an average of slightly over 16%.

This compares with a national average in 2013 for all state-funded schools of 32.4%, almost twice the size of the average cohort in this sample. All 20 schools here record a high attainer population significantly below this national average.

This correlation may be significant – tending to support the case that high attainers are more likely to struggle in schools where they are less strongly concentrated – but it does not prove the relationship.

Achievement against the GCSE threshold measure falls as low as 3% (Rushden in 2012) but this was reportedly attributable to the school selecting ineligible English specifications.

Otherwise the poorest result is 30% at Culverhay, also in 2012, followed by Gloucester Academy (44% in 2013) and Raincliffe (50% in 2012). Only these four schools have recorded performance at or below 50%.

Indeed there is a very wide span of performance even amongst these small samples, especially in 2012 when it reaches an amazing 67 percentage points (40 percentage points excluding Rushden). In 2013 there was a span of 26 percentage points and in 2011 a span of 14 percentage points.

The overall average amongst the 20 schools is almost 58%. This varies by year. In 2011 it was 64%, in 2012 it was significantly lower at 51% (but rose to 58% if Rushden is excluded) and in 2013 it was 61%.

This compares with a national average for high attainers in state-funded schools of 94.7% in 2013. The extent to which some of these outlier schools are undershooting the national average is truly eye-watering.

Turning to the progress measures, one might expect even greater variance, given that so many more schools fail to clear this element of the official floor targets with their high attainers.

The overall average across these 20 schools is 41% in English and 44% in maths, suggesting that performance is slightly stronger in maths than English.

But in 2011 the averages were 49% in English and 42% in maths, reversing this general pattern and producing a much wider gap in favour of English.

In 2012 they were 36% in English and 38% in maths, but the English average improves to 41% if Rushden’s result is excluded. This again bucks the overall trend.

The overall average is cemented by the 2013 figures when the average for maths stood at 53% compared with 42% for English.

Hence, over the three years, we can see that the sharp drop in English in 2012 – most probably attributable to the notorious marking issue – was barely recovered in 2013. Conversely, a drop in maths in 2012 was followed by a sharp recovery in 2013.

The small sample size calls into question the significance of these patterns, but they are interesting nevertheless.

The comparable national averages among all state-funded schools in 2013 were 86.2% in English and 87.8% in maths. So the schools in this sample are typically operating at around half the national average levels. This is indeed worse than the comparable record on the threshold measure.

That said, the variation in these results is again huge – 35 percentage points in English (excluding Rushden) and as much as 52 percentage points in maths.

There is no obvious pattern in these schools’ comparative performance in English and maths. Ten schools scored more highly in English and nine in maths, with one school recording equally in both. English was in the ascendancy in 2011 and 2012, but maths supplanted it in 2013.

The final column in Table 2 shows the average point score (APS) for high attainers’ best eight GCSE results. There is once more a very big range, from 144.3 to 326.5 – over 180 points – compared with a 2013 national average for high attainers in state-funded schools of 377.6.

The schools at the bottom of the distribution are almost certainly relying heavily on GCSE-equivalent qualifications, rather than pushing their high attainers towards GCSEs.

Those schools that record relatively high APS alongside relatively low progress scores are most probably taking their high attaining learners with L5 at KS2 to GCSE grade C, but no further.

.

Changes in Performance from 2011 to 2013

Table 3, below, shows how the performance of the 2011 sample changed in 2012 and 2013, while Table 4 shows how the 2012 sample performed in 2013.

The numbers in green show improvements compared with the schools’ 2011 baselines and those in bold are above my illustrative high attainer floor target. The numbers in red are those which are lower than the schools’ 2011 baselines.

.

Table 3: Performance of the 2011 Sample in 2012 and 2013

Name             % HA  5+ A*-C incl E+M      3+ LOP E    3+ LOP M
11 12 13 11 12 13 11 12 13 11 12 13
Carter Community School 13 14 13 56 100 92 56 80 75 44 80 33
Hadden Park High School 13 15 8 60 87 75 40 80 75 20 53 50
Merchants Academy 19 16 20 68 79 96 58 79 88 42 47 71
The Robert Napier School 12 12 11 68 83 96 39 59 92 46 62 80
Bishop of Rochester Academy 5 7 8 70 83 73 50 67 47 60 75 53

.

All but one of the five schools showed little variation in the relative size of their high attainer populations over the three years in question.

More importantly, all five schools made radical improvements in 2012.

Indeed, all five exceeded the 5+ GCSE threshold element of my illustrative floor target in both 2012 and 2013 though, more worryingly, three of the five fell back somewhat in 2013 compared with 2012, which might suggest that short term improvement is not being fully sustained.

Four of the five exceeded the English progress element of the illustrative floor target in 2012 while the fifth – Robert Napier – missed by only 1%.

Four of the five also exceeded the floor in 2013, including Robert Napier which made a 43 percentage point improvement compared with 2012. On this occasion, Bishop of Rochester was the exception, having fallen back even below its 2011 level.

In the maths progress element, all five schools made an improvement in 2012, three of the five exceeding the floor target, the exceptions being Hadden Park and Merchants Academy

But by 2013, only three schools remained above their 2011 baseline and only two – Merchants and Robert Napier – remained above the floor target.

None of the five schools would have remained below my floor target in either 2012 or 2013, by virtue of their improved performance on the 5+ GCSE threshold element, but there was significantly greater insecurity in the progress elements, especially in maths.

There is also evidence of huge swings in performance on the progress measures. Hadden Park improved progression in English by 40 percentage points between 2011 and 2012. Carter Community School almost matched this in maths, improving by 36 percentage points, only to fall back by a huge 47 percentage points in the following year.

Overall this would appear to suggest that this small sample of schools made every effort to improve against the threshold and progress measures in 2012 but, while most were able to sustain improvement – or at least control their decline – on the threshold measure into 2013, this was not always possible with the progress elements.

There is more than a hint of two markedly different trajectories, with one group of schools managing to sustain initial improvements from a very low base and the other group falling back after an initial drive.

Is the same pattern emerging amongst the group of schools that fell below my high attainer floor target in 2012?

.

Table 4: Performance of the 2012 Sample in 2013

Name   % HA  5+ A*-C incl E+M 3+ LOP E 3+ LOP M
12 13  12 13 12 13 12 13
The Rushden Community College 26 23 3 90 0 74 54 87
Culverhay School 19 12 30 67 40 67 20 67
Raincliffe School 11 50 50 33
The Coseley School 20 26 60 88 51 82 60 78
Fleetwood High School 22 24 62 84 38 36 24 67
John Spendluffe Foundation Technology College 12 15 64 100 50 61 43 83
Parklands High School 18 11 69 78 23 56 8 56
Frank F Harrison Engineering College 12 13 70 70 35 57 60 52

.

We must rule out Raincliffe, which closed, leaving seven schools under consideration.

Some of these schools experienced slightly more fluctuation in the size of their high attainer populations – and over the shorter period of two years rather than three.

Six of the seven managed significant improvements in the 5+ GCSE threshold with the remaining school – Frank F Harrison – maintaining its 2012 performance.

Two schools – Frank F Harrison and Culverhay did not exceed the illustrative floor on this element.  Meanwhile John Spendluffe achieved a highly creditable perfect score, comfortably exceeding the national average for state-funded schools. Rushden was not too far behind.

There was greater variability with the progress measures. In English, three schools remained below the illustrative floor in 2013 with one – Fleetwood High – falling back compared with its 2012 performance.

Conversely, Coseley improved by 31 percentage points to not far below the national average for state-funded schools.

In maths two schools failed to make it over the floor. Parklands made a 48 percentage point improvement but still fell short, while Frank F Harrison fell back eight percentage points compared with its 2012 performance.

On the other hand, Rushden and John Spendluffe are closing in on national average performance for state-funded schools. Both have made improvements of over 30 percentage points.

Of the seven, only Frank F Harrison would remain below my overall illustrative floor target on the basis of its 2013 performance.

Taking the two samples together, the good news is that many struggling schools are capable of making radical improvements in their performance with high attainers.

But question marks remain over the capacity of some schools to sustain initial  improvements over subsequent years.

 .

What Interventions Have Impacted on these Schools?

Table 5 below reveals how different accountability and school improvement interventions have been brought to bear on this sample of 20 schools since 2011.

.

Table 5: Interventions Impacting on Sample Schools 2011-2014

Name Floor Targets Most recent Inspection Ofsted Rating (Pre-) warning notice Academised
2011
Carter Community School  .FT 2011. FT 2013  .29/11/12. NYI as academy 2 Sponsored
Hadden Park High School  .FT 2011.FT 2012

.FT 2013

 .13/11/13 .NYI as academy SM Sponsored
Merchants Academy  .FT 2011 .FT 2012  .9/6/11 2
The Robert Napier School  .FT 2011.FT 2012  .17/09/09.NYI as academy 3 Sponsored
Bishop of Rochester Academy  .FT 2011.FT 2013  .28/6/13 3 PWN 3/1/12
2012
The Rushden Community College FT 2012  .10/11/10.NYI as academy 3 Sponsored
Culverhay School  .FT 2011 .FT 2012

.(FT 2013)

 .11/1/12 .NYI as academy SM Sponsored
Raincliffe School  .FT 2012  .19/10/10 3 Closed
The Coseley School  .FT 2012  .13/9/12 SM
Fleetwood High School  .FT 2012 .FT 2013  .20/3/13 SWK
John Spendluffe Foundation Technology College  .FT 2012  .3/3/10 .As academy    18/9/13 .1.2 Academy converter 9/11
Parklands High School  .FT 2011.FT 2012

.FT 2013

 .5/12/13 SM Discussing sponsorship
Frank F Harrison Engineering College  .FT 2011.FT 2012

.(FT 2013)

 .5/7/11.See Mirus Academy below 3 Now Mirus Academy (see below)
2013
Gloucester Academy  .FT 2011.FT 2012

 .FT 2013

 .4/10/12 SWK  .PWN 16/9/13.WN 16/12/13
Christ the King RC and CofE VA School  .FT 2011.FT 2012

.FT 2013

 .18/9/12 SM Closed
Aireville School  .FT 2012.FT 2013  .15/5/13 SM
Manchester Creative and Media Academy for Boys  .FT 2011.FT 2012

.FT 2013

 .13/6/13 SWK PWN 3/1/12
Fearns Community Sports College  .FT 2011.FT 2013  .28/6/12 3
Unity College Blackpool .  .FT 2011 .FT 2012

.FT 2013

 .9/11/11.NYI as academy 3 Sponsored
The Mirus Academy  .FT 2013  .7/11/13 SM

 .

Floor Targets

The first and obvious point to note is that every single school in this list fell below the official floor targets in the year in which they also undershot my illustrative high attainers’ targets.

It is extremely reassuring that none of the schools returning particularly poor outcomes with high attainers are deemed acceptable performers in generic terms. I had feared that a few schools at least would achieve this feat.

In fact, three-quarters of these schools have fallen below the floor targets in at least two of the three years in question, while eight have done so in all three years, two having changed their status by becoming academies in the final year (which, strictly speaking, prevents them from scoring the hat-trick). One has since closed.

Some schools appear to have been spared intervention by receiving a relatively positive Ofsted inspection grade despite their floor target records. For example, Carter Community School had a ‘good’ rating sandwiched between two floor target appearances, while Merchants Academy presumably received its good rating before subsequently dropping below the floor.

John Spendluffe managed an outstanding rating two years before it dropped below the floor target and was rated good – in its new guise as an academy – a year afterwards.

The consequences of falling below the floor targets are surprisingly unclear, as indeed are the complex rules governing the wider business of intervention in underperforming schools.

DfE press notices typically say something like:

Schools below the floor and with a history of underperformance face being taken over by a sponsor with a track record of improving weak schools.’

But of course that can only apply to schools that are not already academies.

Moreover, LA-maintained schools may appeal to Ofsted against standards and performance warning notices issued by their local authorities; and schools and LAs may also challenge forced academisation in the courts, arguing that they have sufficient capacity to drive improvement.

As far as I can establish, it is nowhere clearly explained what exactly constitutes a ‘history of underperformance’, so there is inevitably a degree of subjectivity in the application of this criterion.

Advice elsewhere suggests that a school’s inspection outcomes and ‘the local authority’s position in terms of securing improvement as a maintained school’ should also be taken into account alongside achievement against the floor targets.

We do not know what weighting is given to these different sources of evidence, nor can we rule out the possibility that other factors – tangible or intangible – are also weighed in the balance.

Some might argue that this gives politicians the necessary flexibility to decide each case on its merits, taking careful account of the unique circumstances that apply rather than imposing a standard set of cookie-cutter judgements.

Others might counter that the absence of standard criteria, imposed rigorously but with flexibility to take additional special circumstances in to account, lays such decisions unnecessarily open to dispute and is likely to generate costly and time-consuming legal challenge

.

Academy Warning Notices

When it comes to academies:

‘In cases of sustained poor academic performance at an academy, ministers may issue a pre-warning notice to the relevant trust, demanding urgent action to bring about substantial improvements, or they will receive a warning notice. If improvement does not follow after that, further action – which could ultimately lead to a change of sponsor – can be taken. In cases where there are concerns about the performance of a number of a trust’s schools, the trust has been stopped from taking on new projects.’

‘Sustained poor academic performance’ may or may not be different from a ‘history of underperformance’ and it too escapes definition.

One cannot but conclude that it would be very helpful indeed to have some authoritative guidance, so that there is much greater transparency in the processes through which these various provisions are being applied, to academies and LA-maintained schools alike.

In the absence of such guidance, it seems rather surprising that only three of the academies in this sample – Bishop of Rochester, Gloucester and Manchester Creative and Media – have received pre-warning letters to date, while only Gloucester’s has been superseded by a full-blown warning notice. None of these mention specifically the underperformance of high attainers.

  • Bishop of Rochester received its notice in January 2012, but subsequently fell below the floor targets in both 2012 and 2013 and – betweentimes – received an Ofsted inspection rating of 3 (‘requires improvement’).
  • Manchester Creative and Media also received its pre-warning notice in January 2012. It too has been below the floor targets in both 2012 and 2013 and was deemed to have serious weaknesses in a June 2013 inspection.
  • Gloucester received its pre-warning notice much more recently, in September 2013, followed by a full warning notice just three months later.

These pre-warning letters invite the relevant Trusts to set out within 15 days what action they will take to improve matters, whereas the warning notices demand a series of specific improvements with a tight deadline. (In the case of Gloucester Academy the notice issued on 16 December 2013 imposing a deadline of 15 January 2014. We do not yet know the outcome.)

Other schools in my sample have presumably been spared a pre-warning letter because of their relatively recent acquisition of academy status, although several other 2012 openers have already received them. One anticipates that more will attract such attention in due course.

 .

Ofsted Inspection

The relevant columns of Table 5 reveal that, of the 12 schools that are now academies (taking care to count Harrison/Mirus as one rather than two), half have not yet been inspected in their new guise.

As noted above, it is strictly the case that, when schools become academies – whether sponsored or via conversion – they are formally closed and replaced by successor schools, so the old inspection reports no longer apply to the new school.

However, this does not prevent many academies from referring to such reports on their websites – and they do have a certain currency when one wishes to see whether or not a recently converted academy has been making progress.

But, if we accept the orthodox position, there are only six academies with bona fide inspection reports: Merchants, Bishop of Rochester, John Spendluffe, Gloucester, Manchester Creative and Media and Mirus.

All five of the LA-maintained schools still open have been inspected fairly recently: Coseley, Fleetwood, Parklands, Aireville and Fearns.

This gives us a sample of 11 schools with valid inspection reports:

  • Two academies are rated ‘good’ (2)  – Merchants and John Spendluffe;
  • One academy – Bishop of Rochester – and one LA-maintained school –  Fearns – ‘require improvement’ (3);
  • Two academies – Gloucester and Manchester – and one LA-maintained school – Fleetwood – are inadequate (4) having serious weaknesses and
  • One academy – Mirus – and three LA-maintained schools – Parklands, Coseley and Aireville – are inadequate (4) and in Special Measures.

The School Inspection Handbook explains the distinction between these two  variants of ‘inadequate’:

‘A school is judged to require significant improvement where it has serious weaknesses because one or more of the key areas is ‘inadequate’ (grade 4) and/or there are important weaknesses in the provision for pupils’ spiritual, moral, social and cultural development. However, leaders, managers and governors have been assessed as having the capacity to secure improvement

…A school requires special measures if:

  • it is failing to give its pupils an acceptable standard of education and
  • the persons responsible for leading, managing or governing are not demonstrating the capacity to secure the necessary improvement in the school.’

Schools in each of these categories are subject to more frequent monitoring reports. Those with serious weaknesses are typically re-inspected within 18 months, while, for those in special measures, the timing of re-inspection depends on the school’s rate of improvement.

It may be a surprise to some that only seven of the 11 are currently deemed inadequate given the weight of evidence stacked against them.

There is some support for the contention that Ofsted inspection ratings, floor target assessments and pre-warning notices do not always link together as seamlessly as one might imagine, although apparent inconsistencies may sometimes arise from the chronological sequence of these different judgements.

But what do these 11 reports say, if anything, about the performance of high attainers? Is there substantive evidence of a stronger focus on ‘the most able’ in those reports that have issued since September 2013?

.

The Content of Ofsted Inspection Reports

Table 6, below, sets out what each report contains on this topic, presenting the schools in the order of their most recent inspection.

One might therefore expect the judgements to be more specific and explicit in the three reports at the foot of the table, which should reflect the new guidance introduced last September. I discussed that guidance at length in this October 2013 post.

.

Table 6: Specific references to high attainers/more able/most able in inspection reports

Name Date Outcome Comments
Merchants Academy 29/6/11 Good (2) In Year 9… an impressive proportion of higher-attaining students…have been entered early for the GCSE examinations in mathematics and science. Given their exceptionally low starting points on entry into the academy, this indicates that these students are making outstanding progress in their learning and their achievement is exceptional.More-able students are fast-tracked to early GCSE entry and prepared well to follow the InternationalBaccalaureate route.
Fearns Community Sports College 28/6/12 Requires improvement (3) Setting has been introduced across all year groups to ensure that students are appropriately challenged and supported, especially more-able students. This is now beginning to increase the number of students achieving higher levels earlier in Key Stage 3.
The Coseley School 13/9/12 Special Measures (4) Teaching is inadequate because it does not always extend students, particularly the more able.What does the school need to do to improve further?Raise achievement, particularly for the most able, by ensuring that:

  • work consistently challenges and engages all students so that they make good progress in lessons
  • challenging targets are set as a minimum expectation
  • students do not end studies in English language and mathematics early without having the chance to achieve the best possible grade
  • GCSE results in all subjects are at least in line with national expectations.

Target setting is not challenging enough for all ability groups, particularly for the more-able students who do not make sufficient progress by the end of Key Stage 4.

Gloucester Academy 4/10/12 Serious Weaknesses (4) No specific reference
Fleetwood High School 20/3/13 Serious Weaknesses(4) No specific reference
Aireville School 15/5/13 Special Measures(4) Teachers tend to give the same task to all students despite a wide range of ability within the class. Consequently, many students will complete their work and wait politely until the teacher has ensured the weaker students complete at least part of the task. This limits the achievement of the more-able students and undermines the confidence of the least-able.There is now a good range of subjects and qualifications that meet the diverse needs and aspirations of the students, particularly the more-able students.
Manchester Creative and Media Academy for Boys 13/6/13 Serious Weaknesses(4) The most-able boys are not consistently challenged to attain at the highest levels. In some lessons they work independently and make rapid progress, whereas on other occasions their work is undemanding.What does the academy need to do to improve further?Improve the quality of teaching in Key Stages 3 and 4 so that it is at least good leading to rapid progress and raised attainment for all groups of boys, especially in English, mathematics and science by…  ensuring that tasks are engaging and challenge all students, including the most-able.The most-able boys receive insufficient challenge to enable them to excel. Too many lessons donot require them to solve problems or link their learning to real-life contexts.In some lessons teachers’ planning indicates that they intend different students to achieve different outcomes, but they provide them all with the same tasks and do not adjust the pace or nature of work for higher- or lower-attaining students. This results in a slow pace of learning and some boys becoming frustrated.
Bishop of Rochester Academy 28/6/13 Requires improvement (3) No specific reference
John Spendluffe Foundation Technology College 18/9/13 Good (2) Not enough lessons are outstanding in providing a strong pace, challenge and opportunities for independent learning, particularly for the most able.The 2013 results show a leap forward in attainment and progress, although the most able could still make better progress.Leadership and management are not outstanding because the achievement of pupils, though improving quickly, has not been maintained at a high level over a period of time, and a small number of more-able students are still not achieving their full potential.
The Mirus Academy 7/11/13 Special Measures (4) The academy’s early entry policy for GCSE has made no discernible difference to pupils’ achievement, including that of more able pupils.
Parklands High School 5/12/13 Special Measures (4) The achievement of students supported by the pupil premium generally lags behind that of their classmates. All groups, including themost able students and those who have special educational needs, achieve poorly.Students who join the school having achieved Level 5 in national Key Stage 2 tests in primary school fare less well than middle attainers, in part due to early GCSE entry. They did a little better in 2013 than in 2012.

.

There is inconsistency within both parts of the sample – the first eight reports that pre-date the new guidance and the three produced subsequently.

Three of the eleven reports make no specific reference to high attainers/most able learners, all of them undertaken before the new guidance came into effect.

In three more cases the references are confined to early entry or setting, one of those published since September 2013.

Only four of the eleven make what I judge to be substantive comments:

  • The Coseley School (special measures) – where the needs of the most able are explicitly marked out as an area requiring improvement;
  • The Manchester Creative and Media Academy for Boys (serious weaknesses) – where attention is paid to the most able throughout the report;
  • John Spendluffe Foundation Technology College (good) – which includes some commentary on the performance of the most able; and
  • Parklands High School (special measures) – which also provides little more than the essential minimum coverage.

The first two predate the new emphasis on the most able, but they are comfortably the most thorough. It is worrying that not all reports published since September are taking the needs of the most able as seriously as they might.

One might expect that, unconsciously or otherwise, inspectors are less ready to single out the performance of the most able when a school is inadequate across the board, but the small sample above does not support this hypothesis. Some of the most substantive comments relate to inadequate schools.

It therefore seems more likely that the variance is attributable to the differing capacity of inspection teams to respond to the new emphases in their inspection guidance. This would support the case made in my previous post for inspectors to receive additional guidance on how they should interpret the new requirement.

.

Conclusion

This post established an illustrative floor target to identify a small sample of 20 schools that have demonstrated particularly poor performance with high attainers in the Performance Tables for 2011, 2012 or 2013.

It:

  • Compared the performance of these schools in the year in which they fell below the floor, noting significant variance by year and between institutions, but also highlighting the fact that the proportion of high attainers attending these schools is significantly lower than the national average for state-funded schools.
  • Examined the subsequent performance of schools below the illustrative floor in 2011 and 2012, finding that almost all made significant improvements in the year immediately following, but that some of the 2011 cohort experienced difficulty in sustaining this improvement across all elements into a second year. It seems that progress in English, maths or both are more vulnerable to slippage than the 5+ A*-C GCSE threshold measure.
  • Confirmed – most reassuringly – that every school in the sample fell below the official, generic floor targets in the year in which they also undershot my illustrative high attainer floor targets.
  • Reviewed the combination of assessments and interventions applied to the sample of schools since 2011, specifically the interaction between academisation, floor targets, Ofsted inspection and (pre)warning notices for academies. These do not always point in the same direction, although chronology can be an extenuating factor. New guidance about how these and other provisions apply and interact would radically improve transparency in a complex and politically charged field.
  • Analysed the coverage of high attainers/most able students in recent inspection reports on 11 schools from amongst the sample of 20, including three published after September 2013 when new emphasis on the most able came into effect. This exposed grave inconsistency in the scope and quality of the coverage, both before and after September 2013, which did not correlate with the grade of the inspection. Inspectors would benefit from succinct additional guidance.

In the process of determining which schools fell below my high attainers floor target, I also identified the schools that undershot one or other of the elements but not both. This wider group included 46 schools in 2011, 52 schools in 2012 and 34 schools in 2013.

Several of these schools reappear in two or more of the three years, either in their existing form or following conversion to academy status.

Together they constitute a ‘watch list’ of more than 100 institutions, the substantial majority of which remain vulnerable to continued underperformance with their high attainers for the duration of the current accountability regime.

The chances are that many will also continue to struggle following the introduction of the new ‘progress 8’ floor measure from 2015.

Perhaps unsurprisingly, the significant majority are now sponsored academies.

I plan to monitor their progress.

.

*Apologies for this rather tabloid title!

.

GP

February 2014