What Happened to the Level 6 Reading Results?

 

Provisional 2014 key stage 2 results were published on 28 August.

500px-Japanese_Urban_Expwy_Sign_Number_6.svgThis brief supplementary post considers the Level 6 test results – in reading, in maths and in grammar, punctuation and spelling (GPS) – and how they compare with Level 6 outcomes in 2012 and 2013.

An earlier post, A Closer Look at Level 6, published in May 2014, provides a fuller analysis of these earlier results.

Those not familiar with the 2014 L6 test materials can consult the papers, mark schemes and level thresholds at these links:

 

Number of Entries

Entry levels for the 2014 Level 6 tests were published in the media in May 2014. Chart 1 below shows the number of entries for each test since 2012 (2013 in the case of GPS). These figures are for all schools, independent as well as state-funded.

 

L6 Sept chart 1

Chart 1: Entry rates for Level 6 tests 2012 to 2014 – all schools

 

In 2014, reading entries were up 36%, GPS entries up 52% and maths entries up 36%. There is as yet no indication of a backlash from the decision to withdraw Level 6 tests after 2015, though this may have an impact next year.

The postscript to A Closer Look estimated that, if entries continue to increase at current rates, we might expect something approaching 120,000 in reading, 130,000 in GPS and 140,000 in maths.

Chart 2 shows the percentage of all eligible learners entered for Level 6 tests, again for all schools. Nationally, between one in six and one in five eligible learners are now entered for Level 6 tests. Entry rates for reading and maths have almost doubled since 2012.

 

L6 Sept chart 2

Chart 2: Percentage of eligible learners entered for Level 6 tests 2012 to 2014, all schools

 

Success Rates

The headline percentages in the SFR show:

  • 0% achieving L6 reading (unchanged from 2013)
  • 4% achieving L6 GPS (up from 2% in 2013) and
  • 9% achieving L6 maths (up from 7% in 2013).

Local authority and regional percentages are also supplied.

  • Only in Richmond did the L6 pass rate in reading register above 0% (at 1%). Hence all regions are at 0%.
  • For GPS the highest percentages are 14% in Richmond, 10% in Kensington and Chelsea and Kingston, 9% in Sutton and 8% in Barnet, Harrow and Trafford. Regional rates vary between 2% in Yorkshire and Humberside and 6% in Outer London.
  • In maths, Richmond recorded 22%, Kingston 19%, Trafford, Harrow and Sutton were at 18% and Kensington and Chelsea at 17%. Regional rates range from 7% in Yorkshire and Humberside and the East Midlands to 13% in Outer London.

Further insight into the national figures can be obtained by analysing the raw numbers supplied in the SFR.

Chart 3 shows how many of those entered for each test were successful in each year. Here there is something of a surprise.

 

L6 Sept chart 3

Chart 3: Percentage of learners entered achieving Level 6, 2012 to 2014, all schools

 

Nearly half of all entrants are now successful in L6 maths, though the improvement in the success rate has slowed markedly compared with the nine percentage point jump in 2013.

In GPS, the success rate has improved by nine percentage points between 2013 and 2014 and almost one in four entrants is now successful. Hence the GPS success rate is roughly half that for maths. This may be attributable in part to its shorter history, although the 2014 success rate is significantly below the rate for maths in 2013.

But in reading an already very low success rate has declined markedly, following a solid improvement in 2013 from a very low base in 2012. The 2014 success rate is now less than half what it was in 2012. Fewer than one in a hundred of those entered have passed this test.

Chart 4 shows how many learners were successful in the L6 reading test in 2014 compared with previous years, giving results for boys and girls separately.

 

L6 Sept chart 4

Chart 4: Percentage of learners entered achieving Level 6 in reading, 2012 to 2014, by gender

 

The total number of successful learners in 2014 is over 5% lower than in 2012, when the reading test was introduced, and down 62% on the success rate achieved in 2013.

Girls appear to have suffered disproportionately from the decline in 2014 success rates. While the success rate for girls is down 63%, the decline for boys is slightly less, at 61%. The success rate for boys remains above where it was in 2012 but, for girls, it is about 12% down on where it was in 2012.

In 2012, only 22% of successful candidates were boys. This rose to 26% in 2013 and has again increased slightly, to 28% in 2014. The gap between girls’ and boys’ performance remains substantially bigger than those for GPS and maths.

Charts 5 and 6 give the comparable figures for GPS and maths respectively.

In GPS, the total number of successful entries has increased by almost 140% compared with 2013. Girls form a slightly lower proportion of this group than in 2013, their share falling from 62% to 60%. Boys are therefore beginning to close what remains a substantial performance gap.

 

L6 Sept chart 5

Chart 5: Percentage of learners entered achieving Level 6 in GPS, 2012 to 2014, by gender

 

In maths, the total number of successful entries is up by about 40% on 2013 and demonstrates rapid improvement over the three year period.

Compared with 2013, the success rate for girls has increased by 43%, whereas the corresponding increase for boys is closer to 41%. Boys formed 65% of the successful cohort in 2012, 61% in 2013 and 60% in 2014, so girls’ progress in narrowing this substantial performance gap is slowing.

 

L6 Sept chart 6

Chart 6: Percentage of learners entered achieving Level 6 in maths, 2012 to 2014, by gender

 

Progress

The SFR also provides a table, this time for state-funded schools only, showing the KS1 outcomes of those successful in achieving Level 6. (For maths and reading, this data includes those with a non-numerical grade in the test who have been awarded L6 via teacher assessment. The data for writing is derived solely from teacher assessment.)

Not surprisingly, over 94% of those achieving Level 6 in reading had achieved Level 3 in KS1, but 4.8% were at L2A and a single learner was recorded at Level 1. The proportion with KS1 Level 3 in 2013 was higher, at almost 96%.

In maths, however, only some 78% of those achieving Level 6 were at Level 3 in KS1. A further 18% were at 2A and almost 3% were at 2B. A further 165 were recorded as 2C or 1. In 2013, over 82% had KS1 L3 while almost 15% had 2A.

It seems, therefore, that KS1 performance was a slightly weaker indicator of KS2 level 6 success in 2014 than in the previous year, but this trend was apparent in both reading and maths – and KS1 performance remains a significantly weaker indicator in maths than it is in reading.

 

Why did the L6 reading results decline so drastically?

Given that the number of entries for the Level 6 reading test increased dramatically, the declining pass rate suggests either a problematic test or that schools entered a higher proportion of learners who had relatively little chance of success. A third possibility is that the test was deliberately made more difficult.

The level threshold for the 2014 Level 6 reading test was 24 marks, compared with 22 marks in 2013, but there are supposed to be sophisticated procedures in place to ensure that standards are maintained. We should be able to discount the third cause.

The second cause is also unlikely to be significant, since schools are strongly advised only to enter learners who are already demonstrating attainment beyond KS2 Level 5.There is no benefit to learners or schools from entering pupils for tests that they are almost certain to fail.

The existing pass rate was very low, but it was on an upward trajectory. Increasing familiarity with the test ought to have improved schools’ capacity to enter the right learners and to prepare them to pass it.

That leaves only the first possibility – something must have been wrong with the test.

Press coverage from May 2014, immediately after the test was administered, explained that it contained different rules for learners and invigilators about the length of time available for answering questions.

The paper gave learners one hour for completion, while invigilators were told pupils had 10 minutes’ reading time followed by 50 minutes in which to answer the questions. Schools interpreted this contradiction differently and several reported disruption to the examination as a consequence.

The NAHT was reported to have written to the Standards and Testing Agency:

‘…asking for a swift review into this error and to seek assurance that no child will be disadvantaged after having possibly been given incorrect advice on how to manage their time and answers’.

The STA statement says:

‘We apologise for this error. All children had the same amount of time to complete the test and were able to consult the reading booklet at any time. We expect it will have taken pupils around 10 minutes to read the booklet, so this discrepancy should not have led to any significant advantage for those pupils where reading time was not correctly allotted.’

NAHT has now posted the reply it received from STA on 16 May. It says:

‘Ofqual, our regulator, is aware of the error and of the information set out below and will, of
course, have to independently assure itself that the test remains valid. We would not
expect this to occur until marking and level setting processes are complete, in line with
their normal timescales.’

It then sets out the reasons why it believes the test remains valid. These suggest the advantage to the learners following the incorrect instructions was minimal since:

  • few would need less than 10 minutes’ reading time;
  • pre-testing showed 90% of learners completed the test within 50 minutes;
  • in 2013 only 3.5% of learners were within 1 or 2 marks of the threshold;
  • a comparative study to change the timing of the Levels 3-5 test made little difference to item difficulty.

NAHT says it will now review the test results in the light of this response.

 

 

Who is responsible?

According to its most recent business plan, STA:

‘is responsible for setting and maintaining test standards’ (p3)

but it publishes little or nothing about the process involved, or how it handles representations such as that from NAHT.

Meanwhile, Ofqual says its role is:

‘to make sure the assessments are valid and fit for purpose, that the assessments are fair and manageable, that the standards are properly set and maintained and the results are used appropriately.

We have two specific objectives as set out by law:

  • to promote assessment arrangements which are valid, reliable and comparable
  • to promote public confidence in the arrangements.

We keep national assessments under review at all times. If we think at any point there might be a significant problem with the system, then we notify the Secretary of State for Education.’

Ofqual’s Chair has confirmed via Twitter that Ofqual was:

‘made aware at the time, considered the issues and observed level setting’.

Ofqual was content that the level-setting was properly undertaken.

 

 

I asked whether, in the light of that, Ofqual saw a role for itself in investigating the atypical results. I envisaged that this might take place under the Regulatory Framework for National Curriculum Assessments (2011).

This commits Ofqual to publishing annually its ‘programme for reviewing National Assessment arrangements’ (p14) as well as ‘an annual report on the outcomes of the review programme’ (p18).

However the most recent of these relates to 2011/12 and appeared in November of that year.

 

 

I infer from this that we may seem some reaction from Ofqual, if and when it finally produces an annual report on National Curriculum Assessments in 2014, but that’s not going to appear before 2015 at the earliest.

I can’t help but feel that this is not quite satisfactory – that atypical test performance of this magnitude ought to trigger an automatic and transparent review, even if the overall number of learners affected is comparatively small.

If I were part of the system I would want to understand promptly exactly what happened, for fear that it might happen again.

If you are in any doubt quite how out of kilter the reading test outcomes were, consider the parallel results for Level 6 teacher assessment.

In 2013, 5,698 learners were assessed at Level 6 in reading through teacher assessment – almost exactly two-and-a-half times as many as achieved Level 6 in the test.

In 2014, a whopping 17,582 learners were assessed at Level 6 through teacher assessment, around 20 times as many as secured a Level 6 in the reading test.

If the ratio between test and teacher assessment results in 2014 had been the same as it was in 2013, the number successful on the test would have been over 7,000, eight-fold higher than the reported 851.

I rest my case.

 

The new regime

In February 2013, a DfE-commissioned report Investigation of Key Stage 2 Level 6 Tests recommended that:

‘There is a need to review whether the L6 test in Reading is the most appropriate test to use to discriminate between the highest ability pupils and others given:

a) that only around 0.3 per cent of the pupils that achieved at least a level 5 went on to achieve a level 6 in Reading compared to 9 per cent for Mathematics

b) there was a particular lack of guidance and school expertise in this area

c) pupil maturity was seen to be an issue

d) the cost of supporting and administering a test for such a small proportion of the school population appears to outweigh the benefits.’

This has been overtaken by the decision to withdraw all three Level 6 tests and to rely on single tests of reading GPS and maths for all learners when the new assessment regime is introduced from 2016.

Draft test frameworks were published in March 2014, supplemented in July by sample questions, mark schemes and commentary.

Given the imminent introduction of this new regime, together with schools’ experience in 2014, it seems increasingly unlikely that 2015 Level 6 test entries in reading will approach the 120,000 figure suggested by the trend.

Perhaps more importantly, schools and assessment experts alike seem remarkably sanguine about the prospect of single tests for pupils demonstrating the full range of prior attainment, apart from those assessed via the P-Scales. (The draft test frameworks are worryingly vague about whether those operating at the equivalent of Levels 1 and 2 will be included.)

I could wish to be equally sanguine, on behalf of all those learners capable of achieving at least the equivalent of Level 6 after 2015. But, as things stand, the evidence to support that position is seemingly non-existent.

In October 2013, Ofqual commented that:

‘There are also some significant technical challenges in designing assessments which can discriminate effectively and consistently across the attainment range so they can be reported at this level of precision.’

A year on, we still have no inkling whether those challenges have been overcome.

 

GP

September 2014

 

 

 

 

11 thoughts on “What Happened to the Level 6 Reading Results?

  1. We still overlook the main reason i think the L6 results are terminally low in KS2: the level 5 threshold has consistently been set too low for years, and so implies that many average readers are above average. That in turn has led schools to focus less on Reading (because their results were fine).
    As a middle school teacher I have long argued that secondary schools can trust all KS2 test results, except reading.

  2. But the data suggests that schools made major strides in 2013 to improve the percentage of children achieving L6 in the reading test, only to fall back in 2014 below the extremely low success rate achieved in 2012. Meanwhile, they have continued to achieve year-on-year improvement in GPS and maths.

    I still can’t find anyone who can offer me a convincing explanation of why that happened – nor is anyone clarifying the outcome of NAHT complaints about the administration of the test. Maybe everybody is just too busy with other priorities at the start of term, but this does demand further analysis, both at school and national levels.

  3. My kids’ primary has requested that a few L6 reading tests be remarked. They clearly felt marking was inconsistent with classroom observations.

  4. That’s interesting. Sounds as though some children they judged secure Level 6s through teacher assessment didn’t demonstrate comparable performance on the test. I wonder whether the number of requested remarks is significantly higher this year nationally. Do let me know if you hear the outcome.

  5. Our school has decided to stop entering children for the reading. When both we and the secondary agree that children have reached the level, have moderated together, given extra lessons and personalised programs and still no joy, yet the children are achieving L6 in writing and maths, it’s painful to keep hammering at it.

  6. I understand how you feel – and I am worried that this year’s experience will incline many more schools to take the same view, especially since the days of Level 6 tests are numbered. It may be that the highest-attaining readers will be better able to demonstrate their skills on the new universal reading test to be introduced from 2016 but, as my post indicates, I feel there are still plenty of unanswered questions on that score. It would be more helpful than not, in my view, if the technical difficulties associated with designing such tests were openly debated. It will be a shame if the new tests end up placing an artificial ceiling on KS2 attainment.

  7. Is it just that we have all lost the fight against some nonsense measures and calculations? Also we have learnt not to channel our energies into something that won’t get listened to or see an acknowledgement of a possible error. Last sentence is not a criticism of you taking time to identify and pursue this it’s just my view after getting responses from dfe staff that are just an avoidance of answering questions or providing reasoning for things.

  8. That may be so, but my experience has been that if we don’t stand up for the interests of high attainers, because there are relatively few of them, those interests are often overlooked.

    The temptation is to say – the numbers are small, Level 6 tests are going anyway, in terms of percentage passing this test is already a lost cause – but that position cannot be in these learners’ best interests, nor does it contribute to the smooth and efficient operation of our national assessment system.

    But I also wanted to use the post to raise unanswered questions about the new universal tests to be introduced from 2016. It’s really important that we see, understand and accept the evidence from the test development agencies that demonstrates how they work for all children, including those at both ends of the attainment distribution. If we don’t, there is a strong risk that we will be presented with an irreversible fait accompli when it is too late to do anything about it.

Leave a comment