I have been researching the relationship between STEM and gifted education in Ireland, Australia and elsewhere, with a view to comparing and contrasting their approaches to those in England, the US and the ASEAN States, all three of which have been reviewed in previous posts.
But a new publication has led me to revisit the English situation, just ahead of the publication of a new Schools White Paper which is likely to set out Government plans for the future development of school-based STEM education here.
In that context, I also want to take a closer look at the impact of investment in STEM education on achievement, specifically high achievement in the core STEM subjects in GCSE and A level examinations, which are taken predominantly in Year 11 (age 16) and Year 13 (age 18) respectively.
Regular readers will know my premiss that the English STEM programme has been too little focussed to date on the highest achievers – the gifted and talented – and that this strategy will undermine efforts to strengthen our international competitiveness.
There is a risk that England is falling into the same trap as the US, whose situation is illustrated starkly in the recent Hanushek study of high level maths achievement relative to other competing countries, as reviewed in my last post.
Educating the Next Generation of Scientists
The NAO is a public body independent of Government which employs some 900 staff to scrutinise public expenditure on behalf of Parliament. The Head of the NAO is known as the Comptroller and Auditor General, and he has a statutory responsibility to report to Parliament on the economy, efficiency and effectiveness with which Government departments have deployed their resources.
The purpose of this Report is to:
‘evaluate progress by the Department for Education in increasing take-up and achievement in maths and science up to age 18 and the extent to which specific programmes to raise the quality of school science facilities, recruit and retain science and maths teachers, and improve the appeal of science to young people have contributed to any increase’.
This describes the schools dimension of England’s national STEM programme, the development of which was traced in my earlier post.
The Report notes that, net of the enormous costs of teaching STEM subjects in schools, the Government has spent around £100 million annually on STEM support, almost half of it within the Department for Education’s budget. So this is a sizeable programme by English standards.
Progress on improving achievement and take-up
In my view, the analysis of take-up and achievement in the Report is partial and rather cursory. Although it traces the development of the national STEM strategy through the various documents summarised in my earlier post, the Report fails to pick up any emphasis on our highest achievers at age 16, relying exclusively on the standard achievement benchmark of GCSE grade C and above.
This is despite noting in passing the emphasis within the Science and Investment Framework 2004-14 on the achievement of A*-B grades (which is in itself insufficiently demanding but is nevertheless a move in the right direction).
The analysis of A level achievement (our main post-16 qualification and the gateway to university entry) is slightly more relevant, since it concentrates on the achievement of grades A-C , so lower grade A level passes are excluded. But that benchmark is still pitched too low in my view.
We learn that:
- Take-up of the separate sciences – primarily physics, chemistry and biology – increased by almost 150% between 2004-05 and 2009-10. (Promoting study of separate sciences at GCSE has been a core focus of the programme because of evidence that this tends to correlate with greater success at A level and in undergraduate study.)
- There are ‘generally rising trends in GCSE achievement in maths and the separate sciences over the period, against the standard A*-C grade benchmark. For example, 55% of students achieved grade C or above in maths in 2004-05, increasing to 65% in 2009-10. (I am not sure how these figures are derived since the dataset I have seen gives as 58.4% the percentage achieving GCSE maths at A*-C in 2010.) In the separate subjects of physics, chemistry and biology, the percentage achieving C and above has increased from 90/91% to 94% over the same period (these figures are broadly in line with my dataset).
- Take-up has increased at A level, markedly so in maths and more steadily in biology and chemistry. However, entries for A level physics have only increased very slightly. While the other subjects are on target to achieve the targets set for 2014, physics is at grave risk of undershooting.
- In terms of A level achievement at grades A-C (there is no reference to the introduction of the A* grade in 2010) there have been steady increases since 2001-02, of 11% in biology, 8% in maths, 8% in physics and 7% in chemistry. In all cases, at least 72% of A level entries in these subjects are awarded a C grade or above, with maths reaching 82%.
But are these the right benchmarks?
My previous posts have sought to demonstrate the important relationship between success in STEM education and the performance of a country’s gifted high achievers in STEM subjects. Those Asian countries that achieve most highly in maths and science in PISA and other international comparisons studies were aware of this from the beginning – and recent publications should begin to raise awareness in the US. Even Finland now acknowledges the need to provide extra challenge and support for its gifted learners.
Given this context, I do not believe that achieving a C grade or above at GCSE can be the right benchmark for a national STEM programme. In physics, chemistry and biology, this benchmark is now almost in reach of 100% of those entered for the examinations!
Surely the STEM programme must be focused significantly on those who progress to A level and on to a degree – the so called ‘STEM pipeline’ – but we all know that a GCSE grade C is inadequate preparation for achieving a C grade or above at A level – the benchmark monitored in the NAO report – and that a C grade at A level is itself no longer sufficient to secure a place on most STEM-related first degree courses.
This report illustrates the correlation between GCSE grades and A level grades in different subjects. To take an example, we can see that over 50% of those achieving a C grade in maths GCSE went on to achieve a U (ungraded) in A level maths and only 11% achieved grades A-C. In physics, chemistry and biology, only 14-16% of those with a C grade at GCSE went on to achieve at least a C grade at A level while 32-36% were Ungraded.
My earlier posts have shown that the international comparisons studies are considering a much higher level of high achiever, though admittedly as part of a holistic focus on science and maths achievement.
We have seen that the advanced level in PISA 2006 was achieved by just 9.0% of UK 15 year-olds in maths and only 5.7% in science. The countries that top these comparative studies are getting 28% of mathematicians to the advanced standard in maths and 8% to the advanced standard in science. Even though the UK performs relatively well in science – much less so in maths – there is still a lot of ground to catch up on some of our main international competitors.
Given that a GCSE grade of C or above can be achieved by well over 90% of our students in the sciences, one can see just how irrelevant such a benchmark has become. Almost half of all entries to the separate science GCSEs now achieve an A*/A (16% in maths) which suggests that even a focus on that subset of high achievers would be significantly out of kilter with the PISA benchmark for advanced performance. However, it would be much better than grades A*-C.
Time series data on the performance of our highest achievers
In my earlier post, I looked at the change in GCSE performance at A*/A grades between 2009 and 2010, pointing out that top grade GCSE performance in maths and the sciences is behind the trend in all GCSE subjects and behind the trend in A*-C performance in each of the four subjects.
I have now reviewed changes in GCSE performance at A*/A grades in STEM subjects since 2005 and changes in A level performance at A*/A grades (remember that an A* grade was only introduced in 2010) in STEM subjects, also since 2005.
The data shows that:
- Between 2005-2010 achievement of A*-C grades in all GCSE subjects has increased by 7.9%, whereas achievement of A*/A grades has increased by only 4.2%. This suggests that schools are guilty of concentrating over-much on helping students achieve a C grade rather than challenging and supporting all learners to improve.
- In maths, physics, chemistry and biology, the rate of improvement in the percentage achieving A*/A grades significantly undershoots the rate of improvement in those achieving A*-C grades: STEM subjects are not exempt from the generic criticism above;
- In maths, physics, chemistry and biology, the improvement in the percentage achieving A*/A grades is significantly lower than the improvement in the percentage achieving A*/A grades in all subjects. So, despite the STEM programme, the highest achievers are relatively more underserved in STEM subjects.
- Between 2005-2010 achievement of A*-C grades in all A level subjects has increased by 5.5%, whereas achievement of A*/A grades has increased by only 4.2%. Although in post-16 settings the needs of the highest achievers are relatively well catered for, they are still not improving at the same rate.
- In further maths and biology, the rate of improvement in the percentage achieving A*/A grades significantly undershoots the rate of improvement in those achieving A*-C grades; there is a slight undershoot in physics, broad parity in maths and a significant overshoot in favour of A*/A in chemistry. It would be interesting to research what has happened in chemistry and seek to replicate that in the other subjects.
- In maths and further maths, the improvement in the percentage achieving A*/A grades is lower than the improvement in the percentage achieving A*/A grades in all subjects (very significantly so in the case of further maths), whereas there is broad parity in the case of physics, while chemistry and biology are running ahead of the improvement for all subjects.
In short, although the evidence at A level is much more mixed, the GCSE evidence demonstrates unambiguously that we are not investing enough of our effort in the education of our highest achievers in STEM subjects.
This is likely to be reflected in PISA 2009 and, given that other countries clearly are investing in their highest achievers, we can expect to drop significantly down the tables marking out the countries with the highest proportion of their students achieving the advanced level.
It is also worth noting that these results include independent schools as well as state schools, and that the highest achievers are to be found disproportionately in the independent sector. I do not have access to the statistics for state-maintained schools only, but they are highly likely to tell an even more worrying story.
And the same point applies in spades to students from disadvantaged backgrounds in state schools. There is an urgent need to respond to the excellence gap in STEM-related achievement as part of the wider effort to concentrate on high achievers.
Other key findings in the NAO report
The Report identifies five stated and one hidden ‘critical success factors’ in improving take-up and achievement in STEM subjects, reporting progress against each:
- Careers information and guidance has been patchy and requires improvement. More general efforts already under way to strengthen the scope and quality of information, advice and guidance in schools will hopefully begin to address this issue;
- The quality and quantity of school science facilities. There is limited data but what is available suggests that progress has been relatively slow. The Report does not say so but the abolition of BSF, the previous Government’s colossal capital building programme, is unlikely to speed up progress;
- The quality and quantity of science and maths teachers. Targets for increasing the numbers of specialist chemistry teachers will be met, but those for increasing the numbers of specialists in maths and physics will not. This situation may be eased by the recession. The Report does not recognise that the case for focusing on the achievement of higher GCSE and A level grades is of course strengthened by the need to increase the supply of high quality specialist teachers: a C grade at GCSE or at A level is not really sufficient;
- Young people’s attitudes towards science and maths. The Report uses TIMSS and PISA data to show that, although the UK generally compares favourably with other countries on these measures, it has lost some ground in recent years. (It is strange that the Report makes use of the international comparative data on these ‘soft’ measures but not in its analysis of improvements in achievement.)
- Availability of GCSE Triple Science – ie the separate subjects of physics, chemistry and biology. The Report notes that take-up has increased by almost 150% in the past five years, but almost half of secondary schools do not yet offer triple science and it is less widely available in areas of higher deprivation, thus creating an obstacle to narrowing the excellence gap.
The hidden critical factor is school specialism. The Report says that:
‘Schools with a specialism in science, technology, engineering or maths and computing are effective in bringing together the programmes and resources that support good take-up and achievement in science and maths.’
The Report uses regression analysis to isolate interventions associated with statistically significant increases in numbers achieving A*-C grades at GCSE in the sciences, concluding that STEM specialist schools account for almost all of the increase attributable to interventions per se (though enrichment activities, the role of STEM Ambassadors and professional development support through the National Science Learning Centres also register positive effects). Similar results are derived when A level achievement is examined.
This may be deliberately underplayed given the Government’s recent decision to dismantle the specialist schools programme, devolving the funding direct to schools and leaving them to decide whether to continue spending it on their identified specialisms. One can only conclude that such a change could have potentially dire consequences for the national STEM programme unless an alternative system is created.
However, it is important to keep this in proportion: the Report notes that well over 90% of the cause of increased intake and higher performance is attributable to other external factors such as pupil intake.
It would be interesting and worthwhile to run the same regression analysis to isolate impact of these different activities on achievement at A*/A grades at GCSE and A Level.
The Report confirms that there has been some progress in securing coherence across a wide range of small-scale programmes (over 470 separate initiatives were under way in 2004) but adds that the newly-rationalised set of interventions could be further rationalised and provided to schools in a more systematic way.
Regression analysis demonstrates a positive association between higher take-up and achievement and the number of programmes under way in any school, although diminishing returns can set in if schools undertake large numbers of activities with similar objectives. There is currently considerable regional and local authority variation in the take-up of different activities and the ‘offer’ has not yet reached all state secondary schools.
The overall conclusion is that:
‘Increased take-up and achievement in school science and maths is, as this report shows, dependent on a number of key factors. These need to be brought together in coherent pathways to maximise successful results and efficient use of public resources in pursuit of this objective. The Department has made progress in doing so, for example by rationalizing the previous plethora of initiatives within a national programme. However, gaps and inconsistencies in availability and uptake remain, creating a shortfall in value for money which the Department could and should address in developing its future programme for science and maths in schools.’
And the Report recommends that, in taking forward the policy priorities of the new Government, the Department for Education should:
‘develop an overarching programme with a clear logic, based on evidence of cause and effect. The programme should provide a framework with clear priorities, a well-defined critical path and appropriate measures of progress. It should provide a basis for engaging with local authorities, schools and colleges on the actions required in the following key areas:
- a systematic approach which gives assurance that there will be sufficient teachers with a specialism in maths, chemistry or physics;
- more even take-up of continuous professional development opportunities for teachers, particularly in local authority areas where fewer schools are currently using Science Learning Centres;
- a realistic assessment of what progress can be made to bring school laboratories up to a good or excellent standard, since the previous target was neither informed by robust data nor achieved within the specified timeframe;
- actions at local level to give all young people access to a curriculum that includes the study of separate sciences; and a school or college that performs well in science and maths, whether through a relevant specialism or by other effective means; [my emphasis]
- further development of the analysis presented in this report with a view to: evaluating more coherently and consistently the efficacy and cost-effectiveness of individual programmes in increasing take-up and achievement; and providing information on local use of programmes to support reviews of whether take-up is sufficient and appropriate.’
One can reasonably expect that the plans set out in the imminent White Paper will reflect those recommendations.
It remains to be seen whether the need to target more support on our highest achievers is also addressed. If it is not, then analysis of the PISA 2009 results published just two weeks later will almost certainly help to make the case.