The 2012 PISA results came out yesterday, showing that the UK's place in the world rankings has remained almost unchanged over the last three years.
It struck me, listening to the Today programme in a half-asleep stupor (as is my cliched, lefty-academic morning routine) that I didn't really know what PISA measured, how it measured it, or whether the 'banana republic' rhetoric was at all justified.
It's a tricky line to tread: I'm 100% behind the idea that the more good mathematicians we have, the better it is for everyone; however, I'm not sure that the PISA study is a) a great way to decide whether we're succeeding, or b) even shows whether we're succeeding.
This post is to outline my initial, unstructured concerns; I'd be very interested to hear other people's opinions and responses, so do comment away.
PISA is the Programme for International Student Assessment, a test taken every three years by a sample of around 5,000 16-year-old students in each country. There are three tests (Maths, English and Science); the results are collated and scaled so that the overall mean is 500 and standard deviation is 100 (a little bit of S1 for you, there).
There's a good piece by David Spiegelhalter outlining some criticisms of how the results are analysed.
Here's the 2006 paper (with answers). My initial instinct is that the questions look pretty similar to those at the top end of a higher GCSE paper - mixed with a few of the detailed 'explain' questions we've seen more of recently.
I'm not sure whether you're allowed a calculator for the PISA test; I suspect not, given that the numbers involved aren't especially challenging, and arithmetic doesn't play a large role.
Instead, the questions seem much more weighted towards interpreting data, spatial and logical reasoning, a good deal of estimation and a little algebra.
I think these are skills you'd expect a good mathematician to have; however, they're absolutely nothing to do with the back-to-basics chant-the-times-tables ideas I generally see touted as solutions to the Maths Problem.
My thinking is that the PISA tests the maths required if you want to go into a technical career. It's not necessarily the maths you need for everyday life if you're, say, an administrator or a teacher.
I'm pretty sure there are two distinct threads in maths: numeracy (which you do need every day - decent number sense, ability to read graphs, dealing with percentages and money, familiarity with spreadsheets, among other things), and technical maths (the more advanced algebra, volumes of unusual shapes, trickier geometry, for instance). I think PISA leans1 - like the GCSE - towards the technical end, which isn't necessarily what everyone needs.
(By analogy: I think it's important to have a basic grasp of a foreign language and an idea of how to cook. But not everyone needs to be able to watch Amélie without subtitles or knock up a perfect soufflé.)
Well, it's hard to tell. We're not comparing against a fixed yardstick, only against other education systems. It's like asking if the 2009 England rugby team is better than the 2012 one - their performance relative to the other teams of their era doesn't tell you which would win a match between them.
In 2009, the UK was... pretty much bang-on average. The UK mean score was 492, statistically indistinguishable from the overall mean of 500. The spread was typical as well - the 5th to 95th percentile in England (for example) was 349 to 634, compared with 343 to 643 overall; again, you'd be hard-pushed to say that English students were anything other than typical in maths.
This year, the UK mean has improved slightly, although you wouldn't know it from the reporting; the report itself says "The United Kingdom performs around the average in mathematics and reading and above average in science."
I don't immediately see percentile data, but I do note that 11.8% of UK students were 'top performers' in maths, in both 2009 and 2012. That's a little lower than the OECD average of 12.8; about the same as the difference between the 21.8% of low performers in the UK compared to 23.1% OECD-wide.
All of that suggests we're an average country at getting 16-year-olds through maths tests, and perhaps slightly2 more consistent than other OECD countries.
Nobody sets out to be average. Obviously, I'd like the UK to be higher in the rankings than it is. However, to predicate an education policy on this test would seem pretty ridiculous.
Grown-ups - with the benefit of many years of doing everyday maths - have complained about how little school-leavers know in maths for as long as there have been schools, and Education Secretary after E.S. has tried to shake up the system to solve the perceived problem. That's understandable, if misguided. They've got to get re-elected and/or promoted, after all, so they need to look like they're doing something3 .
I intensely dislike the "kids know nothing and schools are failing" attitude that prevails in politics, the media and on the ground; we can make improvements, certainly, but I don't think "we're turning into a banana republic and it's all [X]'s fault" is at all helpful.
I reckon we should instead be focussing on how to engage more students with maths, show that it's not all about numbers, and encourage good mathematical skills4 like collaboration, persistence and research, rather than tougher exams.
I'd love to hear your thoughts - leave a comment below!