icon

Making sense of election statistics

Share this page

Making sense of election statistics

 

Polling station

Tonight, in the final televised debate ahead of the election, the three main party leaders will talk about the economy, the recession, public sector debt, spending or cuts, and more. All will use statistics to back up their points or to pull apart their opponents' arguments. But how can we work out whether to believe the figures and what do they really mean?

Statisticians, journalists and scientists today launched Making Sense of Statistics, a guide that provides a few questions you can ask and outlines the pitfalls to look out for when weighing up claims that use statistics.

And with just one week to go before the General Election, the groups Sense About Science and Straight Statistics have released a companion guide, Making Sense of Statistics in an election.

To help you judge the politicians' performances tonight and in the final days before the polls open, the guide's authors (including Plus contributors Nigel Hawkes, David Spiegelhalter and Michael Blastland) have the following four points to keep in mind when working out who wins your vote.

It sounds like a lot of money; is it?

The national debt, public spending, and the need for cuts are all quoted in millions and billions of pounds. The sums are so large that they are hard to comprehend. But big figures can be made more manageable by reducing them to the domestic scale. For example, the NHS costs £110 billion a year. But divide this by the population of the UK – around 60 million – and it comes to an annual cost of almost £2,000 per head. Divide by the number of weeks in the year, and the NHS costs each of us £40 a week.

With any claim that quotes figures, we need to know the context before deciding if we agree or disagree. Is a cut of £6 billion in public spending big, or small? It helps to know that total public spending will be nearly £650 billion in 2010, and that the gross domestic product is close to £1,500 billion.

When talking about change, we need to know both the absolute and the relative amount. Changes in tax or National Insurance contributions, increases in the minimum wage... promises to increase or cut spending are common but to work out how important these are we need to know the absolute change as well as the relative one.

If an election promise was to increase spending on training for the unemployed by a third (the relative change), how much money would actually be spent? If the spend on training was £60 million to start with, we can work out that the new spend would be £80 million; a gain of £20 million (the absolute change). Or suppose a local authority promises to spend another £50,000 on a recycling scheme. It may sound like quite a lot, but if it is only 2% of what is already being spent that might change your view of it. After inflation it wouldn't be a real increase at all.

To understand changes like these, we need to know both the relative change (a 5 % increase, or a 2% cut, for example) and the absolute change (£10 million more, or £4 million less).

What's the margin of error?

A well-conducted poll (typically opinion polls ask between 1,000 and 2,000 people) can tell us how the electorate might vote. Polls tell us the expected percentage of votes for different parties with a usual margin of error of plus or minus three percentage points.

So if a poll found that Labour is expected to get 27% of the votes, the Conservatives 33% and the Liberal Democrats 31% (as one did on April 21st 2010) then the real share of the vote could range from as low as 30% to as high as 36% for the Conservatives, the Liberal Democrats could be as low as 28% or as high as 34% and so on. Margins of error are seldom quoted, but they can be greater than the gap between the parties. So polls often mean less than they appear to; following the trend over time may give a truer impression.

What’s actually been counted?

With claims that crime is going down, but violent crime is rising; and that unemployment is rising but the number of people claiming for unemployment benefits fell sharply, how do we know what is really going on?

The first thing to check is whether what's been counted is the same. Earlier this year, the UK Statistics Authority criticised claims that there had been an increase in violent crime because the statistics used were not a fair comparison. The problem? The claims used crime statistics from the late 1990s to 2008/2009 without acknowledging that in 2002 the definition of a violent crime changed. This change in definition led to a sharp increase in violent crimes recorded.

The second thing to check is whether all the appropriate statistics are being quoted, not just a selected few. In crime, that means both police-recorded crime, and the British Crime Survey, which often reach different conclusions.

"Taking numbers for granted is naïve — you become a sucker for spin. But treating them all as so many lies, turning away in cynicism, is to give up on every political, economic or social argument you follow, every public cause you love or hate," says one of the authors, Michael Blastland. "The middle way is the only way: to learn how numbers work." Let's hope both the electorate and the politicians check their sums before the big day next Thursday.


Further reading

Find out more about election maths in the following Plus articles:

David Spiegelhalter runs the Understanding Uncertainty website and has a column in Plus. Michael Blastland is co-author of the Plus article The tiger that isn't: numbers in the media, and you can hear Nigel Hawkes in the Plus podcast Evaluating a medical treatment.

Read more about...