Are pollsters exaggerating the Yes vote in Scotland?

“Why do opinion polls in Scotland vary so much?” asks Peter Kellner. It is an important question. An answer would give us a clue as to what will happen in the independence referendum. The president of YouGov is puzzled by the results from his and other polling companies. He produces this table by way of argument:

Mr Kellner has taken the past five or six polls from five of the main pollsters and calculated the average for the Yes vote. Next to that column he shows the range in the Yes vote across the five or six polls. Aside from ICM, these ranges are narrow.

He wants us to take two things from this table. First, that there has been very little movement in any of the individual polls since Christmas 2013. Second, that there is a curious difference between the results of YouGov and TNS, which place the Yes vote between 39-42, and Panelbase and Survation, which report higher totals.

On Kellner’s first point, polls haven’t been quite as flat as he suggests. “Every single poll was higher [for Yes] in March than it was in November”, according to Professor John Curtice, the professor of politics at Strathclyde University whose blog serves as something of a peace and reconciliation commission for squabbling psephologists.

The chart below, from Prof Curtice’s blog, tracks a rolling average of the six most recent polls. (The FT’s version, which separates out don’t know voters, is here.) There was a small movement towards the Yes side through April (highlighted on the chart) but some of those gains have since been reversed, as recent polls confirm.

Nevertheless, there hasn’t been that much movement in individual polls. I think this can be explained by thinking about what is related to Scots saying they will voting one way or another, i.e. which other views are highly correlated with a Yes or a No vote. Doing this helps to separate the signal from the noise surrounding the long campaign. One view is the degree to which someone feels Scottish and not British, which contrary to what many people believe hasn’t change much since 1999. The other is their view of the economic consequences of independence. A person’s view on either of these issues is not easily swayed by the vicissitudes of a campaign.

So, if we say that Mr Kellner is half-right about his first conclusion, what should we make of his second point about the relatively big differences among the pollsters?

Remember that the art of polling is in large part taking raw data and weighting it so that the sample is representative of the population you are interested in. As well as weighting for demographic characteristics, pollsters adjust their samples to take into account how people voted in the past and whether they will vote next time. Different pollsters have different ways of sampling and different ways of weighting.

Kellner’s argument – and remember that he is the president of YouGov – is that other pollsters have a dodgy sample and they are not weighing it properly.

He alleges that his rival Survation, for example, has too many SNP-supporters in its sample, and therefore it inflates the Yes vote in its final results. In the table below, he compares the shares of a recent Survation sample who say they voted for each party in the 2010 UK general election, with the actual result at the polls.

A sample in which 36 per cent said they voted for the SNP in 2010 may, when in fact only 20 per cent did, is liable to suggest the Yes camp is doing better than it is, he says. There is more than can be explained through recall bias, Mr Kellner implies.

Is Kellner right that errors such as these flatter the Yes tally in some polls? We will only know come September 18. But Prof Curtice belives Mr Kellner “has slightly missed the point”. Almost all of the unweighted samples used by pollsters in the Scottish referendum are weighted so to increase the size of the Yes vote. There are no shenanigans here; older people in higher social classes are more likely to complete polls, especially via internet panels. They are also more likely to vote No. Pollsters have to make a call, then, about how much to adjust tallies upwards.

YouGov does this. So too does Survation. And its method of adjusting for past vote actually makes little difference between the unweighted and weighted tallies – once other adjustments have been made, e.g. for age, class, income and gender. If other pollsters are wrong, they are wrong in the round, not just for neglecting recall bias.

In April, Martin Boon, the director of ICM Research, wrote an honest and fascinating op-ed for the Scotsman on the difficulties of polling for the Scottish referendum. He mentions the uniqueness of the vote; unlike in local or general elections, there are no historical precedents that pollsters can use. And he describes how the Yes vote tends to increase from unweighted to weighted samples.

Mr Boon also reflects on something that I spend too much time thinking about. People who vote are more likely to complete opinion polls – he mentions how 77 per cent of people in a recent ICM poll said they voted in the previous Scottish parliament election, when in fact only about 50 per cent voted. In typical elections, there are well known ways of adjusting for that difference. But in the referendum, turnout really is going to be high (77 per cent is about the “over-under”). The results will in part be determined by voters who didn’t turn up in 2011 but who will vote this time – and yet there is no precedent for what adjustments to make.

What polling that has been done on these non-voters turned referendum-voters suggests they are marginally more likely to be No voters than the average. Perhaps this is surprising: we may think that voters are being roused from their apathy by the promise of independence. But non-voters in Scotland are quite like non-voters anywhere else: poorer, less educated, and turned off by traditional politics. If they would have been swayed by the romantic appeal of nationalism it would have happened by now. They are pragmatic. A high turnout, then, may help the No campaign rather than the nationalists. If true, this would suggest that Mr Kellner is right to be relaxed about a comfortable win for the unionists.


Update: I’ve followed this post with one on the suggestion that unionists are scared to tell pollsters they’re voting No, and one on the myth that the SNP surge in 2011 was missed by election watchers.