Polls published on the eve of a general election do not have a good pedigree. It’s not just their infamous performance in 1992 that counts against them, though that remains the benchmark for inaccuracy.
Their record at every election in modern times has been patchy at best, and there has only ever been one instance in recent history of a pollster correctly forecasting vote share for each of the three main parties.
Some have come close, however, and this year we’re likely to see a record number of companies publishing eve-of-election polls. Which should we treat with the most credibility?
No pollster emerged from the 2010 election with a sparkling record.
No pollster emerged from the 2010 election with a sparkling record. Nobody got all three shares correct; they didn’t even get two of them right. The Tories’ share of 37 per cent was correctly forecast by a pair of pollsters: Populus for the Times and ComRes for the Independent/ITV. But nobody correctly put Labour on 30 per cent or the Lib Dems on 24. (Note: all polling figures are for Great Britain, not the UK.)
Looking at the nine polls published on the eve of the 2010 election, the spread in projected shares of the vote were relatively constant: the Tories were on 33-37 per cent, Labour on 24-29 per cent and the Lib Dems on 26-29 per cent.
The pollsters performed worst on the Lib Dem share, but were actually pretty good on the gap between Labour and the Tories. They put the swing from the former to the latter at between 4.5 and 6 per cent. The actual swing turned out to be just over 5.
2010 represented a major break from history.
The mean error of the nine polls was Conservatives -1.4, Labour -2.4 and the Liberal Democrats +3.4. This represented a major break from history. In all bar one of the eve-of-election polls over the previous 20 years, Labour’s share had been overestimated.
In 2010 this changed. Nobody overstated Labour. Indeed, Labour was underestimated more than the Tories, while it was the Lib Dems who were markedly overestimated: a fault assigned by pollsters both to a late swing away from Nick Clegg’s party and an overestimate of the number of Lib Dem supporters likely to vote.
Here are the nine polls, arranged alphabetically by company:
To assess who was closest overall, we can use each poll’s mean error to compile the following league table. The figure shows the deviation in percentage points.
1. Populus/Times — 1.7
2. Ipsos MORI/Evening Standard — 1.7
3. ICM/Guardian — 1.7
4. ComRes/ITV-Independent — 2.0
5. Harris/Daily Mail — 2.0
6. Opinium/Express — 2.3
7. YouGov/Sun — 2.7
8. Angus Reid/Political Betting — 4.0
9. TNS BMRB — 4.0
Where two or more pollsters have the same deviation, I’ve ranked them according to how close they came to hitting one or more of the actual results. Hence while Populus, Ipsos-Mori and ICM all have the same deviation, Populus comes top as it was the only one of the three to get any element of the forecast spot on (the Tories’ share of 37 per cent).
The two pollsters at the bottom of the table, Angus Reid and TNS, are also the only ones who suggested the Lib Dems would come second in share of the vote ahead of Labour. The reverse proved to be the case. Unsurprisingly, both of these companies also underestimated the Tories – TNS by four percentage points.
Phone pollsters tended to do better in the eve-of-election forecasts in 2010.
2010 was the first election in which more pollsters carried out fieldwork online than by phone or face-to-face. And all four of the positions at the top of our league table are occupied by organisations which collected (and for all bar Populus still collect) data by phone: Ipsos-Mori, ICM and ComRes. 
The online firms were much lower down, coming sixth (Opinium), seventh (YouGov) and ninth/last (TNS). Differences between polls conducted online or by phone have provoked comment in this campaign, particularly on the subject of trends towards this or that party.
But while it’s fair to say phone pollsters tended to do better in the eve-of-election forecasts in 2010, the only organisation to get any of the three shares exactly right was one to source their data online.
Almost all of the pollsters who put out eve-of-election polls last time round will do so again this year. Only two have not been active in this election campaign: Harris and Angus Reid. Out of the other seven, three have retained the same associations as in 2010: MORI is still with the Standard, ICM with the Guardian and YouGov with the Sun/Sunday Times.
We should end up with a grand total of 11 eve-of-election polls.
Opinium is now with the Observer, ComRes supplies polls for both the Independent on Sunday-Sunday Mirror and the Mail; while Populus has joined TNS in not being associated with one particular publication.
Add in the four newcomers in 2015 – Lord Ashcroft, Survation, Panelbase and BMG – and we should end up with a grand total of 11 eve-of-election polls, all of which will be reported and analysed here on May2015. (Two have been released so far – Populus’, which showed a tie, and Ashcroft, which put the Tories ahead by 2.)
If any of these polling companies manages to get more than one share of the vote correct, it will be a massive boost not just to their own reputation but to that of eve-of-election polls in general. If that company turns out to be one that uses online fieldwork, the impact will be even greater.
Just 10 years ago only one online pollster was in the field: YouGov. It didn’t forecast any of the shares correctly, ‘predicting’ a 37-32-24 Labour-Tory-Lib Dem share in 2005. By contrast, NOP, a defunct phone pollster, got it spot on: 36-33-23. It was the only one to get it exactly right.
Four other eve-of-election polls put Labour ahead by between three and six percentage points, all within the margin of error but all overestimating Labour’s lead. This was also true of the one poll outside the margin of error, Communicate for the Independent on Sunday, which put Labour on 39 per cent and the Tories on 31 per cent.
NOP remains the sole example of a hat-trick in any poll published on the eve of an election in the past 40 years. It’s a record that should caution us against treating this year’s forecasts with too much authority. History suggests that trends can be more useful that actual numbers, but even these can prove misleading, most notoriously in 1992.
Five polls were published on the eve of this particular election. Aall underestimated the Tories and overestimated Labour dramatically. Gallup was closest, but was still 7.1 points out on the Tories’ lead over Labour. NOP – the same company that would get the result spot on in 2005 – was the most out, by a whopping 10.6 points.
While the Tories’ lead over Labour turned out to be 7.6 points, Gallup had forecast 0.5; ICM 0; Mori -1; Harris -2; and NOP -3.
A number of explanations were offered afterwards by way of an apology – unrepresentative samples of respondents; a late swing; a so-called “spiral of silence” on the part of Tory voters (‘shy Tories’); selective participation in polls – but because all the forecasts had leaned the same way it had been easier to believe they were all fairly correct.
1997 and 2001
In 1997 five eve-of-election polls were released. Things improved slightly, but a third of estimates were outside the 3 per cent margin of error. Four of the five polls overestimated Labour’s vote.
Only ICM underestimated it, by one percentage point, and also came closest with the size of Labour’s lead over the Tories. Nobody got Labour or the Lib Dems’ share exactly right, and only Harris got anything spot on, correctly forecasting the Tories’ share of 31 per cent.
John Curtice was right to note at the time that these polls “were nothing like as successful as was suggested in the [pundits’] initial commentary.”
Labour was overestimated again in 2001. All five eve-of-polls exaggerated the party’s lead by an average of five points. Gallup and NOP overestimated it by eight points. This was not a good year for polling companies, although, as David Butler and Dennis Kavanagh revealed in their study of the 2001 election, a private study by Labour a few days before polling day explored what could happen if the turnout fell to 55 per cent and came up with a prediction that exactly matched the 42-33-19 per cent outcome. (The actual turnout was 59 per cent: a post-war low.)
Before the 1990s, Margaret Thatcher’s landslide wins were just as expected as those of Tony Blair, yet the polls were similarly wayward in their final forecasts. In 1983 every organisation overestimated the Tory vote and the party’s lead over Labour, the latter by as much as 6.5 points (NOP).
Conversely, in 1987, five of the six eve-of-election polls underestimated the Conservative lead and overestimated Labour. The only pollster to come anywhere close in both elections was MORI, who in 1987 even managed to get Labour spot on.
In the 1980s, the polls were similarly wayward in their final forecasts.
The pattern, in short, is that there is no pattern. The pollsters have always got it more wrong than right, but in an inconsistent fashion that precludes any sensible extrapolation as to what may or may not happen this year. MORI fared best in the 1980s; Harris came closest in 1997; NOP was spot on in 2005; and Populus topped the table for 2010.
Two of those four are no longer providing polls during British general elections (NOP and Harris), but MORI came second behind Populus in 2010’s rankings, so you could argue those two are the ones to watch closest this time, while ICM has always fared fairly well.
History shows it’s best to be prepared for a bit of a surprise, however. I doubt we’ll see anything quite as bad as NOP’s opinion poll from almost 50 years ago on the eve of the Leyton by-election, which forecast a Labour lead of 20 percentage points only to see the party defeated by half a point: the greatest error in prediction ever made by a serious British survey organisation.
All the same, today’s pollsters might be in for a few rounds of humble pie this weekend.
Ian Jones is a contributing editor to May2015. He blogs daily at ukgeneralelection.com.
 An earlier version of this piece incorrectly stated Populus, who are an online pollster now, conducted their 2010 polls online.