UK Election: Polls vs Reality

UK Election: Polls vs Reality

As the polls clearly predicted, Keir Starmer’s Labour Party won a convincing landslide victory in the recent UK General Election.

Labour had held what looked like an unassailable lead for a long time before the 4th July election, and so it turned out.

While the pre-election polls largely got it right, and the exit poll on the day was even more accurate, there were a few discrepancies and trends that only got picked up late on.

After 14 years of Conservative rule however, the instability of the previous government, fall out over Brexit and the state of the UK economy, it was probably one of the easiest elections to call in recent memory.

But let’s dive into the numbers.

 

The Exit Poll

As is tradition in the UK, election night kicks off at 10pm with the exit poll conducted by Ipsos for the BBC, ITV New and Sky News. This poll is usually an extremely accurate predictor of the result as it is based on voters’ responses as they left polling stations earlier in the day.

Voters at 130 polling stations are approached at random and asked to fill out a replica ballot indicating their vote. These ballots are then used to model and predict constituency outcomes by comparing the data against previous election results at the same polling stations.

The forecast also considers various factors such as voting patterns in the 2016 EU referendum and geographical voting shifts. Despite the challenges of predicting election outcomes in contentious years and the adjustments required due to boundary changes from the 2023 review, confidence remains high in the exit poll’s reliability.

Despite the failure of the exit poll to predict a Conservative victory in 2015 – it said there would be another hung parliament – the exit poll is generally extremely accurate, as you can see in this graph of the previous four elections:

This time the exit poll projected Labour to secure a large majority with 410 seats, with the Conservatives trailing on 131 seats.

How close was that? Pretty close, as it turns out. The final tally saw Labour clinching 412 seats, while the Conservatives ended up with 121. That’s a difference of just two seats for Labour and ten for the Conservatives.

 

The Pre-Election Polls

Most polls leading up to the election consistently showed Labour with a substantial lead. This trend was confirmed in the actual results.

The BBC’s poll tracker noted a slight convergence in the polls just before the election, with Conservatives experiencing a minor increase in support and Labour a slight decline. However, this minor shift did not significantly impact the overall outcome.

Reform UK, which was polling in third place before the election, seems to have underperformed compared to some pre-election polls. The exit poll projected 13 seats for Reform UK, which is lower than what some pre-election polls might have suggested.

The Liberal Democrats appear to have outperformed some pre-election expectations, with the exit poll projecting 61 seats for them.

The BBC’s final poll tracker settled on 39% for Labour, 21% for the Conservatives, 17% for Reform and 11% for the Lib Dems. As you can see by comparing the actual results and the final polls, this did indeed over-estimate the swing from Conservative to Labour or Reform.

Some polling models came impressively close to predicting the scale of Labour’s victory. Britain Predicts/New Statesman projected 418 seats for Labour, while The Economist went a bit overboard with 429 seats.

As everyone knows, translating a national polling average into seats won is a tricky game under the UK’s first past the post voting system. Slight regional variations, or a vote spread too thinly across the country (as was the case with Reform and the Greens), can lead to huge discrepancies in the eventual number of seats won versus votes cast.

 

The Actual Results

Let’s break down to see how close both the exit poll and pre-election polls came.

  • Labour won 35% of the vote share, slightly lower than polls predicted, but secured a staggering 412 seats (63.4% of total seats).
  • The Conservatives won 24% of the vote share, slightly higher than polls predicted, but only managed to secure 121 seats (18.6% of total seats).
  • The Liberal Democrats were the dark horse, outperforming poll predictions by winning 12% of the vote share and 71 seats (10.9% of total seats).
  • Reform UK, on the other hand, won 14% of the vote share but only 5 seats (0.6% of total seats).

So, while both the exit poll and pre-elections polls slightly over-represented Labour and Reform, while under-estimating the Conservative vote, both had correctly predicted the broad trends. The averages of the polls were also tracking in the right direction, correctly capturing over the course of the campaign the slight decline in support for Labour and huge increase in Reform’s vote share. Overall then, an excellent result and a great election for the pollster.

 

The Lessons Learned

So, what can we take away from the performance of the UK election polls?

  1. Exit polls models are extremely good and only getting more accurate.
  2. Pre-election polls are generally reliable for broad trends, but can miss nuances, especially when it comes to smaller parties.
  3. Regional variations matter. National polls might miss important local shifts.
  4. The UK’s voting system can create results that look wildly different from raw vote shares. Pollsters and pundits know this, of course, but their models potentially need on-the-fly tweaking to respond to events on the ground.
  5. Polls are a snapshot, not a crystal ball. They’re incredibly useful tools, but they’re not infallible.

 

The Future of Polling

Polling methodologies continue to evolve. The rise of online polling, the challenges of reaching a truly representative sample in an increasingly diverse and digitally connected society, and the need to account for complex regional variations all present challenges and opportunities for pollsters.

In the end, perhaps the real value of polls isn’t in their ability to predict the future, but in their power to engage us in the democratic process. They spark conversations, fuel debates, and remind us that in a democracy, every voice – and every vote – counts.

 

We’ve put together a Case Study to show some of the results and cost reductions we’ve achieved when conducting political polling offshore – download it here.

For more information about TKW Research’s Live Phone Polling solutions – and to find out how we’ve made it cost-competitive again – visit this page.