The downturn in the accuracy of polling seems to have picked up steam in 2012, after major pollsters Rasmussen (which is right-leaning) and Gallup (which is not) showed that Republican challenger Mitt Romney would best incumbent President Barack Obama.
Obviously, Romney did not win, and Gallup ended up holding a news conference about the errors in its polling and correcting its methodology. Gallup listed four reasons why its polls were so wrong, including the way it weighted white respondents and categorized likely voters.
Not every pollster was wrong about the 2012 election. The Real Clear Politics average ended up being accurate and Public Policy Polling, a liberal firm, was actually the most accurate pollster of the election.
But 2012 was just the beginning of bad polling. In 2014, Democrats tried to claim that again the polls were skewed against them. In reality, the polls were skewed in their favor and led many to believe Republicans would not win as big as they did in the midterms. Republicans ended up taking back the senate that year, leaving pollsters scratching their heads and re-evaluating their methods once again.
One particular pollster in 2014 received grief: McLaughlin & Associates. In a poll they conducted at the end of May, just before the Virginia primary, McLaughlin found then-House Majority Leader Eric Cantor with a 34-point lead over his challenger, little-known professor Dave Brat. Cantor went on to lose re-election by 12-points.
John McLaughlin, president of the polling agency, told reporters that Democrats had voted in the primary and ruined the results. This was impossible to defend, and seemed to be just an excuse, considering this same pollster was wrong about Romney in 2012 and the Virginia senate election that same year.
After the defeat, Republicans privately told each other to stop using McLaughlin.
America isn’t the only country where polls have gone wrong, either. Polls confirmed popular (well, popular to the media) opinion that Israeli Prime Minister Benjamin Netanyahu would lose re-election in 2015. Early exit polls also showed a close race between Bibi and his challenger, Isaac Herzog. But just three hours later, Bibi won in a landslide.
At the time, the Observer’s Rabbi Shmuley Boteach explained that security issues played a large role in Bibi’s re-election.
“With genocidal Hamas to the West, Hezbollah to the North, and Iran to the East, how could security issues not have been the primary issue?” Boteach wrote. “For all the talk about how hated Bibi has made Israel, it was just as hated under Rabin, Peres, and Olmert. I know because I was hosting Prime Ministers of Israel at Oxford at the time and saw the huge demonstrations against them.”
Later that year, polls showed that the U.K.’s Labour Party would win big at the polls and Prime Minister David Cameron would be ousted. Here again, the polls were wrong and the Tories won. Cameron remained prime minister until…
Pollsters in 2016 got the Brexit referendum wrong. Prior to the vote, polls showed Britain would overwhelmingly vote to remain in the European Union. When all the votes were counted, 52 percent had voted to exit the E.U. The polls here weren’t as egregiously wrong as they were in 2015—their final polls were within the margin of error—but they were wrong nonetheless.
Just this week, pollsters got it wrong again. Three surveys last week showed that voters in Colombia would pass a peace deal with rebels backed by the government. It was predicted to pass with 60 percent of the vote. In the end, the referendum failed.
So what’s going on with polling? It’s complicated, and each failure has unique elements to it. For most of the polling failures, including those in America and the Tory victory, systemic bias led to the problem. The oversampling of young voters made it appear as though U.K.’s Labour Party would win. Oversampling of Democrats made it appear as though they would win in 2014.
For Brexit, bias didn’t play as much of a role, but underestimating voter turnout did, just as with Colombia referendum. Pollsters here again overestimated the participation of young people in the vote.
“We’re only as good as our assumptions about who’s going to show up,” Cliff Young, president of U.S. public affairs at a U.K polling firm, told Bloomberg News.
Another pollster told Bloomberg that question wording plays a role in skewed results, and did so in Colombia.
“It’s a classic social desirability bias: You’re asking people, do they want to support peace. People seemed obliged to say yes,” he said.
Another problem with modern polling is that it isn’t, well, modern. Polls are still largely conducted using landlines, and people—especially in America—are increasingly turning to cell phones.
All this takes a toll on the reliability of 2016 polls, as Politico’s Steven Shepard wrote in 2014.
“As Americans become even harder to reach by phone—and emerging methodologies, such as Internet polling, remain unproven—the poor performance of pollsters this year casts serious doubt on the reliability of surveys during the 2016 presidential race,” he wrote.
So what are we to make of current polls? Certainly, reporting on where a candidate is polling can have an effect on the electorate, either by exciting or suppressing voters or forcing a candidate to make changes to their campaign. After several weeks of bad polls showing Democratic presidential nominee Hillary Clinton down to GOP nominee Donald Trump, the polls have shifted. Clinton is again leading Trump, after weeks of bad news coverage of the GOP nominee and the general consensus that Clinton won the first debate.
Was it all about the bad news of the day? Or were conservatives suddenly oversampled when Trump was winning and Democrats oversampled now? Or have pollsters fixed the errors of the past few years and the polls are now accurate? We won’t know until after the election.
FiveThirtyEight’s Nate Silver wrote that polling bias changes year to year. Sometimes it favors Democrats, sometimes Republicans, there’s no grand partisanship in polling.
“Beyond the shadow of a doubt, the polling was biased against Democrats in 1998, 2006 and 2012. However, just as certainly, it was biased against Republicans in 1994, 2002 and now 2014,” Silver wrote. “It can be dangerous to apply the ‘lessons’ from one election cycle to the next one.”
People (including this author) won’t stop writing about polls, because they’re always interesting. And even if 2016 proves once again that polls mean nothing, it won’t crater the industry. But if pollsters are wrong again, it will be difficult for anyone to take them seriously with so many major mistakes.