The Vector Blog ›

Nothing's Wrong With The Polls

By Marc Zwelling
The Vector Poll™

Most polls leading up to the recent U.S. election nailed Joe Biden's share of the vote almost to the decimal point.

The actual result was Biden 51.3%, Donald Trump 46.9%. A week before election day, Biden led by 52% to 44% in a Fox News poll and by 52% to 42% in a USA Today poll.

An NBC-Wall Street Journal poll ending three days before the election had Biden first, 52% to 42%.

But Trump's bigger share in the election than in the polls led to this headline on the CBC website: "How the polls got it wrong — again."

In The Globe and Mail, columnist Gary Mason wrote, "The story of the election, it seems, is the degree to which the pollsters, many of whom were pointing toward a Democratic landslide, got it so wrong."

It's the pundits, however, not the polls, who got it wrong.

A poll is not a prediction. Claiming a poll was wrong because the vote results were different is like blaming the speedometer if you get lost on the highway.

There is no way of knowing what 235 million eligible U.S. voters actually think. So, no one can know if a poll is right or wrong.

While the pundits rejoice in schadenfreude, pollsters can blame themselves. One polling company's website boasts the firm "accurately forecasted the popular vote in the 2019 Canadian General Election." Another polling company brags that in the same election it was "once again the most accurate firm in the country."

When polling firms assert their polls were "the most accurate" they enable journalists to be the Supreme Court for surveys, ruling on the false distinction between polls that "got it right" and "got it wrong."

If a pre-election poll doesn't nail the election results, don't blame the pollsters. Blame the voters.


Like any survey, pre-election polls measure opinions on the days that polltakers conduct their fieldwork. Even a poll finished an hour before voting places open can fail to foresee an election outcome. Some people change their mind when they have a ballot in their hand.

Another reason polls can't predict election results is that not everyone votes. A charity has a list of donors. A union has a list of members. Everyone polled from a charity's list is a donor. Everybody surveyed from a union list is in the union. But polltakers can't be certain their samples are exact miniature replicas of the people who vote.

There is another explanation why polls and election results disagree. In a 1982 article in Public Opinion (an American Enterprise Institute journal), researchers William Schneider and I.A. Lewis asked, "Do people lie to pollsters?" The answer is yes.

Analyzing U.S. post-election polls from 1966 through 1978, Schneider and Lewis found that the respondents' reported turnout was 5% to 9% higher than the actual turnout. Post-election polls in Canada show the same phenomenon: 70% or more claim they voted, 10 to 20 percentage points higher than the turnout.

Researchers have conducted experiments asking respondents their impression of fictitious names. In a poll in Germany (cited in the Public Opinion article) a nonexistent cabinet minister ranked ahead of 10 real ministers.

We lie because we want to convince ourselves and others that we practice desirable social behavior.


In a survey conducted by the Harris Poll in October with Canadian credit card users, 68% said they intend to shop for holiday gifts at businesses owned by women, and 66% said they would shop at Black-owned businesses. That two thirds would check the ownership of the retailer before they buy is commendable — and unbelievable.

Perhaps the U.S. polls under-counted Trump's support because many of his voters were embarrassed to confess their true feelings to an interviewer. There is evidence of this in a Leger poll in December where one-quarter of his voters said they don't "like Donald Trump personally." Other Trump voters may refuse to answer surveys because he incites distrust of the media, which sponsor many polls.

What should pollsters do to stop the damage to their reputation from "wrong polls"?

  • Remind the news media that a poll isn't a census. Poll results are estimates. A poll poses questions to a sample of the population, not everyone.
  • Explain that a poll's margin of error refers only to possible errors resulting from skews in the sample (too many older respondents, too few young men or women). There is no way to know the range of possible errors from question wording that influences the answers or asking questions in a particular order.
  • Be modest. If polling companies stop saying "our poll was most accurate" they won't have to defend their profession from taunts that polls were wrong.

Like disclaimers and warning labels on medicines, every media story on an election poll should include a caveat like this:

"The poll results in this article are not intended to predict the election outcome. To do that would require being able to identify who is going to vote. Surveys cannot do that. People who actually vote will differ in ways that are impossible to predict, and opinions can change between the time the poll was conducted and the election."

(Thanks for this suggestion to Lance M. Pollack, a University of California-San Francisco statistician.)

The U.S. election proves polls matter even if they aren't forecasts. The streams of data they produced enable pundits to read the American mind in depth the way nothing else can. The polling industry can honestly boast, if a poll didn't tell you, how else would you know?



Marc Zwelling is the founder of the Vector Poll™ and author of Public Opinion and Polling For Dummies (Wiley, 2012). This article was first published in Ontario News Watch. Posted date : January 10, 2021


< go back

Read more from The Vector Blog»







Visit Adobe.com to get the Free Adobe PDF Reader

Vector Research has clients in every sector
The Vector Blog
  • The Vector Blog