Poll-dancing
Political polls simulate democracy—but often capture and convey the wrong things.
Americans are poll-obsessed. Which pollsters rank highest for accuracy and transparency so far this election year? Check out our brief and basic guide to help you spot the good polls from the bad and navigate what’s ahead.
NEW YORK — Okay. Full disclosure. I’m a bit of a data nerd, and I’m addicted to this election year’s public opinion polls.
With autocracy on the ballot, my obsession with political polling has never been quite as pronounced. And now, the amped-up, wildly unprecedented, age-baiting cage-match for the White House is making it hard not to notice my news feed getting flooded —by the hour—with new polls and surveys en masse.
With the presidential election just months away, U.S. voters are being inundated with news about polls telling them who’s ahead, who’s not—and since the withdrawal of Joe Biden from the race, who else besides J.D. Vance and Kamala Harris may be joining the fray.
Are polls still relevant, or even accurate? Are some too “instant” to matter? How do political polls affect the presidential election, and outcomes? How can you determine which polls are more accurate than others, and select which polls to watch?
Read on.
Has there ever been a presidential election year so dominated (and influenced) by polls?
American historian Jill Lepore says no. From the late 1990s to 2012, she says, more than 1,200 polling organizations conducted nearly 37,000 polls by making more than 3 billion phone calls. This year, alone, the number of polling organizations has doubled, as has the number of polls being conducted.
Has there ever been a presidential election year with as much misinformation swirling, questions circulating about both candidates’ mental acuity and AI-generated “deep fakes” darting around to skewer the general public’s opinion of one candidate over the other, without being held accountable for false framing? Has there ever been a contest for the White House with as many public doubts about any candidates’ ability to lead?
No. Campaigns have always been a bit nasty, but this year’s penchant for turning that ‘nasty’ into thousands of high-speed, true-and-false “attack” videos made on the cheap and spread across social media in a heartbeat is something very new. Generative AI only recently burst on the scene, and media distribution speeds have become nearly instant, enabling sophisticated, tech-driven campaign ‘war rooms’ to react, in near real-time, to each turn of events—large and small.
Doctored GOP videos of President Biden—along with hundreds more highlighting Biden’s actual communication mis-steps in his debate against Trump—are still getting shared by the thousands, and globally. Short videos of Trump’s most blatant lies about Biden—and now Harris—are becoming daily bulletins. Fact-checkers at CNN and NBC News say they have had trouble keeping up. [“The lies are sprinting the 100-meter dash and the fact-checking is taking a stroll on the beach, so it’s never going to catch up,” said one Democratic strategist.]
I’m not alone in my poll fixation.
A new Pew Research poll says that while a majority of Americans (62%) are “worn out” by 2024 presidential campaign coverage, they also are closely following coverage of Vice President Kamala Harris and Donald Trump (73%)—and will continue to do so, with 40% of them following all polls “very closely.”
“General election polls are blaring like sirens,” says The New York Times’ opinion columnist Michelle Goldberg, “with no end in sight. … The public’s demand for polls guarantees a robust supply of them this year.”
Voters Beware
But voters, beware. “A lot of instant Internet polls are pure publicity malarkey, which are difficult for most voters to distinguish from so-called scientific polls, and they’re sowing confusion, and making even good polls into bad polls, by influencing their results,” says historian Lepore. In other words, this year’s new breed of polling isn’t just helping us understand the atmosphere of an election. Instant polling can mislead us in a lot of ways, if we’re not careful nor up to speed on the issues and candidates—nor familiar with some of the basics of how polls are conducted.
Here’s a brief guide to help you navigate and distinguish the good polls from the bad—something to remember after the recent Trump-Biden debate. What we see happening isn’t always the same as what people feel about what they just saw.
Inaccuracies. Many polls got it wrong in 2016. Hillary Clinton was not a “shoo in.” [Trump won, not Clinton.] In 2020, Biden won, not Trump, and in 2022, there was no “red wave.” Historically, even before then, pollsters weren’t getting it right most of the time. In 1948, George Gallup famously, and wrongly predicted Dewey would win over Truman; in 1952, Dwight Eisenhower unexpectedly defeated Adlai Stevenson, with dozens more famous pollster misses to follow. Gallup, like today’s Nate Silver, liked to say pollsters take the “pulse of democracy” but the late writer, E.B. White, advised Gallup against rushing to translate poll results into meaningful conclusions. “Although pollsters can take a nation’s pulse, you can’t be sure that the nation hasn’t just run up a flight of stairs,” he said. The takeaway for us? Don’t take one poll seriously. Look for patterns across a variety of polls.
Disinformation. Many years before ChatGPT was released, the University of Cambridge’s Social Decision-Making Laboratory created a relatively new polling tool, one they called the Misinformation Susceptibility Test (MIST). In collaboration with the pollster YouGov, it used AI-generated headlines to test how susceptible Americans are to AI-generated fake news. One headline was “Certain Vaccines Are Loaded With Dangerous Chemicals and Toxins.” Another was “Government Officials Have Manipulated Stock Prices to Hide Scandals.” YouGov asked those being polled if they believed the headlines. The results were concerning: 41% of Americans incorrectly thought the vaccine headline was true and 46% thought the government was manipulating the stock market. By mixing real and AI-generated images to accompany such headlines, politicians can blur the lines between fact and fiction and use AI to boost their political attacks. Be on the look-out for fake news. Seeing is no longer believing in a politically divided election year.
Decreasing response rates. The best and most responsible pollsters, whether Democratic, Republic or nonpartisan, want nothing so much as reliable results. Yet today, with response rates often in the single digits, difficulty remains. Polling shapes politics. Size and inclusion matter. Most political polls are still merely tools that help uncover public opinion —but some are not accurate representations of it because those polled don’t always represent a full mix of those registered to vote. Does the poll include a good mix of young and old people? A balanced mix of registered Republicans and Democrats? Genders? Races? Incomes? Polls which are inclusive in these ways tend to be more accurate than not.
Lack of Transparency. What should you look for in a poll to know it was conducted well? Who conducted the poll, and how— and when was it conducted, who was interviewed and how were they selected? All of these factors are key to the accuracy of results. But there are challenges. A lot of polls are still being conducted by phone— despite the fact that fewer people than ever answer phone calls in today’s smartphone environment unless they know the person who’s calling them. And who’s being polled? Sometimes, those queried either know nothing about the matters the polls purport to measure, or express no opinion about them. Such polling problems, known as non-opinion, forced opinion and exclusion bias challenges, can dramatically skew results. “The first question a pollster should ask but doesn’t do often enough,” the sociologist Leo Bogart advised famously in 1972, is, “Have you thought about this at all? Do you even have an opinion?” Before considering any poll to be gospel, check out the poll’s methodology statement and list of questions included in the poll’s survey findings. Check to see if the poll, as published, includes transparency information about its sample size, when the poll was conducted, a description of respondents and response rates. The best polls make this information clear, and easy for anyone to access.
Sponsor bias and polarization. Sponsors of polls can meddle by making questionable tradeoffs between attention and accuracy. Not only are many polls commissioned by partisan groups with obvious biases (and some bordering on disinformation), some sponsors seek provocative polling results and use them to gain the prominence of stature and the expert academic authority they lack. Sponsor bias is especially demonstrative in political campaign polling, with results often differing by as many as 3 or 4 percent points, or more outside the margin of error, says Silver. Polarization is a relatively new challenge for pollsters when designing a fair poll.
Wording bias. What you hear from respondents depends on who you ask—and how your phrase the questions. Framing matters. For example, do you let people answer the questions as they like, or limit them to choosing one of two answers to control options rather than listen for new insights? And how are the questions framed? Example: If someone asked you if you prefer cats or dogs? What if you prefer neither, but have a strong preference for parakeets? Limiting responses tends to catalyze inaccurate results—or certain kinds of results that may be misleanding. How questions are phrased can manipulate the truth, or keep pollsters from finding it to begin with.
Assumptions about voter turnout and crowd sizes. Comparisons of new data to historic data is fine, but often only valid to generalize. Example: How many people attended an Inauguration ceremony for Trump versus Obama? Pollsters can only extrapolate the turnout rates of previous years—or, in this case, compare aerial photographs of both crowds. [To disprove Trump’s claims that his inaugural crowd on the national mall was larger than Obama’s, the aerial photographs of both ceremonies were made public.] And beware of polls or news stories labeling anything a “record crowd.” The last couple of election cycles have cited “record turnouts” across both sides of the aisle, especially by younger voters, but with fuzzy or incomplete data to back up any specifics, thereby lessening the value of already-displaced historical precedents. Voter turnout is the number of people who voted in an election, not a crowd count of those attending political events.
Polling the pollsters
Which pollsters rank the best in terms of credibility, accuracy, transparency and inclusive reach and methodology?
According to 538’s pollster ratings, here is a list of the nation’s Top 10 most credible and accurate political polls/pollsters operating in 2024:
ABC News/The Washington Post
University of Massachusetts Lowell Center for Public Opinion
What’s your take on what to watch for when ranking pollsters this election year? Please share! And don’t forget to vote. Every vote counts.
This post was updated July 26th to reference Biden’s withdrawal from the 2024 presidential race and Vice President Kamala Harris’ announced bid for the White House.