When should you be suspicious of poll results? Can election polls really influence voter behavior? Do people lie to pollsters? Public opinion expert Dahlia Scheindlin has the answers.
Public opinion polls get things wrong, and not only in Israel. They failed to predict the 2016 American presidential election, Brexit, the U.K. elections in 2015, and the exit polls in the 2015 Israeli election. But anger at polls — and pollsters — can sometimes feel like misplaced anger at election results themselves. It’s as if disappointed voters think that if only pollsters had predicted Trump’s victory, paradoxically, that the result might somehow have been different.
Social science is an extremely fallible endeavor. The great scholar of democracy and public opinion, V.O. Key, once called political advertising people “insecure, ulcer-ridden hucksters,” and this is a species for which I feel some kinship. We are paid on the premise that human behavior can be measured and predicted with rule-governed formulas, in the desperate belief that if we crack the code we can change voter decisions.
But human decision-making excels at not following rules. Further, both candidates and readers tend to forget that social or other scientists impose personal, methodological, and social biases on our reading of the world, even if unintentionally. All this can make it hard to understand what we can and can’t know from polls.
In general, I advise being a critical consumer of any kind of research, not only when voting but also when deciding what to eat, how much to sleep, which medicines to take, how to understand sides of a conflict, or the attitudes of our compatriots.
Therefore, what follows is an attempt to answer some common complaints I hear about polls, and tips for how to get the most out of them instead. I’ll focus on the current Israeli elections, but hopefully the ideas are applicable to polls about politics in general.
There are three points at which polls can go awry: how the survey is designed and conducted, how it is reported, and how it is interpreted (by you, the readers). All of these should be considered when deconstructing the following complaints.
‘Israeli polls are of terrible quality’
Like anything, there is high and low-quality polling. The only way to distinguish between them is transparency. Even the most perfect poll is worthless if the methodology is not made available.
Any media report or poll publication should include sample size, means of outreach (phone, internet, mobile), margin of error, and a description of the sample population (“all voting age adults,” or, in Israel, whether a poll is “representative of Israelis” – meaning Jews and Arabs – or only one social or ethnic group). Field dates are critical, since current events can influence responses. The identity of both the data collectors and the commissioning body must be stated.
If the methodology isn’t accessible, it’s safe to assume there’s something to hide: a poor or paltry sample, unscrupulous data collection, or a client with a known bias. Chuck the poll.
A little-known 2002 amendment to Israel’s campaign law mandates that public polls for media outlets must be sent to the Central Election Committee and made available on its website, including full survey information. It’s a very helpful resource, but with some flaws that we’ll examine further on.
‘People lie in polls’
This is a terribly negative view of human nature. A more accurate critique is that people are not so much cunning, as conformist. We have a normative bias: people may prefer to state political preferences they believe society finds acceptable, and fear expressing opinions they believe will make them look bad, resulting in social isolation.
The German political scientist Elisabeth Noelle-Neumann developed her “Spiral of Silence” theory around this idea in the 1970s, observing what might be considered an early version of political correctness. Her research was developed in Germany and the theory itself might well have a cultural bias.
But Israelis are certifiably the least politically correct species on earth. In the days of phone surveys, I used to listen in to monitor the quality of interviewing. There was no mistaking the pride people took in telling their voter choice — there’s no “shy Tory” syndrome here. It’s true that Israeli electoral behavior guru Asher Arian once related that in the socialist and collectivist Israel of the 1960s and 70s, respondents wouldn’t admit to being right wing in surveys. And when I first began polling in 1999, we knew that Shas was regularly under-represented in surveys because samples showed much lower numbers than their actual Knesset strength. It was reasonable to artificially “weight up” the number of Shas voters to match the empirical results.
Nowadays, if anything, the left is shy. But mostly, the Knesset reflects the basic breakdown of ideology in public opinion fairly accurately. At least as far as parties are concerned, the instability of the system appears to be a far greater factor than poor responses in surveys.
For example, a party that’s underrepresented in a survey may have genuinely collapsed. In March 2018, Orly Levy formed a new party and polled at about five seats for a spell. In focus groups people expressed genuine appreciation for her. But the situation changed; with five parties competing on the non-Orthodox right, there aren’t enough votes to go around, so polls putting her below the threshold now are probably right. With so many new parties, breakups, mergers, or parties dropping out, the surveys aren’t necessarily wrong — it’s the political conditions that keep shifting.
For these reasons, I don’t see evidence that there are sufficient liars to skew the numbers in any consistent way. I do hear many Israelis express despair, confusion, and anger at the vagaries of their politicians, leading to indecision that may last until election day. When interpreting polls, I advise compassion rather than suspicion.
‘No poll predicted Lapid in 2013, the Pensioners party in 2006, Shinui in 2003, or Shas in 1999’
True. Many of these parties reflected what I have come to expect as the electoral surprise in most Israeli elections. There tend to be seven to nine seats that go somewhere pollsters didn’t predict. What’s the anatomy of these surprises?
These appear to be protest votes. They are people who are sick of the old parties, looking for something new, or excited by an old party newly revived. Furthermore, they tend to gravitate towards the center rather than the far left or the far right (Shinui, Pensioners, Lapid). The exception was 1999, when Shas was the recipient of the protest vote, and rose seven seats with the support of voters thought to be far outside the Mizrahi ultra-Orthodox or Orthodox base – including religiously traditional or secular voters, too. In 2003, Lapid’s dad, Tommy, increased Shinui’s vote by nine seats. In 2006 the brand new Pensioners’ party won seven seats. Yesh Atid was new in 2013 and won six to seven seats more than expected (19, compared to 12 or 13 in the final week of polls).
The number of surprise seats represents about 5-7 percent of voters — often very close to the number of undecided people in surveys by the final days of campaign. What are we supposed to do about these undecided voters?
Every good election survey, anywhere, asks people whom they’ll vote for, then ought to follow up with the declared undecideds by asking which way they are “leaning.” That whittles the numbers down to “hard” undecideds – people who really can’t choose. A serious analysis doesn’t stop there, but should cross-tabulate the hard undecided’s ideological self-definition (in my experience, they are disproportionately “centrist” or “don’t know”), look at their past vote, candidate or party favorability, and then create a good model for allocating those votes to existing parties based on that information.
I suspect that in many media polls, budgets and deadlines don’t allow numerous additional questions like that. So instead of allocating intelligently, the pollsters distribute the undecided voters evenly, or drop them, which amounts to the same thing. That’s a good way to get it wrong.
Unfortunately most media polls aren’t very transparent about what they do with the undecided voters. Even the Central Election Committee survey database doesn’t show the methods for probing the undecided, making it hard to know which polls to take seriously.
‘But the last poll showed X party winning’
This one’s on you, readers. We, too, have a responsibility for how we interpret polls, and it involves checking our bias, while employing safe statistics.
Start with the latter. My rule of thumb for pretty much any research consumption is: never trust one study. Repeatability is key to any scientific experiment. Luckily, there is no lack of repetition in Israeli election studies.
The second rule is to focus on trends, range, and averages. You don’t need to be Nate Silver to look back over polls and see if a party is regularly going up, down, or staying the same. Likud, for example, has almost never dropped below 25 seats or broken 35 in this election cycle. That’s a pretty stable trend with a 10-point range and a 30-seat average. Barring major new trends, it’s safe to guess that Likud will come close to that.
A few weeks ago, Gantz had momentum on his side, going from zero to the 20-seat range in his first few weeks. That was a clear upward trend but it stabilized. Joining with Lapid gave the party a bounce, putting them above the combination of 20 plus Lapid’s current 11 seats – polls from the last week have Blue and White at around 35-37 seats. But a bounce sometimes goes back down. The trend of polls over the next week should be taken much more seriously than any single poll showing 37 seats. For now.
Finally, don’t fall into the liberal American pundit trap of 2016, when no one wanted to believe Trump could win. Existential fears are legitimate but not sound tools for interpreting surveys. It’s best to try and disassociate emotionally for a clearer reading.
Can polls influence voting behavior?
Absolutely. That’s one reason for the “blackout” period mandated by Israeli election law — usually about three days before the elections when polls are not allowed to be published. (Polls can still be conducted for internal use by campaigns.)
We are all guilty of being influenced by polls. Knowing if a party is close to the threshold helps us decide whether to support it or not. Knowing an election is close may motivate people to turn out to help tip the balance. Knowing that polls show a potential victory may cause others to join the victory vote (if that’s the side of the map they support). For this reason, asking interviewees “which party/candidate do you think will win” is important to track.
And is that really so bad? Humans, after all, are social creatures, and that helps us survive. Our governments, for better or worse, at least in relatively democratic systems, reflect who we really are.