Election results
Many voters believed that the major U.S. political parties offered them two disappointing choices for President this year.
Although their reasons for disparaging their options varied from voter to voter, a rough consensus emerged about the candidates’ relative riskiness. While one had served in public office for decades, the other had never done any government work. At least in terms of familiarity with the job, the first seemed safe, the other risky.
But in the Electoral College, the “safe” one lost, and the “risky” one prevailed.
Why?
You’ll have many thoughts on how to answer that question. You can find plenty of opinions anywhere you look.
But as a trial lawyer, I have a particular interest in one possible, if partial, explanation:
Does a cognitive tendency in our brains push us towards risk (or perhaps away from it) when we face a set of alternatives that we perceive as negative?
Cognitive bias?
A book I read this year, Thinking, Fast and Slow (2011), suggested an intriguing answer.
As the Nobel-winning economist Daniel Kahneman noted in the book, decision-makers (with my italics) “tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good.” But “when both outcomes are negative“, they tend to do the opposite: they “reject the sure thing and accept the gamble (they are risk seeking)”.
Flu study
Kahneman used a study to illustrate the concepts — risk-aversion in the face of good options, and risk-seeking in the face of bad ones. The study aimed to gauge the participants’ reactions to information about possible ways to combat an impending flu epidemic. The study stated that, unless authorities intervened, the flu’s onslaught would kill 600 people.
Participants then chose between two possible responses, Program A and Program B. Program A, the study said, “will” save 200 of the 600 people. Program B, on the other hand, had a one-third “probability” of saving all 600 and a two-thirds “probability” of saving none.
Although both programs portended exactly the same outcomes — 200 would live, and 400 would die — a large majority opted for Program A. The framing of the alternatives made the difference. Whereas Program A made saving 200 people’s lives certain, Program B spoke of the same result in terms of “probability”, emphasizing risk.
Seeking risk
Kahneman did not stop there. His study went on to ask participants to consider a Program A that made 400 deaths a sure thing and Program B that presented a one-third probability of zero deaths and a two-thirds probability of a complete wipe-out.
Despite the fact that both programs had precisely the same probable result under both formulations of Programs A and B, the second prompted participants to choose the risky-sounding Program B. Far more participants chose to gamble, hoping that the far more unlikely event (a one-third chance that all 600 will survive) will occur.
Since all of the scenarios described the same end-state — 200 live, and 400 die — why did participants in the study make such different choices?
As Kahneman said, decision-makers “tend to prefer the sure thing over the gamble (they are risk averse) when the outcomes are good”, but they “tend to reject the sure thing and accept the gamble (they are risk seeking) when both outcomes are negative.”
People play it safe when they choose between two good alternatives, and they gamble when they select between two bad ones. We humans have a cognitive bias that causes us to unconsciously shun danger when we expect good results and don’t want to lose out and to court it when we expect a negative outcome and prefer to defy the odds.
Election insight
What does that say about the election?
It suggests that voters who liked neither of their main options but voted for one of them anyway tended to pick the choice that seemed more risky.
In a very close election, did that make the difference?
I’ll leave that to others to decide.
But I do urge you trial lawyers out there to read Kahneman’s book. Thinking, Fast and Slow provides a lot of interesting insights into the cognitive tendencies, glitches, and biases that unconsciously but predictably affect the way people perceive the world and how they make decisions. He explores the importance of framing, why losing something hurts twice as much as gaining the same thing, and many other matters that may help you persuade your audiences more effectively.
What do you have to lose?