27.3 Forming Good and Bad Decisions and Judgments

27-3 What is intuition, and how can the availability heuristic, overconfidence, belief perseverance, and framing influence our decisions and judgments?

When making each day’s hundreds of judgments and decisions (Is it worth the bother to take a jacket? Can I trust this person? Should I shoot the basketball or pass to the player who’s hot?), we seldom take the time and effort to reason systematically. We just follow our intuition, our fast, automatic, unreasoned feelings and thoughts. After interviewing policy makers in government, business, and education, social psychologist Irving Janis (1986) concluded that they “often do not use a reflective problem-solving approach. How do they usually arrive at their decisions? If you ask, they are likely to tell you … they do it mostly by the seat of their pants.”

The Availability Heuristic

When we need to act quickly, the mental shortcuts we call heuristics enable snap judgments. Thanks to our mind’s automatic information processing, intuitive judgments are instantaneous. They also are usually effective (Gigerenzer & Sturm, 2012). However, research by cognitive psychologists Amos Tversky and Daniel Kahneman (1974) showed how these generally helpful shortcuts can lead even the smartest people into dumb decisions.3 The availability heuristic operates when we estimate the likelihood of events based on how mentally available they are—how easily they come to mind. Casinos entice us to gamble by signaling even small wins with bells and lights—making them mentally vivid—while keeping big losses invisible.

“Kahneman and his colleagues and students have changed the way we think about the way people think.”

American Psychological Association President Sharon Brehm, 2007

The availability heuristic can distort our judgments of other people, too. Anything that makes information pop into mind—its vividness, recency, or distinctiveness—can make it seem commonplace. If someone from a particular ethnic or religious group commits a terrorist act, as happened on September 11, 2001, our readily available memory of the dramatic event may shape our impression of the whole group.

“In creating these problems, we didn’t set out to fool people. All our problems fooled us, too.” Amos Tversky (1985)
“Intuitive thinking [is] fine most of the time…. But sometimes that habit of mind gets us in trouble.” Daniel Kahneman (2005)

360

Even during that horrific year, terrorist acts claimed comparatively few lives. Yet when the statistical reality of greater dangers (see FIGURE 27.6) was pitted against the 9/11 terror, the memorable case won: Emotion-laden images of terror exacerbated our fears (Sunstein, 2007).

Figure 27.6
Risk of death from various causes in the United States, 2001 (Data assembled from various government sources by Randall Marshall et al., 2007.)

“Don’t believe everything you think.”

Bumper sticker

We often fear the wrong things (See below for Thinking Critically About: The Fear Factor). We fear flying because we visualize air disasters. We fear letting our sons and daughters walk to school because we see mental snapshots of abducted and brutalized children. We fear swimming in ocean waters because we replay Jaws with ourselves as victims. Even just passing by a person who sneezes and coughs heightens our perceptions of various health risks (Lee et al., 2010). And so, thanks to such readily available images, we come to fear extremely rare events.

THINKING  CRITICALLY  ABOUT

THINKING CRITICALLY ABOUT: The Fear Factor—Why We Fear the Wrong Things

27-4 What factors contribute to our fear of unlikely events?

After the 9/11 attacks, many people feared flying more than driving. In a 2006 Gallup survey, only 40 percent of Americans reported being “not afraid at all” to fly. Yet from 2009 to 2011 Americans were—mile for mile—170 times more likely to die in a vehicle accident than on a scheduled flight (National Safety Council, 2014). In 2011, 21,221 people died in U.S. car or light truck accidents, while zero (as in 2010) died on scheduled airline flights. When flying, the most dangerous part of the trip is the drive to the airport.

In a late 2001 essay, I [DM] calculated that if—because of 9/11—we flew 20 percent less and instead drove half those unflown miles, about 800 more people would die in the year after the 9/11 attacks (Myers, 2001). German psychologist Gerd Gigerenzer (2004, 2006; Gaissmaier & Gigerenzer, 2012) later checked my estimate against actual accident data. (Why didn’t I think to do that?) U.S. traffic deaths did indeed increase significantly in the last three months of 2001 (FIGURE 27.7). By the end of 2002, Gigerenzer estimated, 1600 Americans had “lost their lives on the road by trying to avoid the risk of flying.”

Figure 27.7
Scared onto deadly highways Images of 9/11 etched a sharper image in American minds than did the millions of fatality-free flights on U.S. airlines during 2002 and after. Dramatic events are readily available to memory, and they shape our perceptions of risk. In the three months after 9/11, those faulty perceptions led more Americans to travel, and some to die, by car. (Data from Gigerenzer, 2004.)

Why do we in so many ways fear the wrong things? Why do so many American parents fear school shootings, when their child is more likely to be killed by lightning (Ripley, 2013)? Psychologists have identified four influences that feed fear and cause us to ignore higher risks.

  1. We fear what our ancestral history has prepared us to fear. Human emotions were road tested in the Stone Age. Our old brain prepares us to fear yesterday’s risks: snakes, lizards, and spiders (which combined now kill a tiny fraction of the number killed by modern-day threats, such as cars and cigarettes). Yesterday’s risks also prepare us to fear confinement and heights, and therefore flying.
  2. We fear what we cannot control. Driving we control; flying we do not.
  3. We fear what is immediate. The dangers of flying are mostly telescoped into the moments of takeoff and landing. The dangers of driving are diffused across many moments to come, each trivially dangerous.
  4. Thanks to the availability heuristic, we fear what is most readily available in memory. Vivid images, like that of United Flight 175 slicing into the World Trade Center, feed our judgments of risk. Thousands of safe car trips have extinguished our anxieties about driving. Shark attacks kill about one American per year, while heart disease kills 800,000—but it’s much easier to visualize a shark bite, and thus many people fear sharks more than cigarettes (Daley, 2011). Similarly, we remember (and fear) widespread disasters (hurricanes, tornadoes, earthquakes) that kill people dramatically, in bunches. But we fear too little the less dramatic threats that claim lives quietly, one by one, continuing into the distant future. Horrified citizens and commentators renewed calls for U.S. gun control in 2012, after 20 children and 6 adults were slain in a Connecticut elementary school—although even more Americans are murdered by guns daily, though less dramatically, one by one. Philanthropist Bill Gates has noted that each year a half-million children worldwide die from rotavirus. This is the equivalent of four 747s full of children dying every day, and we hear nothing of it (Glass, 2004).
Dramatic deaths in bunches breed concern and fear The memorable 2010 Haitian earthquake that killed some 250,000 people stirred an outpouring of justified concern. Meanwhile, according to the World Health Organization, a silent earthquake of poverty-related malaria was killing about that many people, mostly in Africa, every four months.

The news, and our own memorable experiences, can make us disproportionately fearful of infinitesimal risks. As one risk analyst explained, “If it’s in the news, don’t worry about it. The very definition of news is ‘something that hardly ever happens’” (Schneier, 2007).

“Fearful people are more dependent, more easily manipulated and controlled, more susceptible to deceptively simple, strong, tough measures and hard-line postures.”

Media researcher George Gerbner to U.S. Congressional Subcommittee on Communications, 1981

RETRIEVAL PRACTICE

  • Why can news be described as “something that hardly ever happens”? How does knowing this help us assess our fears?

If a tragic event such as a plane crash makes the news, it is noteworthy and unusual, unlike much more common bad events, such as traffic accidents. Knowing this, we can worry less about unlikely events and think more about improving the safety of our everyday activities. (For example, we can wear a seat belt when in a vehicle and use the crosswalk when walking.)

To offer a vivid depiction of climate change, Cal Tech scientists created an interactive map of global temperatures over the past 120 years (see www.tinyurl.com/TempChange).

Meanwhile, the lack of comparably available images of global climate change—which some scientists regard as a future “Armageddon in slow motion”—has left many people little concerned (Pew, 2014). What’s more cognitively available than slow climate change is our recently experienced local weather, which tells us nothing about long-term planetary trends (Egan & Mullin, 2012; Zaval et al., 2014). Unusually hot local weather increases people’s worry about global climate warming, while a recent cold day reduces their concern and overwhelms less memorable scientific data (Li et al., 2011). After Hurricane Sandy devastated New Jersey, its residents’ vivid experience of extreme weather increased their environmentalism (Rudman et al., 2013).

Dramatic outcomes make us gasp; probabilities we hardly grasp. As of 2013, some 40 nations—including Canada, many in Europe, and the United States—have, however, sought to harness the positive power of vivid, memorable images by putting eye-catching warnings and graphic photos on cigarette packages (Riordan, 2013). This campaign has worked (Huang et al., 2013). As psychologist Paul Slovic (2007) points out, we reason emotionally and neglect probabilities. We overfeel and underthink. In one experiment, donations to a starving 7-year-old were greater when her image was not accompanied by statistical information about the millions of needy African children like her (Small et al., 2007). “The more who die, the less we care,” noted Slovic (2010).

Overconfidence

Predict your own behavior When will you finish reading this module?

Sometimes our judgments and decisions go awry simply because we are more confident than correct. Across various tasks, people overestimate their performance (Metcalfe, 1998). If 60 percent of people correctly answer a factual question, such as “Is absinthe a liqueur or a precious stone?,” they will typically average 75 percent confidence (Fischhoff et al., 1977). (It’s a licorice-flavored liqueur.) This tendency to overestimate the accuracy of our knowledge and judgments is overconfidence.

361

It was an overconfident BP that, before its exploded drilling platform spewed oil into the Gulf of Mexico, downplayed safety concerns, and then downplayed the spill’s magnitude (Mohr et al., 2010; Urbina, 2010). It is overconfidence that drives stockbrokers and investment managers to market their ability to outperform stock market averages (Malkiel, 2012). A purchase of stock X, recommended by a broker who judges this to be the time to buy, is usually balanced by a sale made by someone who judges this to be the time to sell. Despite their confidence, buyer and seller cannot both be right.

Overconfidence can also feed extreme political views. People with a superficial understanding of proposals for cap-and-trade carbon emissions or a national flat tax often express strong pro or con views. Asking them to explain the details of these policies exposes them to their own ignorance, which in turn leads them to express more moderate views (Fernbach et al., 2013). Sometimes the less people know, the more immoderate they are.

Hofstadter’s Law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.

Douglas Hofstadter, Gödel, Escher, Bach: The Eternal Golden Braid, 1979

Classrooms are full of overconfident students who expect to finish assignments and write papers ahead of schedule (Buehler et al., 1994, 2002). In fact, the projects generally take about twice the number of days predicted. We also overestimate our future leisure time (Zauberman & Lynch, 2005). Anticipating how much more we will accomplish next month, we happily accept invitations and assignments, only to discover we’re just as busy when the day rolls around. The same “planning fallacy” (underestimating time and money) appears everywhere. Boston’s mega-construction “Big Dig” was projected to take 10 years and actually took 20. And the average kitchen remodeling project ends up costing about double what homeowners expect (Kahneman, 2011).

“When you know a thing, to hold that you know it; and when you do not know a thing, to allow that you do not know it; this is knowledge.”

Confucius (551–479 B.C.E.), Analects

Overconfidence can have adaptive value. People who err on the side of overconfidence live more happily. They seem more competent than others (Anderson et al., 2012). Moreover, given prompt and clear feedback, as weather forecasters receive after each day’s predictions, we can learn to be more realistic about the accuracy of our judgments (Fischhoff, 1982). The wisdom to know when we know a thing and when we do not is born of experience.

Belief Perseverance

Our overconfidence is startling; equally so is our belief perseverance—our tendency to cling to our beliefs in the face of contrary evidence. One study of belief perseverence engaged people with opposing views of capital punishment (Lord et al., 1979). After studying two supposedly new research findings, one supporting and the other refuting the claim that the death penalty deters crime, each side was more impressed by the study supporting its own beliefs. And each readily disputed the other study. Thus, showing the pro- and anti-capital-punishment groups the same mixed evidence actually increased their disagreement.

To rein in belief perseverance, a simple remedy exists: Consider the opposite. When the same researchers repeated the capital-punishment study, they asked some participants to be “as objective and unbiased as possible” (Lord et al., 1984). The plea did nothing to reduce biased evaluations of evidence. They also asked another group to consider “whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue.” Having imagined and pondered opposite findings, these people became much less biased.

The more we come to appreciate why our beliefs might be true, the more tightly we cling to them. Once we have explained to ourselves why we believe a child is “gifted” or has a “specific learning disorder,” we tend to ignore evidence undermining our belief. Once beliefs form and get justified, it takes more compelling evidence to change them than it did to create them. Prejudice persists. Beliefs often persevere.

362

The Effects of Framing

Framing—the way we present an issue—sways our decisions and judgments. Imagine two surgeons explaining a surgery risk. One tells patients that 10 percent of people die during this surgery. The other says that 90 percent survive. Although the information is the same, the effect is not. Both patients and physicians perceive greater risk when they hear that 10 percent die (Marteau, 1989; McNeil et al., 1988; Rothman & Salovey, 1997).

Similarly, 9 in 10 college students rated a condom as effective if told it had a supposed “95 percent success rate” in stopping the HIV virus. Only 4 in 10 judged it effective when told it had a “5 percent failure rate” (Linville et al., 1992). To scare people even more, frame risks as numbers, not percentages. People told that a chemical exposure was projected to kill 10 of every 10 million people (imagine 10 dead people!) felt more frightened than did those told the fatality risk was an infinitesimal .000001 (Kraus et al., 1992).

363

Framing can be a powerful persuasion tool. Carefully posed options can nudge people toward decisions that could benefit them or society as a whole (Benartzi & Thaler, 2013; Thaler & Sunstein, 2008):

364

The point to remember: Those who understand the power of framing can use it to nudge our decisions.

The Perils and Powers of Intuition

27-5 How do smart thinkers use intuition?

The perils of intuition—irrational fears, cloudy judgments, illogical reasoning—feed gut fears and prejudices. Irrational thinking can persist even when people are offered extra pay for thinking smart, even when they are asked to justify their answers, and even when they are expert physicians or clinicians (Shafir & LeBoeuf, 2002). Highly intelligent people (including U.S. federal intelligence agents in one study) are similarly vulnerable to them (Reyna et al., 2013; Stanovich et al., 2013). Even very smart people can make not-so-smart judgments.

So, are our heads indeed filled with straw? Good news: Cognitive scientists are also revealing intuition’s powers. Here is a summary of some of the high points:

The bottom line: Our two-track mind makes sweet harmony as smart, critical thinking listens to the creative whispers of our vast unseen mind, and then evaluates evidence, tests conclusions, and plans for the future.