25.3 Forming Good and Bad Decisions and Judgments

25-3 What is intuition, and how can the availability heuristic, overconfidence, belief perseverance, and framing influence our decisions and judgments?

intuition an effortless, immediate, automatic feeling or thought, as contrasted with explicit, conscious reasoning.

When making each day’s hundreds of judgments and decisions (Should I take a jacket? Can I trust this person? Should I shoot the basketball or pass to the player who’s hot?), we seldom take the time and effort to reason systematically. We just follow our intuition—our fast, automatic, unreasoned feelings and thoughts. After interviewing policy makers in government, business, and education, social psychologist Irving Janis (1986) concluded that they “often do not use a reflective problem-solving approach. How do they usually arrive at their decisions? If you ask, they are likely to tell you . . . they do it mostly by the seat of their pants.”

The Availability Heuristic

319

image
B. Veley. Used by permission.

availability heuristic estimating the likelihood of events based on their availability in memory; if instances come readily to mind (perhaps because of their vividness), we presume such events are common.

When we need to act quickly, the mental shortcuts we call heuristics enable snap judgments. Thanks to our mind’s automatic information processing, intuitive judgments are instantaneous. They also are usually effective (Gigerenzer & Sturm, 2012). However, research by cognitive psychologists Amos Tversky and Daniel Kahneman (1974) showed how these generally helpful shortcuts can lead even the smartest people into dumb decisions.3 The availability heuristic operates when we estimate the likelihood of events based on how mentally available they are—how easily they come to mind. Casinos entice us to gamble by signaling even small wins with bells and lights—making them mentally vivid—while keeping big losses invisible.

“Kahneman and his colleagues and students have changed the way we think about the way people think.”

American Psychological Association President Sharon Brehm, 2007

The availability heuristic colors our judgments of other people, too. Anything that makes information pop into mind—its vividness, recentness, or distinctiveness—can make it seem commonplace. While this generally helps us to evaluate people and situations quickly, it can also distort our perceptions. If someone from a particular ethnic or religious group commits a terrorist act, as happened on September 11, 2001, our readily available memory of the dramatic event may shape our impression of the whole group.

Even during that horrific year, terrorist acts claimed comparatively few lives. Yet when the statistical reality of greater dangers (see FIGURE 25.5) was pitted against the 9/11 terror, the memorable case won: Emotion-laden images of terror exacerbated our fears (Sunstein, 2007).

image
Figure 9.5: FIGURE 25.5 Risk of death from various causes in the United States, 2001 (Data assembled from various government sources by Randall Marshall et al., 2007.)

Although our fears often protect us, we sometimes fear the wrong things. (See Thinking Critically About: The Fear Factor below.) We fear flying because we visualize air disasters. We fear letting our sons and daughters walk to school because we see mental snapshots of abducted and brutalized children. We fear swimming in ocean waters because we replay Jaws with ourselves as victims. Even passing by a person who sneezes and coughs can heighten our perceptions of various health risks (Lee et al., 2010). And so, thanks to such readily available images, we come to fear extremely rare events.

image
“In creating these problems, we didn’t set out to fool people. All our problems fooled us, too.”
Amos Tversky (1985) (left)
“Intuitive thinking [is] fine most of the time. . . . But sometimes that habit of mind gets us in trouble.”
Daniel Kahneman (2005) (right)

320

THINKING CRITICALLY ABOUT

The Fear Factor—Why We Fear the Wrong Things

25-4 What factors contribute to our fear of unlikely events?

After the 9/11 attacks, many people feared flying more than driving. In a 2006 Gallup survey, only 40 percent of Americans reported being “not afraid at all” to fly. Yet from 2009 to 2011, Americans were—mile for mile—170 times more likely to die in a vehicle accident than on a scheduled flight (National Safety Council, 2014). In 2011, 21,221 people died in U.S. car or light truck accidents, while zero (as in 2010) died on scheduled airline flights. When flying, the most dangerous part of the trip is the drive to the airport.

In a late 2001 essay, I [DM] calculated that if—because of 9/11—we flew 20 percent less and instead drove half those unflown miles, about 800 more people would die in the year after the 9/11 attacks (Myers, 2001). German psychologist Gerd Gigerenzer (2004, 2006; Gaissmater & Gigerenzer, 2012) later checked my estimate against actual accident data. (Why didn’t I think to do that?) U.S. traffic deaths did indeed increase significantly in the last three months of 2001 (FIGURE 25.6). By the end of 2002, Gigerenzer estimated, 1600 Americans had “lost their lives on the road by trying to avoid the risk of flying.”

image
Figure 9.6: FIGURE 25.6 Scared onto deadly highways Images of 9/11 etched a sharper image in American minds than did the millions of fatality-free flights on U.S. airlines during 2002 and after. Dramatic events are readily available to memory, and they shape our perceptions of risk. In the three months after 9/11, those faulty perceptions led more Americans to travel, and some to die, by car. (Data from Gigerenzer, 2004.)
Lars Christensen/Shutterstock
© Transtock/Corbis

Why do we in so many ways fear the wrong things? Why do so many American parents fear school shootings, when their child is more likely to be killed by lightning (Ripley, 2013)? Why, in 2014, were so many Americans more frightened of Ebola (which killed no one who contracted it in the United States) than of influenza, which kills some 24,000 Americans annually? Psychologists have identified four influences that feed fear and cause us to ignore higher risks:

  1. We fear what our ancestral history has prepared us to fear. Human emotions were road tested in the Stone Age. Our old brain prepares us to fear yesterday’s risks: snakes, lizards, and spiders (which combined now kill a tiny fraction of the number killed by modern-day threats, such as cars and cigarettes). Yesterday’s risks also prepare us to fear confinement and heights, and therefore flying.

321

  1. We fear what we cannot control. Driving we control; flying we do not.

  2. We fear what is immediate. The dangers of flying are mostly telescoped into the moments of takeoff and landing. The dangers of driving are diffused across many moments, each trivially dangerous.

  3. Thanks to the availability heuristic, we fear what is most readily available in memory. Vivid images, like that of a horrific air crash, feed our judgments of risk. Shark attacks kill about one American per year, while heart disease kills 800,000—but it’s much easier to visualize a shark bite, and thus many people fear sharks more than cigarettes or the effects of an unhealthy diet (Daley, 2011). Similarly, we remember (and fear) widespread disasters (hurricanes, tornadoes, earthquakes) that kill people dramatically, in bunches. But we fear too little the less dramatic threats that claim lives quietly, one by one, continuing into the distant future. Horrified citizens and commentators renewed calls for U.S. gun control in 2015, after nine African-Americans at a Bible study were slain by a racist guest—although guns kill about 30 Americans every day, one by one, in a less dramatic fashion. Philanthropist Bill Gates has noted that each year, a half-million children worldwide die from rotavirus. This is the equivalent of four 747s full of children every day, and we hear nothing of it (Glass, 2004). Media outlets often draw readers and viewers by reporting on the immediate and the dramatic. “If it bleeds, it leads.”

image
Dramatic deaths in bunches breed concern and fear The memorable 2010 Haitian earthquake that killed some 250,000 people stirred an outpouring of justified concern. Meanwhile, according to the World Health Organization, a silent earthquake of poverty-related malaria was killing about that many people, mostly in Africa, every four months.
Ian Berry/Magnum Photos

The news, and our own memorable experiences, can make us disproportionately fearful of infinitesimal risks—and to spend an estimated $500 million per U.S. terrorist death but only $10,000 per cancer death (Eagan, 2015). As one risk analyst explained, “If it’s in the news, don’t worry about it. The very definition of news is ‘something that hardly ever happens’” (Schneier, 2007).

RETRIEVE IT

Question

tiJduUQx7wwWhjLbdRz03VR6txg/zqVdbmAeKrJcrRHs+qhugCpsg7H6Uh6n9KULTHk2L3hQwKt+gszwSknPVUGZ4Ujyp2oDZzWHWckTHXUqygrlnczMer7auTRvF/7aohLRIrhX67vIdBKumreC/ZTRznEsjfDxrsMVdw==
ANSWER: If a tragic event such as a plane crash makes the news, it grabs our attention more than the much more common bad events, such as traffic accidents. Knowing this, we can worry less about unlikely events and think more about improving the safety of our everyday activities. (For example, we can wear a seat belt when in a vehicle and use the crosswalk when walking.)

“Fearful people are more dependent, more easily manipulated and controlled, more susceptible to deceptively simple, strong, tough measures and hard-line postures.”

Media researcher George Gerbner to U.S. Congressional Subcommittee on Communications, 1981

Meanwhile, the lack of available images of future climate change disasters—which some scientists regard as “Armageddon in slow motion”—has left most people little concerned (Pew, 2014). What’s more cognitively available than slow climate change is our recently experienced local weather, which tells us nothing about long-term planetary trends (Egan & Mullin, 2012; Zaval et al., 2014). Unusually hot local weather increases people’s worry about global climate warming, while a recent cold day reduces their concern and overwhelms less memorable scientific data (Li et al., 2011). After Hurricane Sandy devastated New Jersey, its residents’ vivid experience of extreme weather increased their environmentalism (Rudman et al., 2013).

"Don’t believe everything you think."

Bumper sticker

“Global warming isn’t real because it was cold today! Also great news: World hunger is over because I just ate.”

Stephen Colbert tweet, November 18, 2014

Dramatic outcomes make us gasp; probabilities we hardly grasp. As of 2013, some 40 nations have sought to harness the positive power of vivid, memorable images by putting eye-catching warnings and graphic photos on cigarette packages (Riordan, 2013). This campaign has worked (Huang et al., 2013). As psychologist Paul Slovic (2007) points out, we reason emotionally and neglect probabilities. We overfeel and underthink. In one experiment, donations to a starving 7-year-old were greater when her image was not accompanied by statistical information about the millions of needy African children like her (Small et al., 2007). “The more who die, the less we care,” noted Slovic (2010).

Overconfidence

overconfidence the tendency to be more confident than correct—to overestimate the accuracy of our beliefs and judgments.

Sometimes our judgments and decisions go awry simply because we are more confident than correct. Across various tasks, people overestimate their performance (Metcalfe, 1998). If 60 percent of people correctly answer a factual question, such as “Is absinthe a liqueur or a precious stone?,” they will typically average 75 percent confidence (Fischhoff et al., 1977). (It’s a licorice-flavored liqueur.) This tendency to overestimate the accuracy of our knowledge and judgments is overconfidence.

image
Figure 9.7: FIGURE 25.7 Solution to the matchstick problem To solve this problem, you must view it from a new perspective, breaking the fixation of limiting solutions to two dimensions.
From “Problem Solving” by M. Scheerer. Copyright © 1963 by Scientific American, Inc. All Rights Reserved.

It was an overconfident BP that, before its exploded drilling platform spewed oil into the Gulf of Mexico, downplayed safety concerns, and then downplayed the spill’s magnitude (Mohr et al., 2010; Urbina, 2010). It is overconfidence that drives stockbrokers and investment managers to market their ability to outperform stock market averages (Malkiel, 2012). A purchase of stock X, recommended by a broker who judges this to be the time to buy, is usually balanced by a sale made by someone who judges this to be the time to sell. Despite their confidence, buyer and seller cannot both be right.

image
Predict your own behavior When will you finish reading this module?
Bianca Moscatelli/Worth Publishers

Hofstadter’s Law: It always takes longer than you expect, even when you take into account Hofstadter’s Law.

Douglas Hofstadter, Gödel, Escher, Bach: The Eternal Golden Braid, 1979

“When you know a thing, to hold that you know it; and when you do not know a thing, to allow that you do not know it; this is knowledge.”

Confucius (551–479 B.C.E.), Analects

322

Overconfidence can also feed extreme political views. People with a superficial understanding of proposals for cap-and-trade carbon emissions or a national flat tax often express strong pro or con views. Asking them to explain the details of these policies exposes them to their own ignorance, which in turn leads them to express more moderate views (Fernbach et al., 2013). Sometimes the less people know, the more immoderate they are.

Classrooms are full of overconfident students who expect to finish assignments and write papers ahead of schedule (Buehler et al., 1994, 2002). In fact, these projects generally take about twice the number of days predicted. This “planning fallacy” (underestimating the time and cost of a project) routinely occurs with construction projects, which often finish late and over budget.

Overconfidence can have adaptive value. People who err on the side of overconfidence live more happily. Their seeming competence can help them gain influence (Anderson et al., 2012). Moreover, given prompt and clear feedback, as weather forecasters receive after each day’s predictions, we can learn to be more realistic about the accuracy of our judgments (Fischhoff, 1982). The wisdom to know when we know a thing and when we do not is born of experience.

Belief Perseverance

belief perseverance clinging to one’s initial conceptions after the basis on which they were formed has been discredited.

Our overconfidence is startling. Equally so is our belief perseverance—our tendency to cling to our beliefs in the face of contrary evidence. One study of belief perseverance engaged people with opposing views of capital punishment (Lord et al., 1979). After studying two supposedly new research findings, one supporting and the other refuting the claim that the death penalty deters crime, each side was more impressed by the study supporting its own beliefs. And each readily disputed the other study. Thus, showing the pro- and anti-capital-punishment groups the same mixed evidence actually increased their disagreement.

To rein in belief perseverance, a simple remedy exists: Consider the opposite. When the same researchers repeated the capital-punishment study, they asked some participants to be “as objective and unbiased as possible” (Lord et al., 1984). The plea did nothing to reduce biased evaluations of evidence. They asked a different group to consider “whether you would have made the same high or low evaluations had exactly the same study produced results on the other side of the issue.” Having imagined and pondered opposite findings, these people became much less biased.

The more we come to appreciate why our beliefs might be true, the more tightly we cling to them. Once beliefs form and get justified, it takes more compelling evidence to change them than it did to create them. Prejudice persists. Beliefs often persevere.

The Effects of Framing

framing the way an issue is posed; how an issue is framed can significantly affect decisions and judgments.

Framing—the way we present an issue—sways our decisions and judgments. Imagine two surgeons explaining a surgery risk. One tells patients that 10 percent of people die during this surgery. The other tells patients that 90 percent survive. Although the information is the same, the effect is not. Both patients and physicians perceive greater risk when they hear that 10 percent die (Marteau, 1989; McNeil et al., 1988; Rothman & Salovey, 1997).

image
The New Yorker Collection, 1973, Fradon from cartoonbank.com. All Rights Reserved.

Framing can be a powerful persuasion tool. Carefully posed options can nudge people toward decisions that could benefit them or society as a whole (Benartzi & Thaler, 2013; Thaler & Sunstein, 2008):

323

The point to remember: Those who understand the power of framing can use it (for good or ill) to nudge our decisions.

The Perils and Powers of Intuition

25-5 How do smart thinkers use intuition?

The perils of intuition—irrational fears, cloudy judgments, illogical reasoning—feed gut fears and prejudices. Irrational thinking can persist even when people are offered extra pay for thinking smart, even when they are asked to justify their answers, and even when they are expert physicians or clinicians (Shafir & LeBoeuf, 2002). Highly intelligent people (including U.S. federal intelligence agents, in one study) are similarly vulnerable to intuition’s distortions of reality (Reyna et al., 2014; Stanovich et al., 2013). Even very smart people can make not-so-smart judgments. So, are our heads indeed “filled with straw,” as T. S. Eliot suggested?

image
Hmm . . . male or female? When acquired expertise becomes an automatic habit, as it is for experienced chicken sexers, it feels like intuition. At a glance, they just know, yet cannot easily tell you how they know.
Jean Philippe Ksiazek/AFP/Getty

Throughout this book you will see examples of smart intuition. In brief,

Critics note that some studies have not found the supposed power of unconscious thought and remind us that deliberate, conscious thought also furthers smart thinking (Lassiter et al., 2009; Newell, 2015; Nieuwenstein et al., 2015; Payne et al., 2008). In challenging situations, superior decision makers, including chess players, take time to think (Moxley et al., 2012). And with many sorts of problems, deliberative thinkers are aware of the intuitive option, but know when to override it (Mata et al., 2013). Consider:

A bat and a ball together cost 110 cents.
The bat costs 100 cents more than the ball.
How much does the ball cost?

Most people’s intuitive response—10 cents—is wrong, and a few moments of deliberate thinking reveals why.4

The bottom line: Our two-track mind makes sweet harmony as smart, critical thinking listens to the creative whispers of our vast unseen mind and then evaluates evidence, tests conclusions, and plans for the future.