1.1 The Need for Psychological Science

The limits of intuition Personnel interviewers tend to be overconfident of their gut feelings about job applicants. Their confidence stems partly from their recalling cases where their favorable impression proved right, and partly from their ignorance about rejected applicants who succeeded elsewhere.

1-1 How does our everyday thinking sometimes lead us to a wrong conclusion?

Some people suppose that psychology merely documents and dresses in jargon what people already know: “You get paid for using fancy methods to prove what my grandmother knows?” Others place their faith in human intuition: “Buried deep within each and every one of us, there is an instinctive, heart-felt awareness that provides—if we allow it to—the most reliable guide,” offered Prince Charles (2000).

Prince Charles has much company, judging from the long list of pop psychology books on “intuitive managing,” “intuitive trading,” and “intuitive healing.” Today’s psychological science does document a vast intuitive mind. As we will see, our thinking, memory, and attitudes operate on two levels—conscious and unconscious—with the larger part operating automatically, offscreen. Like jumbo jets, we fly mostly on autopilot.

So, are we smart to listen to the whispers of our inner wisdom, to simply trust “the force within”? Or should we more often be subjecting our intuitive hunches to skeptical scrutiny?

This much seems certain: We often underestimate intuition’s perils. My [DM] geographical intuition tells me that Reno is east of Los Angeles, that Rome is south of New York, that Atlanta is east of Detroit. But I am wrong, wrong, and wrong.

Studies show that people greatly overestimate their lie detection accuracy, their eyewitness recollections, their interviewee assessments, their risk predictions, and their stock-picking talents. As a Nobel Prize–winning physicist explained, “The first principle is that you must not fool yourself—and you are the easiest person to fool” (Feynman, 1997).

“Those who trust in their own wits are fools.”

Proverbs 28:26

Indeed, observed novelist Madeleine L’Engle, “The naked intellect is an extraordinarily inaccurate instrument” (1973). Three phenomena—hindsight bias, overconfidence, and our tendency to perceive patterns in random events—illustrate why we cannot rely solely on intuition and common sense.

Did We Know It All Along? Hindsight Bias

“Life is lived forwards, but understood backwards.”

Philosopher Søren Kierkegaard, 1813–1855

hindsight bias the tendency to believe, after learning an outcome, that one would have foreseen it. (Also known as the I-knew-it-all-along phenomenon.)

Consider how easy it is to draw the bull’s eye after the arrow strikes. After the stock market drops, people say it was “due for a correction.” After the football game, we credit the coach if a “gutsy play” wins the game, and fault the coach for the “stupid play” if it doesn’t. After a war or an election, its outcome usually seems obvious. Although history may therefore seem like a series of inevitable events, the actual future is seldom foreseen. No one’s diary recorded, “Today the Hundred Years War began.”

“Anything seems commonplace, once explained.”

Dr. Watson to Sherlock Holmes

This hindsight bias (also known as the I-knew-it-all-along phenomenon) is easy to demonstrate: Give half the members of a group some purported psychological finding, and give the other half an opposite result. Tell the first group, “Psychologists have found that separation weakens romantic attraction. As the saying goes, “Out of sight, out of mind.” Ask them to imagine why this might be true. Most people can, and nearly all will then view this true finding as unsurprising.

Tell the second group the opposite, “Psychologists have found that separation strengthens romantic attraction. As the saying goes, “Absence makes the heart grow fonder.” People given this untrue result can also easily imagine it, and most will also see it as unsurprising. When opposite findings both seem like common sense, there is a problem.

Such errors in our recollections and explanations show why we need psychological research. Just asking people how and why they felt or acted as they did can sometimes be misleading—not because common sense is usually wrong, but because common sense more easily describes what has happened than what will happen. As physicist Niels Bohr reportedly jested, “Prediction is very difficult, especially about the future.”

21

Hindsight bias When drilling its Deepwater Horizon oil well in 2010, BP employees took shortcuts and ignored warning signs, without intending to harm the environment or their company’s reputation. After the resulting Gulf oil spill, with the benefit of 20/20 hindsight, the foolishness of those judgments became obvious.

More than 800 scholarly papers have shown hindsight bias in people young and old from across the world (Roese & Vohs, 2012). Nevertheless, Grandma’s intuition is often right. As baseball great Yogi Berra once said, “You can observe a lot by watching.” (We have Berra to thank for other gems, such as “Nobody ever comes here—it’s too crowded,” and “If the people don’t want to come out to the ballpark, nobody’s gonna stop ’em.”) Because we’re all behavior watchers, it would be surprising if many of psychology’s findings had not been foreseen. Many people believe that love breeds happiness, and they are right (we have what Chapter 11 calls a deep “need to belong”).

Indeed, noted Daniel Gilbert, Brett Pelham, and Douglas Krull (2003), “good ideas in psychology usually have an oddly familiar quality, and the moment we encounter them we feel certain that we once came close to thinking the same thing ourselves and simply failed to write it down.” Good ideas are like good inventions: Once created, they seem obvious. (Why did it take so long for someone to invent suitcases on wheels and Post-it Notes?)

But sometimes Grandma’s intuition, informed by countless casual observations, is wrong. In later chapters, we will see how research has overturned popular ideas—that familiarity breeds contempt, that dreams predict the future, and that most of us use only 10 percent of our brain. We will also see how it has surprised us with discoveries about how the brain’s chemical messengers control our moods and memories, about other animals’ abilities, and about the effects of stress on our capacity to fight disease.

Overconfidence

Fun anagram solutions from Wordsmith (www.wordsmith.org): Snooze alarms = Alas! No more z’s Dormitory = dirty room
Slot machines = cash lost in ’em

We humans tend to think we know more than we do. Asked how sure we are of our answers to factual questions (Is Boston north or south of Paris?), we tend to be more confident than correct.1 Or consider these three anagrams, which Richard Goranson (1978) asked people to unscramble:

Overconfidence in history:
“We don’t like their sound. Groups of guitars are on their way out.”

Decca Records, in turning down a recording contract with the Beatles in 1962

WREAT → WATER

ETRYN → ENTRY

GRABE → BARGE

22

“Computers in the future may weigh no more than 1.5 tons.”

Popular Mechanics, 1949

About how many seconds do you think it would have taken you to unscramble each of these? Did hindsight influence you? Knowing the answers tends to make us overconfident. (Surely the solution would take only 10 seconds or so.) In reality, the average problem solver spends 3 minutes, as you also might, given a similar anagram without the solution: OCHSA.2

“They couldn’t hit an elephant at this distance.”

General John Sedgwick just before being killed during a U.S. Civil War battle, 1864

Are we any better at predicting social behavior? University of Pennsylvania psychologist Philip Tetlock (1998, 2005) collected more than 27,000 expert predictions of world events, such as the future of South Africa or whether Quebec would separate from Canada. His repeated finding: These predictions, which experts made with 80 percent confidence on average, were right less than 40 percent of the time. Nevertheless, even those who erred maintained their confidence by noting they were “almost right.” “The Québécois separatists almost won the secessionist referendum.”

“The telephone may be appropriate for our American cousins, but not here, because we have an adequate supply of messenger boys.”

British expert group evaluating the invention of the telephone

RETRIEVAL PRACTICE

  • Why, after friends start dating, do we often feel that we knew they were meant to be together?

We often suffer from hindsight bias—after we’ve learned a situation’s outcome, that outcome seems familiar and therefore obvious.

Perceiving Order in Random Events

In our natural eagerness to make sense of our world, we perceive patterns. People see a face on the Moon, hear Satanic messages in music, perceive the Virgin Mary’s image on a grilled cheese sandwich. Even in random data, we often find order, because—here’s a curious fact of life—random sequences often don’t look random (Falk et al., 2009; Nickerson, 2002, 2005). Flip a coin 50 times and you may be surprised at the streaks of heads and tails. In actual random sequences, patterns and streaks (such as repeating digits) occur more often than people expect (Oskarsson et al., 2009).

However, some happenings, such as winning a lottery twice, seem so extraordinary that we struggle to conceive an ordinary, chance-related explanation. “But with a large enough sample, any outrageous thing is likely to happen,” note statisticians Persi Diaconis and Frederick Mosteller (1989). An event that happens to but 1 in 1 billion people every day occurs about 7 times a day, 2500 times a year.


Consider how scientific inquiry can help you think smarter about hot streaks in sports with LaunchPad’s How Would You Know If There Is a Hot Hand in Basketball?

The point to remember: Hindsight bias, overconfidence, and our tendency to perceive patterns in random events often lead us to overestimate our intuition. But scientific inquiry can help us sift reality from illusion.

is a research-based online learning tool that will help you excel in this course. Visit LaunchPad to take advantage of self-tests, interactive simulations, and
HOW WOULD YOU KNOW? activities. For a 1-minute introduction to LaunchPad, including how to get in and use its helpful resources, go to http://tinyurl.com/LaunchPadIntro. In LaunchPad, you will find resources collected by chapter. Additional resources may be found by clicking on the “Resources” star in the left column.

Given enough random events, some weird-seeming streaks will occur During the 2010 World Cup, a German octopus—Paul, “the oracle of Oberhausen”—was offered two boxes, each with mussels and with a national flag on one side. Paul selected the correct box eight out of eight times in predicting the outcome of Germany’s seven matches and Spain’s triumph in the final.

Question

qyxhUau3q0m1i+PVflErRiw6DEoLPxXMrB8SbXZ2g17saKcRJQrdMo8MX8mu5oovDdfBgpeeDl1yb0jptOljndLRH51fYGlF7ZXXX7BzxkWd3REINPrbMzeIleTeYEbj/gpzAf5XEQnOhrKpnDZXIYiaAprKCUWbPht1zPplvYt61z1WG0mhwQ7+Qsu/14cpFGSxqZ91HMjKmH+Simo76MYv0AFfco9b0Ejlp3hiJhmB6bhuBi69TD6Do9V93G/wC5/T/+UmzBI=
Possible sample answer: Three phenomena—hindsight bias, overconfidence, and our tendency to perceive patterns in random events—illustrate why we cannot rely solely on intuition and common sense and why we need psychological research. With hindsight bias, when we know the outcome of an event, it seems so obvious that we are likely to think that we knew it all along. Overconfidence is our tendency to think we know more than we know. By perceiving order in random events we impose patterns where there are none.

23

The Scientific Attitude: Curious, Skeptical, and Humble

“The really unusual day would be one where nothing unusual happens.”

Statistician Persi Diaconis (2002)

1-2 How do the scientific attitude’s three main components relate to critical thinking?

Underlying all science is, first, a hard-headed curiosity, a passion to explore and understand without misleading or being misled. Some questions (Is there life after death?) are beyond science. Answering them in any way requires a leap of faith. With many other ideas (Can some people demonstrate ESP?), the proof is in the pudding. Let the facts speak for themselves.

Magician James Randi has used this empirical approach when testing those claiming to see glowing auras around people’s bodies:

Randi:Do you see an aura around my head?

Aura seer:Yes, indeed.

Randi:Can you still see the aura if I put thismagazine in front of my face?

Aura seer:Of course.

Randi:Then if I were to step behind a wallbarely taller than I am, you could determine mylocation from the aura visible above my head, right?

The Amazing Randi The magician James Randi exemplifies skepticism. He has tested and debunked supposed psychic phenomena.

Randi once told me that no aura seer has agreed to take this simple test.

No matter how sensible-seeming or wild an idea, the smart thinker asks: Does it work? When put to the test, can its predictions be confirmed? Subjected to such scrutiny, crazy-sounding ideas sometimes find support. During the 1700s, scientists scoffed at the notion that meteorites had extraterrestrial origins. When two Yale scientists challenged the conventional opinion, Thomas Jefferson reportedly jeered, “Gentlemen, I would rather believe that those two Yankee professors would lie than to believe that stones fell from Heaven.” Sometimes scientific inquiry turns jeers into cheers.

More often, science becomes society’s garbage disposal, sending crazy-sounding ideas to the waste heap, atop previous claims of perpetual motion machines, miracle cancer cures, and out-of-body travels into centuries past. To sift reality from fantasy, sense from nonsense, therefore requires a scientific attitude: being skeptical but not cynical, open but not gullible.

“To believe with certainty,” says a Polish proverb, “we must begin by doubting.” As scientists, psychologists approach the world of behavior with a curious skepticism, persistently asking two questions: What do you mean? How do you know?

When ideas compete, skeptical testing can reveal which ones best match the facts. Do parental behaviors determine children’s sexual orientation? Can astrologers predict your future based on the position of the planets at your birth? Is electroconvulsive therapy (delivering an electric shock to the brain) an effective treatment for severe depression? As we will see, putting such claims to the test has led psychological scientists to answer No to the first two questions and Yes to the third.

Putting a scientific attitude into practice requires not only curiosity and skepticism but also humility—an awareness of our own vulnerability to error and an openness to surprises and new perspectives. In the last analysis, what matters is not my opinion or yours, but the truths nature reveals in response to our questioning. If people or other animals don’t behave as our ideas predict, then so much the worse for our ideas. This humble attitude was expressed in one of psychology’s early mottos: “The rat is always right.”

24

“My deeply held belief is that if a god anything like the traditional sort exists, our curiosity and intelligence are provided by such a god. We would be unappreciative of those gifts … if we suppressed our passion to explore the universe and ourselves.”

Carl Sagan, Broca’s Brain, 1979

Historians of science tell us that these three attitudes—curiosity, skepticism, and humility—helped make modern science possible. Some deeply religious people today may view science, including psychological science, as a threat. Yet, many of the leaders of the scientific revolution, including Copernicus and Newton, were deeply religious people acting on the idea that “in order to love and honor God, it is necessary to fully appreciate the wonders of his handiwork” (Stark, 2003a,b).

Of course, scientists, like anyone else, can have big egos and may cling to their preconceptions. Nevertheless, the ideal of curious, skeptical, humble scrutiny of competing ideas unifies psychologists as a community as they check and recheck one another’s findings and conclusions.

Critical Thinking

critical thinking thinking that does not blindly accept arguments and conclusions. Rather, it examines assumptions, appraises the source, discerns hidden values, evaluates evidence, and assesses conclusions.

From a Twitter feed:
“The problem with quotes on the Internet is that you never know if they’re true.”—Abraham Lincoln

The scientific attitude prepares us to think smarter. Smart thinking, called critical thinking, examines assumptions, appraises the source, discerns hidden values, evaluates evidence, and assesses conclusions. Whether reading online commentary or listening to a conversation, critical thinkers ask questions: How do they know that? What is this person’s agenda? Is the conclusion based on anecdote and gut feelings, or on evidence? Does the evidence justify a cause–effect conclusion? What alternative explanations are possible?

“The real purpose of the scientific method is to make sure Nature hasn’t misled you into thinking you know something you don’t actually know.”

Robert M. Pirsig, Zen and the Art of Motorcycle Maintenance, 1974

Critical thinking, informed by science, helps clear the colored lenses of our biases. Consider: Does climate change threaten our future, and, if so, is it human-caused? In 2009, climate-action advocates interpreted an Australian heat wave and dust storms as evidence of climate change. In 2010, climate-change skeptics perceived North American bitter cold and East Coast blizzards as discounting global warming. Rather than having their understanding of climate change swayed by today’s weather, or by their own political views, critical thinkers say, “Show me the evidence.” Over time, is the Earth actually warming? Are the polar ice caps melting? Are vegetation patterns changing? And is human activity spewing gases that would lead us to expect such changes? When contemplating such issues, critical thinkers will consider the credibility of sources. They will look at the evidence (Do the facts support them, or are they just makin’ stuff up?). They will recognize multiple perspectives. And they will expose themselves to news sources that challenge their preconceived ideas.

Life after studying psychology The study of psychology, and its critical thinking strategies, have helped prepare people for varied occupations, as illustrated by Facebook founder Mark Zuckerburg (who studied psychology and computer science while at Harvard) and satirist Jon Stewart (a psych major at William and Mary).

Has psychology’s critical inquiry been open to surprising findings? The answer, as ensuing chapters illustrate, is plainly Yes. Some examples: Massive losses of brain tissue early in life may have minimal long-term effects (see Chapter 2). Within days, newborns can recognize their mother by her odor (see Chapter 5). After brain damage, a person may be able to learn new skills yet be unaware of such learning (see Chapter 8). Diverse groups—men and women, old and young, rich and middle class, those with disabilities and those without—report comparable levels of personal happiness (see Chapter 12).

25

And has critical inquiry convincingly debunked popular presumptions? The answer, as ensuing chapters also illustrate, is again Yes. The evidence indicates that sleepwalkers are not acting out their dreams (see Chapter 3). Our past experiences are not all recorded verbatim in our brains; with brain stimulation or hypnosis, one cannot simply replay and relive long-buried or repressed memories (see Chapter 8). Most people do not suffer from unrealistically low self-esteem, and high self-esteem is not all good (see Chapter 14). Opposites tend not to attract (see Chapter 13). In each of these instances and more, what scientists have learned is not what is widely believed.

Psychological science can also identify effective policies. To deter crime, should we invest money in lengthening prison sentences or increase the likelihood of arrest? To help people recover from a trauma, should counselors help them relive it, or not? To increase voting, should we tell people about the low turnout problem, or emphasize that their peers are voting? When put to critical thinking’s test—and contrary to common practice—the second option in each case wins (Shafir, 2013).

Question

T2qg+FN7Glq8vLMtTMSWnj9u+e360b7nlcA9A/ACD8k5VcIxyXyRCNY9uB1FcgJr7o5+vwC3lpCi1HbfcZ6SxAgwLDpAwqXYgfu0AY/kyaAuAujbj1DvA6F0BqH9Ei3TVR/4U/k/okPm27AWF60J4Yo0tQJd+JK/WgW8IWBqi8Q+FeB9ZsxYL/wfRBwsFpWtR8Hgj8FJJheA5D13ENDCf1Y72IQrlAOB6Fu+qBku1R3YUF2I0ofnIfTtSUje5yDNzcGg9XLhaukkA21OGZsX1dKyGBQPHWdoOJxcRCx2U8TllH+DQPlxOxYH2i7Ur+gpqB64ZiqJaqdyoDCncwyXbGQPw5rEfrCo5IWLP5ae/LO5Ndk3qAil7FZsEeuBdZ+P7iv5aa2hzaE=
Possible sample answer: A scientific attitude necessitates curiosity, skepticism, and humility about our understanding of behavior and mental processes. Humans have intuitive biases that lead them to focus on situations in which expectations were met or in which they had been correct and to focus attention away from situations in which they had been wrong or when expectations were not met. These biases lead people to draw conclusions that can be wrong. A scientific attitude instead helps us to put aside these biases. And with critical thinking, we examine assumptions, appraise the source, discern hidden values, evaluate evidence, and assess conclusions.

RETRIEVAL PRACTICE

  • “For a lot of bad ideas, science is society’s garbage disposal.” Describe what this tells us about the scientific attitude and what’s involved in critical thinking.

The scientific attitude combines (1) curiosity about the world around us, (2) skepticism about unproven claims and ideas, and (3) humility about one’s own understanding. Evaluating evidence, assessing conclusions, and examining our own assumptions are essential parts of critical thinking.