Statistical Reasoning in Everyday Life

A-1

Statistics are important tools in psychological research. But statistics also benefit us all, by helping us see what the unaided eye might miss. To be an educated person today is to be able to apply simple statistical principles to everyday reasoning. We needn’t memorize complicated formulas to think more clearly and critically about data.

Off-the-top-of-the-head estimates often misread reality and then mislead the public. Someone throws out a big, round number. Others echo it, and before long the big, round number becomes public misinformation. Here are a few examples:

image
©Patrick Hardin

When setting goals, we love big round numbers. We’re far more likely to want to lose 20 pounds than 19 or 21 pounds. We’re far more likely to retake the SAT if our verbal plus math score is just short of a big round number, such as 1200. By modifying their behavior, batters are nearly four times more likely to finish the season with a .300 average than with a .299 average (Pope & Simonsohn, 2011).

The point to remember: Doubt big, round, undocumented numbers. That’s actually a lesson we intuitively appreciate, by finding precise numbers more credible (Oppenheimer et al., 2014). When U.S. Secretary of State John Kerry sought to rally American support in 2013 for a military response to Syria’s apparent use of chemical weapons, his argument gained credibility from its precision: “The United States government now knows that at least 1429 Syrians were killed in this attack, including at least 426 children.”

Statistical illiteracy also feeds needless health scares (Gigerenzer et al., 2008, 2009, 2010). In the 1990s, the British press reported a study showing that women taking a particular contraceptive pill had a 100 percent increased risk of blood clots that could produce strokes. This caused thousands of women to stop taking the pill, leading to a wave of unwanted pregnancies and an estimated 13,000 additional abortions (which also are associated with increased blood clot risk). And what did the study actually find? A 100 percent increased risk, indeed—but only from 1 in 7000 to 2 in 7000. Such false alarms underscore the need to teach statistical reasoning and to present statistical information more transparently.