Gerd Gigerenzer is director of the Center for Adaptive Behavior and Cognition at the Max Planck Institute for Human Development in Berlin. He is also director of the HardingCenter for Risk Literacy in Berlin. He studies how people can make effective decisions given limited time and information. Gigerenzer also explores ways to improve statistical understanding and communication. He has trained U.S. federal judges and physicians in several countries on how to understand risk and uncertainty. Behavioral sciences writer Bruce Bower asked Gigerenzer about statistical illiteracy and the nature of decision making.
How extensive is statistical illiteracy?
It’s a largely unknown problem that applies not only to uneducated people but to the highly educated, including physicians, journalists and politicians. Statistical illiteracy among physicians causes over-treatment, overdiagnosis and increased health care costs. It also affects patients, whose hopes can get unnecessarily raised by the claims that they read in medication advertisements. Statistical literacy should be taught in school beginning in the primary grades.
Can you give an example of statistical illiteracy?
In October 1995, the U.K. Committee on Safety of Medicines issued a warning, widely reported by British newspapers, that third-generation oral contraceptive pills increased the risk of potentially fatal blood clots in the legs or lungs by 100 percent over the same risk from second-generation contraceptives. What the committee and the newspapers failed to report was that the absolute risk of this serious side effect had increased from 1 in 7,000 women who took second-generation pills to 2 in 7,000 women who took third-generation pills. Absolute risks are typically small numbers while corresponding relative changes tend to look big, particularly when the base rate is low.
The pill scare was caused by collective innumeracy. It led to an estimated 13,000 additional abortions in the following year in England and Wales. The cost in increased abortion provision for England’s National Health Service reached the equivalent of about $70 million. Teenage pregnancies also increased. Women’s confidence in oral contraceptives was undermined and pill sales fell sharply.
Among the few to profit were the journalists who got the story on the front page.
Your research focuses on simple rules of thumb as decision-making aids—fast and frugal heuristics. What are the implications of heuristics for understanding good and bad decisions?
My group has shown that using heuristics based on a few cues in the environment can lead to more accurate decisions than making extensive mental calculations. Many cognitive scientists are scared by this idea that less effort can produce better decisions than more effort does. Fast and frugal heuristics demonstrate that there’s a reason for trusting our intuitions.
Weighing and adding up the pros and cons of a tough decision is only possible in limited situations where a person knows all the consequences of his or her potential actions and has lots of time to consider those consequences. In the real world, smart thinking requires finding new alternatives using limited information. We need to trust both our brains and our guts.
Do moral decisions rely on fast and frugal heuristics?
Theories of morality have traditionally assumed that a sense of morality comes from conscious deliberations inside each person. But for many moral decisions, we ask friends for advice, imitate others who have been in similar situations and are otherwise guided by our surroundings. In many situations, people’s moral intuitions are based on unconscious rules of thumb … embedded in their social environments. Deliberate reasoning as a motivation for moral behavior occurs only in unusual contexts, such as in professional debates.
How does heuristics-based morality work in real life?
Take organ donation. Many people die waiting for suitable organs in the United States and Germany, where a minority of citizens sign donor cards. Yet France and Austria have no such problem because 99.9 percent of their citizens are eligible to be organ donors. A powerful heuristic must be at work here that is stronger than deliberate reasoning, national character or individual preferences.
The heuristic goes like this: If there is a legal default, do nothing about it. In countries such as the United States and Germany, the legal default is that nobody is a donor without registering to be one. You need to opt in. In countries such as France and Austria, everyone is a donor unless they opt out.
Collective decisions to follow these default rules have life or death consequences. Knowing the heuristics that guide people’s moral actions can be of help in designing change for the better.