Sometimes busting myths can backfire
It was the mic drop heard ’round the Internet.
On January 25, rapper B.o.B (Bobby Ray Simmons) sent out a series of statements on Twitter stating why he thinks the Earth is flat.
The cities in the background are approx. 16miles apart… where is the curve ? please explain this pic.twitter.com/YCJVBdOWX7
— B.o.B (@bobatl) January 25, 2016
Once you go flat, you never go back
— B.o.B (@bobatl) January 26, 2016
In response, Neil deGrasse Tyson, an astrophysicist and noted science communicator at the Hayden Planetarium at the American Museum of Natural History in New York, tweeted back some facts to bust a myth proven wrong back in the sixth century B.C.
@bobatl Earth’s curve indeed blocks 150 (not 170) ft of Manhattan. But most buildings in midtown are waaay taller than that.
— Neil deGrasse Tyson (@neiltyson) January 25, 2016
@bobatl Polaris is gone by 1.5 deg S. Latitude. You’ve never been south of Earth’s Equator, or if so, you’ve never looked up.
— Neil deGrasse Tyson (@neiltyson) January 25, 2016
A rap battle ensued, with B.o.B writing a track called “Flatline,” and Tyson’s nephew, Stephen Tyson, releasing a track in return. Finally, Tyson appeared in a special segment on The Nightly Show, where he addressed B.o.B, stating, “small sections of large, curved surfaces will always look flat to little creatures that crawl upon it.” Tyson also criticized the “anti-intellectual strain” in society, and finished with a demonstration of gravity by, literally, dropping the mic.
This epic meeting of science and culture provided entertainment to millions. But was it actually effective at busting the flat Earth myth for B.o.B? A new study takes a look at making myth-busting work and where it can fail. It turns out that repeating a myth and correcting it can backfire, making readers more likely to remember a myth as true. But the study also shows that to make the facts stick, sometimes it helps to force people to judge them as they come.
Myth busting articles are catchy and popular. There’s a thrill in clicking on an article about why “cheese is not just like crack.” It catches your attention — who thought this was a thing? And it has the added thrill of making you feel a little bit smarter than everyone else who might have been trying to justify their cheese addiction.
But bursting mythical bubbles can also backfire. The first problem is that people are easily persuaded by things they hear more often. “The mere repetition of a myth leads people to believe it to be more true,” notes Christina Peter, a communication scientist at the Ludwig Maximillian University of Munich.
And unfortunately, our brains don’t remember myths in a very helpful way. “There’s a lot of research that tells us people have a hard time remembering negations,” says Stephan Lewandowsky, a cognitive scientist at the University of Bristol in England. We remember myths not as myths, but rather as statements that are additionally tagged as “false.” So instead of remembering “cheese is nothing like crack,” our brains remember “cheese is like crack (false).” As our memories fade, the qualifier on the statement may fade too, leaving us with the false idea that brie really is the next cocaine.
Peter and her colleague Thomas Koch decided to find out how best to combat this backfire effect — our tendency to misremember myths as fact — when confronted with scientific information. They recruited 335 volunteers and asked them to read three newspaper articles. The first and last were decoys. The important one was in the middle, and concerned a new in-home bowel cancer test. The article included eight statements about the new test, with each immediately identified as fact or myth, and with an explanation of why the items were true or false.
The scientists also asked the participants to focus on different things. They asked one group to form an opinion about the articles as they read them. They asked another just to study the language.
After all the groups were done reading, Peter and Koch presented them with the eight statements from the bowel test article, and asked them whether they were true or false. Then the scientists asked the participants those questions again after five days to test what they retained.
Readers who focused just on the language of the articles suffered from the backfire effect. They were more likely to remember false statements as true than to remember true statements as false. This backfire effect got stronger when they saw the statements again five days later, and it influenced what they thought of the bowel test. The articles described the test in a slightly negative light. But if people remembered more of the myths as facts, they ended up with a positive view of the test. Oops.
But the backfire effect changed if participants formed an opinion as they read. Participants who were making up their minds on the fly made errors half as often as those who were reading only for language.
Peter says the results suggest that when presenting readers with new information, “try to avoid repeating false information,” since that may be what remains in people’s minds. And in some situations, Peter says, asking readers for their opinion or getting them to form an opinion as they read might help them distinguish between what is truth and what is myth. Peter and Koch published their results in the January Science Communication.
The backfire effect in myth-busting is fairly well-established, Lewandowsky says. But no one has tried to correct for it before. “I reckon they’re on to something,” he says. “I’d like to see it replicated. It’s really neat and it meshes well with other areas [of research].”
Does this mean myth-busting worked on B.o.B? It’s hard to say. The people in Peter’s study were confronted with new information about an at-home bowel cancer test. It wasn’t something they had previous opinions about. When people already believe something — for example, if they think the Earth is flat — and are confronted with the truth, the backfire effect can come out in force. “I can say broadly when people have strong beliefs about something, it’s difficult to unwind those beliefs, regardless of how strong the evidence is,” says Eryn Newman, a cognitive psychologist at the University of Southern California in Los Angeles. .
Whether or not B.o.B ended up convinced, Tyson had a good strategy, because instead of just mocking the rapper, Tyson included real facts to replace the false ones. “The moment you explain something to people, the debunking problem is far less of an issue,” Lewandowsky explains. “You replace the myth with an explanation of the truth.”
And luckily, the prior beliefs of other people will probably protect them from B.o.B’s flat-Earth ideas. “Nobody believing in a heliocentric worldview will suddenly believe the world is flat just because he heard it several times from B.o.B,” Peter says.