This researcher studies how misinformation seeps into science and politics
Yotam Ophir's recent work focuses on the disconnect between scientists and the public
By Sujata Gupta
TV stars and terrorists may appear to have little in common. But after watching YouTube videos by members of a violent terrorist organization, Yotam Ophir realized the two groups deploy similar tactics to connect with remote audiences. The terrorists dressed casually, stared straight at the camera when talking and narrated their pasts in gripping, plot-driven fashion, just like actors.
When Ophir presented that theory in class as a junior at the University of Haifa in Israel, his teacher, communications researcher Gabriel Weimann, was so impressed that he encouraged Ophir to publish on the idea. That resulted in Ophir’s first academic paper, published in March 2012 in Perspectives on Terrorism.
“I think [that paper] opened the door for him, both outside and also inside him, inside his mind,” says Weimann, now at Reichman University in Herzliya, Israel.
Since then, Ophir has remained intrigued by how various people — whether terrorists, policy makers, journalists or public health officials — communicate information and beliefs to broader audiences. The last 20 years have dramatically changed the way we interact with media, says Ophir, now a communications researcher at the University at Buffalo in New York. “All of my research is about humans’ attempt to cope with the crazy and increasing amount of information that now surrounds us 24/7.”
Ophir is especially interested in understanding how misinformation — a topic he’s currently writing a book about — seeps into fields such as health, science and politics. “My hope is that our work can help [people] understand … what stands between humans and accepting the … evidence,” Ophir says.
How the media covers epidemics
Ophir hadn’t set out to become a communications researcher. “I wanted to be a musician,” he says.
But an introduction to mass communications class during freshman year — also taught by Weimann — set Ophir on a new trajectory. On the first day of class, Weimann recounted the story of Jessica Lynch, an injured U.S. soldier presumed captured by Iraqi fighters. Weimann showed the class seemingly dramatic video of Lynch’s rescue. The video, and the media frenzy surrounding its release, had turned Lynch into a war hero.
But the portrayal was misleading. Lynch had not been shot or stabbed as initially reported. And Iraqi soldiers had already abandoned the hospital Lynch was in by the time the U.S. military arrived. Reporters, who had not witnessed the “rescue,” leaned heavily on a five-minute video clip released by the Pentagon. A damning BBC investigation later called the events “one of the most stunning pieces of news management ever conceived.”
Ophir was struck by how staged the whole operation appeared — made to look like a “Hollywood movie” — and the resulting media spin. “It touched a nerve, and I was like, ‘Wow, I need to know more about this,’” he says.
Ophir went on to earn a master’s degree at the University of Haifa, studying how fictional characters can influence people’s beliefs. In 2013, Ophir moved to the University of Pennsylvania for a doctorate degree in the lab of communications researcher Joseph Cappella, who focused on the tobacco industry. Ophir initially investigated how cigarette companies lured people into buying products known to cause cancer and other health problems.
But his focus changed in 2014 when an Ebola outbreak began sweeping through West Africa. Ophir devoured news stories about U.S. medical personnel carrying the disease home. “It scared me personally,” he says.
Soon, though, Ophir found a disconnect between the science of how Ebola spreads and how it was being portrayed in the media. For instance, many stories focused on the subway rides of an infected doctor who had returned to New York City. But Ebola spreads through the exchange of bodily fluids, unlikely to occur on a subway, so those stories served mostly to drum up fear, Ophir says. Curious to know more, Ophir shifted his focus. “I wanted to study the way the media talks about epidemics,” he says.
One of Ophir’s early challenges was sorting out how to identify patterns in enormous troves of documents, Cappella recalls. “He took advantage of the computational techniques that were being developed and helped develop them himself.”
For instance, Ophir automated his analysis of over 5,000 articles about the H1N1, Ebola and Zika epidemics in four major newspapers: the New York Times, Washington Post, USA Today and Wall Street Journal. Those articles were frequently at odds with the U.S. Centers for Disease Control and Prevention’s recommendations for how to communicate information about infectious disease outbreaks, Ophir reported in the May/June 2018 Health Security. Few articles included practical information on what individuals could do to reduce the risk of catching and spreading the disease.
Ophir’s research convinced him that the United States was ill-prepared for an infectious disease outbreak. “I was warning that we’re not ready for the next epidemic because we don’t know how to talk about it,” Ophir says. “Then COVID happened.”
Turning to science and the public
In recent years, Ophir and members of his lab have looked at how political polarization shows up in nonpolitical spaces, such as app review sites. And they have begun trying to identify fringe ideas and beliefs on extremist websites before they go mainstream. All this work coheres, Cappella says, in that it “describes the movement of information, and the movement of persuasive information, through society.”
Ophir’s latest research is a case in point. While it’s common for surveys to ask whether or not people trust science, Ophir wanted to understand people’s beliefs with more nuance. In 2022, working in collaboration with researchers from the Annenberg Public Policy Center of the University of Pennsylvania, he developed a survey for measuring public perceptions of science and scientists.The team asked over 1,100 phone respondents about their political leaning and funding preferences. Ideology is linked to funding preferences, the team reported in September 2023 in Proceedings of the National Academy of Sciences. For example, when conservatives perceived scientists as biased, they were less likely to support funding. The same wasn’t true for liberals.
That work resulted in a predictive model that can assess the gap between how science presents itself and public perception of that presentation. Identifying such communication gaps is a key step in facing today’s challenges, Ophir says. “We could come up with a solution to climate change tomorrow and half the country would reject it.… We won’t be able to survive if we don’t learn to communicate better.”