Researchers, journals asked to censor data

Bird flu findings could provide terrorists a bioweapons blueprint, NIH panel worries

Scientists undertake research to advance knowledge. Normally, one aspect of that advancement is to find as broad an audience for the newly acquired data as possible. But what happens if medically important data could be misappropriated for ruthless purposes? That question underlies the ruckus developing over two new bird flu papers.

After reviewing the manuscripts, a federally convened panel has asked the authors of both papers to censor important details of their work. It argued that “certain information obtained through such studies has the potential to be misused for harmful purposes.” That’s a thinly veiled reference to biowarfare. The concern is that human manipulation might transform some low-risk bug into a virus that triggers a localized epidemic — if not a runaway global pandemic.

As a second line of defense, the panel — and the Department of Health and Human Services — has strongly encouraged the journals reviewing the new papers to also ensure that no dangerous details are published.

Details of the situation began to emerge on December 20 when the National Institutes of Health reported generally on the findings of an independent expert review panel that it had convened to pore over the bird flu papers that had been submitted for publication.

This panel recommended that federal officials ask the reports’ authors — and the journals that were planning to publish the research (Science and Nature) — to withhold critical details of the experimental methodology employed. HHS (NIH’s parent agency) seconded the recommendation.

Viruses belonging to the H5N1 — avian flu — family are not ordinarily a big threat to mammals, including people. But the new papers described techniques that had successfully increased the infectivity of these viruses beyond their winged hosts-of-choice — to ferrets.

NIH emphasized that the self-censorship it was requesting amounts to “non-binding recommendations to the authors and journal editors.”

Science editor-in-chief Bruce Alberts acknowledged on December 20 that the National Science Advisory Board for Biosecurity had, almost three weeks earlier, verbally recommended that his journal publish an abbreviated version of the flu paper moving through its pipeline. The board argued that full publication of research details could pose security threats, Alberts noted, and that it was imperative to keep those details “from falling into the wrong hands.”

Because Science supports a formal Statement on Scientific Publication and Security, Alberts says his editors are considering compliance — based on yet-to-be issued details by the federal government on how “to ensure that any information that is omitted from the publication will be provided to all those responsible scientists who request it, as part of their legitimate efforts to improve public health and safety.”

Nature’s editor-in-chief Philip Campbell said much the same thing: “We are discussing with interested parties how, within the scenario recommended by NSABB, appropriate access to the scientific methods and data could be enabled.”

The feds seem to be charting new ground: asking — but not demanding — that scientists and journals do the right thing. As if anyone truly knows what the right thing is and always will be.

How do scientists vet the work of others if the details are kept secret? If a select group of scientists gets access to oversee and peer review the work of germ scientists, who vets the overseers? And then who polices all of these individuals to ensure none spills the beans — unwittingly or intentionally — allowing crucial germ-engineering data to end up in the hands of nefarious parties? Is the solution to suddenly impose a gag rule on anyone working with germs that could be, but aren’t yet, amazingly dangerous?

There have been rules in place for decades on how to manage nuclear secrets. This system begins with not allowing anyone access to government-funded work in weapons-sensitive areas without a security clearance. (I had to have a top secret clearance once, long ago, just to work as a journalist at a national lab because there was always the risk that I might accidentally encounter sensitive materials — which, to my knowledge, I never did.)

State-imposed secrecy has not often been a problem with which the medical community has had to wrestle. Will garden-variety cold, flu and other infectious-disease scientists be asked to move their research to biosafety facilities? And will the findings many publish end up reading like black-box science? (You know: They studied this germ by manipulating it in some undisclosed fashion and then, voila — its virulence substantially increased.)

Bioterrorism is scary and all too real. Few of us would challenge that all reasonable efforts must be made to limit details for weaponizing a natural agent already loose in our environment. What proves tricky is drawing the line at which aspects of science are too risky to share. An overly conservative approach risks substantially slowing down the creation of new knowledge. Too little may put free blueprints for disaster into the hands of unstable people with a dangerous political agenda.

It’s interesting that this situation didn’t rear its head sooner. But the aspect of this new episode that most disturbs me is the idea of relying on appeals to researchers’ sense of morality as a way to contain the release of dual-use information: techniques and/or technology with the potential for both civilian and military applications. Morality is hard to legislate, much less police. Establishing what is ethically appropriate depends on an individual’s place in the universe — making it frighteningly relative.

Janet Raloff is the Editor, Digital of Science News Explores, a daily online magazine for middle school students. She started at Science News in 1977 as the environment and policy writer, specializing in toxicology. To her never-ending surprise, her daughter became a toxicologist.