Posted on 12/03/2012 6:33:35 PM PST by neverdem
Researchers are giving mixed reviews to a draft U.S. government plan to subject some grant requests for studies involving the H5N1 avian influenza virus to special reviews—and perhaps even require the work to be kept secret.
Elements of the plan have been "very controversial within [the] U.S. government" committee that developed it, Amy Patterson, associate director for science policy at the National Institutes of Health (NIH) in Bethesda, Maryland, told a meeting of the National Science Advisory Board for Biosecurity (NSABB) earlier this week. Patterson unveiled the proposal—formally known as A Proposed Framework for Guiding HHS [the Department of Health and Human Services] Funding Decisions about Highly Pathogenic Avian Influenza H5N1 Gain-of-Function Research(PDF)—at the 27 November meeting of NSABB, which advises the U.S. government on overseeing "dual use" biological research that could be used for good and evil.
Although it would now apply to just a handful of potential studies, the framework "is going to raise a lot of questions and concerns among researchers," predicts microbiologist Ron Atlas of the University of Louisville in Kentucky who has worked on biosecurity policy issues for more than a decade for the American Society for Microbiology.
Some influenza scientists are already calling the plan "misguided," while others predict it will make it impossible to obtain NIH funding for an entire subset of potentially useful studies. Others, however, say it represents a needed "step forward" in government efforts to reduce the terrorism and public safety risks associated with H5N1 studies.
The debate is expected to get a full hearing over the next few months. NIH says that it will soon release for public comment a white paper describing the plan in detail. And it plans to present the framework for discussion at an international workshop on H5N1 research that it is holding in Bethesda from 17 to 18 December. "We definitely want to hear what other countries are thinking about this," Patterson said.
"This is a global concern," Atlas says, and any U.S. policy "that can't be globalized is in the long run going to be ineffective."
A long-running debate
The framework has its roots in a long-running debate over how to regulate dual use biological studies that gained momentum after the 2001 anthrax attacks in the United States. After that attack, an expert panel assembled by the U.S. National Academies in 2003 recommended that the government and the scientific community work together to develop systems for identifying potentially risky research before it began, as well as ways to deal with study results that might pose a significant security threat if they fell into the wrong hands. The panel also recommended the creation of a panel similar to NSABB, which was created in 2004.
Both U.S. officials and scientists were caught relatively unprepared, however, when a controversy erupted in late 2011 over how to handle the results of two experiments in which scientists engineered the H5N1 virus, which normally is deadly in birds, to become transmissible between mammals, potentially opening the door to a dangerous human pandemic. The U.S. government asked NSABB to review the two studies, which were funded by NIH and had already been submitted for publication. After initially recommending that Nature and Science not publish the studies, a majority of the panel—which has no regulatory authority—lifted its objections to publishing only slightly revised versions of the manuscripts. The journals published the papers, by teams led by Yoshihiro Kawaoka of the University of Wisconsin, Madison, and the University of Tokyo, and Ron Fouchier of Erasmus MC in Rotterdam, the Netherlands, in May and June, respectively.
The fallout from the controversy is far from over, however. This past January, avian influenza researchers imposed a voluntary moratorium on experiments that might make the H5N1 virus more dangerous to humans; it was initially supposed to last just a few months, but it is still in place with no end in sight. In March 2012, the U.S. government added "highly pathogenic" versions of the H5N1 virus to its list of potentially dangerous "select agents," and required funding agencies to take a closer look at the dual use potential of proposed and ongoing H5N1 studies. (Those reviews found just a handful of potentially problematic studies and stopped none, NIH officials say.) NIH officials also say that they are working on a hefty guidance document to help universities conduct their own reviews of potentially risky research—with an eye toward having campus biosafety committees shoulder some of the responsibility.
Meanwhile, researchers worldwide have been debating whether the kind of H5N1 studies conducted by Fouchier and Kawaoka are actually needed. Known as "gain-of-function" studies, such experiments manipulate viral genes in ways that make the virus more transmissible, pathogenic, or expand its host range (increasing the kinds of animals it can infect). Many researchers say gain-of-function studies are key to understanding which kinds of genetic changes might make H5N1 more dangerous to humans and offer clues to better treatments. The results can also give biologists an early warning of what changes to look for in naturally occurring viruses. Critics, however, say the studies often offer few practical public health benefits, but pose plenty of risks if they enable a dangerous virus to escape from a poorly operated laboratory or provide a roadmap for terrorists.
To fund or not to fund?
The new framework is designed to reduce those risks, Patterson told NSABB, by adding special department-wide or government-wide reviews for experiments that are judged problematic by the first round of reviewers. It "attempts to set forth a conceptual framework for how we might approach, at least within HHS … decisions about what we would be willing to fund or not fund," she said.
Patterson emphasized that the special reviews would only occur with experiments that propose to manipulate the H5N1 virus in ways that would allow it to "gain" transmissibility, pathogenicity, or host range. It would not apply to routine studies in which public health scientists characterize naturally occurring viruses or test how they respond to various drugs. So far, she added, NIH officials are aware of just "four or five" potential H5N1 grant requests that might trigger the special reviews.
Patterson said an interagency working group comprised of government science and security experts had "sketched out in pencil" seven criteria that studies would have to meet to be eligible for HHS funding. They are:
1. The research addresses a scientific question with high significance to public health;
2. The research does not intend, nor is reasonably anticipated to yield a HPAI H5N1 experimental virus which has increased transmissibility, pathogenicity, or expanded host range, unless there is evidence that such a virus could be produced through a natural evolutionary process in the foreseeable future;
3. There are no feasible alternative methods to address the same scientific question in a manner that poses less risk than does the proposed approach;
4. Biosafety risks to laboratory workers and the public can be sufficiently mitigated and managed;
5. Biosecurity risks can be sufficiently mitigated and managed;
6. The research information is anticipated to be broadly shared in order to realize its potential benefits to global health; and
7. The research is supported through funding mechanisms that facilitate appropriate oversight of the conduct and communication of the research.
Patterson told NSABB that she expects most of the debate to center on the second criterion, which requires researchers to provide evidence that the virus they want to create could arise through natural evolution, meaning it might be something of practical interest to public health officials. "I'll just say outfront [that it] has been very controversial within the U.S. government discussions," she said. In part, that's because it is not clear what kind of evidence would be needed to show that an engineered virus might also arise naturally. "I've heard questions like: 'What constitutes evidence? What's the foreseeable future?' " Patterson said.
Sure enough, several NSABB members expressed doubts about the requirement, prompting Patterson to note that "I'm not hearing much love for criterion two."
"My read of this is that it really would put a stop … to most of this research," said new NSABB Chair Samuel Stanley, who was trained as a medical doctor and is now president of Stony Brook University in New York. "I'm not sure how one would get that evidence. … I think it sets a bar that may be too high in my opinion to allow you to do any gain-of-function [experiments]. … While I certainly appreciate the risks, … they are very powerful tools when used appropriately."
The idea "that we can do something that nature can't … doesn't stand the test of credibility," added Kenneth Berns, director of the University of Florida's Genetics Institute in Gainsville. The criterion may need to be fine-tuned, he said.
Practical use?
Atlas, who watched a Web broadcast of the NSABB discussion, predicts that the first criterion, which requires researchers to show that a study has "high significance" to public health, will also spark debate. "It raises the bar," he says. "What is of direct public health benefit? I think we are going to get into a gray area about the differentiation between fundamental scientific knowledge and something that can be directly applied."
Secret science
The sixth and seventh criteria, which deal with the publication of results and funding mechanisms, hint at what might happen if government reviewers decide that a study is worth funding—but the potential results are judged too risky to let into public view. "If there were a circumstance where it was deemed to be important for public health, but there were concerns about the nature of the research findings, we would reach out to the Department of Homeland Security [DHS], DOD [the Department of Defense], or other agencies that fund classified research and ask them to consider undertaking the project," Patterson said.
Another possibility, she said, is that researchers would be offered a contract (rather than a grant) that might require the work to take place in a highly secure laboratory or place restrictions on how the results could be shared.
After the anthrax attacks, HHS gained authority to classify work it funded but reportedly has never exercised that authority. In part, that is because the department's two major research arms—NIH and the Centers for Disease Control and Prevention—have policies of funding open science. In contrast, DHS and DOD have a history of conducting classified studies.
The pros and cons of secret research have been the subject of much NSABB debate, Patterson and several panel members noted. During this year's controversy over the two H5N1 studies, for instance, panelists had hoped that the U.S. government could find a legal and practical way to withhold the details from some people but not others, but it couldn't. Still, Michael Osterholm, director of the Center for Infectious Disease Research and Policy at the University of Minnesota, Twin Cities, said he hoped the government and scientists might still find some way of preferentially sharing risky results. "We seem to have somewhat divided into two camps: Those that believe this kind of functional research should be done and those that don't," he said. "To me, there has always been this middle place. … There's got to be somewhere between [publishing results in the] open source literature and classified that we've got to figure out how to get to, because until we do we are going to be caught."
Mixed reactions
The proposal is drawing mixed reactions from scientists outside NSABB. Many wonder how controversial studies that have come up in the past, such as the Fouchier and Kawaoka studies, would have fared if the criteria had been in place.
Fouchier, for one, thinks that his work would have passed—probably. "In my opinion our research meets all of these criteria," Fouchier writes in an e-mail to ScienceInsider. "Number 2 is a tough one, as I fail to see how one would get such evidence. It is not clear to me what research would be forbidden given these criteria. Of course, that depends on who will be reviewing the proposals. If it is reviewed by security people … then a lot of research (and certainly ours) would not be funded by HHS."
Atlas guesses that both studies would have ultimately gotten the green light. "They probably would have triggered [the extra] review," he says, "but they would have met that criteria that [the H5N1 viruses the scientists wanted to create] would be expected to occur naturally." And there was a reasonable argument that the information could help with surveillance efforts.
One influenza researcher, however, believes that NIH's effort is off track. "I believe the NSABB is misguided in making gain-of-function experiments using H5N1 influenza viruses such an issue," virologist Peter Palese of Mount Sinai Hospital in New York City, writes to ScienceInsider in an e-mail. "Gain-of-function experiments are almost always loss-of-function experiments for another property. For example, making H5N1 influenza viruses more transmissible in ferrets (gain-of-function) results in a loss of pathogenicity of these [viral] mutants in the ferret (loss-of-function). Thus, the NSABB looks only at one side of the coin!"
The framework is getting better reviews from researchers who say the U.S. government has been too lax in regulating dual use research. "The proposed review guidelines are a step forward," writes Richard Ebright, a biologist at Rutgers University, Busch Campus, in New Jersey, in an e-mail to Science Insider. "However, they have crucial flaws that need to be addressed before implementation." In particular, he believes "the proposed process does not provide for a bona fide risk—benefit assessment," and that required lab security and safety precautions are still inadequate.
The coming debate
Patterson said there will be plenty of time for scientists and the public—both here and abroad—to air their views and help shape the final policy. One key question, she said, will be whether H5N1 gain-of-function studies are needed at all. "We need to have the courage to put those issues on the table," she said. "If we don't, … any answer we come up with will be suspect, whether it is that some of this research should go forward, or none of it should go forward."
Patterson also went out of her way at the NSABB meeting to blunt the notion that the H5N1 framework is a first step toward imposing broader rules on other fields of research. "I would not like anyone to walk out of the room thinking that we are going to impose this on all of infectious disease research," she said. "But I do think it is a fair observation that for any other infectious agent that has pandemic potential, that some of these principles are applicable."
The main goal in creating the framework, she said, is to reduce the uncertainty facing researchers hoping to win U.S. funds for such studies. "People in the [H5N1] field … do need some very concrete and practical guideposts right now … to understand whether they can undertake [their work] or not," she said, adding that scientists want to know: "Are they going to get in trouble if they undertake it? Can they publish?"
Kristine Beardsley, a bioterrorism expert on the White House's national security staff who worked on the new framework, seconded that idea. Part of the intent, she said, is to provide "some sort of guidance" so that "scientists themselves can feel comfortable that the government is not going to be big brother and be asking them not to do this type of research."
She and other U.S. officials will soon find out, however, whether scientists see the new plan as comforting or corrosive.
FReepmail me if you want on or off my combined microbiology/immunology ping list.
Given those seven restrictions, this research seem reasonable. Virii are constantly mutating, and being reactive rather than proactive could cost lives.
Thought to self, “what could possibly go wrong with this”?
Ping... (Thanks, neverdem!)
Ping... (Thanks, neverdem!)
Thanks for the ping!
You’re Welcome, Alamo-Girl!
1. The research addresses a scientific question with high significance to public health;
2. The research does not intend, nor is reasonably anticipated to yield a HPAI H5N1 experimental virus which has increased transmissibility, pathogenicity, or expanded host range, unless there is evidence that such a virus could be produced through a natural evolutionary process in the foreseeable future;
3. There are no feasible alternative methods to address the same scientific question in a manner that poses less risk than does the proposed approach;
4. Biosafety risks to laboratory workers and the public can be sufficiently mitigated and managed;
5. Biosecurity risks can be sufficiently mitigated and managed;
6. The research information is anticipated to be broadly shared in order to realize its potential benefits to global health; and
7. The research is supported through funding mechanisms that facilitate appropriate oversight of the conduct and communication of the research.
There are two main things that worry me here; one related to carelessness / ineptitude, the other something more sinister, but which might be given an opportunity due to carelessness / ineptitude.
First, let's take a step back, in two different directions. They Hippocratic Oath states: "First, do no harm." But there is potential for harm here. And of several different kinds.
Unfortunately, it's kind of hard to discuss, because each of the issues I bring up is kind of a double-edged sword in terms of debate. But, let's take a stab anyhow.
If confronting the issue of a potential serious health risk with viruses (rule #1), one can run into dangers either from action or inaction. IF one decides to stand pat, banking on nature not going ballistic on us, and there does happen to be a virulent outbreak, then we may be up the creek. Remember the classic example of the 1918 flu which killed millions worldwide, in a time before nearly instantaneous (during the symptom-free incubation period) for the flu: and, considering that there were people who went to bed healthy and "woke up dead" during the epidemic, that's saying something.
Complicating that is the fear that terrorists may be trying to cook up a nasty version of H5N1 akin to the 1918 Flu, and we don't want to get caught flat-footed by *that*.
The difficulty I see there, is that if terrorists *do* make a virulent strain, then all of the "early warning" systems mentioned above, which are meant to track slowly evolving natural strains which are (say) one or two spot mutations away from virulence, will be useless.
The other difficulty with worrying about a terrorist strain is SlingsandArrows' statement that "Virii are constantly mutating" coupled with the remarks of Peter Palese from the article:
""Gain-of-function experiments are almost always loss-of-function experiments for another property. For example, making H5N1 influenza viruses more transmissible in ferrets (gain-of-function) results in a loss of pathogenicity of these [viral] mutants in the ferret (loss-of-function). Thus, the NSABB looks only at one side of the coin!" "
This is a double-edged sword, in that it implies that a very transmissible strain will be less virulent, and a more virulent strain will be less transmissible: if that's the case, can't we just be prepared for social distancing (e.g. consider the scenarios in Tom Clancy's Executive Orders and let the virus mutate itself away? But on the other hand, the 1918 flu took *months* and/or *years* to mutate away, and there were three separate waves of the flu, months apart, each of which caused its own round of fatalities. And the other end of this, is that if the H5N1 is constantly mutating, what good is a vaccine going to do? Look how much work we have to do just to keep up with a 40% chance of immunity to regular flu from standard flu shots, which mutate away from the vaccine within a year, and yet still manage to get plenty of people ill in their new form. I see the risk of throwing a LOT of money at something that may not do all that much good.
But there is another type of risk, too, and this risk revolves around points #2, #6, and #7.
We kind of already touched on #2: but another issue comes when dealing with engineered viruses. Genome sequencing is getting comparatively cheap and easy, as is other bioengineering; it is not without the realm of possibility that a terrorist group could fund capability in this area, and just wait to see *when* the US starts a "crash program" on natural H5N1, thereby "tipping off" the terrorists that a natural strain is very close to virulence. (Telling them when and where to start experimenting.)
A second issue is that of "opsec" -- many scientists argue that "knowledge should be shared" -- and we saw how well that worked out during the Cold War. All it takes is one Klaus Fuchs or some misguided Kumbaya ninny (and it need not be a researcher, or someone with access to the Level 4 Hot Zone, it might be a secretary who Xeroxes the weekly reports) "accidentally" dropping a thumb drive at a Starbucks, and the risks of bio-proliferation (to coin a phrase) are right on top of us.
And finally, there is the danger of the government deciding to play a little bit clever with us: I posted ages ago a tin-foil hat report of a Chinese general discussing biowarfare against the US in order to free up Lebensraum for the Chinese, while sparing our infrastructure and raw materials; but there is also the possibility of Maoists or Earth-First type nuts (some of whom are prominent in academia, think tanks, and advisory roles to this administration) who would like nothing better than to thin the ranks of the World's Population by five billion or more; or, alternatively, clear out all the pesky older people in the US who are owed pensions and Social Security and the like, and of whom a disproportionate share still remember when the Constitution was taught in schools and was the law of the land. Not to mention the drug companies who would *love* to have a monopoly on a "life or death for EVERYONE" drug, to recoup their losses from Obamacare (and if you think I'm crazy, try reading this: East Germany's STASI Sold Citizens to Western Pharmaceutical Companies as Human Guinea Pigs).
And with point #7 ("appropriate oversight" usually means some god-awful boilerplate report issued once a year), I fear that the safeguards in place will not be adequate to prevent malfeasance from whatever source.
Thoughts, comments?
You wrote a a pretty good essay on double edge swords. We’re reduced to relying on morals, brains and patriotism. I hope we’re not screwed. We have enough enemies besides the useful idiots.
Thank you for your thoughtful reply. I’m studying for exams, so I’ll be brief.
To undertake the research does indeed entail risks. So does not undertaking it. All things considered, I believe the latter to be the greater risk. Harm can be done by omission as well as commission.
I don't have a lot of time, but I wanted to address this point because I think it is a rather important one.
The H5N1 research done so far pretty strongly indicates that a similarly transmissible virus arising through natural evolution is VERY possible and most likely just a matter of time. Every time I read an editorial about frankenvirus, risks of terrorism, and whatever, all I can think is that the fear is misdirected. Mother Nature is the most deadly terrorist that ever existed. We need this research. We need to know if increased transmissibility is coupled with decreased virulence.
.
Avian Influenza Outbreaks in Humans, 2012
Overall case fatality rate of 59.0%
WHO reports only laboratory-confirmed cases
http://www.cdph.ca.gov/programs/vrdl/Pages/AvianInfluenzaOutbreaksinHumans.aspx
For more information, see WHO Avian Influenza webpage:
http://www.who.int/influenza/human_animal_interface/en/
For comparison:
The global mortality rate from the 1918/1919 pandemic is not known, but an estimated 10% to 20% of those who were infected died. - With about a third of the world population infected, this case-fatality ratio means 3% to 6% of the entire global population died. - Influenza may have killed as many as 25 million people in its first 25 weeks. Older estimates say it killed 4050 million people, while current estimates say 50100 million people worldwide were killed.
This pandemic has been described as “the greatest medical holocaust in history” and may have killed more people than the Black Death. It is said that this flu killed more people in 24 weeks than AIDS has killed in 24 years, more in a year than the Black Death killed in a century.
http://en.wikipedia.org/wiki/1918_flu_pandemic
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.