Strategies to Prevent Bioterrorism: Biosecurity Policies in the United States and Germany

30 April 2007

Jonathan B. Tucker

The mailing of letters contaminated with anthrax spores through the US postal system in September-October 2001 resulted in the infection of 22 people, five of whom died, and caused expanding ripples of fear, social disruption, and economic damage. This crime, which remains unsolved, called attention to the need for ‘biosecurity’ measures to prevent terrorists from acquiring the materials and know-how required to carry out biological attacks. Such measures include limiting access to dangerous pathogens and toxins, restricting the publication of sensitive scientific information, and overseeing dual-use research that is conducted with peaceful intent but could be misused for hostile purposes. Because biotechnology is spreading worldwide, national biosecurity laws and regulations should be harmonised to the extent possible.

To explore the feasibility of a common international approach to biosecurity, this paper compares how the United States and Germany each address the issue. The rationale for this comparison is that, at first glance, the two countries appear so similar that harmonising their biosecurity policies should be relatively simple. Both are Western industrial democracies that rank among the top five in biotechnology, and both have a political system that divides governance responsibilities between the federal government and the states. Nevertheless, Washington and Berlin have taken quite different approaches to the prevention of bioterrorism, including pathogen security and the oversight of dual-use research.[1] A systematic comparison of the two countries’ biosecurity policies should therefore shed light on the challenges involved in developing a harmonised set of international guidelines.

Background

Although biology is not the only science with potentially destructive applications, it poses a qualitatively different set of risks. Producing sufficient fissile material to build a nuclear bomb or enough chemical weapons to inflict mass casualties requires a large industrial base and, most likely, the financial and technical resources of a nation state. But terrorists who managed to obtain and deliver a highly infectious, contagious, and lethal pathogen could unleash a deadly epidemic with a far smaller investment of time, money, and effort.[2] Because of the potential destructive power that biotechnology puts in the hands of small groups and even individuals, it is essential to find ways of channelling biological research and development in beneficial directions, while minimising the risk of its deliberate misuse for hostile purposes.

At least in the near term, the greatest threat is not from ‘biohackers’ tinkering in basement laboratories but rather from military scientists working in national biological warfare programmes. Such individuals read the scientific literature and are capable of directing basic research findings into new lines of offensive development. Although the current technical capabilities of terrorist groups such as Al-Qaeda are rudimentary, they may become more sophisticated over the next decade as powerful biotechnologies such as automated DNA synthesisers spread to countries around the world.

National strategies to prevent bioterrorism have focused primarily on measures to limit access to dangerous pathogens.[3] Such efforts are consistent with Article IV of the 1972 Biological and Toxin Weapons Convention (BWC), which stipulates that each state party shall ‘take any necessary measures to prohibit and prevent’ the development, production, stockpiling, and transfer of biological and toxin weapons within its territory and any other location under its jurisdiction or control. Although Article IV does not refer to specific measures, it is generally understood to require member states to adopt implementing legislation making the prohibitions of the BWC binding on their citizens and imposing penal sanctions for violations. National biosecurity measures were discussed during two intersessional meetings of experts and BWC states parties in 2003. Although the participating countries exchanged a great deal of information, no effort was made to distil it into a uniform set of guidelines or ‘best practices’ for national implementation.

In April 2004, the development of national biosecurity measures received a new impetus from UN Security Council Resolution 1540, which requires all states to adopt ‘appropriate’ and ‘effective’ national legislation to prevent the proliferation of nuclear, chemical and biological weapons and related materials, especially for terrorist purposes. As of April 2006, however, only 48 countries had domestic legislation in place requiring the licensing or registration of facilities and/or individuals that work with hazardous biological agents.[4]

Biosecurity measures also extend beyond pathogen security to the more contentious issue of regulating dual-use research in the life sciences.[5] Although recent advances in bioinformatics and molecular genetics promise great benefits for human health, state proliferators or sophisticated terrorists could potentially misuse the results of such research to develop more effective biological weapons. The problem of dual-use research came to the fore in early 2001, when a group of Australian scientists developing a viral contraceptive to control rodent populations reported in the Journal of Virology that inserting a gene for an immune-system protein into the mousepox virus made the normally benign virus highly lethal in mice, even when animals were vaccinated against it. Critics said this research finding should not have been published because it could provide a ‘roadmap’ for terrorists seeking to develop more lethal versions of related viruses that infect humans, such as smallpox and monkeypox.

Since then, other examples of dual-use research have appeared in the scientific literature. Of greatest concern is the development of techniques for the synthesis and assembly of entire microbial genomes, making it possible to create a variety of natural and artificial pathogens ‘from scratch’ in the laboratory. In October 2005, for example, Terence Tumpey and his colleagues reported that they had synthesised the Spanish influenza virus, resurrecting an extinct pathogen that had killed upwards of 50 million people during the global flu pandemic of 1918-19.[6] The new field of ‘synthetic genomics’ has raised concern about its potential application for hostile purposes.[7] In order to prevent such misuse, the United States has proposed restrictions on security-sensitive research and publication, raising difficult policy questions about the appropriate balance between national security and scientific freedom.

A complicating factor for policy-makers is that advanced biotechnology is no longer the exclusive purview of advanced industrial states such as the United States, Germany, and Japan. Countries such as Brazil, China, Cuba, India, Malaysia, Singapore, South Africa, South Korea and Taiwan are also making major investments in biotechnology and, in some cases, conducting cutting-edge research and development. It will therefore be necessary to develop internationally harmonised - or at least mutually compatible - biosecurity guidelines to prevent the emergence of a patchwork quilt of national regulations, with gaps and inconsistencies that proliferators and terrorists could exploit as targets of opportunity.[8]

Domestic Legislation on Pathogen Security

The term ‘pathogen security’ refers to measures to reduce the risk of bioterrorism by making it harder for would-be perpetrators to gain access to dangerous pathogens and toxins that have legitimate uses in biomedical research but could be misused for the development of biological weapons. Although the United States and Germany share this objective, they have taken divergent approaches to ensuring that only legitimate scientists have access to dangerous biological materials.

US Legislation on Pathogen Security

The US Congress first introduced security controls on dangerous pathogens in the late 1990s, after an incident in 1995 revealed the lack of government regulation in this area. Larry Wayne Harris, a licensed microbiologist living in Columbus, Ohio, and a known neo-Nazi sympathizer, used a forged letterhead and the identification number of the laboratory where he worked to order three vials of freeze-dried plague bacteria through the mail from a major supplier, American Type Culture Collection (ATCC). After Harris aroused suspicion by making repeated calls to ATCC to check on the status of his order, he was arrested and convicted of one count of mail fraud for the use of the forged letterhead. Further prosecution was not possible, however, because no US law then in effect prohibited ordinary citizens from ordering pathogens for personal use.[9]

In response to the Harris case, the US Congress included a section in the Anti-Terrorism Act of 1996 regulating US domestic facilities that sell, transfer, or receive dangerous micro-organisms and toxins. The Department of Health and Human Services, through its Centers for Disease Control and Prevention (CDC) and in consultation with other federal agencies, developed a list of ‘select agents and toxins’ of bioterrorism concern. The federal Select Agent Rule, promulgated in 1997, required anyone who shipped or received such agents to register with the CDC and file a report on each transaction. This regulation contained a major loophole, however, in that laboratories that merely possessed or worked with select agents, but did not transfer or receive them, did not have to register.

The autumn 2001 mailings of letters contaminated with anthrax spores had a major impact on the US Congress. Two Democratic Senators (Tom Daschle and Patrick Leahy) were targets of the letter attacks, several other Senators and their staffs were urged to take prophylactic antibiotics, and the Hart Senate Office Building was closed for nearly five months for decontamination. Not surprisingly, Congress responded to this traumatic experience with a flurry of new legislation. The USA Patriot Act, passed shortly after the incident, contained a provision making it a crime to possess a biological agent, toxin, or delivery system that ‘is not reasonably justified by a prophylactic, protective, bona fide research, or other peaceful purpose’. In addition, the Patriot Act defined several categories of ‘restricted persons’ who are prohibited from shipping, receiving, transporting, or possessing select agents. This ban applies to all individuals who have been convicted of a felony, diagnosed with a mental illness, convicted of using illegal drugs, or are citizens of countries that the State Department has designated as ‘state sponsors of terrorism’.[10] No exceptions or waivers are allowed.

The Senate Judiciary Committee also held hearings on the anthrax letter attacks during which officials from the Federal Bureau of Investigation (FBI) admitted that because of the loophole in the 1997 Select Agent Rule, they could not identify all of the facilities in the United States that possessed stocks of anthrax bacteria. Seeking to close this loophole, Congress included a provision tightening the controls on select agents in the Public Health Security and Bioterrorism Preparedness and Response Act, which President George W. Bush signed on June 12, 2002. This law requires all entities in the United States that possess, use, or transfer human pathogens and toxins on an expanded Select Agent List to register with the CDC and implement enhanced laboratory security measures. Clinical laboratories are exempt from the registration requirement if they destroy medical specimens containing select agents within two weeks.[11]

The Bioterrorism Act also mandated the Animal and Plant Health Inspection Service (APHIS) of the US Department of Agriculture to develop and maintain a separate list of pathogens and toxins that pose a severe threat to livestock or crops. Laboratories that possess or work with these agents must register with APHIS. At present, the Select Agent List contains more than 80 microbial and toxin agents that affect humans, livestock, or plants.[12] This list must be reviewed and updated every two years.

The primary aims of the strengthened Select Agent Rule are to track who has access to listed pathogens and toxins, what pathogens have been handled and studied, and where they have been used, so as to reduce the risk of diversion for hostile purposes. Because of the wide variety of laboratories that work with select agents, the guidelines for upgrading physical security are not highly prescriptive. Instead, each affected institution must conduct a threat and vulnerability assessment and develop a comprehensive plan to secure and limit access to all areas containing select agents. The security plan must be approved by the US government, performance-tested, and updated periodically. CDC and APHIS officials conduct short-notice inspections of the registered laboratories to ensure that their security measures are effective. Upgrading physical security can be quite expensive; Louisiana State University in Baton Rouge, for example, spent approximately $130,000 on new security systems for its laboratories.[13]

According to the Bioterrorism Act, all scientists seeking to work with select agents and toxins must be fingerprinted and undergo an FBI background check to verify that they do not have a criminal record or are listed on databases of known terrorists. The legal consequences for scientists who do not comply with the Select Agent Rule can be severe. Thomas Butler, a microbiologist at Texas Tech University, was arrested by the FBI in 2003 for failing to report 30 missing vials of plague bacteria. Although he was acquitted of this charge, he was later found guilty of 47 other counts, most of them involving fraud.[14]

As of October 2006, 335 facilities in the United States had registered with the CDC to work with human select agents, including private research companies, universities and hospitals, and an additional 75 facilities had registered with APHIS to work with select plant and animal pathogens.[15] Ironically, the expansion of select-agent research caused by the dramatic rise in federal biodefence spending has generated new safety and security risks. Many scientists working with select agents have little prior experience handling dangerous pathogens and were drawn to this area by the availability of research money. At the same time, the Select Agent Rule has caused a number of experienced researchers to leave the field. Microbiologists at Stanford University, for example, stopped working with select agents because the ‘administrative and security burdens of the select agent rule outweighed the scientific need to maintain stocks on campus’.[16] Other institutions have destroyed valuable pathogen strain collections accumulated over many years.

The recent emergence of synthetic genomics also promises to complicate the Select Agent Rule by making it possible to resurrect extinct viruses or to create novel viruses that do not exist in nature.[17] More generally, high-throughput DNA synthesisers will eventually bring the synthesis of large viruses and even small bacteria into the realm of feasibility. When that happens, strategies for pathogen security based on physical access controls will cease to be effective, and government policy-makers will need to regulate the synthesis of dangerous viruses in the laboratory.

In contrast to select agents, the United States does not regulate recombinant DNA (‘genetic engineering’) research through legislation but rather through administrative rules that apply to scientists who receive grants from the National Institutes of Health (NIH). A major weakness of the NIH Guidelines is that they are not legally binding on the US biotechnology industry. Although some companies comply voluntarily with the guidelines, much industrial research and development involving recombinant DNA takes place outside the regulatory framework.

German Legislation on Pathogen Security

Perhaps because Germany has never experienced a bioterrorist attack, the level of public awareness and concern about this issue is considerably less than in the United States. Shortly after the US anthrax mailings, Germany experienced a spate of anthrax hoaxes that raised anxiety but were quickly discovered to be false alarms. Partly for this reason, Germany did not respond to the events of autumn 2001 by introducing new biosecurity legislation. Whereas the United States framed bioterrorism prevention as a security problem and responded by tightening controls on a targeted list of select agents and toxins that could be used as weapons, German officials viewed the risk of bioterrorism mainly in public health terms, as a subset of the broader challenge of infectious disease. Accordingly, whereas the US Select Agent Rule focuses narrowly on pathogens considered suitable for bioterrorist use, Germany relies on an extensive framework of ‘biosafety’ laws and regulations, which are designed to ensure the safe handling of dangerous pathogens by legitimate researchers and to minimize the risks to public health and the environment from research conducted for peaceful purposes.

The German term ‘biologische Sicherheit’ is generally understood to mean ‘biosafety’. There is no separate word for ‘biosecurity’ and German laws and regulations do not make a clear distinction between the two concepts. After 9/11 and the anthrax letter attacks, the German government examined its existing biosafety regulations to identify gaps that might be exploited by terrorists. Although some gaps were found, German officials concluded that they could be addressed without additional legislation.

German biosafety measures date back to 1900, when the Reich Epidemic Act (Reichsseuchengesetz) required scientists wishing to work with dangerous pathogens to meet certain educational prerequisites and to be licensed by the state. The list of current German laws related to biosafety includes the Genetic Engineering Act (Gentechnikgesetz) of 1993, the Health and Safety at Work Act (Arbeitschutzgesetz) of 1996, the Plant Protection Act (Pflanzenschutzgesetz) of 1998, the Regulation on Health and Safety Related to Activities involving Biological Agents (Biostoffverordnung) of 1999, the Infection Protection Act (Infektionsschutzgesetz) of 2000, and the Animal Infectious Disease Act (Tierseuchengesetz) of 2001. The War Weapons Control Act (Kriegwaffenkontrollgesetz) of 1961, as amended, makes it illegal to ‘develop, produce or trade in biological or chemical weapons, to acquire them from or transfer them to another person, to import or export them, to transport them through or otherwise bring them into or out of federal territory, or otherwise to exercise actual control over them.’ [18] In addition, the Biological Weapons Convention Act of 1983 bans the development and production of biological and toxin agents for hostile purposes. All of these laws and ordinances impose penal sanctions for violations, including fines and imprisonment.

Because German biosafety legislation has been built up incrementally over more than a century, the responsibilities for implementation and oversight are scattered over multiple agencies. The federal ministries of health, interior, agriculture, labour, and consumer protection are involved in enforcing various aspects of the biosafety laws, and a non-governmental professional association for the chemical industry (Berufsgenossenschaft der chemischen Industrie) develops and oversees certain occupational health and safety rules. In addition, under Germany’s federal system, many biosafety regulations are enforced by the state (Bundesland) authorities, who have primary responsibility for health care, law enforcement, and civilian emergency management (Katastrophenschutz). Finally, as a member of the European Union (EU), Germany must harmonise its domestic biosafety legislation with common European norms. According to the principle of ‘subsidiarity’, the implementation of EU directives is delegated to the lowest appropriate level of each member government.

Germany has extensive tracking and manifest systems for dangerous pathogens and a well-developed infrastructure for inspection and training. [19] The biosafety regulations are based on the inherent capacity of micro-organisms to cause illness and death in humans, animals, or plants. All known microbial and toxin agents are classified into four Risk Groups based on characteristics such as infectiousness, contagiousness, virulence (ability to cause illness and death), and the availability of protective vaccines and therapeutic drugs. Risk Group 4 includes pathogens such as the Ebola virus that are lethal and incurable, and hence demand highly stringent (Biosafety Level 4) containment measures, whereas Risk Group 1 agents require only minimal safety precautions.[20] Because of this comprehensive approach, the total number of microbial and toxin agents covered by the German biosafety regulations is considerably larger than the US Select Agent List.

The Infection Protection Act of 2000 spells out the biosafety precautions that must be taken when working with dangerous or genetically engineered pathogens. This law requires the licensing of facilities that store and handle dangerous pathogens and the individual scientists who work with them. Permission for individual scientists to handle pathogens in Risk Groups 3 and 4 (along with several named pathogens in Risk Group 2) must be granted by local or regional public health authorities in the form of a personal use authorization (Umgangsgenehmigung). Scientists who repeatedly violate the biosafety rules may have their license to work with dangerous pathogens revoked.[21]

Prerequisites for obtaining an authorization for work with dangerous pathogens include possessing the proper academic credentials and, in some cases, undergoing a personal reliability check. This vetting procedure was introduced after the events of autumn 2001 but was based on an existing law, the Security Clearance Act (Sicherheitsüberprüfungsgesetz) of 1994, which provides that ‘a person who is to be entrusted with a security-sensitive position must first undergo a security investigation’.[22] Because the language of the Act is quite broad, it can be interpreted to cover individuals who work at sensitive facilities such as nuclear power plants or who handle hazardous materials such as radioactive isotopes and dangerous pathogens. German federal agencies generally require a personal reliability check, and universities may request one on a voluntary basis if warranted by the nature of the proposed research. The vetting process for access to dangerous pathogens is essentially the same as that for classified information. Factors that may rule out eligibility include prior conviction of a felony, evidence of antisocial behaviour or mental instability, extreme political views or membership in a subversive organisation that is monitored by the German Federal Office for Protection of the Constitution (Bundesamt für Verfassungsschutz).

Unlike the United States, Germany does not deny access to hazardous biological materials on the basis of nationality, although some sensitive research facilities are off-limits to non-citizens. Moreover, when foreigners apply to enter Germany for scientific training or to conduct research, the administrative rules for granting visas take into account the country of origin and the institution where the applicant is employed. Because the visa rules are not based on legislation, they can be readily modified in response to intelligence information. The purpose of the visa system is not to impede normal exchanges or collaborations between German and foreign scientists but rather to protect sensitive information.[23] Nevertheless, the visa-issuing authorities may lack the expertise needed to assess the intentions of scientists coming from countries of proliferation concern.

Permission (Zulassung) to conduct individual experiments is required for those involving genetic engineering, which are assessed solely from the standpoint of biosafety. Unlike the NIH Guidelines, which are not binding on private industry, Germany’s Genetic Engineering Act of 1993 covers recombinant DNA research in academia, government and industry, without exception, and includes penal sanctions for violations. Depending on the nature of the proposed experiment, it must be reviewed either by the state authorities or by a federal body called the Central Commission for Biological Safety (ZKBS). The biocontainment level for genetic engineering experiments is determined by assessing whether the modified microbe is likely to be more dangerous than the wild type. If so, the experiment must be conducted at a higher level of containment.[24] The primary goal of the German genetic-engineering regulations is to ensure the safety of experiments rather than to prevent misuse. Although all scientists who receive a license to conduct recombinant DNA research must attend a three-day course on biosafety, the issues of laboratory security and biological weapons are not addressed in detail.

Despite their differences, the US and German approaches to pathogen security overlap extensively in practice. According to Dr. Volker Beck, a technical adviser to the German Foreign Ministry, ‘The US prefers the term biosecurity, whereas Europe has long focused on biosafety. In fact, 80 percent of what is called biosecurity is actually biosafety’.[25] Nevertheless, the US and German paradigms differ in three important respects. First, the German biosafety regulations cover a much larger set of microbial and toxin agents, only some of which could be used as bioterrorist weapons. Second, whereas the German regulations on genetic engineering research are more stringent than those of the United States, the rules for working with natural pathogens are less so. Third, because the German biosafety regulations aim primarily to prevent accidental infections and releases, they focus on physical containment and good laboratory practice. U.S. biosecurity measures, in contrast, seek to prevent the deliberate theft, diversion, and misuse of pathogens and therefore focus on physical security measures, access controls, and personnel reliability.

Restrictions on Publication of Dual-Use Research

Part of the debate over dual-use research in the life sciences has focused on whether or not security-sensitive findings should be published in the open scientific literature. The main concern is not that the scientists doing the work will engage in illicit activity but rather the possibility that someone else could misuse the research findings for malicious purposes. Scientists have traditionally viewed the acquisition of knowledge as an unalloyed good that contributes to an understanding of the natural world and leads to beneficial applications, yet some types of scientific information may be dangerous in the wrong hands. As bioethicist Arthur Caplan of the University of Pennsylvania has argued, ‘We have to get away from the ethos that knowledge is good, knowledge should be publicly available, that information will liberate us. . . . Information will kill us in the techno-terrorist age’.[26] Nevertheless, US and German policy-makers differ strongly on the issue of restricting the publication of basic research. Whereas US officials may be willing in some cases to prevent the release of sensitive findings in the name of national security, their German counterparts are uncompromising in their defence of scientific freedom.

US Approach to Restrictions on Scientific Publication

In 2002, a research paper in the journal Science describing the laboratory synthesis of poliovirus outraged some members of Congress, who proposed legislation restricting the publication of dual-use research. Seeking to preempt congressional action, a group of editors of leading scientific journals issued a ‘Statement on Scientific Publication and National Security’ in which they declared that they were ‘committed to dealing responsibly and effectively with safety and security issues that may be raised by papers submitted for publication, and to increasing our capacity to identify such issues as they arise’. In rare cases, they noted, ‘an editor may conclude that the potential harm of publication outweighs the potential societal benefits. Under such circumstances, the paper should be modified, or not published’.[27] This statement made clear, however, that the responsibility for any decisions to restrict scientific publication should remain in the hands of journal editors, publishers, and the affected researchers themselves and not delegated to government officials.

The US journal editors’ statement did not include guidelines for how articles that pose security concerns should be reviewed before publication, and to date, no scientific papers have been rejected on security grounds.[28] In 2005, the public release of a paper containing a mathematical model of how terrorists might use botulinum toxin to contaminate the nation’s milk supply was delayed after the US government requested that it not be published. Because the research had not been funded with public money, however, the federal government had no legal jurisdiction over the information and the journal went ahead with publication.[29]

In 2002, in response to the controversy over the Australian mousepox experiment, the National Research Council (the policy analysis unit of the US National Academies) assembled an expert committee chaired by biology professor Gerald Fink of the Massachusetts Institute of Technology to consider ways of preventing the misuse of biotechnology for hostile purposes without hindering progress in the life sciences. The Fink Committee’s final report, released in late 2003, concluded that dual-use research posed a real threat and identified seven categories of potentially dangerous experiments that might warrant additional discussion or review.[30]

One of the recommendations of the Fink Report was to establish a national panel of experts from the scientific and defence communities to advise the US government on how best to address the security risks associated with federally funded research in the life sciences. In response to this recommendation, the Bush Administration established the National Science Advisory Board for Biosecurity (NSABB), which met for the first time in mid-2005. Administered by the National Institutes of Health, the NSABB includes up to 25 voting members, plus ex officio representatives from 15 federal agencies that conduct or support research in the life sciences. The mandate of the NSABB is to develop criteria for identifying dual-use research, draft guidelines for the review and oversight of such experiments, and suggest restrictions on scientific publication.[31]

In July 2006, the scientific communications subcommittee of the NSABB proposed that universities conduct a risk-benefit analysis before approving the publication of research results that could be ‘directly misapplied by others to pose a threat to human health, agriculture, plants, animals, the environment, or material’. The reviewers would decide whether the results should be published immediately, after a delay, with modifications or added contextual information, or not at all.[32] Nevertheless, identifying which basic research findings are directly relevant to bioterrorism is far from obvious.[33]

US critics have also argued against restricting scientific publication on several grounds. First, the restrictions would slow progress and deter research in the areas potentially subject to censorship. Second, classifying sensitive scientific information might delay its spread only temporarily. Whereas man-made plans can be kept secret forever, facts of nature could be rediscovered by investigators working outside of the control regime. Third, restricting the publication of scientific information would make it harder to monitor dual-use research and might hamper the development of medical defences against biological threats. For example, the discovery that poxviruses can be genetically modified to make them vaccine-resistant alerted the scientific community to the need for new antiviral drugs to treat possible engineered strains. [34]

German Approach to Restrictions on Scientific Publication

The findings and recommendations of the Fink Report have yet to attract much attention in Germany, where there is limited awareness of the problem of dual-use research. Those few German scientists and officials who are familiar with the issue oppose proposals to restrict the publication of basic research directly relevant to bioterrorism. They argue that scientific freedom is guaranteed by Article 5 of the German Constitution, or Basic Law (Grundgesetz), and that any exceptions to this rule must be well-founded. The issue of restricting scientific publication is further complicated by the fact that many life-sciences journals are international, raising concerns that other countries could restrict the publication of German research.

In general, German scientists and officials believe strongly that scientific knowledge and ideas must remain freely available, regardless of their theoretical potential for misuse. Dr. Walter Biederbick, Director of the Federal Information Center for Biological Security at the Robert Koch Institute (the German equivalent of the CDC), questions whether a bioterrorist would go to the trouble of developing a genetically engineered pathogen when standard microbial agents obtained from nature, such as the anthrax bacterium, would be quite effective at killing or causing disruption. Biederbick also doubts the feasibility of controlling the advance of biotechnology. ‘Stopping the spread of knowledge is very difficult’, he says. ‘It can be delayed for a time but not prevented entirely’.[35]

A German government working paper submitted to a 2005 BWC Meeting of Experts called proposals to restrict the publication of dual-use research ‘unacceptable’ because they ‘violate central rules of scientific research’.[36] Another working paper prepared for the meeting stated, ‘To aim at the exclusion of every possibility of misuse of data with respect to “dual use” would lead to an unacceptable situation: a major part of research in the fields of microbiology and infectious diseases, especially in molecular and cellular basic research, cannot be published anymore or just with major restrictions. The probable consequence would be to stop the accumulation and exchange of knowledge to fight [the] global emergence of old and new pathogens and infectious diseases produced by nature’.[37]

According to Dr. Alexander Kekulé, a microbiologist who heads the Biosecurity Working Group of the German Commission on Homeland Security, ‘Terrorists are unlikely to be scientifically innovative in their own right. They may copy a scientific discovery or method that already exists, but to do so they would need step-by-step instructions that are simple enough to follow without a great deal of training or expertise’.[38] Kekulé worries that restricting scientific communication would be counterproductive because it would create a black or grey market in the forbidden information. If scientific publication remains unconstrained, he argues, the broad community of scientists with good intentions should be able to maintain a technical lead over the small minority with malicious intent. But if information is censored, highly motivated terrorists might find a way to access the restricted data while ordinary scientists would not, slowing the development of medical countermeasures and giving the terrorists a relative advantage.[39]

German experts acknowledge that in rare cases, the publication of dual-use information might have to be restricted if its public release could lead directly to the development of novel agents for hostile purposes. A German government working paper states that scientific editors and publishers ‘should develop specific rules for this type of information’ but that in general ‘the exchange of ideas including publications should continue to be open on the national level as well as on the international level, taking the aspects of misuse into account’.[40]

Security Reviews of Dual-Use Research

Whereas US officials emphasise the need for a review and oversight mechanism to identify and oversee dual-use experiments that could be misused for hostile purposes, their German counterparts oppose such measures on the grounds that they would be ineffective and could have a chilling effect on important areas of scientific investigation.

US Approach to Security Reviews of Dual-Use Research

US government policy with respect to the review and oversight of dual-use research in the life sciences is currently in flux because the NSABB is still in the process of developing guidelines. The key tasks facing the biosecurity board are to devise criteria for identifying ‘dual-use research of concern’ and to prepare guidelines for the review and oversight of such projects to minimise the risk that the resulting knowledge could be misused for hostile purposes. In seeking to define dual-use research, the NSABB has set the threshold fairly high. First, the research must have the potential to be misapplied directly for hostile purposes, creating an immediate risk that warrants concern. Second, the potential misuse must have significant implications for public health. For example, creating a highly virulent organism that cannot be readily transmitted would not constitute a major threat.[41]

The NSABB has also determined that whenever possible, security reviews should take place at an early stage, before a research proposal has been approved and funded by a government agency. One idea is to assign the task of reviewing research proposals to the more than 400 registered Institutional Biosafety Committees (IBCs) in the United States that currently perform risk assessments of research involving recombinant DNA technology. The overall effectiveness of the IBC system, however, has been called into question.[42] Moreover, as currently organised, the IBCs focus narrowly on biosafety and do not appear capable of fulfilling the additional biosecurity functions envisioned by the NSABB. Not only are the existing committees overworked and staffed largely by volunteers, but they lack the expertise to assess the security implications of proposed research. Thus, a new set of local oversight committees may need to be created for this purpose.

Given the globalization of biotechnology research, it is clear that any effective system of security review and oversight of dual-use research will have to be based on internationally harmonised rules and procedures. If, for example, other countries were to adopt guidelines that are considerably less stringent than those of the United States, US researchers and scientific journals would find themselves at a competitive disadvantage, and the expected biosecurity benefits of the tighter US regulations would not materialise.[43]

In addition to national mechanisms for security review and oversight, the US government favours the development of professional codes of conduct to sensitise scientists to their obligations under the BWC and encourage them to report violations by others.[44] Because biomedical research is so diverse, however, the United States contends that a ‘one-size-fits-all’ approach would be ineffective and that each organisation should develop its own code of conduct, tailored to its particular focus and the activities of its members.[45]

German Approach to Security Reviews of Dual-Use Research

German government officials and scientists have yet to grapple systematically with the problem of dual-use research. Although some preliminary discussions of the issue have taken place, there are currently no plans to establish an NSABB-like commission to advise the federal government on biosecurity. The reason for this resistance is two-fold: (1) a strong commitment to academic freedom, and (2) the perception that German science is already overregulated and that more government intervention would curtail scientific progress and national competitiveness in the fields of biomedicine, biology and biotechnology.[46]

The German biomedical community stresses the importance of basic research on the mechanisms of pathogenesis and antibiotic resistance for combating infectious diseases and objects to the Fink Report’s proposed restrictions on experiments that enhance the pathogenicity, transmissibility or host range of bacteria and viruses. German scientists contend that the world will be more secure if such research remains fully transparent and in the public domain rather than hidden behind walls of secrecy and classification. According to a German working paper prepared for a BWC experts’ meeting, ‘An open information exchange between scientists will allow a better understanding of risks arising from the handling of infectious or toxic material or genetic modifications of organisms. This will lead to generally accepted recommendations for risk management of dangerous pathogens and toxins’.[47]

In lieu of restrictions on dual-use research, the German government favours mandating laboratory best practices and educating graduate students and postdoctoral fellows about bioethics and biological arms control. A professional code of conduct for the life sciences would ideally ‘promote awareness of the complex dual-use dilemma and at the same time obligate the research scientist to reflect on risk assessments and consider alternative approaches during the research process’. Germany objects, however, to codes banning ‘research of any kind carried out with peaceful intent’.[48]

Some German scientists admit that certain hypothetical experiments should not be carried out because they could result in engineered pathogens or dual-use information that might endanger public health and national security. In such cases, however, scientists and not bureaucrats should be the ones deciding whether or not to proceed. Dr. Biederbick of the Robert Koch Institute observes that ‘as soon as the state begins controlling information, it inadvertently creates incentives to circumvent those controls’.[49] Accordingly, German officials favour a self-governance mechanism that emerges from within the scientific community, rather than being imposed from above. Dr. Stefan Kaufmann, the director of the Max Planck Institute for Infection Biology in Berlin, supports efforts to raise the awareness of scientists about the potential for misuse so that they will agree to participate voluntarily in the oversight process. He acknowledges, however, that if scientists do not accept responsibility for reviewing dual-use research, the task may be taken out of their hands.[50]

Conduct of Biodefence Research

Although the BWC permits defensive research to protect soldiers and civilians from biological weapons, the line between defensive and offensive activities is defined largely on the basis of intent and can be difficult to distinguish clearly. For this reason, biodefence research projects should be sufficiently transparent to avoid provoking suspicions that they are a cover for offensive research and development. Germany appears to be more sensitive to this particular dual-use dilemma than the United States.

US Approach to Biodefence Research

The US presidential directive ‘Biodefense for the 21st Century’, issued by the Bush Administration in April 2004, describes the basic elements of the US biodefence programme and defines the roles and responsibilities of various federal departments and agencies in implementing this strategy. A key pillar of the US biodefence programme is research on ‘threat awareness, including BW-related intelligence, risk and net assessments, and anticipation of future threats’.[51] To help define the nation’s biodefence priorities, the Science and Technology Directorate of the Department of Homeland Security (DHS) conducts periodic threat and risk assessments of a broad set of biological agents. According to John Vitko, Director of the Biological Countermeasures Portfolio at DHS, these assessments ‘are performed with the best available information. However, there are large uncertainties, sometimes factors of ten to a hundred, in some of the key parameters and hence in the associated risks’.[52]

To address these ‘critical knowledge gaps’, DHS conducts a programme of ‘laboratory threat characterization research’ that reportedly involves realistic tests with small amounts of weaponised pathogens and toxins, as well as genetically engineered microbes that might be used in a bioterrorist attack.[53] A maximum-containment laboratory for such research, the National Biodefense Analysis and Countermeasures Center (NBACC), is currently under construction at Fort Detrick, Maryland. Some arms control experts have criticised this research programme because it appears to skirt or even cross the line of what is permitted by the phrase ‘prophylactic, protective and other peaceful purposes’ in Article I of the BWC and because some of the projects are classified.[54] Although DHS reviews the threat-characterization experiments internally to ensure compliance with the treaty, the department does not intend to subject the research to a more objective interagency oversight process. According to an article in the Washington Post, “The administration . . . [insists] that the work of NBACC is purely defensive and thus fully legal. It has rejected calls for oversight by independent observers outside the department’s network of government scientists and contractors. And it defends the secrecy as necessary to protect Americans.” [55]

German Approach to Biodefence Research

The German term for biodefence is B-Schutz, meaning ‘biological protection’, avoiding the word for ‘defence’ (Verteidigung) because of its military connotations. During the 1950s, when West Germany was allowed to rearm as a member of NATO, the former Wehrmacht Chemical Troops (Nebeltruppe) were renamed the Nuclear, Biological and Chemical (NBC) Protective Troops (ABC-Abwehrtruppe). Military biodefence work is conducted by two research centres of the German Federal Armed Forces (Bundeswehr), the Institute for Microbiology in Munich and the Defence Research Institute for Protective Technologies in Munster, as well as by a number of civilian contractors.

The German biodefence programme is characterised by a strict policy of focussing on protective measures, reducing the likelihood that the research will result in dual-use findings. According to a German government statement, ‘Activities with potential for offensive use, such as investigation of the [antibiotic] resistance of microorganisms, genetic manipulation of organisms and aerosol experiments, are avoided in principle’.[56] The Bundeswehr does not conduct laboratory threat-assessment studies that might be problematic from an arms control perspective.[57] Further, the Bundeswehr eschews classified biodefence research and maintains the transparency of its activities by regularly publishing research findings and presenting them at national and international conferences.

German biodefence activities are also subjected to the following review and oversight mechanisms:

  • Internal peer review of all research projects by the relevant Bundeswehr agencies;
  • Oversight by the responsible federal and state authorities of all projects involving the use of genetically engineered organisms;
  • Submission of an annual BWC confidence-building measure declaration on the German biodefence programme to the UN Department of Disarmament Affairs;
  • Publication of the topics and goals of all medical biodefence research projects on the website of the Medical Service (Sanitätsdienst) of the Bundeswehr;[58] and
  • Parliamentary oversight in the form of an annual declaration by the Bundeswehr to the Defense Committee of the German Parliament of all biodefence research projects financed by the German Ministry of Defence that utilise genetic engineering techniques.[59]

Conclusions

Although United States and Germany are close allies and have many values in common, their biosecurity policies differ in important ways for reasons of history, geopolitics and political culture. In the area of pathogen security, Germany relies on broad biosafety regulations rather than narrowly targeted biosecurity measures. The German biosafety regulations predated the US anthrax letter attacks of autumn 2001 and have changed little since then. The only area that has been expanded since 9/11 involves personal reliability checks of scientists who work with dangerous pathogens, and this vetting process draws on existing legislative authority. Unlike the United States, Germany does not deny access to dangerous pathogens strictly on the basis of nationality. According to political scientist Alexander Kelle, the German emphasis on biosafety rather than biosecurity ‘reflects the limited extent to which public health has been securitised in the German political and expert discourse’.[60]

With respect to dual-use research, German officials have so far rejected restrictions on scientific publication and proposals for top-down government oversight. In the area of biodefence, the two countries have also taken quite different approaches. The United States has largely ignored how other countries view its laboratory threat-characterisation programme, which includes experiments that appear to skirt if not cross the red lines laid down by the BWC. Germany, in contrast, has sought to reassure other countries about the strictly protective nature of its biodefence programme by avoiding provocative experiments and striving for maximum transparency.

German experts are troubled by the rapid expansion of the US biodefence programme since 2001 and the claim by DHS officials that it may be necessary to create small quantities of weaponised biological agents in order to guide the development of countermeasures. This logic, German officials fear, could undermine the normative restraints embodied in the BWC and inadvertently lead to a new biological arms race. According to Dr. Beck, ‘The Americans have a different concept of what can and should be done in biodefence research than the Europeans. So I think it will be difficult to agree on a common standard for what should be permitted’. [61]

The divergent biosecurity paradigms of Germany and the United States can be attributed to a variety of factors. First, the two countries differ in their assessments of the magnitude and urgency of the threats of biological warfare and bioterrorism. As a global power with military forces deployed around the world, the United States is more vulnerable to an asymmetric biological attack, whereas German officials assess the risk of bioterrorism in Europe as fairly low.[62] In addition, whereas US analysts tend to engage in worst-case assessments of the bioterrorism threat, their German counterparts are sceptical that terrorist groups could exploit discoveries at the cutting-edge of biology to create novel biological weapons.

Second, the German biosafety regulations are based on a strong cultural tradition of placing trust in the professional integrity of scientists and the self-governance of the scientific community. Germans take it as an article of faith that individuals with the right training and credentials will comply with the rules for good laboratory practice and effective biocontainment. Nevertheless, this assumption neglects the fact that scientists may be motivated by curiosity or ambition to cut corners and perform experiments that pose risks to society at large. The German biosafety regulations also fail to address scenarios in which trusted insiders deliberately acquire and release pathogens for malicious purposes.

Third, in reaction to the heavy-handed censorship and ideological distortion of scientific research (particularly in the field of human genetics) that took place during the Third Reich, German scientists and officials perceive scientific freedom as an inalienable right and strongly resist government intervention in this area. At the same time, the memory of the unethical experiments performed by Josef Mengele and other Nazi doctors has led Germany to be more stringent in regulating applied biotechnologies such as genetic engineering.

Unless the gap between the US and German approaches to biosecurity can be bridged, it will create impediments to scientific cooperation and joint efforts to combat bioterrorism. German scientists complain that since 2001, they have had difficulty exchanging select-agent strains with US scientists or ordering microbial cultures from American Type Culture Collection, the leading US supplier. In addition, the US Select Agent Rule has made it more difficult for American researchers to share data with colleagues from other nations. According to Dr. Bernd Appel, director of the biosafety division at the German Federal Office for Risk Assessment, exchanges of information on select agents between Germany and the United States have become a ‘one-way street’.[63] For example, during the preparation of a European manual of laboratory methods for anthrax research, researchers at the Max Planck Institute for Infection Biology sought advice from US anthrax specialists, who refused to share information. The Institute’s director, Dr. Kaufmann, observes, ‘Having always assumed that scientists from different countries could talk freely, I found the reticence of my American colleagues both troubling and sad’.[64] Such restrictions on the sharing of sensitive information risk isolating US researchers from the international scientific community.

In Search of Common Ground

What are some possible areas of common ground between the US and German approaches to biosecurity? With respect to pathogen security, the German biosafety regulations and the US Select Agent Rule differ conceptually but are substantially equivalent in terms of effectiveness. [65] In other areas of biosecurity, however, the US and German approaches remain far apart. Whereas the NSABB process in the United States is moving toward imposing some type of review and oversight process on dual-use research and publications, German officials strongly oppose this approach.

In general, German officials are less concerned than their American counterparts about the dual-use aspects of basic research in the life sciences and the professional responsibility of scientists to prevent misuse. When it comes to biodefence research, however, Germany appears to be more aware than the United States of the dual-use dilemmas in this area and has taken deliberate steps to reduce the risk of misperception by other countries. According to Dr. Kaufmann, both the United States and Germany should seek out a sensible compromise approach to biosecurity. ‘In America there is a strong awareness of dual-use issues but often to the point of hysteria, resulting in a huge biodefence complex that creates its own dangers’, he observes. ‘Germany, in contrast, has little awareness of dual-use issues, resulting in a sense of complacency that produces a different set of risks. We must find a middle way that takes the threat seriously without devoting billions of dollars to biodefence research’.[66]

Both countries agree in principle that some types of experiments should not be performed, either because they would create a highly dangerous pathogen or because the resulting knowledge would be prone to misuse. But the German side strongly opposes restrictions on scientific publication and a top-down mechanism for review and oversight of security-sensitive research. Instead, German officials favour the idea of identifying dual-use concerns early in the proposal review process and subjecting the small fraction of experiments that pose potential safety or security risks to a process of peer review (Begutachtung) before a final funding decision is made.

Preliminary approaches for conducting such reviews have already been developed. In addition to the work of the NSABB, Jan van Aken of the University of Hamburg and John Steinbruner and his colleagues at the University of Maryland have suggested useful approaches to weighing the risks and benefits of dual-use research.[67] Nevertheless, it remains to be seen if the United States and Germany can work out compatible responses to the challenge of biosecurity in an age of international terrorism.

Notes

[1] This finding is actually not that surprising. Professor Sheila Jasanoff of Harvard University has described how the United States and Germany took divergent approaches to the regulation of genetic engineering based on their different political cultures. Whereas US policy-makers framed genetic engineering as a tool for making products whose risks could be assessed according to existing regulatory principles, German officials framed genetic engineering as a novel technological process for intervening in nature that entailed certain inherent risks and uncertainties, and thus required special precautions. See Sheila Jasanoff, Designs on Nature: Science and Democracy in Europe and the United States (Princeton, NJ: Princeton University Press, 2005).

[2] This point was made by Gerald Epstein, in remarks during a panel discussion on ‘Preventing the Misuse of Biotechnology’, Carnegie International Nonproliferation Conference, Washington, D.C., December 15, 2002.

[3] See Jonathan B. Tucker, ‘Preventing Terrorist Access to Dangerous Pathogens: The Need for International Biosecurity Standards’, Disarmament Diplomacy 66 (September 2002)

[4] United Nations Security Council, ‘Letter dated 25 April 2006 from the Chairman of the Security Council Committee established pursuant to Resolution 1540 (2004) addressed to the President of the Security Council’, S/2006/257 (April 25, 2006).

[5] Ronald M. Atlas and Malcolm Dando, ‘The Dual-Use Dilemma for the Life Sciences: Perspectives, Conundrums, and Global Solutions’, Biosecurity and Bioterrorism, vol. 4 no. 3 (September 2006), pp. 276-286.

[6] Terence M. Tumpey, et al. “Characterization of the Reconstructed 1918 Spanish Influenza Pandemic Virus’, Science (October 7, 2005), pp. 77-80.

[7] Jan van Aken, ‘Is It Wise to Resurrect a Deadly Virus?’, Heredity, online publication (October 11, 2006), doi:10.1038/sj.hdy.6800911; Jonathan B. Tucker and Raymond A. Zilinskas, ‘The Promise and Perils of Synthetic Biology’, The New Atlantis 12 (Spring 2006), pp. 25-45.

[8] Jonathan B. Tucker, Biosecurity: Limiting Terrorist Access to Dangerous Pathogens, Peaceworks Report No. 52 (Washington, DC: US Institute of Peace, November 2003).

[9] Jessica Eve Stern, ‘Larry Wayne Harris’, in Jonathan B. Tucker, ed., Toxic Terror: Assessing Terrorist Use of Chemical and Biological Weapons (Cambridge, MA: MIT Press, 2000), pp. 227-246.

[10] The current US State Department list of state sponsors of terrorism includes Cuba, Iran, North Korea, Sudan, and Syria.

[11] United States of America, ‘Specific Measures Taken by the United States Relevant to Security of Dangerous Pathogens and Toxins’, First BWC Meeting of Experts, BWC/MSP.2003/MX/WP.6 (July 4, 2003).

[12] An updated version of the US Select Agent List is available online at http://www.cdc.gov/od/sap/docs/salist.pdf

[13] David Malakoff, ‘One Year After: Tighter Security Reshapes Research’, Science, vol. 297 (September 6, 2002), pp. 1630-1633.

[14] David Malakoff and Martin Enserink, ‘Scientist on Trial’, Science Now (December 1, 2003).

[15] Dave Altimari, ‘Security Fears at Anthrax Labs’, Hartford Courant (October 8, 2006).

[16] Jennifer Gaudioso and Reynolds M. Salerno, ‘Biosecurity and Research: Minimizing Adverse Impacts’, Science, vol. 304 (April 30, 2004), p. 687.

[17] Synthetic Biology Working Group presentation, ‘Meeting of the National Science Advisory Board for Biosecurity (NSABB)’, Bethesda, MD (October 25, 2006), transcribed by Edward Hammond.

[18] Federal Republic of Germany, ‘Legislation in the Federal Republic of Germany on the Prohibition of Biological Weapons’, Fifth Review Conference of the BWC, BWC/CONF.V/5 (October 2, 2001). See also, Christine Rhode and David Smith, ‘Die Gesetze im Alltag der Mikrobiologen’, BIOforum (November 2005), pp. 2-3.

[19] Federal Republic of Germany, ‘Legislation in the Federal Republic of Germany Related to Security and Oversight of Pathogenic Microorganisms and Toxins’, Third Meeting of Experts, Geneva, Switzerland, BWC/MSP.2003/MX/WP.13 (July 28, 2003).

[20] Germany currently has two BSL-4 laboratories for work with human pathogens, at the Bernard Nocht Institute for Tropical Medicine in Hamburg and the Philipps University in Marburg, and a third is planned at the Robert Koch Institute in Berlin. A fourth high-containment facility, at the Friedrich Loeffler Institute on Reims Island in the Baltic Sea, studies highly contagious livestock pathogens such as foot-and-mouth disease.

[21] The German biosafety regulations have a loophole with respect to clinical laboratories. The registration requirement applies only to facilities that perform ‘targeted’ (zielgerichtete) experiments with dangerous pathogens but not clinical labs that culture human specimens for purposes of diagnosis. Moreover, there is no requirement for clinical laboratories to subject their staff members to personal security checks or to ensure the physical security of patient specimens that may contain dangerous pathogens.

[22] The original German text states: ‚Eine Person, die mit einer sicherheitsempfindlichen Tätigkeit betraut werden soll (Betroffener), ist vorher einer Sicherheitsüberprüfung zu unterziehen’.

[23] Interview with Professor Reinhard Burger, Vice President, Robert Koch Institute (September 28, 2006).

[24] When a foreign gene is inserted into a host organism, the experiment must be conducted at a level of biocontainment that corresponds to the Risk Group of the transferred gene or the host, whichever is higher.

[25] Interview with Dr. Volker Beck, technical advisor to the German Federal Foreign Office, Berlin (October 4, 2006).

[26] Arthur Caplan, quoted in Ronald M. Atlas, ‘Bioterrorism: The ASM Response’, ASM News [American Society for Microbiology], vol. 68 (2002), pp. 117-121.

[27] Journal Editors and Authors Group, ‘Statement on Scientific Publication and National Security’, Science, vol. 299 (February 21, 2003), p. 1149.

[28] National Science Advisory Board for Biosecurity, Minutes of meeting on March 30, 2006, Bethesda, Maryland, p. 21.

[29] Lawrence M. Wein and Yifan Liu, ‘Analyzing a Bioterror Attack on the Food Supply: The Case of Botulinum Toxin in Milk’, Proceedings of the National Academy of Sciences (July 12, 2005), pp. 9984-9989.

[30] US National Research Council, Committee on Research Standards and Practices to Prevent the Destructive Application of Biotechnology, Biotechnology Research in an Age of Terrorism (Washington, DC: National Academies Press, 2003).

[31]Dana A. Shea, Congressional Research Service, ‘Oversight of Dual-Use Biological Research: The National Science Advisory Board for Biosecurity’, CRS Report for Congress (March 29, 2006), p. 7.

[32] National Science Advisory Board for Biosecurity, ‘Tools for the Responsible Communication of Research with Dual Use Potential’, NSABB Draft Guidance Documents (July 2006), Section 2, pp. 7-15. See also, Kelly Field, ‘Biosecurity Panel Suggests Pre-Publication Review for Research that Could be Threatening’, Chronicle of Higher Education (July 24, 2006).

[33] Peter Aldous, ‘Bioterror Special: Friend or Foe?’ New Scientist, (October 14, 2006).

[34] Robert Carlson, ‘Commentary: Open development of biological technology is crucial to US domestic security and to the health of our economy’, (2005), available online at http://www.futurebrief.com.

[35] Interview with Dr. Walter Biederbick, Robert Koch Institute, Berlin (September 28, 2006).

[36] Federal Republic of Germany, ‘Regulations for the Prevention of Bioterrorism-Pros and Contras from a Scientist’s Point of View’, Third BWC Meeting of Experts, Geneva, Switzerland, BWC/MSP/2005/MX/WP.11 (June 13, 2005).

[37] Federal Republic of Germany, ‘The University Science Perspective’, Third BWC Meeting of Experts, Geneva, Switzerland, BWC/MSP/2005/MX/WP.13 (June 13, 2005).

[38] Interview with Dr. Alexander Kekulé, Director, Institute for Medical Microbiology, Martin Luther University, Halle-Wittenber, Germany (October 24, 2006). See also, German Commission on Homeland Security web site at http://www.schutzkommission.de

[39] Interview with Dr. Alexander Kekulé.

[40] Federal Republic of Germany, ‘Infectious Diseases, Biosafety and Biosecurity’, Third BWC Meeting of Experts, Geneva, Switzerland, BWC/MSP/2005/MX/WP.14 (June 13, 2005).

[41] Dennis Kasper, “Research Restrictions,” Bulletin of the Atomic Scientists (March/April 2007), p. 19; World Health Organization, Life Sciences Research: Opportunities and Risks for Public Health - Mapping the Issues (Geneva, Switzerland: WHO, 2005), pp. 16-17.

[42] Sunshine Project, Mandate for Failure: The State of Institutional Biosafety Committees in an Age of Biological Weapons Research, (Austin, TX: Sunshine Project, October 2004).

[43] Shea, ‘Oversight of Dual-Use Research’, p. 9.

[44] Brian Reppert, ‘Towards a Life Science Code: Countering the Threats from Biological Weapons’, Bradford Briefing Paper No. 13 (September 2004), online at http://www.brad.ac.uk/acad/sbtwc

[45] US Representative Donald A. Mahley, ‘United States Statement to the Annual Meeting of the Biological Weapons Convention States Parties, December 5-9, 2006’.

[46] Adelheid Müller-Lissner, ‘Politische Pflanzen: Der Streit um die Zukunft der grünen Gentechnik’, Der Tagesspiegel (Berlin), (September 27, 2006), p. 27; interview with Dr. Gabriele Kraatz-Wadsack, United Nations Department of Disarmament Affairs, New York City, (July 6, 2006).

[47] Federal Republic of Germany, ‘Legislation and Freedom of Research’, Third BWC Meeting of Experts, Geneva, Switzerland, BWC/MSP/2005/MX/WP.15 (June 13, 2005).

[48] Federal Republic of Germany, ‘Codes of Conduct and Their Application in the Life Sciences at Universities’, Third BWC Meeting of Experts, BWC/MSP/2005/MX/WP.12 (June 13, 2005).

[49] Interview with Dr. Biederbick.

[50] Interview with Dr. Stefan Kaufmann, Max Planck Institute for Infection Biology, Berlin (September 28, 2006).

[51] The White House, Biodefense for the 21st Century (Washington, D.C., April 2004).

[52] John Vitko, Jr., Director, Biological Countermeasures Portfolio, Science & Technology Directorate, Department of Homeland Security, Statement for the Record, Hearing on ‘Implementing the National Biodefense Strategy’, US House of Representatives, Committee on Homeland Security, Subcommittee on Prevention of Nuclear and Biological Attack (July 28, 2005).

[53] James B. Petro and W. Seth Carus, ‘Biological Threat Characterization Research: A Critical Component of National Biodefense’, Biosecurity and Bioterrorism, vol. 3 no. 4 (2005), pp. 295-308.

[54] Milton Leitenberg, James Leonard, and Richard Spertzel, ‘Biodefense Crossing the Line’, Politics and the Life Sciences 22 (2004). See also, Jonathan B. Tucker, ‘Biological Threat Assessment: Is the Cure Worse than the Disease?’ Arms Control Today (October 2004), online at http://www.armscontrol.org/act/2004_10/Tucker.asp

[55] Joby Warrick, ‘The Secretive Fight Against Bioterror’, Washington Post (July 30, 2006), p. A1.

[56] Federal Republic of Germany, ‘German Policies for Biodefence Research’, Third BWC Meeting of Experts, Geneva, Switzerland, BWC/MSP/2005/MX/WP.10 (June 13, 2005).

[57] Sunshine Project, A Survey of Biological and Biochemical Weapons Related Research Activities in Germany (November 16, 2004), p. 4.

[58] Website of the Medical Service of the German Bundeswehr, ">http://www.sanitaetsdienst-bundeswehr.de

[59] Federal Republic of Germany, ‘German Policies for Biodefense Research’.

[60] Dr. Alexander Kelle, ‘Discourses on the Securitization of Public Health - A Survey of Four Countries’, University of Bradford Regime Review Paper No. 3 (June 2006), p. 21, available online at: http://www.brad.ac.uk/acad/sbtwc/regrev/regrev.htm

[61] Interview with Dr. Volker Beck.

[62] Schutzkommission beim Bundesministerium des Innern, ‚Bericht über mögliche Gefahren für die Bevölkerung bei Großkatastrophen und im Verteidigungsfall’ [Committee on Homeland Security of the German Ministry of the Interior, ‘Report on Possible Dangers for the Population from Major Catastrophes and in Wartime’], Zivilschutz-Forschung 48 (October 2001), p. 27.

[63] Interview with Dr. Bernd Appel, German Federal Office for Risk Assessment, Berlin (November 23, 2006).

[64] Interview with Dr. Stefan Kaufmann.

[65] In another effort to build a bridge between the biosafety and biosecurity paradigms, the World Health Organization (WHO) recently published a booklet of biosecurity guidelines to complement its existing manual on laboratory biosafety. See World Health Organization, ‘Biorisk Management: Laboratory Biosecurity Guidance’, WHO/CDS/EPR/2006.6 (Geneva, Switzerland: WHO, September 2006).

[66] Interview with Dr. Stefan Kaufmann.

[67] Jan van Aken, ‘When Risk Outweighs Benefits’, EMBO Reports, vol. 7 [special issue] (2006), pp. 11-12; John Steinbruner, Elisa D. Harris, Nancy Gallagher, and Stacy Okutani, ‘Controlling Dangerous Pathogens: A Prototype Protective Oversight System’, Working Paper, Center for International Security Studies, University of Maryland (December 2005).

Dr. Jonathan B. Tucker is a Fulbright Senior Scholar in Berlin, on leave from the Center for Nonproliferation Studies of the Monterey Institute of International Studies. This paper was written during a two-month Bosch Public Policy Fellowship at the American Academy in Berlin.

© 2007 The Acronym Institute.