In their article “Neuroethics and National Security,” Canli and his colleagues (2007) raise some important ethical, legal and social questions concerning the use of new brain mapping technologies for national security. I applaud the authors for bringing these issues to the attention of the bioethics community, and I especially concur with their call for a partnership between neuroscientists, ethicists, and government decision-makers concerning the dilemmas related to neuroethics and national security. Toward the end of their article, the authors mention some challenges that need to be addressed, including a couple of issues related to secrecy and classified research. I would like to expand the discussion of those issues in this commentary.
One of the perennial issues in ethics of scientific research is the conflict between openness and secrecy. In one respect, openness is clearly one of the most important ethical norms in science, as it is essential for collaboration, criticism, peer review, replication of results, and public accountability. The failure to share information in research often bears rotten fruits, including intellectual isolation and stagnation, dogmatism, corruption, and harms to the public. In another respect, secrecy is sometimes necessary to maintain the confidentiality of research subjects, protect unpublished research from premature disclosure, or safeguard intellectual property rights, trade secrets, or military secrets. Researchers must find ways to balance these competing values—openness versus secrecy—when deciding whether to share or publish scientific information (Resnik 2006).
The threat of terrorism—bioterrorism in particular—has prompted greater awareness of national security issues and the ethics of scientific publication (National Research Council 2003; Resnik and Shamoo 2005). Legal systems in the United States and many other countries distinguish between “classified” and “unclassified” scientific research. Government agencies have the authority to classify research if its widespread dissemination would pose a significant threat to national security. Most of the classified research in the United States is sponsored by agencies that deal with military, intelligence, or security issues. Research may be classified when a study has been completed or before it has even begun. Nuclear weapons research, for example, is “born classified” in that it is treated as classified from its inception. Classified information is not disseminated to the public and is distributed on a need-to-know basis: a person may have access to classified information only if his or her job requires it (Shulsky and Schmitt 2002). Access to such information is granted only to people with the proper security clearance. To obtain a security clearance, a person must undergo an extensive background check and agree to have his or her actions monitored. Government agencies may declassify information when its dissemination no longer poses a threat to national security. For example, in 1994, President Clinton declassified thousands of documents pertaining to secret radiation experiments on human subjects sponsored by the federal government (Advisory Committee on Human Radiation Experiments 1995).
“Unclassified” research includes most academic and industrial science, as well as some types of military research that do not involve national security issues. Unclassified research has none of the restrictions that apply to classified research: scientists do not need to obtain special permission to have access to or disseminate unclassified research, nor do they need to face the threat of reprisal for publishing or talking about unclassified research. Since 1985, the United States’ policy has been to ensure that unclassified research can be shared freely with members of the scientific community; the United States government usually does not attempt to censor or control unclassified research. In rare cases, however, the government has attempted to co-opt academic or industrial research that poses a threat to national security by requesting that it be classified. For example, the government has attempted to control encryption technologies developed by scientists in academia and industry (Resnik 1998).
Since the government does not attempt to stop the dissemination of unclassified research, scientists sometimes face difficult questions concerning the publication or public presentation of studies that may pose a threat to national security. For example, the editors of the Proceedings of the National Academy of Sciences (PNAS) encountered a difficult decision when authors Lawrence Wein and Yifan Liu submitted an article in the fall of 2004 describing a mathematical model for using botulinum toxin to contaminate the United States’ milk supply. The article provided an estimate of the amount of toxin needed to kill half a million people, but also described some steps that the United States could take to safeguard the milk supply (Wein and Liu 2005). Officials from the Department of Health and Human Services (DHHS) requested that PNAS not publish the article because of its implications for national security. After meeting with representatives of DHHS, the editors of PNAS nevertheless decided to publish the article because they believed that the benefits of publication outweighed the risks. According to the editors, the article did not provide terrorists with any information that was not already publicly available but it did contain some important information the public health authorities could use to protect the milk supply (Alberts 2005).
The article by Wein and Liu (2005) serves as a salient example of what has come to be known as the dual use dilemma in biomedical research: very often knowledge that can be used for good purposes, such as preventing or treating disease or promoting public health, can also be used for harmful purposes, such as causing disease, death, mass destruction, or terror. In making decisions concerning dual use research, editors (and other gatekeepers, such as conference organizers) must carefully assess and weigh the benefits and risks of public dissemination. They must also consider the alternatives to public dissemination, such as sharing the results with a limited audience of qualified scientists, or publication of only part of the research (Resnik and Shamoo 2005). To make well-informed decisions, it is important to solicit advice from experts in the relevant area of science, but also experts in many other areas, such as international relations, anthropology or sociology, ethics, public policy, and law. Decision makers may also require information from law enforcement, military, and intelligence authorities. In 2004, the DHHS formed an organization, known as the National Science Advisory Board for Biosecurity, to provide biomedical scientists with information, advice and support for difficult decisions concerning publication of dual-use research (Alberts 2005).
Though most of the brain-imaging technologies described in the target article are still in their infancy, they may be mature enough one day to pose a threat to national security if they are publicly disseminated. When this happens, neuroscientists will face difficult questions related to the publication of dual-use research in their field. To ensure that they are prepared to deal with these tough choices, neuroscientists should be mindful of the ethical, social, and legal implications of their work, and maintain an open dialogue with government officials. The article by Canli and his colleagues (2007) is a helpful step in that direction.
Acknowledgments
The intramural program of the National Institute of Environmental Health Sciences, National Institutes of Health, sponsored this research. It does not represent the views of the National Institute of Environmental Health Sciences or the National Institutes of Health.
References
- Advisory Committee on Human Radiation Experiments. The human radiation experiments. Washington, DC: Department of Energy; 1995. [Google Scholar]
- Alberts B. Modeling attacks on the food supply. Proceedings of the National Academy of Science USA. 2005;102:9737–9738. doi: 10.1073/pnas.0504944102. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Canli T, Brandon S, Casebeer W, Crowley PJ, DuRousseau D, Greely HT, Pascual-Leone A. Neuroethics and national security. American Journal of Bioethics (AJOB-Neuroscience) 2007;7(5):3–13. doi: 10.1080/15265160701290249. [DOI] [PubMed] [Google Scholar]
- National Research Council. Biotechnology in the age of terrorism: Confronting the dual use dilemma. Washington, DC: National Research Council; 2003. [Google Scholar]
- Resnik D. The ethics of science. New York, NY: Routledge; 1998. [Google Scholar]
- Resnik D. Openness vs. secrecy in scientific research. Episteme. 2006;2:135–147. doi: 10.3366/epi.2005.2.3.135. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Resnik D, Shamoo A. Bioterrorism and the responsible conduct of biomedical research. Drug Development Research. 2005;63:121–133. [Google Scholar]
- Shulsky A, Schmitt G. Silent warfare: understanding the world of intelligence. 3. Dulles, VA: Potomac Books; 2002. [Google Scholar]
- Wein L, Liu Y. Analyzing a bioterror attack on the food supply: the case of botulinum toxin in milk. Proceedings of the National Academy of Sciences USA. 2005;102:9984–9989. doi: 10.1073/pnas.0408526102. [DOI] [PMC free article] [PubMed] [Google Scholar]
