Abstract
Recent experiments have been used to “edit” genomes of various plant, animal and other species, including humans, with unprecedented precision. Furthermore, editing Cas9 endonuclease gene with a gene encoding the desired guide RNA into an organism, adjacent to an altered gene, could create a “gene drive” that could spread a trait through an entire population of organisms. These experiments represent advances along a spectrum of technological abilities that genetic engineers have been working on since the advent of recombinant DNA techniques. The scientific and bioethics communities have built substantial literatures about the ethical and policy implications of genetic engineering, especially in the age of bioterrorism. However, recent CRISPr/Cas experiments have triggered a rehashing of previous policy discussions, suggesting that the scientific community requires guidance on how to think about social responsibility. We propose a framework to enable analysis of social responsibility, using two examples of genetic engineering experiments.
Controversial genetic engineering research has started to crowd the events calendar at the National Academy Sciences. Last year the Academy convened a workshop to examine infectious disease gain of function research (Sharples et al.) and this September, 2015, it is scheduled to begin examining human gene editing, in particular the use of CRISPr-Cas technology. These meetings will provide a useful public setting for debate and they produce useful resources that document sentiments at particular points in time. At the same time, the National Academy of Sciences, and many others, have examined these questions, especially the wisdom of human germ line gene editing many times before now. (Olson; Committee on Science Engineering and Public Policy) This is not to suggest that debate over such issues should be curtailed. However, the fact that ethical issues around genetic modification continue to be revisited time and again by the scientific community with little reference to previous discussion does suggest that convening meetings at the National Academy of Science is an insufficient means for resolving such important issues. Furthermore, recent policy discussions about gene drives, while necessary and welcome, continue to focus on biohazard containment strategies and on filling regulatory gaps exposed by the discovery of a specific technique for genetic manipulation. (Akbari et al.; Oye et al.) However, these discussions do not appear to generalize, and instead remain specific and acontextual. It is time for scientists to expand beyond a concept of ethical and responsibility in science that is limited to “responsible conduct of research” (RCR) or meeting regulatory requirements. We need to reconceptualize how scientists can address broader societal implications of their work as they develop research agendas and protocols.
Recent experiments in genetic engineering such as those on the H5N1 influenza virus and those using gene drives highlight the need to clarify how scientists should integrate social responsibility into their work. (Relman; Oye et al.) However, there is no consensus about what social responsibility in science means or what it looks like in practice. (Wang; Pimple; Rappert; Glerup and Horstb; Zandvoort) In order to advance the discussion about what social responsibility is and how to implement it, we use these recent experiments as case studies to propose an analytic framework that can be used to assess and compare research projects along several dimensions of practice. Our intent here is not to define social responsibility, nor to make prescriptive claims. Rather we hope to open up a domain of empirical inquiry into what various stakeholder groups think constitutes social responsibility in science, with the goal of building toward conceptual clarity and consensus on this topic.
What is social responsibility in science?
Definitions of social responsibility in science, especially in US-based publications, are often vague (Pimple; Wang), such as “striving to promote the good of society through your research.” (Resnik) Nevertheless, social responsibility is sometimes seen as distinct from responsible conduct of research (RCR) which has retained a focus that is limited to role-related responsibilities, and typically include responsibilities to produce reliable, unbiased results; prohibitions on falsification, fabrication or plagiarism; responsibilities to publish or share results of research, and obligations to serve as peer reviewers and to mentor future researchers. (National Academy of Sciences (NAS) On Being a Scientist: Responsible Conduct in Research [2nd Edition]; MacIntyre; Douglas; Shrader-Frechette; Weed and McKeown; Pimple; Gibbons) Pimple’s phrase serving the common good highlights what is missing from the current RCR-focused research ethics framework (Pimple) and is useful because it directs attention to the relationship between science and society, and suggests both positive and negative duties: how can science both protect and promote the common good? (Douglas)
Some contend that responsibilities related to the common good include negative obligations, such as to avoid unjustified risk. (Douglas; Shrader-Frechette) Such negative obligations are reflected in many regulatory approaches to potential risk, including the NSABB and the US Recombinant DNA Advisory Committee and even the work of research ethics boards and data safety monitoring boards and include variants such as the precautionary principle and prudent vigilance. (Gutmann) Others have proposed that these responsibilities also include positive obligations such as to improve health and fight disease, or to participate in policy development. (Weed and McKeown) More far-reaching arguments suggest that science has moved to a new “post-normal” stage (Funtowicz and Ravetz) which, having superseded traditional concepts of risk prediction, calls for a fundamental re-thinking of how science should and can serve society. Thus many philosophers agree that science does have a broader responsibility to society that includes, but extends beyond, the role responsibilities laid out in RCR, but there is less clarity about what it is, on what it is based, or what it requires of scientists in daily practice. It is furthermore not clear whether the constituent obligations fall to science collectively (Shrader-Frechette), or to scientists as individuals. (Douglas)
Demand for greater social responsibility in science dates to the early decades of the 20th century, (Wang) instigated often by the use of science for military ends such as the development and use of atomic weapons, (Kevles) or as part of broader social upheaval, as was the formation of the Union of Concerned Scientists during the Vietnam War. Until recently however, with a few important exceptions (British Society for Social Responsibility in Science (BSSRS); Science for the People), the concept has remained largely a rallying cry expressing concern with the direction of a particular line of research. Furthermore, to the extent that broad societal concerns about science have been expressed and addressed by scientific communities, particularly in the physical and environmental sciences, this attention has not resulted in a robust, practical concept of social responsibility that can be used to assess and guide research in daily practice. Globally, but especially among EU nations, movements to make social responsibility in science into a more substantial concept that informs particular actions and practices are beginning to gain traction.
Interest in this trend is weak in the United States, however, despite the fact that the term itself appears increasingly in publications of prominent science organizations, including AAAS, NIH, National Academy of Science, NSF, and the National Science Advisory Board on Biosecurity (NSABB). Clarification of the meaning and requirements of social responsibility in science is particularly important in the United States because unlike many other nations concerned about these issues the implementation of social responsibility in the United States is left virtually solely to scientists.
Why do genetic engineers need an analytic framework for social responsibility?
The term social responsibility appears increasingly in US federal guidelines and policies governing life sciences research but without the content needed to make it meaningful. The NIH mission statement lists as its goals “to exemplify and promote the highest level of scientific integrity, public accountability, and social responsibility in the conduct of science.” The most recent edition of the National Academy of Science’s authoritative text on research ethics, On Being a Scientist (National Academy of Sciences (NAS) On Being a Scientist: Responsible Conduct in Research [3rd Edition]) was revised from “need[ing] to be aware that ultimately [scientists’] research can have a great impact on society,” (National Academy of Sciences (NAS) On Being a Scientist: Responsible Conduct in Research [2nd Edition]) to having “a responsibility to reflect on how their work and the knowledge they are generating might be used in the broader society.” The NSABB relies heavily on the concept of the culture of responsibility in developing its recommendations for handling dual use research of concern (DURCs). (National Science Advisory Board for Biosecurity (NSABB))
However, the meaning and operationalization of social responsibility remains underdeveloped. For example, NIH has produced clear statements to promote two of three elements of its mission statement, scientific integrity and public accountability, but not social responsibility. (See http://nexus.od.nih.gov/all/2013/01/17/new-resource-on-scientific-research-integrity/ and http://grants.nih.gov/grants/public_accountability/.) On Being a Scientist asserts that scientists should reflect on how their work might be used in the broader society, but asserts that as long as the values of honesty, fairness, collegiality, and openness “are honored, science—and the society it serves—will prosper.” In other words, all that is needed to benefit society is good science. The relationship of science to society is a frequent theme in discussions of the culture of responsibility, but the actual practices that the NSABB promotes to support this culture are largely internal to science, such laboratory procedures, or are focused on human resources and employment practices. (National Science Advisory Board for Biosecurity (NSABB))
Other guidance, such as the US President’s Commission for the Study of Bioethical Issues report on The Ethics of Synthetic Biology and Emerging Technologies, (Presidential Commission for the Study of Bioethical Issues) has offered the principles of responsible stewardship, the responsibility to act for the betterment of all, and prudent vigilance, which makes general recommendations for assessing risks and benefits before and after projects are undertaken. (Gutmann) However, while rejecting extremes of pursuing “technological advances without due regard for environmental or public safety” or of applying a precautionary approach that “blocks all technological approaches until all possible risks are known and neutralized,” there is still considerable work remaining to implement these ideas.
Most commentary that does provide guidance about implementing social responsibility in science emphasizes protective obligations, such as the NSABB’s culture of responsibility focus on keeping DURC (National Science Advisory Board for Biosecurity (NSABB)) technology out of the hands of terrorists. Equally necessary is developing social responsibility as a concept that enables positive collaboration between science and society and helps to translate from science to society the benefits society needs, and that can be practically operationalized. This task requires effort from many sectors, and likely in several stages. As a first stage, we propose a framework for thinking about social responsibility that is tied to elements of research design. We approach this task here as an empirical challenge, which is not to say that it is not also an ethical, moral, and prescriptive one. Rather only that at this stage it might be useful to look at what scientists talk about when they talk about doing socially responsible science. We provide these findings in the form of a proposed analytic framework.
Social responsibility analytic framework
To create this framework we analyzed two recent cases of research, chosen because they provoked sustained discussions among the life sciences community about social responsibility of science, including at meetings specifically for this purpose., and because the scientists involved seemed to embrace rather different understandings of the meaning and practices of social responsibility. The cases are: manipulation of the H5N1 strain of influenza virus to increase its transmissibility and virulence (Herfst et al. “Airborne Transmission of Influenza a/H5n1 Virus between Ferrets”), and proposed applications of gene drives to alter wild populations of sexually reproducing organisms such as mosquitoes (Esvelt et al.). These lines of research are similar in being innovative, genetics-related, NIH-supported research that generated concerns not because of RCR-related issues but because of their potential harms to society. In addition, these experiments raise questions about what benefits they might provide to society, especially in relation to the risks they pose. H5N1 influenza virus has been categorized in the US as a select agent with dual use potential since 2005 (Knobler, Mahmoud and Pray; Department of Health and Human Services) and the gain-of-function objective in this particular inquiry meant the research was subject to special oversight in that it qualified as a DURC (National Institutes of Health). The proposed gene drive experiments fall under no particular regulations other than those that that apply to all recombinant nucleic acid research, most of which do not address issues of risk outside of direct potential for harm to humans or domesticated animals and crops.
We generated this framework by examining controversial research because controversial research generates more discussion and media coverage. We then scanned for statements by investigators about what makes their research socially responsible, such as when and with whom they consulted about potential risks of the research. However, the framework is intended to apply to a wide range of life science inquiries. Seeking to identify the elements investigators considered important for the public to understand or that they believed provided the evidence for their claims that they acted responsibly, we worked iteratively between research accounts and emerging framework categories by analyzing the stories that investigators told. We identified common elements in these accounts and used them to create an analytic framework that can be used to assess the variation and range of meaning and practices associated with social responsibility in scientific research today. We propose this as a work in progress and ask the community to assess test and revise it based on a variety of case studies.
We were able to identify five common features of the investigators’ accounts and propose that together constitute a basic framework for analyzing models of social responsibility that life scientists implicitly or explicitly incorporate into their work. The goal is not to define what it means to be socially responsible in science. Rather we propose this framework as a tool to enable analysis of social responsibility in science. The features are:
Basis
Factors or values investigators rely on to justify their research activities. These vary in the degree to which they emerge directly from the science or emerge also from issues beyond advancing science.
Approach
Approach or reasoning used to identify and manage harms and benefits of research. These range from categorical and rule-based to more reflexive and knowledge producing. The former derives obligations from regulatory or legal categories or considerations, such as human subjects research, research using select agents, or contractual terms, while the latter calls for more open-ended empirical or flexible inquiry into potential harms and benefits of a research project as it unfolds in a particular setting.
Timing
At what stage of research do investigators actively address issues associated with social responsibility? For instance, does this occur during the design phase, before publication, at regular, planned intervals, or when required; i.e., in response to human subjects or animal use regulations.
Participants
Who is brought into the discussion? Only researchers? Research administrators? Various stakeholder groups?
Transparency
How easily can an observer ascertain what procedures were followed or questions asked in deliberations concerning social responsibility?
Using passages from articles written by team members of H5N1 research and of gene drive research, Table 1 provides examples of the kind of information that these elements correspond to. The passages that appear in the table were chosen for their relative brevity from among several expressing similar ideas. The cited references provide greater detail on these points.
Table 1.
Increasing transmissibility of H5N1 influenza virus | Gene drives to alter wild populations | |
---|---|---|
Basis/values | The benefits of H5N1 virus transmission research may or may not result in immediate applications—accumulating knowledge in basic research is an incremental process. However, we believe that our best way to limit the impact of pandemics is to be better prepared than we are now by knowing more about the pathogen and how it may evolve. (Fouchier, Garcia-Sastre and Kawaoka) | Gene drives may be capable of addressing ecological problems by altering entire populations of wild organisms, but their use has remained largely theoretical due to technical constraints. Here we consider the potential for RNA-guided gene drives based on the CRISPR nuclease Cas9 to serve as a general method for spreading altered traits through wild populations over many generations. (Esvelt et al.) |
Approach | Upon signing the research contract, a new GMO permit – explicitly for conducting work with airborne-transmissible H5N1 virus and early pandemic viruses – was obtained from the Dutch Ministry for Infrastructure and the Environment (I&M) in 2007. I&M and COGEM [Dutch Committee Genetic Modification] concluded that the proposed work could be performed with negligible risk to humans and the environment under the conditions realized… (Herfst et al. “Supplemental Materials for Airborne Transmission of Influenza a/H5n1 Virus between Ferrets”) | Technologies with the potential to significantly influence the lives of the general public demand societal review and consent. As self-propagating alterations of wild populations, RNAguided gene drives will be capable of influencing entire ecosystems for good or for ill. As such, it is imperative that all research in this nascent field operate under conditions of full transparency, including independent scientific assessments of probable impacts and thoughtful, informed, and fully inclusive public discussions. (Esvelt et al.) “We could easily have laboratory tests within the next few months and then field tests not long after that. That’s if everybody thinks it’s a good idea…If we’re going to talk about it at all in advance, rather than in the past tense, now is the time.” (De Chant and Nelsen) |
Timing | From the conception phase of the research onward, biosafety and biosecurity experts were consulted to provide assurance that facilities and working conditions were such that the safety and security could be ensured at all times. (Herfst et al. “Airborne Transmission of Influenza a/H5n1 Virus between Ferrets”) | For emerging technologies that affect the global commons, concepts and applications should be published in advance of construction, testing, and release. This lead time enables public discussion of environmental and security concerns, research into areas of uncertainty, and development and testing of safety features. (Oye et al.) (See also Approach) |
Participants | Our work on aerosol transmission of HPAI H5N1 virus was done completely openly, and the decision to perform the work was reached upon serious local, national, and international consultation. The work has been discussed among staff members of the Department of Virology at Erasmus Medical Medical Center (MC) since 1997, followed by consultation with local biosafety officers and facility managers. Over several years, numerous international influenza specialists and other virologists operating in class-3 and-4 facilities were consulted, and a plan was drawn to obtain adequate research facilities in Rotterdam. (Fouchier, Herfst and Osterhaus) |
I thought it might be useful to get into the room people with slightly different material interests [so we invited] regulators, nonprofits, companies, and environmental groups… The idea was to get people to meet several times, to gain trust” before “decisions harden.” (De Chant and Nelsen) (See also comments about public involvement under “Transparency”) |
Transparency | See Participants, above. | Because we are all affected by the state of our ecosystems, public oversight of technologies capable of ecological management will be essential. We recommend that all future research involving gene drives and other technologies capable of altering populations and ecosystems be conducted in full public view, with all empirical data and predictive models freely and openly shared with the global community in a transparent and understandable format. Only through broadly inclusive and well-informed public discussions can we as a society decide how best to manage our shared environment. (Esvelt, Church and Lunshof) |
Applying the framework
The following summaries combine Table 1 information into brief accounts of how H5N1 and gene drive investigators integrated ideas of social responsibility into their research. The summaries are not definitive conclusions about these projects; rather they are meant to illustrate how the Social responsibility analytic framework can be used. In particular this exercise is meant to show how the framework might organize and standardize analysis of social responsibility in science and provide a basis for research and theory building. Abbreviations in the text refer to the features of the framework: Basis (B); Approach (A); Time (T); Participants (P); and Transparency (Tr), indicate correspondence between elements and quotes from Table 1 and respective statements.
H5N1 Summary
H5N1 investigators concerns’ with social responsibility focused primarily on how their research could advance the science of influenza transmission. They characterized the practical applications of their work to pandemic preparedness as uncertain in the short run. (B) They acknowledged that research to increase the virulence and transmissibility of an already dangerous virus posed serious potential harms to society, but judged the harm to science from abandoning the research to be greater. (B) To identify and manage potential research harms, investigators relied on rule-based reasoning, such as the rules governing select agent research requiring various levels of bio-containment precautions or legal considerations such as apply if the research fulfills the terms of a contract? (A) Investigators reported paying close attention to these safety issues before initiating research and as it continued. (T) Having restricted social responsibility considerations to safety precautions, investigators limited the participants they consulted to safety and security experts in addition to research team members and local institutional officials. (P) There is no mention of opening these discussions to participation or observation by others; in particular, there is no mention of the decision to publish an account of findings sufficiently detailed to allow replication. (Tr)
Gene drives Summary
Gene drive investigators cited two factors to justify their research: first that gene drive research could benefit society but second that technical constraints inhibited progress that could be addressed through research. (A) Investigators noted that gene drive research could bring great harm to society by disrupting or destroying ecosystems, if not adequately controlled. To conduct a comprehensive review of potential risks, investigators cooperated in a multi-year NSF project that convened an interdisciplinary group of outside experts (P) to assess potential research harms and to develop a plan for oversight of the research, with special attention to the implications of open field experiments. (B) [35] Investigators initiated plans for this consultation after funding for the science was established, but announced a self-imposed moratorium on experiments pending a broader public consultation. (P; T) Published accounts have not addressed the details of this second consultation, nor have they addressed how participants in the initial experts group were chosen, or how investigators will handle the possibility of disagreement among the public or the possibility that the public will conclude that the research should not proceed. (Tr)
The actions of the gene drive and H5N1 researchers illustrate different conceptualizations of scientists’ responsibility to society and different ways of implementing those concepts. H5N1 researchers embraced a traditional notion wherein researchers’ responsibility to society is limited to conducting good research and reporting its results. Actions that address social responsibility concerns involved only insiders and emphasize separating society from science. It is of course socially responsible to create and guard a barrier that protects society from possible H5N1 infection. The inward focus and equation of responsibility with pre-determined, scientist-managed harm mitigation, however, also expresses the assumption that scientists alone are capable of deciding these questions.
The gene drive investigators have vigorously pursued the goal of advancing knowledge. They were open however to the possibility that the implications of that knowledge created concerns that went beyond what they could confidently treat as falling within their sole purview. They expanded the range of people allowed to participate in assessing the implications of the research and they initiated this process at a time when the science was still open to influence. A finer grained analysis of the timing and identities of outside participants would be necessary to sustain conclusions about the extent that these actions express a substantially different concept of socially responsibility than that of the H5N1 researchers, but preliminary analysis suggests that these investigators have mapped a very different routes for themselves.
The contrast between these two research projects is instructive in two other ways as well. First it suggests that life science researchers might vary dramatically in their definition of social responsibility and, second, it suggests that traditional, rule-based approaches to social responsibility might be insufficient as a basis for creating the more robust concept of social responsibility that recent trends in science demand. These examples illustrate the use of a proposed analytic framework to identify elements of social responsibility in science in service of advancing an empirically grounded discussion of what social responsibility in science means and how to achieve it that could be useful to advance these efforts globally as well as in the US.
Acknowledgments
This work was supported by the US National Human Genome Research Institute, grant numbers R01HG004900 and P50HG003389.
References
- Akbari OS, et al. Safeguarding Gene Drive Experiments in the Laboratory. Science. 2015;349:927–29. doi: 10.1126/science.aac7932. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- British Society for Social Responsibility in Science (BSSRS). Print.
- Committee on Science Engineering and Public Policy. Scientific and Medical Aspects of Human Reproductive Cloning. Washington, DC: National Academies of Sciences; 2002. Print. [PubMed] [Google Scholar]
- De Chant Tim, Nelsen Eleanor. Genetically Engineering Almost Anything. NOVA Next. 2014 Jul 17; Print. [Google Scholar]
- Department of Health and Human Services. Update on Avian Influenza a (H5n1) [Cdc Health Update] 2005. Print. [Google Scholar]
- Douglas Heather. The Moral Terrain of Science. Erkenntnis. 2013 Print. [Google Scholar]
- Esvelt Kevin, Church George, Lunshof Jeantine. “Gene Drives” and Crispr Could Revolutionize Ecosystem Management. Scientific American Guest Blog. 2014 Jul 17; Web. [Google Scholar]
- Esvelt Kevin M, et al. Concerning Rna-Guided Gene Drives for the Alteration of Wild Populations. eLife. 2014;3:e03401. doi: 10.7554/eLife.03401. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fouchier Ron AM, Garcia-Sastre Adolfo, Kawaoka Yoshihiro. The Pause on Avian H5n1 Influenza Virus Transmission Research Should Be Ended. mBio. 2012;3(5):e00358-12. doi: 10.1128/mBio.00358-12. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fouchier Ron AM, Herfst Sander, Osterhaus Albert DME. Restricted Data on Influenza H5n1 Virus Transmission. Science. 2012;335 doi: 10.1126/science.1218376. Print. [DOI] [PubMed] [Google Scholar]
- Funtowicz S, Ravetz J. Science for the Post-Normal Age. Futures. 1993;31(7):735–55. Print. [Google Scholar]
- Gibbons Michael. Science’s New Social Contract with Society. Nature. 1999;402(Supp):C81–C84. doi: 10.1038/35011576. Print. [DOI] [PubMed] [Google Scholar]
- Glerup Cecilie, Horstb Maja. Mapping ‘Social Responsibility’ in Science. Journal of Responsible Innovation. 2014;1(1):31–50. Print. [Google Scholar]
- Gutmann Amy. The Ethics of Synthetic Biology: Guiding Principles for Emerging Technologies. Hastings Center Report. 2011;41(4):17–22. doi: 10.1002/j.1552-146x.2011.tb00118.x. Print. [DOI] [PubMed] [Google Scholar]
- Herfst Sander, et al. Airborne Transmission of Influenza a/H5n1 Virus between Ferrets. Science. 2012;336(6088):1534–41. doi: 10.1126/science.1213362. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Herfst Sander, et al. Supplemental Materials for Airborne Transmission of Influenza a/H5n1 Virus between Ferrets. Science. 2012;336(6088):1534–41. doi: 10.1126/science.1213362. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Kevles D. The Physicists: The History of a Scientific Community in Modern America. New York, NY: Alfred A. Knopf; 1978. Print. [Google Scholar]
- Knobler SL, Mahmoud AAF, Pray LA. Biological Threats and Terrorism: Assessing the Science and Response Capabilities: Workshop Summary. Washington, DC: Institute of Medicine, Board on Global Health, Forum on Emerging Infections; 2002. Print. [PubMed] [Google Scholar]
- MacIntyre Alasdair. After Virtue. 2. Notre Dame: University of Notre Dame Press; 1984. Print. [Google Scholar]
- National Academy of Sciences (NAS) On Being a Scientist: Responsible Conduct in Research. 2. Washington, DC: 1995. Print. [PubMed] [Google Scholar]
- National Academy of Sciences (NAS) On Being a Scientist: Responsible Conduct in Research. 3. Washington, DC: 2009. Print. [Google Scholar]
- National Institutes of Health. United States Government Policy for Oversight of Life Sciences Dual Use Research of Concern. National Institutes of Health Office of Science Policy; 2012. Print. [Google Scholar]
- National Science Advisory Board for Biosecurity (NSABB) Guidance for Enhancing Personnel Reliability and Strengthening the Culture of Responsibility. Bethesda, MD: National Institutes of Health; 2011. Print. [Google Scholar]
- Olson S. Biotechnology: An Industry Comes of Age. Washington, DC: National Academies of Science; 1986. Print. [Google Scholar]
- Oye Kenneth A, et al. Regulating Gene Drives. Science. 2014;345(6197):626–28. doi: 10.1126/science.1254287. Print. [DOI] [PubMed] [Google Scholar]
- Pimple Kenneth D. Six Domains of Research Ethics: A Heuristic Framework for the Responsible Conduct of Research. Sci Eng Ethics. 2002;8(2):191–205. doi: 10.1007/s11948-002-0018-1. Print. [DOI] [PubMed] [Google Scholar]
- Presidential Commission for the Study of Bioethical Issues. New Directions: The Ethics of Synthetic Biology and Emerging Technologies. Washington, DC: 2010. Print. [Google Scholar]
- Rappert Brian. Education for the Life Sciences: Choices and Challenges. In: Rappert Brian, McLeish Caitriona., editors. A Web of Prevention: Biological Weapons, Life Sciences and the Governance of Research. London: Earthscan; 2007. Print. [Google Scholar]
- Relman David A. The Increasingly Compelling Moral Responsibilities of Life Scientists. Hastings Center Report. 2013;43(2):34–35. doi: 10.1002/hast.156. Print. [DOI] [PubMed] [Google Scholar]
- Resnik David B. Ethical Virtues in Scientific Research. Account Res. 2012;19(6):329–43. doi: 10.1080/08989621.2012.728908. Print Science for the People. 2014. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Sharples F, et al. Potential Risks and Benefits of Gain-of-Function Research. Washington, DC: National Academies of Sciences; 2015. Print. [PubMed] [Google Scholar]
- Shrader-Frechette KS. Ethics of Scientific Research. Lanham: Rowman & Littlefield Pub Inc; 1994. Print. [Google Scholar]
- Wang Jessica. Ethics and Social Responsibility in Science. In: Rothenberg Marc., editor. The History of Science in the United States: An Encyclopedia. New York: Garland Publishing Inc; 2001. Print. [Google Scholar]
- Weed Douglas L, McKeown Robert E. Science and Social Responsibility in Public Health. Environmental Health Perspectives. 2003;111(14):1804–08. doi: 10.1289/ehp.6198. Print. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Zandvoort Henk. Necessary Knowledge for Social Responsibility of Scientists and Engineers. In: Dias de Figueiredo A, Sa Furtado C, Graca Rasteiro M, editors. Proceedings of the International Conference on Engineering Education. Coimbra: INEER; 2007. pp. 1–6. Print. [Google Scholar]