Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2013 Apr 4.
Published in final edited form as: J Law Med Ethics. 2012 Winter;40(4):848–855. doi: 10.1111/j.1748-720X.2012.00713.x

Responsible Conduct in Nanomedicine Research: Environmental Concerns Beyond the Common Rule

David B Resnik 1
PMCID: PMC3616622  NIHMSID: NIHMS451903  PMID: 23289687

Introduction

The Common Rule (45 CFR 46, subpart A) is a set of regulations for protecting human participants in research funded by the Department of Health and Human Services (DHHS), which has been adopted in part by 17 federal agencies.1 It includes four different subparts: Subpart A (general protections for human research participants), Subpart B (additional protections for pregnant women, fetuses, and neonates), Subpart C (additional protections for prisoners), and Subpart D (additional protections for children). The Common Rule has not been significantly revised since 1981 although some significant changes may be forth-coming. 2 The Food and Drug Administration (FDA) has adopted its own regulations for the protection of human participants, which are similar to the Common Rule in many key areas, such as the structure and function of the Institutional Review Board (IRB), and the criteria for approving research.3 Most institutions in the U.S. that sponsor or support research with human participants have entered into an agreement (known as a Federalwide Assurance, Multiple Project Assurance, or Single Project Assurance with the Office of Human Research Protections [OHRP]), in which they affirm their intention to abide by the requirements of the Common Rule. Many institutions apply the Common Rule to all types of research with human participants, not just to studies sponsored by agencies that follow the Common Rule.

The Common Rule’s main focus is on the protection of the individual research participant (or human subject). Other than the requirements in Subpart B, it does not include any rules or procedures that protect people or organizations beyond the research participant. The criteria for IRB approval of a research project only require that risks to the subject be minimized but do not mention the minimization of other types of risks, such as risks to investigators, identifiable third parties, or communities.4 The criteria also require that risks to the subject be reasonable in relation to the benefits to the subject or the value of the knowledge gained, and they instruct the IRB not to consider the long-range effects of applying knowledge gained in research.5 The informed consent requirements focus on obtaining consent from the participant (or his or her representative) and do not mention communities, organizations, or third parties.6 The provision related to the protection of confidentiality and privacy is also aimed at the human participant.7

Nanomedicine research raises ethical and legal issues that extend beyond the scope of the Common Rule, such as risks to investigators and research staff, nanotechnology workers, family members of participants, the environment and society. Most of these risks arise as a result of possible exposure to nanomaterials used in clinical research and raise difficult questions for investigators, institutions, and IRBs. Since other types of clinical research also involve risks to people other than the human participants, these questions are not entirely novel. However, they have a new meaning and urgency in the context of nanomedicine since this field is moving forward rapidly and has wide-ranging social, public health, and environmental impacts that are not yet well-understood at this point. Gene therapy research also involves novel genetic modification technologies that are rapidly advancing and have wide-ranging effects. However, investigators and policymakers have been dealing with the risks of genetic modification since the early 1970s, and many of these risks are currently addressed by existing oversight mechanisms, such as National Institutes of Health’s Recombinant DNA Advisory Committee (RAC) and Institutional Biosafety Committees (IBCs).8 Since it is not clear whether existing policies and oversight mechanisms can deal with the wide-ranging effects of nanomedicine research, it is important to consider the risks beyond those that impact the research participant to determine whether additional oversight is needed.

Overview of the Risks of Nanomedicine Research

Nanotechnology involves the manipulation of matter in the range of 1–100 nanometers (nm, or one billionth of a meter).9 For comparison, DNA is 1–2 nm across, a virus is 3–50 nm, and a red blood cell is 300 nm. Some types of nanoparticles occur naturally as volcanic ash, smoke, or viruses. Others, such as manufactured nanoparticles, have been developed for specific purposes related to industry, scientific research, or medicine. Nanoparticles have properties different from particles of the same material at a larger scale. For example, nanomaterials are often more chemically and biologically reactive than larger materials because they have a higher surface area-volume ratio, and many chemical and biological interactions occur on the surface. Also, important physical properties, such as melting point, electrical conductivity, and color, can vary with the size and shape of a nanoparticle. Nanoparticles can have unique properties with a variety of commercial and other applications.10

In the next 10–15 years, nanomedicine — the application of nanotechnology to medicine — will significantly impact the diagnosis, prevention, and treatment of disease. So far, some of the most promising nanotechnology applications involve improving the effectiveness of drug delivery to target cells or tissues and testing for different compounds in blood or urine, but more applications will be forthcoming. Nanotechnology components may soon play a key role in many drugs, biologics, diagnostics, and medical devices. In the future, it may be possible to use nanomachines to locate and destroy diseased or cancerous cells in the body, or to repair damaged tissue.11

The risks of clinical trials involving current nanomedicine products are similar to the risks of clinical trials involving conventional medicine. The most significant risks currently dealt with involve exposing human participants to nanomaterials under investigation. Though prior animal studies can help researchers understand the potential hazards of exposing human participants to nanomaterials, there are always unknown risks and unexpected outcomes when a drug or other product is first tested in a human being, especially if animal models are not established for particular effects, such as immune responses.12 However, since the Common Rule and the FDA regulations provide adequate protections for human participants in clinical trials involving nanomedicine according to many commentators, the application of these regulations to nanomedicine research will not be reviewed in this article.

The main concern of the present inquiry is to consider the risks to people other than human participants who may be exposed to nanomaterials as a result of nanomedicine research. Potentially affected individuals include:

  • Workers involved in the manufacture of nanomaterials used in pre-clinical and clinical research.

  • Investigators and research staff who work with nanomaterials in the laboratory or administer nanomaterials to human participants in a clinical or home setting.

  • Family members or friends of research participants who may come into contact with nanomaterials.

  • People who are exposed to nanomaterials that enter the environment during the manufacturing processes, as a result of disposal of unused products, or during excretion or elimination of nanomaterials administered to human research participants.

Animals and plants both may be exposed to nanomaterials that enter the environment, but these risks will not be directly addressed here.

It is exceedingly difficult to assess the risks to human beings associated with potential exposures to nanomaterials at this point in time due to lack of evidence from laboratory, clinical, environmental, and epidemiological studies of nanotechnology. Though toxicologists and other scientists have begun to investigate the effects of nanomaterials on animals, human beings, and other species, a great deal remains unknown.13 Numerous studies are underway, but because nanotechnology is a young science, many more need to be conducted before we have a good understanding of the risks of nanomaterials.

The tremendous variation in the properties of nanomaterials also confounds risk assessment. Nanomaterials do not constitute a single chemical class, but are composed of many different elements or compounds.14 The only common characteristics that these materials share are their size range. The evidence obtained so far indicates that different nanomaterials have distinct environmental and health effects.15 Some nanomaterials are relatively benign, while others are potentially hazardous. Some degrade easily in biological systems or the environment, while others may persist. Some can accumulate in tissues, while others do not.16 Some can cross cellular membranes or the blood-brain barrier; some may enter the bloodstream via inhalation or dermal contact; some may cause genetic damage; some may be carcinogenic; some induce immune responses; some may damage the lungs, liver, or other organs; and some may be toxic at low doses.17

Precaution Concerning Nanomaterials

Because there is so much uncertainty surrounding the risks of nanomaterials, some have argued that traditional risk management approaches do not apply to nanotechnology and that a precautionary approach is warranted.18 One key difference between traditional risk management and a precautionary approach is that, under the traditional approach, the probabilities of different outcomes can be estimated scientifically, and steps can be taken to minimize risks and maximize benefits based on those probabilities. For example, when deciding whether to approve a new drug for marketing, the FDA estimates the overall benefits and risks of the drug based on evidence from animal studies and clinical trials. The agency can then decide whether the drug should be marketed and under what conditions (e.g., available by prescription only, not for use in pregnancy, etc.). A precautionary approach may be warranted when there is insufficient evidence concerning a new technology to obtain a scientific estimate of the probabilities of beneficial and harmful outcomes.19

Under a precautionary approach to a new technology, society should take reasonable measures to avoid, prevent, minimize, or mitigate harms that are plausible and serious.20 A measure is reasonable insofar as it provides a fair balancing of the different values at stake, such as promotion of public health, protection of the environment, and impacts on economic development. In general, the degree of precaution required depends on the nature of the risks, our ability to manage those risks, and the potential loss of benefits that would result by taking proposed precautions. Highly restrictive precautions, such as banning a technology, may be justified when the risks are catastrophic, the loss of benefits is acceptable, and there are no other effective means for managing the risk. In other situations, regulation and oversight of a new technology, rather than a ban, may be the most prudent course of action.21

What would it mean to take a precautionary approach with regard to nanomaterials? At a minimum, reasonable precautionary measures would include identifying the hazards and conducting extensive research to develop a better understanding of those risks. Reasonable precautions would also involve the institution of safety measures to protect people from exposure until the risks of nanomaterials are better understood.22 For example, factory workers involved in the manufacture of nanomaterials and investigators handling nanomaterials in the laboratory should be educated about potential risks and provided with protective clothing (such as gloves) and equipment (such as negative air pressure ventilation) to minimize exposure as a result of inhaling, ingesting, or touching nanomaterials.23 Factories and laboratories should also be designed to contain nanomaterials to protect workers and the environment. The use of automation and robotics can also minimize human exposure. Nanomaterials should be disposed of properly to prevent environmental contamination.24 Monitoring of exposure would also be important to provide useful data for risk management.

The Role of Institutions and Investigators in Minimizing the Risks of Nanomaterials

Should researchers and institutions involved in nanomedicine research address risks other than those to human participants? Before answering this question, it is important to note that many industrialized nations in Europe and North America already have regulatory mechanisms in place to manage the risks of nanomaterials to workers and the environment. In the U.S., the Occupational Safety and Health Administration (OSHA), a federal agency under Department of Labor, establishes and enforces standards for occupational health and safety.25 OSHA has the authority to regulate many different hazardous chemicals present in the work environment, including nanomaterials.26 The National Institute for Occupational Safety and Health (NIOSH) conducts research on workplace risks and makes recommendations for safety standards and training. NIOSH has been a leader in nanotechnology safety.27 States also have their own agencies to protect workers from general occupational risks. Federal and state occupational safety and health regulations apply to workers who may be exposed to nanomaterials in factories, university laboratories, clinical research sites, or other settings.

The Environmental Protection Agency (EPA) has the authority to regulate nanomaterials classified as chemical substances under the Toxic Substance Control Act (TSCA) or pesticides under the Federal Insecticide, Fungicide, and Rodenticide Act (FIFRA), and is developing a comprehensive approach to protect human health and the environment from nanomaterials. 28 It is important to note, however, that under TSCA, an industrial chemical is regarded as safe until evidence emerges that it is potentially dangerous. Regulatory safety measures are implemented only after a chemical is found to be potentially hazardous.29 Some states also have their own laws concerning toxic chemicals. The EPA sponsors research on chemical safety and reviews data from other sources, such as academia and private industry. Some commentators have argued that TSCA needs to be strengthened to deal with the risks of nanomaterials, but a thorough evaluation of nanotechnology regulation is beyond the scope of this article.30

Clearly, investigators working with nanomaterials and institutional officials in charge of occupational safety and health should be concerned about risks to laboratory technicians and other research staff who may be exposed to nanomaterials as a result of their involvement in research with human participants. They should not only comply with occupational safety and health laws but also promote a culture of safety in the work environment. They should assume that nanomaterials are potentially toxic, unless they have decisive evidence otherwise. Investigators and institutional officials should implement appropriate precautionary measures, such as containment, protective clothing and equipment, proper disposal, exposure monitoring, and education and training of staff. IBCs can help protect research staff from exposure to nanotechnology products involving potentially hazardous biological materials, such as recombinant DNA, pathogens or toxins.31 Although IBCs were charged originally with overseeing recombinant DNA research, many have expanded their mission to include overseeing various types of biologically hazardous research. Other committees in charge of safety at universities, such as laboratory safety committees, can also help protect staff from nanotechnology hazards.

What about risks to workers who are involved in manufacturing nanomaterials in factories? Should investigators or research institutions be concerned about this risk? One might argue that this concern falls outside of the scope of their responsibility (and authority) and is a matter for factory managers, OSHA, EPA, and state agencies to handle. Investigators and institutional officials should focus on the risks of nanomaterials related to their use of these substances; they cannot be held accountable for what happens to these materials before they receive them. If they become aware of occupational safety and health problems at a factory that provides them with nanomaterials, then they can notify the appropriate officials or agencies. But, for the most part, they should concentrate on the health and safety concerns that fall within their domain of influence.

IRB Responsibilities

As mentioned earlier, the IRB’s main charge is to protect the rights and welfare of human research participants. IRBs can fulfill this responsibility by carefully reviewing research documents, such as protocols and consent forms, and ensuring that the risks to human participants are minimized and are reasonable in relation to the benefits to the participants or society. IRBs can also audit research and ensure that adverse events and other problems are reported to the appropriate agencies in a timely fashion.

How should the IRB respond to risks other than those to research participants? A strong case can be made that the IRB should not spend a great deal of time dealing with risks to research staff, investigators, or the environment since these concerns lie beyond its purview as set forth in the federal regulations. Focusing on risks other than those that affect research participants would be a distraction that takes away valuable time and effort from the protection of human research participants and a form of “mission creep.”32 Moreover, the IRB will often lack the expertise and authority to assess some of these risks, such as occupational and environmental hazards, as members are usually not experts in occupational safety or environmental health.

Even though the IRB should not devote considerable time and effort to the assessment of risks other than those that affect research participants, it should still deal with these risks in an appropriate fashion. If the IRB ignores these risks completely, then it is conceivable that no other institutional body will address them, and they will fall through the cracks. It will often be the case that the appropriate IRB response will be to identify risks that need to be addressed by another body when it reviews a nanomedicine protocol. For example, if the IRB has concerns about the safety of research staff or investigators, it can refer the protocol to a laboratory safety committee or occupational health committee for review; if it has concerns about hazardous biological materials, it can refer the protocol to the IBC. Ideally, the IRB will have enough expertise to know which risk issues related to nanomedicine need to be reviewed by another qualified group. The IRB should postpone approval of the protocol if it is not satisfied that appropriate review (and approval) by other groups at the institution has taken place. Delegation of responsibility for risk review related to human participant research already occurs in many institutions. For example, cancer studies involving ionizing radiation are usually referred to a radiation safety committee for prior review before the IRB makes its determination. Likewise, gene therapy research may be reviewed by an IBC before the IRB gives its approval.

Some of the risks related to nanomedicine clinical trials the IRB may deal with itself, rather than deferring to another body. It is conceivable that some research protocols involving nanomedicine may involve risks to identifiable third parties, such as family members or friends who come into contact with nanomedicine research participants. For example, suppose that a research protocol requires participants to dermally administer a nanomedicine product at home. Children in the home could come into contact with this product if it sheds, which could pose a risk to their health. Participants also might not dispose of the product properly, which could pose a risk to people in the home or contaminate the environment. Exposure could also occur when nanomedicine products are excreted (e.g., urine, breast milk). Commonsense suggests that the IRB should assess these risks, and should make sure that the protocol has appropriate procedures in place to minimize these types of exposures. The IRB could ensure that the protocol includes procedures for instructing participants on how to minimize identifiable third party exposures and for disposal of products containing nanomaterials. Risk management should focus on identifiable third parties, because risk management would be unduly burdensome and impractical if it included potential harms to all third parties who might be impacted by the research.

While it seems reasonable to suppose that the IRB should address risks to identifiable third parties in research, does this obligation have a sound ethical and legal basis? As noted above, the Common Rule does not address risks to third parties, except risks to fetuses. Although third-party risks arise in many types of biomedical research, one could argue that investigators working with nanotechnology should pay special attention to third-party risks, due to the uncertainties inherent in this rapidly advancing field and the potential for harm beyond the research participant. One could argue that IRBs have an ethical obligation to manage risks to third parties, based on the general duty of beneficence.33 Beneficence is one of the three ethical principles of research with human participants discussed in the Belmont Report, a highly influential document that provided a conceptual foundation for a major revision of the U.S. federal research regulations in 1981.34 Many institutions that follow the Common Rule also have made a commitment to apply the Belmont Report’s principles to human participant research. Ethical theories, ranging from utilitarianism to Kantianism, as well as professional codes, such as the Hippocratic Oath, also support duties of beneficence. 35 The principle of beneficence holds that we have a duty to promote good consequences and prevent or avoid bad ones. Since exposing third parties in research to nanomaterials has the potential to cause harm, IRBs have an obligation to ensure that investigators take steps to avoid this outcome. Beneficence implies that investigators and IRBs should be concerned about risks to family members and other third parties who may be exposed to nanomaterials used in medical research. If the IRB lacks the expertise to assess risks to identifiable third parties, it can consult with outside experts.

Negligence law may also support obligations to protect identifiable third parties from harm. The elements that a plaintiff in a negligence lawsuit must prove are the following: (a) harm, (b) causation, (c) duty, (d) standard of care, and (e) breach of the standard of care.36 Investigators, institutions, or IRB members could be found liable for negligence if an identifiable third party is harmed as a result of their conduct, if they have a duty to avoid harming that third party, and if they fail to adhere to the standard of care for protecting that third party from harm.37

One of the key issues for the injured plaintiff would be proving that investigators, institutions, or IRBs have duties to third parties. Though the federal research regulations do not establish duties to third parties (with the exception of fetuses), a court might hold that investigators have duties to identifiable third parties if it finds that the harms that occurred to the plaintiffs were reasonably foreseeable. Under negligence law, there is a duty to avoid causing reasonably foreseeable harm to others who are in the zone of danger.38 For example, if a child becomes ill as a result of exposure to nanomedicine products applied in the home, a court might find that the investigator had a duty to protect the child from harm because the harm to the child was reasonably foreseeable and the child was in the zone of danger created by the investigator.

Another key issue for the plaintiff would be establishing the standard of care: what would a reasonable person do to protect an identifiable third party from harm?39 One could argue that a reasonable investigator would take some minimal steps to protect family members or friends from harm related to nanomedicine research, such as informing the research participants about potential hazards to their family members or friends, and how to minimize these risks. A reasonable IRB would address risks to third parties when reviewing a protocol. Since there have been no published cases related biomedical research involving harms to third parties, it is not known how a court would rule on legal issues (such as duty or standard of care) or how a jury would react. However, it makes sense to assume that IRBs, investigators, or institutions could be held liable, since there is a theoretical basis for a negligence lawsuit.

Some might object that requiring IRBs to address these third-party risks would take away valuable time and energy from the protection of research participants. In response to this objection, I agree that IRBs should focus on their main mission, but addressing risks to identifiable third parties will not be a major distraction. IRBs can protect third parties by carefully reviewing the protocol to determine whether third parties may be exposed to nanomedicine products during the study. If there is a significant probability of third-party exposure, then the IRB can stipulate that the investigator take steps to minimize the risk of exposure. In most cases, this can be achieved by instructing participants on risks to third parties and the use of proper precautions when using or disposing of products. Thus, it will not take a great deal of time or effort for an IRB to take steps to ensure that inves tigators protect third parties. Moreover, investigators may fail to adequately address these risks without queries or guidance from the IRB.

Long-Term Risks of the Development of Nanotechnology

Many commentators have been concerned about the long-terms risks of nanotechnology for public health, society, and the environment. Some have speculated that nanomaterials released into the environment could cause disastrous effects on human beings or other species.40 Others have been concerned that nanotechnology could be used to enhance human traits beyond natural boundaries or to threaten privacy.41 Some have worried that nanotechnology will exacerbate socioeconomic inequalities because poor people would have limited access to nanotechnology.42 In his popular science fiction book Prey, Michael Crichton envisions a future in which swarms of nano-robots wreak havoc on human society and the environment.43

How should the IRB, investigators, and institutions respond to these long-term risks? As noted earlier, the Common Rule instructs IRBs not to consider the long-term effects of the applications of research. One could argue that limiting the IRB’s authority in this way is appropriate, because examining the longterm risks of nanomedicine research would divert the IRB’s attention from focusing on immediate risks to research participants and would entangle the committee in controversial and complex social and political issues that are not easily resolvable at the level of an institutional body.44 A similar argument could also be extended to investigators and research institutions because dealing with the long-term implications of nanomedicine research would distract them from their main missions. This is not to say, however, that these issues should be ignored. One might argue that these issues should be addressed by legislative bodies, special committees established by government agencies, or professional associations.45 For example, various presidential commissions, such as the National Bioethics Advisory Commission (formed by President Bill Clinton), the President’s Council on Bioethics (formed by President George W. Bush), and the Presidential Commission for the Study of Bioethical Issues (formed by President Barack Obama) have addressed the long-term implications of biomedical research.46 The Helsinki Declaration mentions that appropriate caution should be taken when conducting medical research that may harm the environment.47

Conclusion

Nanomedicine research raises ethical concerns beyond those covered by the Common Rule. Many of the risks of nanomedicine are already addressed by Occupational Safety and Health and Environmental laws. Investigators and research institutions should comply with these laws in protecting research staff and the environment from harm. Though the IRB should concentrate on risks to human research participants, it should also consider risks to family members or other third parties who may be exposed to nanomedicine products administered to participants. Investigators should also address risks to identifiable third parties. Professional associations, legislative bodies, and government committees should deal with the long-term social, ethical, and environmental consequences of nanomedicine.

Acknowledgments

This article is the work product of an employee or group of employees of the National Institute of Environmental Health Sciences (NIEHS), National Institutes of Health (NIH). However, the statements, opinions, or conclusions contained therein do not necessarily represent the statements, opinions, or conclusions of NIEHS, NIH, or the United States government. Preparation of this article was supported by National Institute of Health (NIH), National Human Genome Research Institute (NHGRI) American Recovery & Reinvestment Act (AARA) Challenge grant #1-RC1- HG005338-01 on “Nanodiagnostics and Nanotherapeutics: Building Research Ethics and Oversight” (S. M. Wolf, PI; J. McCullough, R. Hall, J. P. Kahn, Co-Is). The contents of this article are solely the responsibility of the author and do not necessarily represent the views of NIH or NHGRI. I am grateful to Bruce Androphy for helpful comments.

References

  • 1.45 C.F.R. part 46, subpart A (2009).
  • 2.45 C.F.R. part 46 (2009).
  • 3.21 C.F.R. part 56 (2010).
  • 4.45 C.F.R. § 46.111(a)(1) (2009).
  • 5.45 C.F.R. § 46.111(a)(2) (2009).
  • 6.45 C.F.R. § 46.116 (2009).
  • 7.45 C.F.R. § 46.111(a)(7) (2009).
  • 8.Kimmelman J. Gene Transfer and the Ethics of First-in-Human Research: Lost in Translation. Cambridge: Cambridge University Press; 2009. p. 1. [Google Scholar]
  • 9.National Nanotechnology Initiative. available at < http://www.nano.gov/> (last visited November 19, 2012)
  • 10.Oberdörster G, Oberdörster E, Oberdörster J. Nanotoxicity: An Emerging Discipline Evolving from Studies of Ultrafine Particles. Environmental Health Perspectives. 2005;113(7):823–39. 823. doi: 10.1289/ehp.7339. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Resnik DB, Tinkle SS. Ethics in Nanomedicine. Nanomedicine. 2007;2(3):345–350. 345. doi: 10.2217/17435889.2.3.345. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 12.Resnik DB, Tinkle SS. Ethical Issues in Clinical Trials Involving Nanomedicine. Contemporary Clinical Trials. 2007;28(4):433–441. 433. doi: 10.1016/j.cct.2006.11.001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.See Oberdörster et al., supra note 10, at 823.
  • 14.See National Nanotechnology Initiative, supra note 9.
  • 15.Hoet P, Brüske-Hohlfield I, Salata O. Nanoparticles – Known and Unknown Health Risks. Journal of Nanobiotechnology. 2004;2(1):12–27. 12. doi: 10.1186/1477-3155-2-12. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Oberdörster G. Safety Assessment for Nanotechnology and Nanomedicine: Concepts of Nanotoxicology. Journal of Internal Medicine. 2010;267(1):89–105. 89. doi: 10.1111/j.1365-2796.2009.02187.x. [DOI] [PubMed] [Google Scholar]
  • 17.See Oberdörster et al., supra note 10, at 823.
  • 18.Allhoff F, Lin P, Moore D. What Is Nanotechnology and Why Does It Matter? New York: Wiley-Blackwell; 2010. p. 71. [Google Scholar]
  • 19.Resnik DB. Is the Precautionary Principle Unscientific? Studies in the History and Philosophy of Biology and the Biomedical Sciences. 2003;34(2):329–344. 329. [Google Scholar]
  • 20.Elliott K. Nanomaterials and the Precautionary Principle. Environmental Health Perspectives. 2011;119(6):A240. doi: 10.1289/ehp.1103687. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Munthe C. The Price of Precaution and the Ethics of Risk. Dordrecht: Springer; 2011. p. 1. [Google Scholar]
  • 22.See Oberdörster, supra note 16, at 89.
  • 23.Yokel RA, Macphail RC. Engineered Nanomaterials: Exposures, Hazards, and Risk Prevention. Journal of Occupational Medicine and Toxicology. 2011;21(6):7. doi: 10.1186/1745-6673-6-7. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.See Yokel and Macpail, supra note 23, at 7.
  • 25.Perry M, Hu H. Workplace Health and Safety. In: Frumkin H, editor. Environmental Health: From Global to Local. 2. New York: John Wiley and Sons; 2010. pp. 729–767.pp. 729 [Google Scholar]
  • 26.Occupational Health and Safety Administration. Nanotechnology: OSHA Standards. available at < http://www.osha.gov/dsg/nanotechnology/nanotech_standards.html> (last visited November 19, 2012)
  • 27.National Institute for Occupational Health and Safety. Nanotechnology. available at < http://www.cdc.gov/niosh/topics/nanotech/> (last visited November 19, 2012)
  • 28.Environmental Protection Agency. Control of Nanoscale Materials under the Toxic Substance Control Act. available at < http://www.epa.gov/opptintr/nano/> (last visited November 19, 2012)
  • 29.Cranor C. Legally Poisoned: How the Law Puts Us at Risk from Toxicants. Cambridge, MA: Harvard University Press; 2011. p. 208. [Google Scholar]
  • 30.Davies JC. Managing the Effects of Nanotechnology. Washington, D.C: Woodrow Wilson International Center; 2006. p. 1. [Google Scholar]
  • 31.National Institutes of Health, Office of Biotechnology Activities. Institutional Biosafety Committees. available at < http://oba.od.nih.gov/rdna_ibc/ibc.html> (last visited November 19, 2012)
  • 32.Gunsalus CK, Bruner EM, Burbules NC, Dash L, Finkin M, Goldberg JP, Greenough WT, Miller GA, Pratt MG. Mission Creep in the IRB World. Science. 2006;312(5779):1441. doi: 10.1126/science.1121479. [DOI] [PubMed] [Google Scholar]
  • 33.Resnik DB, Sharp R. Protecting Third Parties in Human Subjects Research. IRB. 2006;28(4):1–7. 1. [PMC free article] [PubMed] [Google Scholar]
  • 34.National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The Belmont Report. Washington, D.C: Department of Health, Education, and Welfare; 1979. p. 1. [Google Scholar]
  • 35.Beauchamp T, Childress J. Principles of Biomedical Ethics. 6. New York: Oxford University Press; 2008. p. 165. [Google Scholar]
  • 36.Weir T. Introduction to Tort Law. 2. Oxford: Oxford Higher Education; 2006. p. 29. [Google Scholar]
  • 37.Resnik DB. Liability for Institutional Review Boards: From Regulation to Litigation. Journal of Legal Medicine. 2004;25(2):131–184. 131. doi: 10.1080/01947640490457451. [DOI] [PubMed] [Google Scholar]
  • 38.See Weir, supra note 36, at 29.
  • 39.See Weir, supra note 36, at 29.
  • 40.See Allhoff, Lin, and Moore, supra note 18, at 71.
  • 41.Sandler R. Nanotechnology: The Social and Ethical Issues. Washington, D.C: Woodrow Wilson International Center for Scholars, Project on Emerging Nanotechnologies; 2009. p. 1. [Google Scholar]
  • 42.See Allhoff, Lin, and Moore, supra note 18, at 71.
  • 43.Crichton M. Prey. New York: Harper Collins; 2002. p. 1. [Google Scholar]
  • 44.Fleischman A, Eckenwiler L, Grady C, Hammerschmidt D, Levine C, Sugarman J. Dealing with the Long-Term Social Implications of Research. American Journal of Bioethics. 2011;11(5):5–9. 5. doi: 10.1080/15265161.2011.568576. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 45.See Fleischman et al., supra note 44, at 5.
  • 46.Id.
  • 47.World Medical Association. WMA Declaration of Helsinki – Ethical Principles for Medical Research Involving Human Subjects. available at < http://www.wma.net/en/30publications/10policies/b3/> (last visited November 19, 2012)

RESOURCES