Skip to main content
American Journal of Public Health logoLink to American Journal of Public Health
editorial
. 2001 Dec;91(12):1964–1967. doi: 10.2105/ajph.91.12.1964

A Bold New Direction for Environmental Health Research

Kenneth Olden 1, Janet Guthrie 1, Sheila Newton 1
PMCID: PMC1446914  PMID: 11726375

Abstract

The biotechnology revolution has opened new opportunities for addressing current inadequacies in decision making regarding environmental health. Strategic investments need to be made (1) to develop high-throughput technologies that could accelerate toxicity testing and generate a mechanistic understanding of toxicity, (2) to incorporate individual susceptibility into risk assessments, and (3) to establish a rational basis for testing and regulatory decision making. New initiatives of the National Institute of Environmental Health Sciences, including the Environmental Genome Project and the Toxicogenomics Center, are discussed.


INFORMATION IS THE BASIS of decision making in our private lives. For example, when it is time for many of us to buy a car or a house, we take great pains to study the market, examining factors such as reliability, safety, and resale value before committing ourselves to such a major investment. As a nation, however, we frequently make decisions about regulation of the levels of exposure to chemical and physical agents in the environment to protect human health—moves that cost the public and private sectors hundreds of billions of dollars—without adequate information. Are these policy decisions that affect the lives of hundreds of millions of Americans less important than the routine family matters just described? This critical lack of information is becoming more evident as we move into an era in which the biggest threats we face are from exposures to low doses of chemical and physical agents, not the high doses we have traditionally faced and tried to control.

Formal risk assessments of environmental and occupational health standards place an awesome burden on regulatory agencies, requiring a period as long as 10 to 15 years for assessment and implementation of some standards. Risk assessment is so difficult because all stages of the process (hazard identification, dose–response analysis, exposure assessment, and risk characterization) are fraught with uncertainty. Uncertainties lead to acrimonious debates among scientists, industry leaders, and public interest groups about the risks and management strategies proposed. These debates become so intense at times that the public must be confused about what is known and what is assumed. Fundamentally, the problem relates to the quality and completeness of the information and the need to extrapolate from animals to humans and from high-dose to low-dose exposure levels.

THE PRECAUTIONARY APPROACH

The foundations of many risk assessments rest on rodent studies at high doses that elicit certain toxic endpoints, such as tumors or organ damage. These studies are sometimes augmented by epidemiologic observations that associate environmental exposures with certain health endpoints. With these data, risk assessors must develop a predictive schema that defines the level of environmental exposure that would lead to disease in a portion of the population. Ideally, regulators would have detailed toxicity information on a chemical, would understand by what mechanism it operates in both rodents and humans, would know the actual exposure uptake, and would be able to factor in how subgroups (children, the elderly, the impoverished) differ in regard to their susceptibility to an environmental agent.

Typically, however, regulators must operate in a less-thanperfect world in which they have much less information on which to base their decisions. To compensate for this lack of information and to ensure that standards protect human health, regulators resort to default assumptions and the precautionary principle in making risk assessment decisions. The debates in risk assessment revolve around levels of comfort with the default assumptions and the potential for standards to be set at needlessly low levels that offer no added benefit in protecting health. Even in instances in which extensive information has been generated, there are uncertainties in transforming toxicity and exposure data into suitable standards. That fact notwithstanding, one would certainly be more comfortable with decisions based on detailed toxicity, mechanistic, and exposure data in which many of the uncertainties have been eliminated.

DOES SUCCESS BREED NEW CHALLENGES?

In part, the current dilemma in human risk assessment has resulted from the success of environmental remediation and pollution control and reduction efforts over the past 30 years. These efforts have dramatically reduced the human health threats posed by the thousands of new chemicals and technologies introduced into our environment during the 20th century. In fact, we have been so successful in improving the quality of our environment that there are those who argue that the environment no longer represents a serious threat to human health. Although polls show that 60% to 70% of Americans believe that environmental problems are still a concern, there is nonetheless a vocal minority that maintain the job is done. It is the contention of these groups that the low-dose exposures experienced by most Americans pose no significant health threat.

We have no idea what kinds of risks are posed by chronic low-dose exposures, however, because testing to this point has, out of necessity, focused on higher exposure levels. Also, some toxicants can accumulate in human tissue. Choices that are relatively easy when dealing with high-dose exposures become more difficult in the low-dose range of exposures. Poor decisions will levy huge burdens on society in the form of pain and suffering, health care costs, environmental degradation and loss of species diversity, and diminished competition of American industry. Thus, it is in the national interest that we make investments in science to generate the information needed to make these important decisions.

Traditional environmentalism has concerned itself with a narrow set of issues related to the development of a complex system of laws and policies. As a consequence, the “big picture” issues have not received the attention they deserve. One example of such a neglected area involves the paucity of information on susceptibility, exposure, toxicity, and the interactive nature of chemical mixtures. Solutions to environmental health problems require a more strategic, holistic approach that targets the significant information gaps in risk assessment.1 The missing information is needed to develop the framework for accurately assessing human disease risk, and such information falls in 3 categories.

First, we must capitalize on recent advances in molecular biology to develop high-throughput technologies that can more quickly and reliably assess toxicity. Second, we must develop the knowledge base necessary to understand differences in susceptibility. Third, we must develop a more rational basis for testing and regulatory decision making based on knowledge of mechanisms of action, actual exposure, possible interactions between agents, and exposure–disease association studies.

TOXICOGENOMICS

Toxicologists are taking advantage of recent developments in human genomics to develop new carcinogenicity and toxicity test systems that are fast and efficient and involve the use of fewer animals than current approaches based on tissue pathology. The new toxicogenomics approach, based on gene-array technology, monitors precursor molecular events involved in the initiation of disease. Given that gene expression is continuously modulated by environmental cues, exposure to toxic agents can be expected to elicit unique patterns of gene expression. DNA microarray technology, which allows monitoring of the expression of thousands of genes simultaneously on small wafer-sized chips, may be useful as a highly sensitive tool to assess toxicity.2 The assumption is that toxicity is likely to evoke quantitative or qualitative changes in gene expression.

Identifying the genes transcribed under different exposure conditions in various cells, tissues, and organisms could have both evaluative and predictive potential. For example, this technology may allow toxicologists to expose cells or tissues to chemicals whose toxicity is unknown and match the results against the “signature,” or common set of changes in gene expression, produced by a known class of toxicants. Our expectation is that we will be able to use the toxicogenomic gene-array approach to survey the entire human genome and thus determine which genes are affected by specific chemicals. This approach will reduce the need for lengthy and expensive animal bioassays and could lend itself to testing for the effects of low-dose, chronic exposure and assessing the toxicity of mixtures. The approach should also be very useful for extrapolating from surrogate models to humans.

To promote the development and use of toxicogenomic approaches, the National Institute of Environmental Health Sciences (NIEHS) has developed a national Toxicogenomics Center consisting of the NIEHS Microarray Center and 5 university-based regional centers. The NIEHS center will coordinate the national effort and serve as the national repository for gene-expression data.3 However, years of experience with the technology will be necessary to develop the confidence and appropriate databases to validate these approaches. Also, the signature patterns generated must be evaluated in population-based studies in terms of disease association. Without new, high-throughput technologies, however, we will not be able to assess the toxicity of the thousands of chemicals on which there are inadequate toxicity data.

GENETIC BASIS FOR DIFFERENCES IN SUSCEPTIBILITY

Genetic susceptibility, environmental exposure, age, sex, nutritional status, and behavior all determine an individual's unique risk for developing disease. However, we limit the brief discussion presented here to the contribution of genetics and environmental exposures. Because of the dramatic discoveries in human genetics over the past decade, many have come to believe that the problem of disease etiology will be solved with the decoding of the human genome. But, contrary to this view, a recent study showed what scientists have long recognized: that the environment—the chemical, physical, and biological agents to which we are exposed, along with our lifestyles—plays an important role in the development of most chronic diseases such as cancer.4 The current view is that most chronic diseases arise from complex interactions of multiple genes and environmental exposures. Therefore, the prevention of most human diseases will require a more thorough understanding of both the genetic and the environmental contributions to their etiology.

Recent developments in human genetics now permit more definitive studies of gene–environment interactions in the development of disease. The recent publication of the “reference sequence” of the human genome provides an important resource to assess the role of genetic polymorphism in susceptibility to environmental exposure. Evidence that genetics plays a significant role in the development of disease has come from studies of familial clusters identifying genes with 1 or several alleles that are associated with an increased risk for a specific disease. Inheritance of such alleles in the population is rare and probably accounts for fewer than 5% of known diseases. Thus, the contribution of monogenic disease genes to the overall incidence of disease is relatively small, although the risk for an individual with a specific disease allele is relatively high.

Most common human diseases appear to be polygenic, resulting from complex interactions of multiple genes. A variant of 1 gene may not be detrimental, but it might become detrimental in combination with specific alleles of 1 or more other genes. Such so-called susceptibility genes increase disease risk only a fewfold, yet they can have a major impact on the incidence of disease in the human population because of their frequency. Susceptibility genes are not sufficient to cause disease; they modify risk.

The Environmental Genome Project was initiated in 1997 to stimulate research into the role of genetic variation in the human body's response to environmental exposures.5–8 The goal is to catalog information about human genetic polymorphism and to apply this information to understanding disease susceptibility and individual responses to environmental exposure. Among the genetic polymorphisms of interest would be those coding for the following: cytochrome P450 metabolizing enzymes, which influence risk of smoking-induced lung cancer; N-acetyltransferase-2, which influences risk of smoking-induced bladder and breast cancers; paraoxonase, which influences pesticide-induced nerve damage; and glutathione S-transferase M1, which influences toxicities and cancer risks.

The Environmental Genome Project is being carried out in 3 phases.8 The first phase will identify polymorphisms in a set of genes that are likely to play important roles in environmentally associated diseases. The second phase will involve functional analysis of the various polymorphisms occurring in coding and regulatory regions of genes. This phase will require laboratory-based as well as population-based studies to establish that a specific polymorphism is associated with a specific disease. The third phase of the project will involve the development of animal models for use in studies of how environmental agents interact with specific polymorphisms to cause human illnesses. Throughout these phases, care is being taken to predict and manage the ethical risks implicit in any project that identifies individual risk of disease, particularly environmentally associated diseases. A full-time ethicist has been hired by NIEHS to oversee this aspect of the project and to stay current in this new and rapidly evolving field.

The mechanisms by which information on susceptibility can be used to reduce risk from exposure to environmental toxicants have not yet been determined. However, several possible approaches can be envisioned, including (1) screening using genetic variation as a biomarker, (2) eliminating or reducing exposure, (3) gene therapy, and (4) pharmacologic intervention.

RATIONAL BASIS FOR TESTING AND REGULATION

Again, information gaps limit rational decision making. For example, we typically have very little information about mechanism, actual exposure dose, and how environmental toxicants interact in a mixture. Therefore, investments in these areas are critically important.

Quantitative risk assessment relies on knowledge of mechanisms to predict dose–response relationships. Studies at both high- and low-dose exposures are needed to identify thresholds when they exist. Selection of the appropriate experimental models to assess toxicity and to understand differences in susceptibility due to genetics, age, sex, behavior, and nutritional status is also improved if mechanisms of action are known. Most important, however, knowledge of mechanisms is critical for the design of primary and secondary prevention strategies characteristic of the practice of public health. NIEHS-supported research has also served as the source of information for many of the regulatory standards put forward by the US environmental health regulatory agencies to protect human health.

In regard to lack of information that is important to human risk assessment, lack of information on exposure is probably the most serious problem. Estimation of exposure using indirect surrogates (e.g., toxic release and production inventories and environmental monitoring) is inadequate and limits our understanding of dose–response relationships. This area of environmental health is in need of development and application of innovative technologies for assessing exposure based on considerations of individual uptake, metabolism, and excretion as well as behavioral differences. We need tools designed to directly measure the amount of tissue deposition of environmental pollutants.

However, risks of exposure to environmental toxicants may be very different from current estimations and assumptions based on animal studies involving exposure to 1 agent at a time. In reality, humans are exposed to multiple agents simultaneously. Now that we have the capacity to develop technologies (e.g., DNA microarray) to assess the toxicity of mixtures, NIEHS has made this a top priority.

CONCLUSIONS

The new era of toxicogenomics, made possible by advances in human genomics, promises to revolutionize the practice of public health as it relates to environmental health protection. Understanding human genetic variation and genomic reactions to specific environmental exposures will have a significant impact on our ability to uncover the causes of variations in response to environmental exposures. The Environmental Genome Project will provide the foundation for the emerging fields of toxicogenomics and pharmacogenomics. These new disciplines hold the promise of reducing the costs and time lines associated with animal and human studies designed to assess the toxicity and efficacy of both environmental pollutants and therapeutic agents.

As with any nascent science, initial costs must be met before the promise can be fulfilled. NIEHS will this year alone commit more than $22 million to combined genomics efforts. These funds, however, are truly strategic investments that will lead to a revolution in our approach to the study of toxicity. It will be through the genomics support of the NIEHS and others that the current ritualistic approach to toxicology and risk assessment can finally give way to a more rigorous, scientifically based approach involving cutting-edge technologies of genetics and molecular biology.

K. Olden wrote the original outline, introduction, and concept sections; S. Newton wrote the areas of emphasis sections; and J. Guthrie developed graphics for the concepts and was responsible for the reference section. J. Guthrie rewrote and condensed the manuscript into a shorter form.

Peer Reviewed

References

  • 1.Olden K, Guthrie J. New frontiers in environmental health research. In: Rom WN, ed. Environmental and Occupational Medicine. 3rd ed. Philadelphia, Pa: Lippincott-Raven; 1998:1807–1813.
  • 2.Brown PO, Hartwell L. Genomics and human disease—variations on variation. Nat Genet. 1998;18:91–93. [DOI] [PubMed] [Google Scholar]
  • 3.Lovett RA. Toxicologists brace for genomics revolution. Science. 2000;289:536–537. [DOI] [PubMed] [Google Scholar]
  • 4.Lichtenstein P, Holm NV, Verkasalo PK, et al. Environmental and heritable factors in the causation of cancer: analyses of cohorts of twins from Sweden, Denmark, and Finland. N Engl J Med. 2000;343:78–85. [DOI] [PubMed] [Google Scholar]
  • 5.Kaiser J. Environment institute lays plans for gene hunt. Science. 1997;278:569–570. [DOI] [PubMed] [Google Scholar]
  • 6.Guengerich FP. The Environmental Genome Project: functional analysis of polymorphisms. Environ Health Perspect. 1998;106:365–368. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 7.Shalat SL, Hong JY, Gallo M. The Environmental Genome Project. Epidemiology. 1998;9:211–212. [DOI] [PubMed] [Google Scholar]
  • 8.Olden KO, Wilson S. Environmental health and genomics: visions and implications. Nat Rev Genet. 2000;1:149–153. [DOI] [PubMed] [Google Scholar]

Articles from American Journal of Public Health are provided here courtesy of American Public Health Association

RESOURCES