Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Jul 1.
Published in final edited form as: Toxicol Pathol. 2014 May 22;42(5):940–942. doi: 10.1177/0192623314537135

FutureTox II: Contemporary Concepts in Toxicology “Pathways to Prediction: In Vitro and In Silico Models for Predictive Toxicology

Susan A Elmore 1, Anne M Ryan 2, Charles E Wood 3, Torrie A Crabbs 4, Robert C Sills 1
PMCID: PMC4241177  NIHMSID: NIHMS594096  PMID: 24855144

The Society of Toxicology (SOT) held a very successful FutureTox II Contemporary Concepts in Toxicology (CCT) Conference in Chapel Hill, North Carolina, on January 16th and 17th, 2014. There were over 291 attendees representing industry, government and academia; the sessions were also telecast to 9 locations, including Health Canada, US FDA/National Center for Toxicologic Research, the US EPA and the California EPA Office of Environmental Health Hazard Assessment. The conference also included more than 50 posters as well as several vendor exhibits.

The theme of the meeting was “Pathways to Prediction: In Vitro and In Silico Models for Predictive Toxicology.” This conference was the product of the Scientific Liaison Coalition (SLC), which is a partnership of 16 societies, including the Society of Toxicologic Pathology, with the aim to increase the awareness and impact of toxicology on human health and disease prevention. The focus of this FutureTox II meeting was integration of current and developing in vitro methodologies and computational modeling approaches with advances in systems biology to facilitate human risk assessment. The overarching theme in each session was to articulate the current strengths and limitations of these newer approaches and their utility in prioritizing chemicals for safety testing.

The meeting co-chairs Thomas B. Knudsen (US EPA, RTP, NC, USA) and Douglas A. Keller (Sanofi US, Bridgewater, NJ, USA), along with the organizing committee, divided the two-day conference into 3 session themes: (I) current and future biological systems, (II) science of predictive models, and (III) regulatory integration and communication. Over the course of the conference, attendees heard 20 presentations across these 3 themes. The last session consisted of 4 interactive breakout sessions (regulatory toxicology, hepatotoxicity, developmental/reproductive toxicity, and cancer), each given the task of identifying the next steps in the refinement and application of these technologies to hazard identification and risk assessment.

Platform and poster presentations covered a diverse range of current research. Prominent topics included:

  • Application of high-throughput screening (HTS) data from large-scale in vitro platforms (e.g. ToxCast/Tox21) and in silico models for risk assessment.

  • Application of pluripotent stem cells to in vitro screening paradigms.

  • Developments in three-dimensional cell/tissue models as screening tools.

  • The use of zebrafish as high(er) throughput phenotypic screens for chemical toxicity.

  • The development of adverse outcome pathway (AOP) maps and a molecular initiating event atlas for specific toxicities.

  • The use of in vitro data to differentiate adverse from non-adverse and adaptive effects.

  • Development of next-generation quantitative structure-activity relationship (QSAR) models.

The conference organizers plan to publish the conference proceedings as a special supplement to the journal Reproductive Toxicology (http://www.journals.elsevier.com/reproductive-toxicology/). The meeting overview and agenda are available at http://www.toxicology.org/ai/meet/cct_futureToxII.asp.

The general premise of this meeting was based on a 2007 report by the U.S. National Research Council titled “Toxicity Testing in the 21st century: A Vision and a Strategy” (NRC 2007). This concept was initiated by the US EPA in collaboration with the National Toxicology Program/National Institute of Environmental Health Sciences and the US National Institutes of Health. The proposed paradigm, now often referred to simply as “Tox21,” called for a shift in safety assessment away from traditional animal-based endpoints and towards in vitro and other HTS assays, alternative models in lower organisms, and computational systems. The objectives of this effort are to transform toxicology from a largely observational science to a more predictive one and, ultimately, to better align future toxicity testing and assessment programs with regulatory needs (Collins et al., 2008).

In a parallel initiative, the European Union (EU) has begun several programs to promote more efficient safety assessment of chemicals and reduce or eliminate unnecessary animal testing. At FutureTox II, keynote speaker Maurice Whelan, from the Institute of Health and Consumer Protection of the European Commission, summarized recently enacted EU legislative directives that have resulted in more stringent restrictions on the use of animals for scientific purposes. For example, the EU Cosmetics Regulation has banned, after March 2013, the marketing of new cosmetics products in Europe that contain any ingredient that has been tested on animals. Other initiatives to replace animal use in repeat-dose toxicity testing were also noted for Europe (see www.seurat-1.eu). Dr. Whelan also noted that scientific communities around the world have increasingly been focused on the 3 Rs: replacement, refinement, and reduction in animals in research. Conference speakers frequently recognized the scientific and legislative impetus behind these programs, as well as current challenges in their translation to human risk assessment and regulatory acceptance.

An important rationale for the Tox21 effort is the lack of toxicity data for thousands of chemicals currently in production that cannot be feasibly evaluated using traditional approaches. This issue, along with the idea that recent advances in molecular and computational biology have not been adequately incorporated into risk assessment, has led to an intense focus by regulatory and other health organizations on advancing predictive toxicology. This rapidly evolving field incorporates a mixture of scientific technologies, including various -omics data streams, advanced cell and tissue culture models, integrative bioinformatics, and in silico simulations, to forecast the interaction between chemicals and biological systems. The goal is to generate predictive models (e.g. via data mining, QSAR modeling, chemoinformatics, and bioactivity profiling) that can be used to prioritize compounds for further testing and, in the future, to predict toxicological events in response to specific chemicals. The AOP framework was also discussed as a way to organize these newer types of data and model pathway-based relationships. The AOP construct is conceptually similar to the mode of action but intended to be used in a more prospective and quantitative manner, working forward from a molecular initiating event.

Despite remarkable progress in recent years, speakers highlighted the many future challenges in the use of in vitro models to predict in vivo results. For example, while current methods allow generation of large amounts of data, tools for mining and interpreting these data in order to generate an end product that is reproducible and meaningful to other scientists and regulatory agencies are not always available. The assays that are selected for screening must also be “fit for purpose;” that is, the specificity and validation criteria for assays used to triage chemicals for further testing are very different from those assays currently used for risk assessment. Additional challenges described for in vitro systems include the lack of metabolic capabilities of many cell lines, the application of consistent thresholds for positive calls, dose extrapolation to in vivo systems, relating acute in vitro exposures to chronic multi-step toxicities, and discrimination of specific targets and non-specific cytotoxicity effects identified in screening assays. While in vitro and in silico models offer value in prioritizing agents for tiered or targeted testing, they currently do not provide adequate predictive value alone for hazard identification and risk assessment.

While the participation of veterinary pathologists in predictive toxicology has been somewhat limited to date, there are a number of important opportunities in this arena moving forward. Pathologists occupy a unique place in the translation of results from cell-based models to human health. The basic scientific goals of the Tox21 effort are to (1) identify mechanisms of chemical-induced biological activity, (2) prioritize chemicals based on biological activity, and (3) develop more predictive models of in vivo biological responses (Bucher 2013). These are all clearly within the realm of expertise for many veterinarian pathologists. Especially in those circumstances where effects from an in vitro assay are intended to replace those observed in a guideline animal assay, it is imperative that potential physiologic and pathologic responses to a toxicant are assessed and captured. As scientists with expertise in comparative biology, veterinary pathologists may also contribute to the development of alternative in vivo models such as small fish systems and the histopathologic evaluation of 3D tissue models (Driessen et al. 2013) (Gill and West 2013). In these ways, pathologists can contribute to the critical challenges of scientifically validating emerging technologies aimed to reduce in vivo testing and facilitate the regulatory acceptance of these assays where appropriate.

At the final session of the FutureTox II meeting, there was participation of several toxicologic pathologists within each of the 4 breakout groups. The organizers expressed a willingness to have greater engagement by toxicologic pathologists at future scientific meetings on this topic, including FutureTox III, which is being proposed to the SOT as a CCT Conference in 2016. The preliminary FutureTox III meeting plans are being developed by the SLC, which is currently chaired by Kevin McDorman. The proposed focus of the FutureTox III meeting will be on the regulatory acceptance of these in vitro and in silico models. Other opportunities include an upcoming STP regional meeting on this topic April 23rd at MedImmune in Gaithersburg, Maryland, which will focus on the Tox21 initiative and its potential impact on the practice of toxicologic pathology. The agenda is posted at http://www.toxpath.org/meetings.asp.

Acknowledgments

This work was supported [in part] by the National Institutes of Health (NIH), National Institute of Environmental Health Sciences (NIEHS).

Footnotes

Disclaimers: This article has been reviewed by the U.S. EPA and approved for publication. Approval does not signify that the contents necessarily reflect the views and the policies of the Agency. Mention of trade names or commercial products does not constitute endorsement or recommendation for use.

References

  1. National Research Council. Toxicity Testing in the 21st Century: A Vision and a Strategy. Washington, DC: National Academies Press; 2007. [Google Scholar]
  2. Collins FS, Gray GM, Bucher JR. Toxicology. Transforming environmental health protection. Science. 2008;319(5865):906–7. doi: 10.1126/science.1154619. [DOI] [PMC free article] [PubMed] [Google Scholar]
  3. Bucher JR. Regulatory forum opinion piece: Tox21 and toxicologic pathology. Toxicol Pathol. 2013;41:125–7. doi: 10.1177/0192623312450632. [DOI] [PubMed] [Google Scholar]
  4. Driessen M, Kienhuis AS, Pennings JL, Pronk TE, van de Brandhof EJ, Roodbergen M, Spaink HP, van de Water B, van der Ven LT. Exploring the zebrafish embryo as an alternative model for the evaluation of liver toxicity by histopathology and expression profiling. Arch Toxicol. 2013;87:807–23. doi: 10.1007/s00204-013-1039-z. [DOI] [PubMed] [Google Scholar]
  5. Gill BJ, West JL. Modeling the tumor extracellular matrix: Tissue engineering tools repurposed towards new frontiers in cancer biology. J Biomech. 2013 doi: 10.1016/j.jbiomech.2013.09.029. [DOI] [PubMed] [Google Scholar]

RESOURCES