NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Research Council (US) Committee on Applications of Toxicogenomic Technologies to Predictive Toxicology. Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment. Washington (DC): National Academies Press (US); 2007.

Cover of Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment

Applications of Toxicogenomic Technologies to Predictive Toxicology and Risk Assessment.

Show details

12Conclusions and Recommendations

Toxicogenomic technologies provide new means to evaluate complex biologic systems and the impact of chemicals on living systems. Specifically, toxicogenomic technologies may be applied to improve cross-species extrapolation in the analysis of chemical hazard, identify susceptible subpopulations, assess effects of early life exposures to chemicals, analyze compounds’ modes of action, screen for potential toxic responses, refine exposure assessment, and analyze biologic effects of combined exposures or mixtures. Applying toxicogenomic technologies to these important problems in toxicology can improve understanding and minimize adverse effects of environmental exposures and drugs and contribute to a knowledge base of toxicity end points.

To date, applications of toxicogenomic technologies in risk assessment and the regulatory decision-making process have been exploratory. Although they clearly have great potential to affect decision making, toxicogenomic technologies are not ready to replace existing required testing regimes in risk assessment and regulatory toxicology. However, toxicogenomic technologies are assuming an increasing role as adjuncts to, and extensions of, existing technologies for predictive toxicology. Toxicogenomics can provide molecular level information and tests that add to the “weight of the evidence” for or against the safety of specific environmental toxicants and drugs. Ultimately, toxicogenomic technologies are envisioned to be more sensitive and more informative than existing technologies and may supplant some approaches currently in use, or at least be a component of batteries that will replace certain tests.

This chapter summarizes the committee’s conclusions and recommendations. The summary includes several areas of important overarching recommendations that are not discussed elsewhere in the report (recommendations for a new initiative, for general risk assessment, and to address educational needs) as well as recommendations discussed in detail at the end of individual chapters.

Recommendation 1: Regulatory agencies should enhance efforts to incorporate toxicogenomic data into risk assessment.

The following actions are needed to move toward this objective:

  1. Substantially enhance agency capability to integrate toxicogenomic approaches into risk assessment practice, focusing on exposure assessment, hazard screening, identification and understanding of variation in human susceptibility, mechanistic insight, assessment of dose-response relationships, cross-species extrapolation, and assessment of mixtures.
  2. Invest in research and expertise within the infrastructure of regulatory agencies as well as active collaboration across agencies.
  3. Develop and expand research programs dedicated to integrating toxicogenomics into challenging risk assessment problems, including the development of public and private sector partnerships.

A NEW HUMAN TOXICOGENOMICS INITIATIVE

Fully integrating toxicogenomic technologies into predictive toxicology will require a coordinated effort analogous in concept to the Human Genome Project. Such an effort would, of necessity, be multidisciplinary and multi-institutional and would require broad standardization of technologies and sharing of information. To achieve its goals, such an effort would require funding and resources significantly greater than what are currently allocated to existing research programs and would require partnerships between the public and private sectors.

A national, publicly available database is an essential element of this effort. This is because toxicogenomic technologies generate enormous amounts of data—on a scale even larger than sequencing efforts like the Human Genome Project. Current public databases are inadequate both to manage the types and volumes of data expected to be generated by large-scale applications of toxicogenomic technologies and to facilitate mining and interpretation of the data, which are just as important as their generation and storage. Finally, the ethical, legal, and social implications of collecting, sharing, and using toxicogenomic information are likely to be substantial.

The project as envisioned would require a large-scale, coordinated effort, involving partnerships of government agencies, academic institutions, and commercial organizations. One essential element is to leverage, where possible, large publicly funded studies so they can be used to generate toxicogenomic data. Another essential element is to take steps to facilitate the production and sharing of data between the public and private sectors. Because many of the most extensive applications of toxicogenomic technologies have occurred in the private sector, public-private partnerships will be a vital component of such a large coordinated effort.

This resource would strengthen the utility of toxicogenomic technologies in toxicity assessment and enable more accurate prediction of health risks associated with existing and newly developed compounds and formulations. For example, the Human Toxicogenomics Initiative (HTGI) data resource could provide context to toxicogenomic data generated by drug and chemical manufacturers. Regulatory agencies could use the data resource in risk assessment to inform exposure limits, and the data could improve understanding of the effects of a broad range of exposures ranging from pollution to natural disasters to terrorist attacks.

Recommendation 2: NIEHS should cooperate with other stakeholders in exploring the feasibility and objectives of implementing a Human Toxicogenomics Initiative dedicated to advancing toxicogenomics. The HTGI is intended to coordinate efforts by the government, academia, and industry to advance the evaluation and, as appropriate, implementation of toxicogenomic technologies.

Elements of the HTGI need to include the following:

  1. Creation and management of a large, public database to store and integrate the results of toxicogenomic analyses with conventional toxicity-testing data (see Chapter 10 for more specifics on database needs and recommendations).
  2. Assembly of toxicogenomic and conventional toxicologic data on a large number (hundreds) of compounds into the single database. This includes generating new toxicogenomic data from humans and animals for a number of compounds on which other types of data already exist and consolidating existing data. Every effort should be made to leverage existing research studies and infrastructure (such as those of the National Toxicology Program) to collect samples and data that can be used for toxicogenomic analyses.
  3. Creation of a centralized, national biorepository for human clinical and epidemiologic samples, building on existing efforts.
  4. Further development of bioinformatic tools (for example, software, analysis, and statistical tools).
  5. Explicit consideration of the ethical, legal, and social implications of collecting and using toxicogenomic data and samples.
  6. Specific coordinated subinitiatives to evaluate the application of toxicogenomic technologies to the assessment of risks associated with chemical exposures.

DATA ANALYSIS TOOLS

There is a need for bioinformatic, statistical, and computational approaches and software to analyze data. The HTGI effort described above in cludes development of specialized bioinformatic and computational tools, including development of appropriate statistical methodologies, to analyze toxicogenomic data.

Recommendation 3: Develop specialized bioinformatic, statistical, and computational tools and approaches to analyze toxicogenomic data.

  1. Develop algorithms that facilitate accurate identification of orthologous genes and proteins in species used in toxicologic research.
  2. Develop tools to integrate data across multiple analytical platforms (for example, gene sequences, transcriptomics, proteomics, and metabolomics).
  3. Develop computational models to enable the study of network responses and systems-level analyses of toxic responses.

EXPOSURE ASSESSMENT

The application of toxicogenomics to define biomarkers of exposure requires consensus on what constitutes an exposure biomarker and standardization of toxicogenomic platforms that are appropriate for identifying signatures of environmental or drug exposures in target and surrogate tissues and fluids. Additional challenges include the individual variation in response to an environmental exposure and the persistence of a toxicogenomic signature after exposure.

Recommendation 4: Toxicogenomic technologies should be adapted and applied for the study of exposure assessment by developing signatures of exposure to individual chemicals and perhaps to chemical mixtures.

The following immediate steps could be taken toward this goal:

  1. Using transcriptomic, proteomic, and metabolomic technologies to identify signatures of environmental exposures in target and surrogate tissues and fluids, primarily with animal models.
  2. Beginning to test complex mixtures for possible identification of distinct exposure signatures.
  3. Examining the time course of persistence of chemical versus toxicogenomic signatures after initial chemical exposures.

The following intermediate steps could be taken toward this goal:

  1. Including toxicogenomic (transcriptomic, metabolomic, and proteomic) analysis of samples in large human population studies and in studies designed to assess exposures at toxicant levels commonly encountered in the workplace and in certain communities.1
  2. Including toxicogenomic analysis of samples in relevant case-control, cohort, and panel studies that involve repeated measurements over time as well as in clinical trials when possible and appropriate.
  3. Using the information collected from studies to help develop and populate a database that will support further development of toxicogenomic exposure assessment.

HAZARD SCREENING

Toxicogenomic technologies provide new and potentially useful specific end points for use in toxicity screening. In contrast to applications in evaluating new drug candidates, screening approaches for environmental chemicals will need to address a broader range of exposures and a more comprehensive set of end points. A database of signatures that are informative of the appropriate range of phenotypic end points and doses is essential. Much of the toxicogenomic data collected reside in publicly inaccessible databases. The process of deriving useful toxicogenomic signatures for screening would be accelerated if these data were publicly available.

Recommendation 5: Upon validation and development of adequate databases, integrate toxicogenomic screening methods into relevant current and future chemical regulatory and safety programs.

The following steps could be taken toward this goal:

  1. Having regulatory agencies (including the Environmental Protection Agency and the Food and Drug Administration) continue to develop and refine guidance documents for their staff on interpreting toxicogenomic data. In particular, guidance for environmental chemicals must ensure that screening protocols address the types of end points most relevant for the general population, including sensitive subpopulations.
  2. Developing mechanisms to improve the quantity and quality of data available for deriving screening profiles and developing a database to organize this information:
    1. Establishing a dialogue with entities holding currently inaccessible toxicogenomic data to evaluate options for increasing data availability.
    2. Integrating toxicogenomic assays into ongoing initiatives such as the National Institutes of Health Molecular Libraries Initia tive, the National Toxicology Program, and other large chemical screening programs.
    3. Establishing a dialogue among regulators, regulated industries, and other relevant stakeholders to address current disincentives to generating and submitting toxicogenomic data in regulatory settings.
    4. Convening an expert panel to provide recommendations for which model compounds, laboratory platforms, and specific data elements are necessary for building toxicogenomic databases that are useful for screening applications. Assessment of in vitro approaches for specific toxic end points should be emphasized. All processes, toxicogenomic data, and outcome data must be publicly accessible.
  3. Developing databases and algorithms for using proteomic and metabonomic data in screening.
  4. Ensuring that the regulatory framework provides incentives, or at least removes disincentives, for premarket testing of chemicals.

VARIABILITY IN SUSCEPTIBLITY

People vary in their susceptibility to the toxic effects of chemical exposures, and information about genetic variability is generally used retrospectively (that is, after safety problems are discovered). Toxicogenomic technologies provide the opportunity to use genetic information in a prospective fashion to identify susceptible subpopulations and to assess the distribution of differences in susceptibility in larger populations. Toxicogenomic technologies could reduce the uncertainty about assumptions used in regulatory processes to address population variability.

Recommendation 6: Use toxicogenomic information to prospectively identify, understand the mechanisms of, and characterize the extent of genetic and epigenetic influences on variations in human susceptibility to the toxic effects of chemicals, with the goal of improving the certainty about assumptions used in the regulatory processes to address population variability.

The following immediate step could be taken toward this goal:

  1. Using animal models to identify and study genes associated with human variation in toxicity.

The following intermediate steps could be taken toward this goal:

  1. Using genome-wide studies, ranging from anonymous dense single nucleotide polymorphism (SNP) scans to specialized arrays of putative functional SNP approaches, of individuals in existing cohort, clinical trial, and other population studies to identify gene variations that influence sensitivity to potentially toxic agents. This will be facilitated by efficient quantitative and qualitative assessment of individual exposures to multiple compounds, which may be aided by toxicogenomics (see exposure assessment section).
  2. Focusing more attention on investigating context-dependent genetic effects (that is, gene-gene interactions as well as interactions with other biologic contexts such as developmental age, sex, and life course factors) that reflect the state of the biologic networks underlying responses to toxicologic agents. For example, animal models can be used to better understand polygenic effects.

The following long-term steps could be taken toward this goal:

  1. Conducting research on the influence of exposure, genetic differences, and their interaction on epigenetic modification of the human genome.
  2. Better characterizing the influence of epigenetic modifications on disease processes that are associated with exposure to toxicologic agents.
  3. Developing an animal model resource that mimics the genetic heterogeneity of human populations to study the distribution of gene-gene interactions and gene-epigenetic interactions and that can serve as a model for understanding population risk.

MECHANISTIC INFORMATION

Toxicogenomic studies are improving our knowledge of the underlying biology and the regulatory networks that integrate the signaling cascades involved in toxicity and thus may advance the introduction of mechanistic insight into risk assessment and fulfill the promise of more accurate and expedited elucidation of class-related biologic effects.

An immediate need in the field of toxicogenomics is for more accurate identification of orthologous genes or proteins across species. (There is also a need for cross-species extrapolation, as discussed below.) There are pressing needs to develop algorithms that combine and interpret multiple types of data (for example, gene expression, proteomic, and metabonomic data); better approaches to probe the complexity of toxic responses are also needed. Finally, additional comprehensive time- and dose-related investigations and the study of exposure paradigms that reproduce the human condition with fidelity are needed.

Recommendation 7: Steps should be taken and tools developed to continue advancing the ability of toxicogenomics to provide useful mechanistic insight.

The following immediate steps could be taken toward this goal:

  1. Develop richer knowledge bases and models that can integrate knowledge of the mechanisms of toxicity and the complex network information, encouraging the community to use these models to study toxicology as a global response. This requires a new paradigm of data management, integration, and computational modeling and will require the development of algorithms that combine and interpret data across multiple platforms.
  2. Encourage detailed mechanistic research that is useful for classifying toxic chemicals and to assess the public health relevance of these toxicity classifications.
  3. Facilitate the identification of orthologous genes and proteins in different laboratory species and the development of algorithms for comparing different species.

The following intermediate steps could be taken toward this goal:

  1. Advance proteomic and metabolomic analyses by promoting the integration of peptide and metabolite separation technologies into toxicologic investigations and advancing proteomic and metabolomic databases.
  2. Implement educational programs to help the toxicology and risk assessment communities incorporate data-rich mechanistic information into their professional practice.

The following long-term step could be taken toward this goal:

  1. When appropriate, encourage a shift in the focus of mechanistic investigations from single genes to more integrated analyses that embrace the complexity of biologic systems as a whole as well as the multidimensionality of dose- and time-related effects of toxic agents.

DOSE-RESPONSE RELATIONSHIPS

Toxicogenomics has the potential to improve understanding of dose-response relationships, particularly at low doses. Collecting information on dose-response relationships for a range of doses appropriately linked to time will be essential to fully integrate toxicogenomics into risk assessment decision making. To effectively address questions about risks associated with human exposures to environmental chemicals, which may be much lower than doses currently used in toxicology studies, attention must focus on characterizing toxicogenomic responses at low doses. Such efforts will be more valuable when toxicogenomic studies are tied to conventional toxicity responses, such as incorporating toxicogenomics into traditional toxicity-testing programs.

Recommendation 8: Future toxicologic assessment should incorporate dose-response and time-course analyses appropriate to risk assessment. An analysis of known toxic compounds that are well characterized would provide an intellectual framework for future studies.

CROSS-SPECIES EXTRAPOLATION

Toxicogenomic technologies offer the potential to significantly enhance the confidence in animal-to-human toxicity extrapolations that constitute the foundation of risk evaluations. Using toxicogenomics to analyze species differences in toxicity will help explain the molecular basis for the differences, improving the translation of animal observations into credible estimates of potential human risk. In addition, by providing comparisons between humans and other species at the molecular level, toxicogenomics may assist in identifying those animal species and strains that are most relevant for specific assays. Identifying the most relevant strains is clearly advantageous for a number of reasons, and the potential exists to decrease the number of tests required to assess toxicity.

Recommendation 9: Continue to use toxicogenomics to study differences in toxicant responses between animal models and humans and continue to use genotyped and genetically altered animal model strains as experimental tools to better extrapolate results from animal tests to human health. Algorithms must be developed to facilitate accurate identification of genes and proteins that serve the same function in different organisms and species—orthologous genes and proteins—used in toxicologic research.

DEVELOPMENTAL EXPOSURES

Although recognized to be important in a number of disorders, relatively little is known about the health impacts of fetal and early-life exposures to many chemicals in current use. Because of their sensitivity, toxicogenomic technologies are expected to reveal more than previously was possible about the molecules involved in development and the critical molecular level events that can be perturbed by toxicants. Toxicogenomics may also enable screening for chemicals that cause gene expression changes associated with adverse developmental effects.

Recommendation 10: Use toxicogenomics to investigate how exposure during early development conveys susceptibility to drug and chemical toxicities.

MIXTURES

Although much toxicology focuses on the study of single chemicals, humans are frequently exposed to multiple chemicals. It is difficult to decipher how exposure to many chemicals will influence the effects of each one. It is unlikely that toxicogenomic signatures will be able to decipher all interactions among complex mixtures, but it should be possible to use mechanism-of-action data to design informative toxicogenomic experiments, including screening chemicals for potential points of biologic conversion (overlap) such as shared activation and detoxification pathways, enhancing identification and exploration of potential interactions, and moving beyond empirical experiments.

Recommendation 11: Use toxicogenomic approaches to test the validity of methods for estimating potential risks associated with mixtures of environmental chemicals.

VALIDATION

Validation is a key step in moving toxicogenomic technologies from the laboratory to real-world applications. Although the community has learned that careful experiments using genomic approaches can provide results that are comparable among laboratories and that provide insight into the biology of the system under study, the need for standards to assess the quality of particular experiments remains, as do other levels of validation discussed in Chapter 9.

Recommendation 12: Actions should be taken to facilitate the technical and regulatory validation of toxicogenomics.

The following specific steps facilitate validation:

  1. Developing objective standards for assessing sample and data quality from different technology platforms, including the development of standardized materials such as those developed by the External RNA Control Consortium.
  2. Developing appropriate criteria for using toxicogenomic technologies for different applications, such as hazard screening and exposure assessment.
  3. Having regulatory agencies establish clear, transparent, and flexible criteria for the regulatory validation of toxicogenomic technologies.

ETHICAL, LEGAL, AND SOCIAL ISSUES

As toxicogenomic data with clinical and epidemiologic annotation are amassed, it is critical to ensure adequate protections on the privacy, confidentiality, and security of toxicogenomic information in health records. Safeguarding this information will further important individual and societal interests. It will also prevent individuals from being dissuaded from participating in research or undergoing the genetic testing that is the first step in individualized risk assessment and risk reduction. The potential consequences of disclosure of toxicogenomic information are greater with the growth of electronic health records.

The decision to learn one’s toxicogenomic risk should rest with the individual, including risks at the workplace. Employers have the responsibility to provide a safe and healthful workplace and to provide nondiscriminatory employment and reasonable accommodations for individuals with disabilities. Additionally, risk communication is an essential component of translating toxicogenomic information into reduced health risks for the public.

Toxicogenomics is also likely to play a role in occupational, environmental, and pharmaceutical regulation and litigation. Regulatory agencies and courts should give appropriate weight to validation, replication, consistency, sensitivity, and specificity when deciding whether to rely on toxicogenomic data.

Recommendation 13: Address the ethical, legal, and social issues that affect the use of toxicogenomic data and the collection of data and samples needed for toxicogenomic research.

The following are important areas to address and specific steps that could be taken (see Chapter 11):

  1. The lack of comprehensive legislation protecting the privacy, confidentiality, and security of health information, including genetic information—especially information relevant to vulnerable populations. This could be addressed by legislative improvements to enhance individual protection and to minimize unnecessary barriers to research while continuing to protect the privacy and welfare of human subjects.
  2. The Department of Health and Human Services should explore new approaches to facilitate large-scale biorepository and database research while protecting the welfare and privacy of human subjects.
  3. Regulatory agencies and courts should give appropriate weight to a number of important factors in deciding to rely on toxicogenomic data and be careful not to apply more stringent evidentiary standards for toxicogenomic data than other types of toxicologic evidence. Factors a regulatory agency or court should consider when deciding whether to rely on toxicogenomic data include validation, replication, consistency, sensitivity, and specificity, as described in Chapter 11.
  4. In toxicogenomic research, especially involving or affecting socially vulnerable populations, special efforts should be directed toward community engagement and consultation about the nature, methods, and consequences of the research. To minimize the risk of adverse impacts on socially vulnerable populations from toxicogenomic research and implementation, access to adequate health care for diagnostic and treatment purposes will be critical and should be a priority for funding agencies and legislators.
  5. The appropriate federal agencies should sponsor or facilitate research on ethical, legal, social, and communication issues in applying toxicogenomic technologies, including public attitudes about risk, social impacts of personal ized genetic information about increased risk, the most effective methods of informed consent and data sharing, and what needs to be communicated.

EDUCATION AND TRAINING IN TOXICOGENOMICS

Given the complexity of toxicogenomics, the generation, analysis, and interpretation of toxicogenomic information represents a challenge even in the scientific community and requires the collaborative interdisciplinary efforts of scientific teams of specialists. Therefore, it is essential that education and training in toxicogenomics become a continuous process that reflects the rapid developments in these new technologies. There is a need to develop education and training programs for health professionals, regulators, attorneys and judges, persons communicating to the public, and scientists in training.

Recommendation 14: Develop education and training programs relevant to toxicogenomic applications to predictive toxicology.

The following specific steps meet this recommendation:

  1. Conduct educational initiatives to raise awareness of the general public, vulnerable subgroups, and health professionals about toxicogenomic findings that can affect health.
  2. Establish a training program for regulators, attorneys, and judges to ensure a basic understanding of the generation and interpretation of toxicogenomic datasets, as applied in regulatory decision making.
  3. For media and experts in communication, provide training that may include short courses on what types of toxicogenomic information will be helpful for the public to understand and how to explain the technical information in an understandable way.
  4. For new scientists, it may be appropriate to develop programs at the master’s and Ph.D. levels that include bioinformatic and toxicogenomic applications in toxicology curricula.
  5. For scientists not specializing in toxicogenomics—such as epidemiologists, environmental scientists, and physicians—and for institutions, include didactic instruction in degree programs and curricula, with the goal of educating them on the principles and practice of toxicogenomics.
  6. Appropriate federal agencies should develop “points to consider” that identify and discuss ethical, legal, and social issues relevant to individual researchers, institutional review boards, research institutes, companies, and funding agencies participating in toxicogenomic research and applications.

Footnotes

1

See issues raised about the protection of humans in Chapter 11.

Copyright © 2007, National Academy of Sciences.
Bookshelf ID: NBK10206

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (2.1M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...