Issue Date: July 13, 2015
Improving Toxicity Testing For Better Decision-Making
Advances in chemistry create amazing molecules for commercial, medical, and other beneficial purposes. But each of these products can also raise important, vexing questions such as, “Will it cause cancer or birth defects or pollute streams or harm wildlife?” and “Will it cause other terrible effects years down the road?”
The toxicology science needed to answer those questions is still largely rooted in 20th-century toxicity testing approaches.
Evidence is mounting that traditional, largely animal-based tests do not always reasonably predict results in humans. However, the approaches are time-tested and have generally provided an effective basis for risk management decisions. We, as a society, developed a social contract more than 50 years ago to tacitly accept the results from these traditional 20th-century toxicity tests to inform policy decisions even given their flaws. Despite significant test improvements, decisions made decades ago using the earliest of these tests are still considered valid. How do we capitalize on the advances and move to a faster, more efficient testing regime using new in vitro, in silico, and other smarter studies to provide better data, and importantly, convince the public that those data are more informative?
The solution requires changes in both science and our social contract. The scientific changes may lie in technological advances that read more like science fiction than science textbook. For instance, federal agencies are partnering to develop a “human on a chip” as a realistic way to predict toxicology to inform safety decisions.
Two years ago, I attended a lecture about “organs on a chip” by Donald E. Ingber of the Wyss Institute and was amazed by his “lung on a chip” and “gut on a chip” videos showing work funded under this effort. Wow! I was observing properties that cells exhibit in moving, breathing, three-dimensional living bodies, not the properties cells exhibit when grown in culture. These organ chips successfully mimicked systems of the body, and some of this technology is already being commercialized for drug toxicity testing.
Now is the time to capitalize on such advances to update the regulatory framework and move the evaluation of chemicals into the 21st century. The National Research Council’s 2007 report “Toxicity Testing in the 21st Century: A Vision and a Strategy” (Tox 21) provided a road map to do this. Building on the NRC report, the federal government has formed the Tox 21 Consortium (http://tox21.org) to construct a flexible new testing paradigm providing credible toxicity information to efficiently and effectively inform decisions and to improve as knowledge is gained.
To date, the Tox 21 Consortium has conducted proof-of-principle efforts showing that the NRC Tox 21 vision works to screen inventories of untested chemicals as a means to focus follow-up testing on specific chemicals and specific end points. They’ve engaged with other national and international organizations to go even further and develop the means to use Tox 21 information to inform risk management decisions as well as to provide tools for chemical design to develop “green” chemicals. These are important first steps in the move from an inefficient paradigm requiring extensive and sometimes irrelevant testing to an efficient and informative paradigm providing the means to use a risk-based, hypothesis-driven approach to identify the specific
Full implementation of the Tox 21 vision can’t be accomplished through science alone. Major shifts in regulatory, legal, and social thinking are needed before we can reach a new social contract in which the results from Tox 21 approaches, though not perfect, are accepted as equivalent to or better than traditional toxicity test results. Congress, regulatory agencies, industry, nongovernmental organizations (NGOs), and the public at large need to address how much and what kind of proof is needed before we can comfortably use the results of these tests for major health and economic decisions.
It’s time to be bold but not reckless. Let’s move from the rodent-based 20th-century toxicity paradigm to a human-based Tox 21 approach. The Tox 21 Consortium and others are doing this, and Toxic Substances Control Act (TSCA) reform efforts now under way in Congress provide the first opportunity to improve chemical testing for regulatory decisions and ultimately move toward a new social contract (C&EN, June 29, page 6).
Tox 21 offers promise to facilitate the design of chemicals that are less likely to have human and environmental health effects as well as to evaluate existing chemicals for potential adverse human health effects more accurately, efficiently, and economically than animal testing. As chemists, let’s partner with toxicologists, systems biologists, and others to develop and apply Tox 21 effectively. Let’s help Congress, and the public, understand and accept the practice and promise of modern chemical testing to improve TSCA now and to improve life for the future.
Views expressed on this page are those of the author and not necessarily those of ACS.
- Chemical & Engineering News
- ISSN 0009-2347
- Copyright © American Chemical Society