Predicting Toxicity | March 7, 2011 Issue - Vol. 89 Issue 10 | Chemical & Engineering News
Volume 89 Issue 10 | p. 40
Issue Date: March 7, 2011

Predicting Toxicity

Insights: Chemists should be aware of the potential risks posed by the products they make
Department: Government & Policy
News Channels: Environmental SCENE
Keywords: risk assessment
EPA is hoping to reduce the number of animals used in toxicity testing.
Credit: Shutterstock
EPA is hoping to reduce the number of animals used in toxicity testing.
Credit: Shutterstock

The Society of Toxicology is holding its 50th annual meeting this week in Washington, D.C. The meeting is expected to attract more than 7,000 scientists from around the world, but probably few chemists will be in attendance.

Likewise at the American Chemical Society’s national spring meeting later this month in Anaheim, Calif., few toxicologists are expected. Although the two groups could learn a lot from one another, they rarely speak.

The need to enhance communication between chemists and toxicologists came up during a workshop last month on advancing the next generation of risk assessment. The workshop, hosted by the Environmental Protection Agency, sought input from stakeholders on developing risk-assessment methods and models that incorporate advances in molecular biology and systems biology.

When Robert Peoples, director of the ACS Green Chemistry Institute, asked how many chemists besides himself were in the room, only one hand went up. “This worries me,” Peoples said. He encouraged all workshop participants to skip the next toxicology conference and instead attend a chemistry conference to inform chemists about the latest tools for toxicity testing.

“Until we raise awareness, chemists are not going to know that these tools are available to help guide them,” Peoples said. “We’ve got to raise the awareness, so they think about toxicity during the initial design phase of their materials and molecules.”

Along the same lines, Paul T. Anastas, assistant administrator of EPA’s Office of Research & Development, pointed out that Ph.D. chemistry students are not required to take a class in toxicology. Many of those students are synthesizing chemicals without giving any thought to the potential risks the materials may pose to public health and the environment. “To not understand the consequences of what you are introducing into the world is problematic,” Anastas emphasized.

Anastas and others at the workshop advocated a green chemistry approach—one that considers the entire life cycle of a product. The challenge is to empower the next generation of chemists and engineers with tools to predict toxicity so the next-generation products don’t present us with the same legacy challenges we are facing today, Anastas said.

Even if chemists could design the toxicity out of future products, EPA still has to deal with the tens of thousands of chemicals already in commerce. Only a few hundred of them have been assessed for their toxicity, and EPA needs to prioritize which of the remaining ones should be evaluated.

The current model of testing one chemical at a time on adult laboratory animals for two years won’t work. Such methods miss critical windows of exposure during developmental periods such as in utero and puberty, and they don’t consider exposures to chemical mixtures. Current methods also don’t always accurately predict human health effects, and they are animal-intensive and time-consuming.

Recognizing these limitations, EPA has ramped up its efforts to develop high-throughput-screening assays and computational methods to help predict whether chemicals will demonstrate toxic properties. EPA’s ToxCast program, in particular, has tested hundreds of chemicals using more than 500 automated, high-throughput in vitro assays that detect changes in biological pathways. Many of those assays use human cell lines.

Ultimately, EPA hopes that the in vitro methods will reduce the need for animal toxicity testing, help identify sensitive subpopulations, provide knowledge about mechanisms of action, and speed up toxicity testing.

That vision of using high-throughput in vitro screens and computational methods to identify perturbations of toxicity pathways was outlined in the widely cited 2007 National Research Council report “Toxicity Testing in the 21st Century: A Vision and a Strategy.” The problem is that not all toxicity pathways are known.

“If we really want to fully implement the NRC vision, we are going to need a project somewhat analogous to the Human Genome Project,” Daniel Krewski, an epidemiology professor at the University of Ottawa, said at the workshop. Such a project “would map all the important toxicity pathways by which adverse health outcomes can occur in response to exposures to environmental agents and identify critical perturbations of those pathways.”

That’s a huge undertaking. And even if it can be accomplished, high-throughput in vitro assays have other problems that need to be resolved before they can be used to support costly regulatory decision making on their own. For example, such assays do not work well for volatile or lipophilic chemicals, and they don’t consider effects of metabolism. In addition, in vitro tests may not capture all pathways, and human cells are not always available.

Nonetheless, EPA appears to be on the right path and at least recognizes that it must change how it conducts chemical risk assessments. Now, if only the chemists and toxicologists would start talking to each other.

Views expressed on this page are those of the author and not necessarily those of ACS.

Chemical & Engineering News
ISSN 0009-2347
Copyright © American Chemical Society

Leave A Comment

*Required to comment