Latest News
Web Date: March 21, 2016

Forecasting liver toxicity before the damage is done

Using toxicity data from thousands of chemicals, researchers developed a computational program that predicts whether a drug candidate is likely to cause liver damage.
Department: Science & Technology
News Channels: Biological SCENE, Organic SCENE
Keywords: Drug Development, Toxicology, Hepatotoxicity
[+]Enlarge
These four drugs are all in the same family, triptans, and share a tryptamine-like core. Despite their similarities, a computational model correctly identified which molecules are hepatotoxic (Tadalafil, Etodolac, and Frovatriptan) and which molecule is not (Yohimbine).
Credit: ACS Chem. Res. Tox.
20160321lnp2-strucs
 
These four drugs are all in the same family, triptans, and share a tryptamine-like core. Despite their similarities, a computational model correctly identified which molecules are hepatotoxic (Tadalafil, Etodolac, and Frovatriptan) and which molecule is not (Yohimbine).
Credit: ACS Chem. Res. Tox.

The liver takes the brunt of the job of clearing drugs from the body, subjecting the organ to a significant toxicity risk. One of the biggest challenges in drug discovery is figuring out which drug candidates are likely to harm the liver before testing the agents in humans. Now, researchers have developed a computational model that compares a drug candidate to those known to cause liver damage and predicts whether the novel drug is likely to do the same (ACS Chem. Res. Toxicol. 2016, DOI: 10.1021/acs.chemrestox.5b00465).

Many pathways are involved in liver toxicity, says Denis Mulliner of Sanofi-Aventis. The standard approach for identifying hepatotoxicity is animal studies, but an agent that is fine for animals may still not be safe for humans. “It’s especially hard to predict, which is a problem for patients,” Mulliner says. Even after rounds of human clinical trials, liver toxicity might only appear after a drug is on the market because damage may occur only rarely or take a long time to develop. Ideally, if scientists could pull potentially harmful drug candidates from the pipeline early, they could avoid embarking on expensive and time-consuming animal studies and human clinical trials, says Mulliner. Computational models that predict a pharmaceutical’s potential toxicity could make drug development faster, cheaper, and safer.

Most of the earlier toxicity models provide only a yes or no answer, Mulliner says. He wanted a more nuanced response that included how confident the model is about its prediction. “Sometimes we get an outcome that says, ‘we don’t know’,” he says. “It’s not very satisfactory, but it’s better than saying there is no problem.” In addition, existing computational approaches use chemical databases that are too small or too homogenous to make good predictions for novel drug candidates.

So Mulliner’s team developed a model that incorporated human and animal toxicity data from a whopping 3712 compounds—three or more times the number used in most earlier models—organizing the data based on each chemical’s molecular properties and the mechanism of the hepatotoxicity. The model looks for common chemical and structural properties that lead to a particular type of toxicity. They then tested their model with 269 proprietary compounds not included in the database. The hepatotoxicity of these compounds had been tested in animals, and the new model correctly identified 72% of the hepatotoxic compounds. The researchers are sharing this database, along with the model’s source code, with the scientific community “to advance the field of predictive toxicology,” Mulliner says.

Gerhard Ecker of the University of Vienna is impressed by the quality of the model’s database. “It’s by far the biggest data set I’ve seen,” he says. “It’s more robust.”

 
Chemical & Engineering News
ISSN 0009-2347
Copyright © American Chemical Society
Comments
Larry E. Fink, Waterwise Consulting, LLC  (Wed Mar 23 14:17:43 EDT 2016)
This approach should be generalizable to other organs and organ systems and should increase in the accuracy of its toxicity predictions with a larger database. To that end, does EPA have the legal authority to call in all proprietary and non-proprietary drug and pesticide testing data under the existing Toxic Substances Control Act for purposes of developing the most robust, accurate and reliable version of the computational model for each organ and organ system, which it would then make publicly available, thereby cutting the cost of dead-end drug and pesticide research as a public service, while vastly improving EPA'S ability to rapidly screen the more than 80,000 chemicals in commerce for withdrawal if not exonerated by multi-generation laboratory animal toxicity testing.
Robert Buntrock (Fri Mar 25 16:33:12 EDT 2016)
First, it's an all too common misconception that all ca. 80,000 compounds on TSCA are actually in commercial use. Second, many of those compounds are natural products. Third, I trust studies on organs and organ systems than I do tox studies on cells in a dish. As pharmaceutical chemists well know, ADMET applies to the effects on the entire organism or organ system. Any chemical has to get to the affected site.
Robert Buntrock (Fri Mar 25 16:29:37 EDT 2016)
I hope that acetaminophen was considered since it is the most commonly encountered liver toxin with a very narrow safety factor comparing common dose to toxic dose, about 8:1. The actual exposure is difficult to determine since it occurs in so many OTC "remedies".
Leave A Comment