ANIMAL TESTING has long been considered the gold standard for environmental toxicology, the determination of the toxicity of chemicals found in the environment. But animal testing takes a long time and entails the deaths of animals to acquire data. Even more, the results can't always be extrapolated to humans. Environmental toxicologists are turning to high-throughput methods developed in the pharmaceutical industry as alternatives to the traditional methods and as ways to prioritize chemicals for more in-depth testing.
The rat cancer bioassays used by the National Toxicology Program (NTP) take two years. NTP is an interagency program charged with evaluating the toxicity of commercial chemicals and is housed within the National Institute of Environmental Health Sciences in Research Triangle Park, N.C. Because such tests are so time-consuming, only a handful of commercial chemicals have gone through the entire battery of NTP tests.
The pharmaceutical industry uses high-throughput screening methods to identify compounds that can be starting points for drugs. But for drug toxicity testing, the industry still turns to the old standby of rodent tests. Several government organizations are working to adapt high-throughput methods for toxicity testing, with the goal of developing in vitro assays that can predict toxicity.
A recent report from the National Research Council (NRC) recommends the development of cell-based screening methods as a replacement for animal testing (C&EN, June 18, page 13). Although toxicologists share the vision that cell-based assays will eliminate the need for animal testing, they suspect it will take many years to become a reality. In the shorter term, high-throughput methods could help prioritize chemicals for current time-consuming methods of toxicity testing, says Robert Kavlock, director of the Environmental Protection Agency's National Center for Computational Toxicology in Research Triangle Park.
Environmental toxicologists face a formidable challenge. They must put together batteries of tests, and they need to understand how the patterns of results from those tests correlate with toxicity, first in animals (because that's what the bulk of current data relates to) and then in humans. From the patterns of data and responses, toxicologists hope to categorize chemicals in classes and make assumptions about the safety of particular chemicals. "It's an enormously difficult undertaking," says John Bucher, associate director of NTP.
The foremost challenge derives from the broad range of compounds found in the environment. Pharmaceutical methods are optimized for compounds with "druglike" properties, such as low molecular weight, solubility, and bioavailability. Although some environmental compounds share these druglike properties, many do not.
ANOTHER CHALLENGE is the unacceptability of false negative results in environmental toxicology. Screening in the pharmaceutical industry can tolerate a small percentage of false negatives-instances in which a compound is deemed safe when it is actually toxic-so long as safe compounds also are identified for further development. But such false negatives are unacceptable for environmental toxicology. If false negatives create an illusion of safety that diverts toxic chemicals from further testing, those misleading results could become "problematic," Kavlock says.
There are also practical issues involved with adapting pharmaceutical methods for environmental toxicology. High-throughput screening protocols in the pharmaceutical industry have been optimized in dimethyl sulfoxide (DMSO). Unfortunately, many chemicals of environmental interest aren't soluble in DMSO. What's more, DMSO is unattractive as a solvent for toxicity testing because it is itself toxic at high enough concentrations.
Although the pharmaceutical industry uses its high-throughput methods to screen for compounds that have some sort of biomedically interesting activity, it's better to think of environmental toxiciologists using such methods to "profile" the biological activity of compounds, says Christopher Austin, director of the National Institutes of Health Chemical Genomics Center (NCGC) in Bethesda, Md. NCGC is teaming up with EPA and NTP to adapt its high-throughput methods to screen chemical libraries for environmental applications. "Profiling means you characterize and document the biological activity or lack thereof for every compound in your collection," he explains.
Such profiles are more useful if they include measurements at multiple concentrations. Pharmaceutical screening is typically performed at a single concentration, but environmental toxicologists want a dose-response curve. NCGC has developed a method in which they use 15 concentrations over a broad range, resulting in a dose-response curve for every compound (Proc. Natl. Acad. Sci. USA 2006, 103, 11473).
EPA is evaluating this and other new approaches to environmental toxicology through its ToxCast program. The agency has awarded nine contracts to test a broad range of assays. Organizers are sending an initial set of more than 300 compounds to each of the companies involved. The compounds selected for this first round are ones for which a rich supply of traditional toxicology data already exists.
The companies involved in ToxCast are using cell-based and biochemical assays to generate toxicity signatures for compounds of interest and see whether they correlate with known toxicity information. One of the companies is Burlingame, Calif.-based BioSeek. Each of BioSeek's assays contains different cell types that represent different tissues and disease states. They measure the levels of a variety of biological molecules that are related to known effects. "We're looking for the ability to detect and distinguish different toxicity mechanisms," says Ellen Berg, the company's chief scientific officer.
EPA plans to use the assay results from BioSeek and the other companies that received contracts to construct predictive signatures of toxicity. In the next phase of ToxCast, the program's managers will validate the approach using additional chemicals. Eventually, the goal is to use the signatures to predict toxicity in newly introduced commercial chemicals.
AUSTIN ESTIMATES that about half of the compounds in the first set to be tested are nondruglike. He confesses that in the initial stages of the program, his center avoided analyzing some of the more challenging compounds.
"Ill-behaved" compounds could produce data artifacts that would obscure whether such an approach works at all, Austin says. "You want to use relatively well-behaved compounds; then the use of nasty compounds becomes an experiment in itself."
The NRC report calls for increased emphasis on assays that incorporate human cells rather than whole animals. Berg calls the use of human cells a "first big step." She hopes that assays using human cells will at least correlate with existing animal data.
Austin doubts that cell-based assays will eliminate animal testing entirely. "My guess is that we will end up with panels of assays that will be in vivo proxies for each of the major types of toxicities." Most types of toxicity are studied in a distinct animal model. If cell-based assays can predict how a compound is likely to act, then toxicity testing can be limited to certain animal models.
Even if in vitro tests reduce rather than eliminate animal tests, they will be a success. Further animal testing might be unnecessary for those compounds that cause obvious toxicity in many cell lines.
"We hope we will provide EPA a science-based system that's capable of prioritizing chemicals for animal testing," Kavlock says. "If we're exceptionally good, we'll be able to pinpoint what animal tests should be done."