Chemistry matters. Join us to get the news you need.

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.


Consumer Safety

Getting Real About Chemical Risks

Predictive Models for hazards and exposure improve, but gaps remain

by Britt E. Erickson
October 14, 2013 | APPEARED IN VOLUME 91, ISSUE 41



Getting Real About Chemical Risks

Many people assume that the chemicals in their detergents, floor cleaners, and other household products have undergone rigorous safety testing. But little is known about the potential risks associated with most of the estimated 80,000 chemicals in commerce today.

While industry tries to dispel links to illnesses that go beyond what science can prove, the public is skeptical because companies have a financial stake in showing their products are safe. This leads both sides to look to the federal government for help.

Credit: C&EN
Britt Erickson speaks with C&EN Associate Editor Lauren K. Wolf about where EPA’s ToxCast program currently stands, and about her reporting trip to EPA’s North Carolina lab.

The agency charged with overseeing the safety of chemicals in the marketplace is the Environmental Protection Agency. EPA has the authority to require industry to provide extensive toxicity data for pesticides. But for most other chemicals, EPA must show that a substance is likely to be a risk to human health or the environment in order to require industry to provide safety data. Manufacturers don’t often give toxicity data to EPA voluntarily, nor does the agency have the resources to assess tens of thousands of chemicals using traditional in vivo rodent-based studies.

Instead, EPA has turned to computational modeling. One ambitious effort, called ToxCast, aims to screen thousands of chemicals for biological activity using about 600 high-throughput biochemical and cell-based assays. The data are then integrated with existing in vivo animal toxicity data and structure-activity information to predict toxicity.

But ToxCast has had problems. Most of the assays were developed for drug discovery, not to assess the hazards of chemicals in the environment. For example, thyroid-disrupting compounds in the environment can work through multiple pathways, but commercial tests focus on just one—a chemical binding to the thyroid receptor. If a chemical acts on a different pathway it will test negative, even though it does disrupt the thyroid.

Credit: EPA
EPA researchers are developing a high-throughput assay for evaluating whether chemicals inhibit the enzyme thyroperoxidase.
Credit: EPA
EPA researchers are developing a high-throughput assay for evaluating whether chemicals inhibit the enzyme thyroperoxidase.

EPA has had some success in developing an alternative thyroid assay that monitors inhibition of the enzyme thyroperoxidase. EPA has also developed a few novel tests for other chemical effects that are not detected by current ToxCast assays. But with only $7 million to $8 million per year to spend on ToxCast, it has been an uphill battle.

EPA is also struggling to get a handle on how much of each chemical people are exposed to. The agency has even less data about exposure than it does about the toxicity of chemicals. Exposure information is important because assessing chemical risk is a function of both a chemical’s toxicity and how much exposure individuals have to that chemical.

Efforts are under way at EPA to estimate exposure through a program called ExpoCast. But that program is just getting off the ground.

In contrast, the proof-of-concept phase for ToxCast was completed in 2009 when EPA scientists showed that ToxCast models could accurately predict the toxicity of about 320 data-rich pesticides. The agency is now completing the second phase of ToxCast, in which it screened another 700 or so chemicals using the same battery of high-throughput assays. This second set includes chemicals found in industrial and consumer products, food additives, and drugs that failed to pass clinical trials.

But as EPA considers using ToxCast data in regulatory decision making and risk assessments, it is getting a lot of pushback from industry and other stakeholders. C&EN recently visited scientists at EPA’s Office of Research & Development (ORD) in Research Triangle Park, N.C., to find out why it is so difficult and taking so long to get the risk assessment community to accept high-throughput in vitro data as an alternative to animal-intensive in vivo studies.

Testing For Danger

Scientists and the public have had trouble getting a firm handle on risk, creating scares and frustrations for everyone. In this three-part report, C&EN examines successes and failures in the latest attempts to assess chemical safety, different methods to ascertain public hazards, and why people perceive the results of risk studies so differently.

One of the problems, EPA scientists point out, is the limited scope of effects covered by commercially available assays.

“When we first started this program, we didn’t have the resources to do de novo assay development for relevant biologies, so we took off-the-shelf assays that seemed to have relevant biologies,” says Tina Bahadori, head of EPA’s Chemical Safety for Sustainability research program, which oversees the part of ORD responsible for ToxCast.

The ToxCast assays primarily screen chemicals for their potential to cause cancer and reproductive, developmental, and endocrine disruption effects. Some areas of toxicology are not addressed by commercially available assays, so EPA scientists have developed a handful of their own assays to help fill in the holes. In particular, they have developed high-throughput assays to evaluate thyroid inhibition, mitochondrial toxicity, neurotoxicity, and developmental effects of chemicals.

These EPA-developed “fit for purpose” assays are based on known adverse-outcome pathways. The assays rely on a mechanistic understanding of the way a chemical works, says Russell S. (Rusty) Thomas, director of ORD’s National Center for Computational Toxicology, which oversees ToxCast.

For example, thyroid-disrupting chemicals are known to work through at least six pathways, and just one of those pathways involves the thyroid receptor. Some chemicals disrupt the thyroid by interfering with production of the enzyme thyroperoxidase (TPO) and do not bind to the thyroid receptor. Screening such chemicals with currently available assays that monitor receptor-specific binding would give a negative result.

To avoid such false negatives, EPA scientists are developing a high-throughput assay to screen chemicals for their ability to interfere with the TPO enzyme. They are also studying other enzymes involved in the other pathways of thyroid disruption as potential targets for future assays.

To build the TPO assay, the scientists used fractions of cells from rat thyroids. They also built a human version of the assay by cloning the human TPO gene and developing a cell line that expresses human TPO, says Stephen O. Simmons, a scientist at EPA’s National Health & Environmental Effects Research Laboratory (NHEERL) in Research Triangle Park. “Once we analyze the rat data, we will repeat the studies using the human cell line,” Simmons says.


EPA is also interested in using the TPO assay in its Endocrine Disruptor Screening Program (EDSP). Thus far, the agency has tested the assay on 21 chemicals and gotten a few positive hits, Simmons says. The next step is to test 1,000 or so ToxCast chemicals and about 800 chemicals of interest to EDSP using the assay, he notes.

In contrast to thyroid toxicity, where much is known about adverse-outcome pathways, less is known about the pathways involved in developmental neurotoxicity. To better understand such effects, EPA is using high-content imaging. The approach allows researchers to obtain data on the size, shape, and location of hundreds of cells from a single image.

Credit: EPA
Exposure to chemicals changes how rat brain cells grow axons and form synapses.
Credit: EPA
Exposure to chemicals changes how rat brain cells grow axons and form synapses.

“We don’t know all of the molecular initiating events involved in developmental neurotoxicity,” emphasizes William Mundy, a neurotoxicologist at EPA’s NHEERL. “So there may not be a target that you can measure a chemical binding to, and there may not be a gene expression assay you can use,” he says. Instead, EPA researchers are using what is essentially an automated epifluorescence microscope to examine whether exposure to chemicals changes how rat brain cells grow axons and form synapses.

Researchers at EPA are also taking advantage of microfluidics to create a network of individual cells that act as a functional neuronal network. The neurons are grown on chips, each of which has 64 electrodes. The cells are then exposed to various chemicals and their spontaneous firing rate and patterns are monitored.

It is like an “in vitro EEG,” says Timothy J. Shafer, a toxicologist in EPA’s Integrated Systems Toxicology Division at ORD. EEG, or electroencephalography, measures the change in voltage resulting from current flows within the neurons of the brain. But whereas an EEG records the average signal from many cells in a pathway, EPA’s microelectrode array device monitors the electrical signal flowing through individual cells in a network. “The advantage is that you get an integrated response, not to one channel but to many different neuronal target proteins and ion channels,” Shafer explains.

EPA is working with Atlanta-based Axion Biosystems to increase the throughput of the device. Rather than analyzing one chip at a time, the researchers have created a device in a 48-well-plate format. In each well is a separate network of neurons. And unlike typical cell culture plates, the wells are all connected by microelectrodes. The trade-off, however, is that as you increase the number of wells, you have to decrease the number of electrodes, Shafer notes.

Credit: EPA
EPA toxicologists are using zebrafish to determine a chemical’s ability to disrupt blood vessel formation.
Credit: EPA
EPA toxicologists are using zebrafish to determine a chemical’s ability to disrupt blood vessel formation.

Another major effort by EPA involves using zebrafish as model organisms to screen for developmental effects of chemicals. Zebrafish are a “wonderful model organism for what I’d call moderate-throughput assays,” says Ronald N. Hines, associate director for health at EPA’s NHEERL.

EPA researchers have already tested about 1,000 chemicals for developmental effects using zebrafish. The throughput is much greater than with traditional rodent-based assays, because zebrafish grow rapidly, from a fertilized egg to a fish in five to six days. And although the throughput is lower than that of cell-based assays, zebrafish have full metabolic capability. Such capability is lacking in many of the high-throughput ToxCast assays, Hines points out.

Zebrafish can be used to learn more about the effects of chemicals during development on a host of important biological systems. EPA researchers are using them to examine chemical effects on blood vessel formation, eye formation, heartbeat, and body shape. These systems are highly integrated, making them difficult to understand with cell-based assays, says Stephanie Padilla, a toxicologist at EPA’s NHEERL.

Mouse embryonic stem cells, and in some cases human induced pluripotent stem cells, are also being used to evaluate the developmental effects of chemicals. In particular, EPA researchers have developed an assay to look at the effect of chemical exposure on differentiation of stem cells into different cell types.

“The amount of effort and time that goes into developing these different assays is huge,” Thomas emphasizes. And even with the handful of assays that EPA has developed in-house, there are still areas that ToxCast does not currently address. One area, for example, is pharmacokinetics. “You may have a very potent chemical, but it may be cleared by your body in such a rapid fashion it may not matter,” Thomas says.

Another area not addressed by ToxCast is variability in how humans respond to chemicals. “Some individuals or life stages are going to be more sensitive than others,” says John Vandenberg, national program director of EPA’s Human Health Risk Assessment research program within ORD.

In addition to ToxCast, which is focused on predicting the hazards of chemicals, EPA also has an effort to estimate chemical exposures called ExpoCast. “We haven’t made as much progress on the exposure side as the hazard side,” Thomas says. “Tools and data for estimating chemical exposures have been lacking, but I think that is starting to change,” he tells C&EN.

The goal of ExpoCast is to develop computational models for estimating chemical exposures using data from epidemiology studies, retail information, and household consumption patterns.

“When it comes to chemical exposure, the action is within the home,” says Timothy J. Buckley, director of EPA’s Human Exposure & Atmospheric Sciences Division. “We spend a lot of time paying attention to the ambient environment. But we need to be focused on the chemicals that we bring into our homes, which tend to be chemicals in consumer products,” he stresses.

Part of the problem is that EPA doesn’t know all of the products that a particular chemical is in, or at what concentrations. To help fill in some of those data gaps, Expo­Cast is using material safety data sheets posted by retail giant Walmart to extract information about what chemicals are in consumer products. Walmart has curated the data to include chemicals and concentrations across all products it sells.

EPA is also mining other household product databases, such as the Nielsen Homescan program. In the Nielsen program, about 15,000 households across the U.S. voluntarily scan the bar codes of every product they bring into their homes. Nielsen has rich demographic data to accompany the consumption data, including household income, number of occupants, and ages. By monitoring purchase patterns within a home over time, EPA can determine use, Buckley says.

Google Trends is also being explored to map product use and intensity. For example, search terms such as personal care products, automotive, landscape and yard, and home maintenance have turned up data that could indicate trends in the use of consumer products across the U.S., Buckley notes.


EPA is using the consumer-use data to develop models for predicting exposure to chemicals in consumer products. It then calibrates the models with biomonitoring data from the Centers for Disease Control & Prevention’s National Health & Nutrition Examination Survey.

In the end, the goal is to use the hazard and exposure information predicted by the ToxCast and ExpoCast computational models to help inform different types of risk assessments and decision making at EPA.

As a first step toward that goal, EPA plans to use ToxCast data to help prioritize which chemicals will be screened in its endocrine disruptor program. ToxCast data may also be used in the near future to help managers at Superfund sites decide which chemicals to look for and to set cleanup goals, Vandenberg says.

But it is likely to be a long time before EPA stops using in vivo animal studies, particularly to support risk assessments, such as its Integrated Risk Information System assessments, Vandenberg says. “The goal is to position ourselves to use fewer animals and have more information,” he notes. “But we are not there yet.”



This article has been sent to the following recipient:

Katya Kean (October 17, 2013 3:37 PM)
Testing individual ingredients is important, but if that is as far as testing goes, that step is about as useful in determining product safety as is testing individual bomb ingredients to determine if a bomb is harmful.
If the EPA wants to evaluate exposure, then holistic human blood tests, not just for the dangerous compounds, but for the possibly affected biomarkers, need to be collected, along with scans of products in a household, and other health-impacting markers.
Isolating variables is a good first step, but perhaps studying the existing combination of variables is what is actually required in determining the harmfulness of the chemical "bombs" we use in our laundry, on our skin, and in cleaning our homes.
For instance, what is the effect of Britney Spear's new perfume or Tide laundry detergent on human or aquatic reproduction or neurotoxicity? Those products needs to be treated as one variable, since that is how they are consumed.
Then, once that step in reality/relevance testing is accomplished, the next phase would be to determine the toxicity of using perfume, Tide scented laundry, scented feminine napkins, suave shampoo, banana boat sunscreen lotion, Secret deodorant, OPI nail polish, and Pantene hairspray in the same day, day after day, together.
Perhaps an intermediate step would be to not care so much about giving those companies bad press, regardless of how much money they might throw into lobbying for secrecy.
Here's a question: When Walmart shows materials sheets for each product, is "fragrance" considered in ingredient? If so, then it's almost pointless to use that information to determine toxicity, since those products have hidden ingredients. When products are told to disclose all ingredients except fragrance, safety determination is impossible. Fragrance recipes need to stop being protected by that outdated law, which is creating a ludicrous loophole for toxic contamination.
Lewis Perdue (October 17, 2013 10:09 PM)
Excellent overview. Sadly, EPA has a lot of catch-up to do. This piece didn't even get into the issues of epigenetics which is yet another mechanism by which some compounds can affect healthy. This piece illustrates that as regards to thyroid effects:
MBP (October 24, 2013 7:02 PM)
The reason why mixture effects are often ignored is a simple matter of mathematics. Testing one chemical at ten concentrations requires (1*10 = 10) ten tests. Testing ten chemicals at ten concentrations requires (10*10 = 100) one hundred tests. Testing every combination of ten different chemicals at ten concentrations requires (10^10 = 10,000,000,000) ten billion tests. It will never be cost-effective to run ten billion tests on just ten chemicals. And that's for a single endpoint (for instance, binding to the thyroid receptor). Checking two endpoints, like binding to the thyroid receptor and checking for mutations in DNA, would be 20 billion tests. And so on. Now notice from the top of the article that there are 80,000 chemicals in use in commerce today, most of which have not been tested for safety at all, with more being added to the market all the time. This is why we have trained scientists, who use current knowledge to prioritize among all the possible experiments that COULD be run and pick the ones that are most likely to provide us with the knowledge we need the most.

Now imagine you're an agency like the EPA trying to protect human and environmental health, but the House of Representatives proposes to cut your 2014 budget by 34% compared to the 2013 budget. How are you supposed to go about testing even a tiny fraction of these chemicals and determine the level of risk they pose? Instead of blaming the EPA and complaining that their analyses aren't sophisticated enough or saying that they need to play catch-up, how about we re-examine our current system where we have de facto human testing by allowing potentially toxic substances to be sold without any data proving their safety.

Leave A Comment

*Required to comment