ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
Critics of environmental regulations often argue that as analytical methods become more precise, policymakers push for tougher regulations, because scientists can detect smaller and smaller contaminant concentrations. But a new report concludes that this criticism, known as the "vanishing zero effect," is a myth—at least when it comes to U.S. drinking water regulations. Instead, the authors find that a better understanding of a chemical's toxicology appears to be the primary driver of changes in regulatory limits (Environ. Sci. Technol., DOI: 10.1021/es101417u).
Although people have invoked the vanishing zero effect for decades, no one had yet examined its validity. So engineers Ryan Calder and Ketra Schmitt of Concordia University in Montreal, decided to perform the first comprehensive analysis of the role that improved detection capabilities play in drinking water regulation
The pair of myth-busters examined three possible rationales for changing a specific contaminant's regulations: an improved detection technique, a better understanding of its toxicology, or a cheaper method to treat it. They specifically evaluated the evolution of the Safe Drinking Water Act of 1974, because those regulations undergo regular review.
"What we found surprised us," Schmitt says. The act regulates 67 chemical contaminants, but since it went into effect, the regulatory limits of only 15 chemicals have changed. "We were surprised that regulations aren't changing as quickly as we anticipated," she says. Even more surprising, regulations for seven of those 15 contaminants, including barium, selenium, and chromium, have became less stringent over time—the opposite direction predicted by the vanishing zero effect.
Still, regulations for six contaminants have stiffened. Of those six, only the regulatory limits on now banned pesticides, lindane and toxaphene, are now at their detection limits. Meanwhile, arsenic's regulatory limits are well above its detection limit and policymakers based that choice on toxicology—its cancer risk—tempered by treatment cost concerns. Toxicological information also drove the limits for cadmium and the pesticide methoxychlor.
Calder and Schmitt also found that toxicological information plays an important role in screening emerging contaminants for potential regulation. For example, regulators first considered perchlorate as a candidate for drinking water limits when an analytical breakthrough lowered its detection limit. But the current regulatory discussions focus on mitigating its health effects.
The paper shines light on a previously ignored topic, says engineering analyst Patrick Gurian at Drexel University in Philadelphia: "It's nice to see some critical thinking and evidence brought to bear on this common belief."
Gurian thinks that improved analytical methods have actually made regulatory decisions more complex. "Once we could simply regulate to a level we could detect," he says. "Now we have to face societal choices about how much to invest in risk reduction and how much risk we are willing to accept."
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X