A plethora of computer and mathematical models guide Environmental Protection Agency officials as they make regulatory decisions. Those decisions, in turn, can have major impacts on the health of communities and the bottom lines of companies.
Models can help policymakers determine the cleanup level for a particular chemical at a Superfund site. They also can influence whether regulators allow a new pesticide on the market or prompt them to require expensive pollution controls on smokestacks.
Some models predict how pollutants move through the environment. Others simulate how a chemical enters the body, is metabolized, and is excreted. Still others are used to estimate the costs and benefits of proposed new regulations.
EPA's use of models continues to expand. Older models are updated and improved to raise their level of sophistication. And newer ones are under development to analyze the new genomic and related "omics" data and to make forecasts based on environmental information gathered by Earth observation satellites.
Yet EPA has no policies guiding how it picks out and uses computational and statistical models as it crafts regulations to control pollution and regulate commercial chemicals and pesticides. The agency, however, is seeking advice on models from the National Research Council. NRC's Committee on Models in the Regulatory Decision-Making Process has been mulling over the matter, and on Dec. 1 it held the third in a series of three workshops probing the models that EPA uses.
Models factor into at least some facet of every regulatory decision EPA makes. That's what Robert Perciasepe, chief operating officer at the National Audubon Society, told the committee at the workshop. Perciasepe served as an EPA assistant administrator for seven-and-a-half years in the Clinton Administration, first running the agency's Office of Water and later heading the Office of Air & Radiation.
Perciasepe said models help policymakers decide among various regulatory alternatives, for example, determining how deeply and how quickly to cut emissions of an air pollutant to protect public health while minimizing costs. EPA officials also use models to justify the regulatory paths they select, he added.
EPA's dependence on models for regulatory decisions is increasing, while policy choices based on scientific judgment, which were more common in the past, are declining, Perciasepe said. This transformation has occurred because regulations based on the results of models are easier to defend in court than are those involving scientific judgments, he said. Also, the federal government is documenting in increasing detail how and why it selects a given regulatory option, and modeling fits neatly into this process, he added.
J. Paul Gilman, who headed EPA's Office of Research & Development from 2002 to 2004, told the NRC committee that the public-including industry and environmental and health advocates-needs access to models so that interested parties can replicate the results that influence EPA decisions. During his tenure at EPA, Gilman oversaw an effort that led to EPA's posting information on its website about 90 models that it uses most frequently (C&EN Online Exclusive, Jan. 29, 2004).
Gilman, now director of the Oak Ridge Center for Advanced Studies, in Tennessee, said the public should know whether the models EPA uses were peer reviewed. In addition, the public should have access to the data fed into the models and information about how the model was applied to a particular regulatory question, he said.
Harvey J. Clewell III, director of the Center for Human Health Assessment at the industry-funded CIIT Centers for Health Research, Research Triangle Park, N.C., agreed with Gilman, especially for models used in risk assessment. Clewell told the committee that requiring the use of peer-reviewed models and giving the public the information needed to replicate the agency's modeling results would hold EPA risk assessments to the same standards that industry must meet if it submits a risk assessment to the agency.
Clewell argued that the Information Quality Act requires EPA to share its models and the data it feeds into them. Under that 2000 law and White House Office of Management & Budget guidelines implementing it, federal agencies must make public the detailed information about risk calculations they use as the basis for health or safety regulations.
Perciasepe said lobbying groups are particularly interested in seeing EPA make public a proprietary computer model that the agency used to evaluate various policy options to cut emissions of sulfur dioxide, nitrogen oxide, and mercury from power plants. This model was used to evaluate President George W. Bush's Clear Skies initiative and competing legislative proposals (C&EN, Dec. 12, page 8).
Improvements to computer and mathematical models enhance their use at EPA, Perciasepe said. But this increasing sophistication of models is also limiting the public's ability to understand the results they yield, he said. It also makes it harder for the agency to explain the results of modeling to decisionmakers, because results generated by models contain uncertainties. Although the scientific community at large accepts uncertainty as normal, Gilman said uncertainty is often viewed by Washington policymakers as meaning "scientists don't agree" and as suggesting that regulatory decisions should be put off. Perciasepe noted that getting policymakers to accept uncertainty and still make decisions is a challenge.
EPA is aware of this matter and is working to address it. Richard D. Morgenstern, a senior fellow at Resources for the Future, told the committee his organization is conducting a study for the agency on the best ways of explaining uncertainty and the results of modeling to policymakers. Morgenstern was EPA assistant administrator for policy, planning, and evaluation in the early 1990s.
In addition to the three former EPA assistant administrators, the NRC committee heard from current agency employees who use models in their work. Much of that discussion focused on risk assessment.
In the past, EPA's risk assessment modeling efforts were mainly focused on using data from laboratory animal tests and, sometimes, epidemiology studies as well, to estimate the health or environmental effects of a chemical at the levels that the substance is found in air, water, or soil. But now, the agency increasingly is using more complex models to determine how the body metabolizes chemicals and how a substance biochemically triggers health problems.
Rory Conolly, senior research biologist at EPA's National Center for Computational Toxicology, said lack of data was once the rate-limiting step in chemical risk assessment. But with the explosion of omics technology-genomics, proteomics, metabolomics, and so on-"We're flooded with data," and analysis of this information has become the rate-limiting step in risk assessment, he told the committee. He added that his views are his own and not that of EPA.
A major challenge in understanding omics data is determining when transient perturbations to an organism's biochemical system-such as the turning on of a gene-due to a chemical exposure become a stable condition that leads to adverse health effects, Conolly said. Modeling omics data for risk assessment purposes involves complex descriptions of cell functions.
Louis J. Gross, director of the Institute for Environmental Modeling at the University of Tennessee, Knoxville, and a member of the NRC committee, noted that omics data is "hot stuff" in the scientific world. But he questioned whether detailed models of cell biology are as helpful for regulatory decisions as details about human or ecosystem exposure to a pollutant.
Conolly responded that tracking changes at the cellular level has important implications for determining dose-response estimates. EPA uses these to extrapolate results from high-dose studies of a chemical on lab animals to the lower levels of compound likely to be found in air, water, or soil and causing real-world exposure.
EPA is also beginning to use satellite data to predict exposure to air pollutants, including particulate matter and ground-level ozone, said Gary J. Foley, director of EPA's National Exposure Research Laboratory. For instance, EPA can use satellite-derived aerosol optical depth data to track the movement of smoke from large forest fires, Foley said. Eventually, EPA will include this information in models it uses to analyze particulate-matter air pollution.
Foley said EPA has become a new "customer" for data collected by National Aeronautics & Space Administration and National Oceanic & Atmospheric Administration satellites. In the past, NOAA and NASA would supply data collected by a satellite that were specifically targeted to EPA's needs only if the environmental agency would pay for the information. But a Bush Administration initiative that links budgets for government programs to the performance of those programs has changed that dynamic, he continued. Nowadays, satellite programs fare better under budget reviews if they supply data to agencies-like EPA-that use the data, Foley said.
The NRC panel has studied the use of models in regulatory decision-making for about two years and is beginning to formulate a report that will guide EPA in selecting and using models. Release of that report is expected in late 2006.