In years to come, researchers are expected to tease out an increasingly detailed understanding of whether, how, and when exposure to specific chemicals can cause harm to humans or the environment. This expected onslaught of new scientific data could change the minds of people who view at least some products of the chemical industry as dangerous and to be avoided, say experts on how humans perceive risk.
Or it might not, they add. Human cognition “will take the facts and turn them into how we choose to feel about them, not some mythical objective, factual, everyone-agrees-on-it truth,” says David Ropeik, a consultant on risk perception and author of two books on the topic.
Research demonstrates that information suggesting something can harm us carries more weight in the human mind than information suggesting a thing or activity is good for us, says Paul Slovic, a professor of psychology at the University of Oregon who studies risk analysis and is also president of Decision Research, a nonprofit research organization. A series of experiments by psychologist Daniel Kahneman, a Nobel Laureate in Economics, and Amos Tversky in the 1970s and ’80s showed how people make judgments under conditions of uncertainty. They demonstrated that people strongly prefer to avoid losses over acquiring gains (Amer. Psychol.1984,39, 341).
Ropeik explains that this trait has been key to human survival through the millennia. This predilection means if people sense that using products containing a certain chemical could lead to harm, they will give it more weight than information suggesting the chemical is safe.
Unfamiliarity also heightens perceptions of risk, says Brian J. Zikmund-Fisher, assistant professor of health behavior and health education at the University of Michigan. For instance, people are generally unconcerned about familiar risks such as driving a car—an activity that is responsible for tens of thousands of deaths a year, he says. But people tend to have greater fear of risks associated with the myriad unfamiliar commercial chemicals used in products from smartphones to baby shampoo.
In addition, “we have strong ideological bents that color the way we react to information,” Slovic says. People, by their very nature, associate with others who share their outlook on the world—such as distrust of government or of the chemical industry—and thus reinforce each other’s standpoints, he says.
If people receive information that aligns with their views, they support it, Slovic explains. If the data go against their views, he continues, “they find ways to denigrate and reject it.” For example, some who oppose greenhouse gas emission reductions frame the growing scientific evidence that links human activities to global warming as a ruse by climate researchers to get more money, Slovic says.
People also perceive risks that are under their control differently from risks that aren’t, Zikmund-Fisher points out. “Imposed risks are so much worse in people’s minds than things they choose to undertake voluntarily.” Smokers are a prime example of this. They voluntarily accept risks of using cigarettes but likely would strenuously object if forced to work in a basement office containing radon—even if both posed the same risk of lung cancer, he says.
Then there’s the issue of trust—or lack of it. “There’s a lot of distrust of organizations and industries if you think these groups are trying to benefit at your expense or risk,” Slovic says. For example, some people harbor negative feelings about the chemical industry, believing that it is focused on profits at the expense of the safety of its products.
What’s worse, negative events—such as a catastrophic release of radiation from a nuclear reactor or evidence that a drug can cause birth defects—can destroy the public’s faith in an institution or industry. “Trust is hard-earned, and it is quickly lost,” Slovic says.
Yet improved communication about risk can help the public understand risk information better, says Baruch Fischhoff, professor of social and decision sciences and engineering and public policy at Carnegie Mellon University. Surveys show that the public views scientists as highly trusted sources of information, so scientists are prime candidates to communicate information about chemical risks, he says. That trust, however, can evaporate if scientists politicize their message by advocating on behalf of industry or activists or fail to acknowledge scientific uncertainties, Fischhoff warns.
Ropeik cautions that scientific data on risks are often complex and difficult to communicate. One result is that the growing body of information about chemical risks often seems to do more to worry people than to reassure them, Slovic says.
One concern is that much of the risk-related data expected to come forth in the years ahead will be from traditional toxicology studies done on laboratory animals, computational toxicology, and epidemiology. The data will provide a great deal more information about hazards of chemicals, Ropeik points out, but not about exposure to them in particular circumstances. Without sufficient exposure data to combine with hazard data, the risk equation isn’t fleshed out.
Nonetheless, given the preponderance of hazard data, environmental and health advocates are likely to conclude that many commercial chemicals pose a greater danger than previously thought, he says. Chemical producers, on the other hand, are likely to point to the new information and declare that their products are less problematic than activists have alleged.
The chemical industry, Ropeik says, “is stuck in this mind-set that more information will lead to objective choice” by consumers about commercial substances. But, as has been noted, research suggests this isn’t the case.
Social science research has mapped out successful strategies for communicating scientific information about risks, Fischhoff says. For instance, scientists need to respect the audience to whom they are presenting the risk information. “Anybody who starts on the premise that people are idiots is doomed to failure,” Fischhoff says. “First, the audience will immediately pick up the disrespect.” Second, scientists who discount their audience are unlikely to expend the energy needed to make risk information as comprehensible as possible, he says. Ultimately, this attitude will turn the audience away from the scientist and the message and toward people they trust—those they identify with politically.
Despite this evidence, many of those working in natural science fields, including chemists and physicists, have to date spurned these scientific findings, he points out.
Nevertheless, effectively communicating information about chemicals’ risks is important to society’s well-being, stress Zikmund-Fisher and Ropeik. Accurate information is essential so consumers can make the best choices among products, according to Zikmund-Fisher. Otherwise, he says, they’ll simply shun items containing a chemical pegged as “bad” without considering the risks of substitute products they select instead.
“The danger,” Ropeik says, “is that we’re less informed than we need to be to make the healthiest possible choices.”