Advertisement

If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)

ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.

ENJOY UNLIMITED ACCES TO C&EN

Environment

Sowing Seeds Of Doubt

How some scientists can twist the facts to suit ends other than scientific truth

by Gavin Schmidt
January 17, 2011 | A version of this story appeared in Volume 89, Issue 3

Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming,
[+]Enlarge
by Naomi Oreskes and Erik M. Conway, Bloomsbury, 2010, 368 pages, $27 hardcover (ISBN 978-1-59691-610-4)
by Naomi Oreskes and Erik M. Conway, Bloomsbury, 2010, 368 pages, $27 hardcover (ISBN 978-1-59691-610-4)

As physical scientists we tend to think of ourselves as rational creatures. We pride ourselves on our skepticism and our nullius in verba approach to scientific questions. Many people are familiar with the works of Carl Sagan, Richard Feynman, and Martin Gardner. The vague platitudes of astrology and the self-delusion of television psychics are not for us. However, scientists such as ourselves are not immune to self-delusion, and even the brightest among us can fall prey to the substitution of wishful thinking for rigorous logic when the science points to conclusions that uncomfortably conflict with our world view.

The tale told by Naomi Oreskes and Erik M. Conway in their book, “Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming,” should be read by all scientists because we need to be reminded that we too are susceptible to the comforts of pseudoscience. The authors illustrate not only that some researchers have used their talents to obfuscate and deny, but that they can be extremely good at it. Indeed, some of these “merchants of doubt” have been so successful that it now appears possible for me, as a physicist, to have a lucrative second career as a professional denier. How did this come to pass? And what are the fault lines in our thinking that have allowed some extremely smart people to profess to believe some very dumb things?

If one surveys the reams of pseudoscience floating around on the Internet, it is obvious that some fields of science are more prone to having their results abused than others. In the case of a snake-oil salesman peddling quack medicine, it is clear that the distortion is driven by a narrow financial interest. But more often the propensity appears when some kinds of science threaten closely held ethical, cultural, or political beliefs.

BUYER BEWARE
[+]Enlarge
Credit: Shutterstock
The truth can be obscured.
Credit: Shutterstock
The truth can be obscured.

These differ among different communities, and so the field of science that is abused also varies. For instance, the rejection of evolution by natural selection has a powerful constituency in parts of the U.S., but the topic barely raises an eyebrow in Europe. Conversely, the often-times antiscientific campaign against genetically modified food in France draws on the local cultural affinities to the terroir and to a latent anticorporatism that is completely alien to most Americans. It is not that religious people in Europe or food lovers in the U.S. are unaware of these issues, but the perception of a threat from the science may not be widely shared even among people who have similar beliefs.

The pattern of antiscientific tactics by the merchants of doubt is often constant across many different topics, the authors write. Frequently, personal attacks against individual scientists are used in lieu of addressing the substance of their conclusions. Cherry-picking of single outliers to counter well-supported general results is common. The elevation of caricatures of the real science as straw men to knock over is ubiquitous.

But one of the strongest methods to deflect attention away from what the science has actually concluded is to find ways to exaggerate the amount of uncertainty. Since there is always uncertainty in science—scientists work at the boundary between known and unknown—any strongly supported result can be politically “countered” by reference to uncertainty in an assumption, a piece of data, or an experimental procedure regardless of how well characterized that uncertainty is or how robust the original result. This tactic implicitly constructs the logical fallacy of suggesting that because we do not know everything, we therefore know nothing.

The manufacturing of doubt has been a deliberate tactic in many politically contentious circumstances, most famously in the tobacco industry’s multidecade attempt to hide the harmful effects of smoking from the public. Indeed, the famous memo from a Brown & Williamson executive declaring that “doubt is our product” finds its echo in almost every antiscience campaign.

It is rare that one finds a serious researcher supporting the dubious claims of intelligent design or homeopathy (though it occasionally happens). These topics simply don’t resonate with many scientists. There is, however, one fault line amply demonstrated in the book that does factor strongly for a significant group of (mainly U.S.-based) physicists—and that is the idea that the technology and industry that have brought the developed world so much prosperity might in fact be harming the environment that civilization has adapted to. The response of this group of high-powered physicists, weaned on the postwar period of technological optimism, to create and professionalize a specific class of science distorters is amply described by Oreskes and Conway.

New & Noteworthy

Inside the Outbreaks: The Elite Medical Detectives of the Epidemic Intelligence Service, by Mark Pendergrast, Houghton Mifflin Harcourt, 2010, 432 pages, $28 hardcover (ISBN 978-0-15-101120-9)
Tells the history of the Epidemic Intelligence Service, a part of the Centers for Disease Control & Prevention, since its founding in 1951. Because of its success, nearly 30 similar programs have been founded around the world.

Learning To Communicate in Science and Engineering: Case Studies from MIT, by Mya Poe, Neal Lerner, and Jennifer Craig, MIT Press, 2010, 256 pages, $35 hardcover (ISBN 978-0-262-16247-0)

Offers in-depth case studies and pedagogical strategies from a range of science and engineering communication-intensive classes at MIT. The book traces the progress of 17 students from diverse backgrounds in seven classes that span five departments.

Physicists William A. Nierenberg, former director of Scripps Oceanographic Institute; Frederick Seitz, a former president of the National Academy of Sciences; Robert Jastrow, a former director of the NASA/Goddard Institute for Space Studies; and S. Fred Singer, former head of the National Weather Bureau’s Satellite Service Center, all became experts at distorting aspects of environmental sciences in very public—and political—ways, starting with Seitz’s work for the tobacco industry and Jastrow’s lobbying for the Reagan Administration’s Strategic Defense Initiative, “Star Wars.” Evidence of unforeseen damage to the environment caused by acid rain, lead pollution, climate change, DDT use, ozone-depleting chemicals, and so on, that might indicate the need for regulation, were seen as a threat to a libertarian dogma that the free market, and the technology used to power it, could do no wrong.

Singer is an interesting example because his activities have touched on almost every environmental controversy over the past 40 years. After his retirement from government service, he associated himself with the tobacco industry’s effort to downplay the effects of secondhand smoke. He wrote numerous “cornucopian” opinion pieces for the Wall Street Journal, forecasting in 1981, among other things, that no move away from oil dependency was needed because the free market would ensure that oil use would be cut in half by the year 2000 (Wall Street J., Feb. 4, 1981, page 23). He downplayed the threat of acid rain, then ozone depletion, and then climate change. In each case his argument was the same: The science did not show a problem. And if it did, the answer was too uncertain to suggest mitigation. In any case, mitigation would not work, and the change was probably good for us.

Singer had found the secret, not to scientific success (his scientific publication record is unimpressive), but to success as a professional denier. He has managed to parlay his skills at obfuscation into a living via an opaquely funded “institute”—the Science & Environmental Policy Project—that supports his advocacy activities to this day.

In the other cases—Jastrow, Seitz, and Nierenberg, all now deceased—the authors trace their paths from impressive science in the early parts of their careers, to success as administrators, and, finally, to increasingly strident abusers of science in their emeritus periods. The institute they founded to promote their ideas, the George C. Marshall Institute, is funded and acts in very similar ways to Singer’s. Among their goals have been to delay and water down whatever efforts are proposed to deal with a suite of environmental problems, particularly carbon emissions, and it is arguable that they have succeeded in preventing much legislation from being passed.

How did deniers such as these become so good at what they do? There are many common threads as detailed above, but it was probably their understanding of the culture of science that has enabled them to so successfully turn it against itself. By reframing politically driven trawling through the science to find results that support a predetermined position, they have tapped into scientists’ natural tendencies to be suspicious of new results. By using physicists’ confidence that they can understand any new field without too much study, they provide just enough misinformation (and false authority) to fool the unwary into thinking that mainstream specialists had missed out on some bone-headedly obvious factor—for example, that water vapor was a greenhouse gas, or that ozone reactions depend on temperature, or that correlation does not prove causation. And by appealing to the scientific community’s normal sense of fair play, they have parlayed their fake controversies and unjustified attacks on individual scientists into getting “equal time” in many media and political outlets.

Oreskes and Conway have unearthed a treasure trove of primary documents covering decades of this sort of activity that leaves one enormously impressed at the scope of their efforts. But this is not just a history of a time before we all became far too sophisticated to fall for such foolishness. It is worth noting that the heirs of this tradition, such as William Happer, a Princeton University professor of physics and current chairman of the George C. Marshall Institute, continue to operate in the same way.

Oreskes and Conway’s book is no mere history of science, but rather a warning to us all that even scientists with stellar intellectual credentials are not immune from dabbling in pseudoscience that fits their prejudices. For those of us searching for answers, caveat emptor: The merchants of doubt are still peddling their wares.

Article:

This article has been sent to the following recipient:

0 /1 FREE ARTICLES LEFT THIS MONTH Remaining
Chemistry matters. Join us to get the news you need.