ERROR 1
ERROR 1
ERROR 2
ERROR 2
ERROR 2
ERROR 2
ERROR 2
Password and Confirm password must match.
If you have an ACS member number, please enter it here so we can link this account to your membership. (optional)
ERROR 2
ACS values your privacy. By submitting your information, you are gaining access to C&EN and subscribing to our weekly newsletter. We use the information you provide to make your reading experience better, and we will never sell your data to third party members.
Early in the pandemic, as many states started shutting down businesses to slow the spread of the coronavirus, Daniel Sarewitz felt a sense of optimism. Sarewitz, an expert on the interplay between science and society, reasoned that unlike the polarized issues of climate change, genetically modified organisms, and nuclear energy, the pandemic presented an immediate, tangible threat. Surely the necessary role of science in the US response to COVID-19 would bring the country together.
Support nonprofit science journalism
C&EN has made this story and all of its coverage of the coronavirus epidemic freely available during the outbreak to keep the public informed. To support us:
Donate Join Subscribe
“For once, we all agree,” Sarewitz wrote in an editorial that appeared in Slate in March. The cause of the crisis was clear. The consequences of inaction were obvious. “We are thus unified by the shared value of preserving life,” he wrote. “For this crisis, the things that unite us are outranking those that divide us.” He continued, “The threat of COVID-19 is bringing out the best in both science and politics.”
The article didn’t age well. “Dumbest thing I ever wrote,” says Sarewitz, who is codirector of the Consortium for Science, Policy, and Outcomes at Arizona State University. “I underestimated, like so many have, how profound the divisions are in this country.”
The Pew Research Center studies those divisions by the numbers, and it turns out that Sarewitz’s perception of togetherness wasn’t wrong at the time. “One of the first things that we noticed back in March was that we had a brief time of unity,” says Cary Funk, director of Pew’s science and society team. Upward of 8 in 10 people in the US agreed that the unprecedented measures to shut down businesses, schools, and travel were necessary. The agreement began to crumble from there.
“About 4 to 6 weeks later, we started to see partisan divisions in pretty much everything connected to the handling of the coronavirus, including the degree to which it poses a public health threat,” Funk says. By November, in what Funk describes as an unusual rift, 84% of Democrats said that COVID-19 was a major threat to public health, compared with only 43% of Republicans. Americans can’t even agree that the pandemic is a problem. No wonder they’re split on what to do about it. Nearly half the country is seemingly ambivalent, if not hostile, to the recommendations and warnings of researchers, many of whom feel that the public is losing confidence in their expertise.
Throughout 2020, Trump and his team sidelined, ignored, or lied about science on countless occasions. In spite of epidemiologists’ predictions, Trump repeatedly claimed that the virus would simply “go away”—by Easter, then by summer, then in time for the new school year. His refusal to wear a mask fueled resistance to our most accessible tool for containing the coronavirus. His rallies made a joke of social distancing. In hyping an old drug, he politicized the question of whether hydroxychloroquine could treat COVID-19, ultimately making it harder for scientists to find the answer. Under his watch, the US Food and Drug Administration authorized the use of convalescent plasma without data proving that it worked. Trump’s attempts to accelerate authorization of COVID-19 vaccines sparked distrust among an American public already hesitant about the shots. And he routinely turned to his own experts—whose opinions aligned with his own, rather than the scientific consensus—to guide and justify his action, and inaction, during the pandemic, culminating in more than 400,000 needless deaths in the US by the time he left office.
“It is just extraordinary what we’ve been through in the past year,” says Eric Topol, founder and director of the Scripps Research Translational Institute. “If you were to write a script about how to destroy the credibility of science, we just saw it. It couldn’t have been more of a comprehensive, systematic takedown, because it happened at every level.”
Scientists are worried. And a bit defeated. Over the past year, many have dropped all else to study coronavirus biology, model its spread, or develop new diagnostics, drugs, and vaccines. Yet despite their mobilization to defeat a once-in-a-century public health threat, a sizable portion of the US population roundly rejects their work. Many people continue to say that the virus was made in a lab—to name one lingering conspiracy—or believe the death toll is exaggerated. Others think the threat is real, yet they refuse to get tested, answer calls about contact tracing, or make plans to get a vaccine. And vaccination is where the threat of eroding trust looms largest as the country’s most massive vaccination campaign in history gets off to a fumbling start.
of US adults said in November 2020 that they had a great deal or fair amount of trust in scientists to act in the public’s best interests.
Source: Pew Research Center.
Convincing people to take the vaccine is an immediate concern—it’s key to ending the pandemic. But a November Pew poll found that only 29% of US adults said they would definitely get the vaccine, and 31% said they would probably get it. Some researchers are also worried that this vaccination campaign could be a make-or-break moment for a public whose trust already seems to be on the brink. “This is really going to have some long-term effects,” says Emily Brunson, a medical anthropologist who studies vaccine acceptance at Texas State University. “If we do this right, there could be a huge increase in public trust of science, public health, and vaccines. But if this is done poorly, we run the risk of undermining everything, including our childhood vaccination programs. And that would be an absolutely critical loss for our country.”
A lack of trust seems tied to everything that has gone wrong with the pandemic response in the US. The rapid evolution of COVID-19 science, mixed messaging from leaders, a torrent of misinformation, political interference in federal science agencies, and peak levels of polarization threaten to disintegrate public trust in science. Can it survive the pandemic?
When Trump was elected in 2016, researchers at science-driven government institutions like the Environmental Protection Agency and the Department of the Interior began to worry about their independence and the potential for disappearance of previously public data. Sure enough, the Trump administration muzzled scientists at environmental and health agencies, stripped information about climate change from government websites, and censored scientific reports and agency press releases.
The public backlash to the perception of an antiscience administration was embodied in the March for Science in April 2017. It was easy for researchers to feel that political leaders were waging a war on science, or at least on data that stood in the way of their economic or ideological agendas. Worse yet was the fear that an anti-intellectual public was increasingly devaluing expertise, including that of scientists. Yard signs popped up across intellectual hubs like Boston and Washington, DC, proudly proclaiming that the occupants believe “science is real,” a seemingly innocuous statement with antagonistic undertones: it implies some people believe science isn’t real.
Given the administration’s treatment of science, you’d be forgiven for thinking that people simply don’t trust science anymore. The data tell a surprisingly different story.
From June 2016 to January 2019, the portion of US adults who said they had a great deal or fair amount of confidence in scientists to “act in the best interests of the public” rose from 76% to 86%, according to Pew surveys. The public’s confidence in religious leaders, journalists, business leaders, and elected officials was far lower. Science seems to be doing all right globally, too. A 2018 Wellcome survey of over 140,000 people in more than 140 countries found that 72% said they trust scientists, and 73% said they would trust a doctor or nurse for medical advice more than anyone else.
Trust in science has hovered around that high-water mark for decades. The General Social Survey has collected data on public perceptions of science and other institutions since the 1970s. “Compared to every other group, confidence in science has stayed relatively stable over time” and is currently second only to confidence in the military, says John C. Besley, a Michigan State University professor who compiles the General Social Survey data for a biennial report from the National Science Foundation. “I spend a lot of time telling people that science is the one institution in society that generally hasn’t lost trust,” he says.
Of course, all these surveys have limitations. Social scientists tend to squirm a bit when you ask them about public trust in science—and rightly so. What constitutes the public? What kind of science are we talking about? The surveys typically ask people about confidence, which is not quite the same as trust, which could refer to any number of things—like the credibility of the research or the integrity of the researchers. Those distinctions are important, although the data suggest that no matter which way you slice it, trust in science at large doesn’t seem to be as big of a problem as scientists would expect.
“Scientists think science is in crisis,” says Kathleen Hall Jamieson, director of the Annenberg Public Policy Center at the University of Pennsylvania. “There is lower trust in some areas that have become politicized, but the global statement that trust in science is being eroded is simply factually incorrect.”
That doesn’t mean that scientists and the public always see eye to eye. If you ask people whether they trust certain areas of science, “you start to see that trust fall apart, and it falls apart differently for different publics,” explains Timothy Caulfield, research director of the Health Law Institute at the University of Alberta. For example, many people who support strict measures to curb climate change are opposed to genetically modified organisms (GMOs), while some people who tout the benefits of GMOs balk at tougher environmental regulations, he says.
of Republicans and 84% of Democrats said in November that COVID-19 was a major threat to public health.
Source: Pew Research Center.
The rift between scientific and public opinion on those two issues is huge. Although nearly 9 in 10 members of the American Association for the Advancement of Science agree that climate change is mostly caused by human activity and that GMO foods are safe to eat, only half of US adults say the same for climate change, and just over one-third for GMOs, according to a 2015 Pew study. Even though the public largely maintains a generic confidence in science, “people are quite capable of viewing scientists as lousy experts when it comes to specific issues that don’t fit their notions of what’s true,” says Sharon Dunwoody, professor emerita of journalism and mass communication at the University of Wisconsin–Madison.
That disconnect has been on full display during the pandemic. The public hailed hospital workers as heroes yet doubted the predictions of epidemiologists and decried the advice of public health officials who urged people to stay home or wear masks if they couldn’t. And the rapidly evolving science of COVID-19 has spurred confusion about how the virus is transmitted and political foot-dragging on instituting simple measures to curb its spread.
The pandemic has placed science under the public microscope. Never before has a scientific issue so immediately and directly affected the lives of everyone at the same time. And never before have scientists created so much new information on one subject so quickly. By early January 2021, PubMed had documented more than 90,000 publications mentioning “COVID-19.”
Most of those papers were not flashy enough to make headlines. The ones that did make the spotlight in the news or on social media typically revealed new, surprising, or contradictory results, with implications that put people either at ease or on edge.
Take, for instance, the vacillating evidence on whether getting COVID-19 once will protect you from getting infected again. It’s a question with personal implications for how to live your life—and of course societal significance for the prospect of one day reaching herd immunity. Hypotheses on this critical question of reinfection ran the gamut. Some asserted that reinfection was impossible, while others offered evidence that immunity was short lived. Epidemiologists, immunologists, and virologists—and a slew of people less qualified to chime in on this issue—brought their own perspectives on what the latest antibody and T-cell studies meant for getting our lives back.
COVID-19 has made armchair immunologists of us all. “Even most other scientists can’t stand immunology. The science is confusing, and the vernacular is awful,” says E. John Wherry, who is an immunologist at the University of Pennsylvania. Well-meaning scientists who “suffered from a lack of looking at our textbooks” made mistakes interpreting the studies, he adds. Everyone wants answers about how long immunity lasts, but scientists, and the public, need to follow the data and be careful about jumping to conclusions, he says.
Scientists usually debate and resolve questions like the reinfection one over years, sometimes decades, in conference halls and academic journals. The urgency of the pandemic has warped that scientific process and put its accelerated results on display. “All of the sudden everyone is watching scientists, and they are seeing all the messiness of how science happens,” says Kasisomayajula “Vish” Viswanath, professor of health communication at the Harvard T. H. Chan School of Public Health. Or, as Caulfield puts it: “People are seeing the sausage being made, and they don’t like what they see.”
Rapidly evolving science understandably caused some confusion during the pandemic, but the problem was compounded by mixed messaging from the health officials charged with making sense of it all. Those leaders floundered on the most critical of issues: how the virus spreads and how best to curb it.
of US adults worried in September that the “FDA will rush to approve a coronavirus vaccine without making sure that it is safe and effective due to political pressure from the Trump Administration.”
Source: KFF Health Tracking Poll.
of US adults said in November that they would definitely or probably get a COVID-19 vaccine.
Source: Pew Research Center.
Early in the pandemic, the US Centers for Disease Control and Prevention (CDC) and World Health Organization (WHO) both emphasized that the virus primarily spread through close contact. Social distancing guidelines were based on the assumption that droplets of viral particles from sick people who were coughing or sneezing traveled only short distances in the air before falling to the ground. People quickly grew accustomed to the plexiglass sneeze guards installed at every cash register and floor markers telling customers to stand 6 feet apart in grocery stores. And health officials said not to wear masks.
“STOP BUYING MASKS!” the US surgeon general, Jerome Adams, wrote on Twitter in late February. Health-care workers need masks when caring for sick patients, but they are “NOT effective” for the general public, he said. And in an interview with 60 Minutes on March 8, 2020, Anthony Fauci, the National Institute of Allergy and Infectious Diseases director, said, “There’s no reason to be walking around with a mask.” Those strident remarks would linger in the minds of many and foment distrust of future public health guidelines. “The worst thing a scientist can do in this uncertain situation is to say that we know all the facts, because that can be quickly disproven, and that can quickly lead to breakdown of trust,” says Declan Fahy, who studies the public communication of science at Dublin City University.
What came next was an academic debate with real-world consequences. By the end of March, some scientists began pointing to emerging evidence that the virus could spread through microscopic aerosols that linger in the air and drift across rooms more easily than the larger virus-laden droplets that social distancing measures were based on. In common parlance, the new data suggested that the virus was airborne, but admitting that would have frightening implications for businesses, hospitals, or any place with lots of people indoors.
By April, in light of evidence that infected people could spread the virus before showing symptoms, the CDC began to support wearing masks as a way to help stop that spread when people can’t maintain social distancing. Adams and Fauci chimed in and encouraged people to wear masks too. The WHO wouldn’t make that recommendation until early June.
Yet on the topic of airborne transmission, despite pressure from scientists around the world, both organizations continued to waffle. And even when the CDC and WHO ultimately acknowledged the possibility of airborne transmission, they downplayed its importance.
Throughout it all, people kept on scrubbing surfaces and spraying disinfectants—measures that a growing number of scientists said would likely do little to stop coronavirus infections. And at many testing centers in the US, people without symptoms weren’t allowed to get tested for COVID-19—a guideline that the CDC temporarily promoted and that likely aided the virus’s spread. “The CDC used to be the gold standard in this country,” says Jo Handelsman, who was the point person for the Ebola epidemic when she served as the associate director for science at the White House Office of Science and Technology Policy from 2014 until early 2017. “The CDC was the first place I’d call to get answers, and it is just so sad that they’ve lost so much credibility.”
The about-face on masks did little to help a country already deeply cynical and skeptical of the government. Many science communication experts think the generalized statements about masks could have benefited from more transparency and nuance; for example, officials could have cited evidence showing that some masks are better than others but that any mask is usually better than no mask.
“COVID will be used someday as the worst example of risk communication in the modern era,” says David Rejeski, who studied communicating risks of synthetic biology and nanotechnology when he was director of the Science and Technology Innovation Program at the Woodrow Wilson Center. And the implications could be broader than this pandemic. Research suggests that “once respect is lost, it is very hard to get it back,” UW-Madison’s Dunwoody says. “Even if the loss of trust is focused on a specific issue, I can imagine that loss of trust generalizing to blaming the larger population of experts in the long term.”
The rapidly evolving science surrounding the coronavirus overlaid with mixed messaging from political leaders and public health authorities has allowed misinformation to take root. A Pew study in June found that one in four adults in the US thought the pandemic was definitely or probably “planned by powerful people.” Likewise, a Yahoo News and YouGov poll in May found that 28% of US adults believed that Bill Gates wanted to use COVID-19 vaccines to implant microchips in people and track them. Another 32% said they were unsure, and only 40% said it was outright false.
Gordon Pennycook, a behavioral scientist at the University of Regina, worries that such unfounded conspiracies may become “increasingly normalized” and thinks we should take the results of the survey “seriously.” The WHO warned of an “infodemic of false information about COVID-19”—what some call a misinfodemic. Caulfield says the abundance of misinformation is one of the biggest problems of the pandemic. “On a scale of 1 to 10, it is an 11,” he says. “Misinformation has resulted in hospitalizations and deaths, financial loss, increased discrimination, and confused science and health policy.”
Misinformation does not always have conspiratorial roots. Sometimes it arises from a reasonable, albeit unproven, idea. Early in the pandemic, scientists began looking for old drugs that could be repurposed as new treatments for the coronavirus. Two old, inexpensive antimalarials—chloroquine and its chemical cousin hydroxychloroquine—quickly emerged as contenders after small studies in monkey cells showed they could stop the coronavirus from spreading. By mid-March, David Boulware, an infectious disease doctor at the University of Minnesota Twin Cities, had launched the first controlled clinical trial to test whether hydroxychloroquine could prevent infections in people exposed to COVID-19.
Then the drug got swept up in the misinfodemic. The confusion started at a White House briefing March 19, when Trump touted hydroxychloroquine as a way to stave off the virus, saying that even “if things don’t go as planned, it’s not going to kill anybody.” He also said that the FDA had rapidly approved the drug for COVID-19; in reality, the FDA had merely given Boulware permission to begin his clinical study.
Trump doubled down on his promotion after a small, and widely criticized, study in France suggested that people with COVID-19 who got hydroxychloroquine pills with or without the antibiotic azithromycin recovered faster than people who didn’t get the drugs. He tweeted that the two drugs could be “one of the biggest game changers in the history of medicine.” Three days later, an Arizona man died after ingesting a fish tank cleaner containing chloroquine. Politically conservative networks promoted hydroxychloroquine incessantly. To Boulware’s shock, on March 28, the FDA issued an emergency use authorization (EUA) permitting doctors to prescribe the drug to hospitalized COVID-19 patients. “There was no data to actually support it,” he says.
Then in April, a flurry of small, conflicting studies, many of which lacked appropriate controls and were plagued by confounding variables, began to suggest that the drug didn’t work or, worse yet, could cause heart problems. The FDA issued a warning about the risks, and enrollment in Boulware’s study and others tanked. Through it all, Trump boasted about taking the drug himself. “Everything became very political, and if you were anti-hydroxychloroquine, you were anti-Trump,” Boulware says.
The mess morphed into crisis May 22 when a study published in the Lancet concluded that people who got either chloroquine or hydroxychloroquine were twice as likely to die as people who didn’t get either drug. But scientists quickly questioned the reliability of the data, which came from a little-known company called Surgisphere, and the journal retracted the paper shortly thereafter. Tweeting about the fiasco, conservative commentator Laura Ingraham said, “Next time they say ‘trust the science,’ remember this.”
It was not a good look for science. Some researchers applauded the retraction as an example of science at work, but others wondered how the study got through peer review. “The self-correcting nature of science says that new studies show other studies are wrong. We retract studies because they should not have been published in the first place,” says Dietram A. Scheufele, a professor of life sciences communication at UW-Madison.
Despite being stalled by the hydroxychloroquine hype, the normal scientific process ran its course, and data from controlled trials of the drug began emerging. The day before the Lancet retraction, Boulware published his study showing that hydroxychloroquine couldn’t prevent people from getting COVID-19. Two days later, investigators of a large trial in the UK said they would stop testing hydroxychloroquine because data showed it did not prevent deaths. The FDA eventually revoked the EUA for the drug on June 15.
Study after study showed that hydroxychloroquine didn’t work, yet it remained a partisan flash point into the fall. In September, a quarter of US adults, including half of all surveyed Republicans, claimed that hydroxychloroquine was an effective treatment for COVID-19, according to a KFF Health Tracking Poll. “Once a particular position becomes associated with an ideological perspective, it becomes increasingly difficult to talk about it in a dispassionate manner,” Caulfield says. Boulware saw this phenomenon as well. “People had made up their mind. They either believed it worked, so why would you want to be in a clinical trial and potentially get a placebo, or they believed it was a dangerous drug” and didn’t want to enroll in a trial, he says. “In truth, neither group was correct. It was a relatively safe medicine; it just didn’t work.”
Scientists would have studied hydroxychloroquine regardless of Trump’s affection for it. But they probably would not have launched more than 200 trials, many of which never produced meaningful data. “When the president says something, journalists can’t just ignore it. They are forced to cover it,” Harvard’s Viswanath says. “It becomes a spiral of amplification, and that’s how we have mainstream misinformation.”
The hydroxychloroquine fiasco was a product of slapdash, contradictory scientific studies coupled with politically motivated misinformation. The furor among doctors and scientists in particular wasn’t just about this one drug but about the FDA’s rejection of its scientific credo in favor of making pills more easily available. The FDA’s handling of hydroxychloroquine “was an outrage,” says Kenneth Kaitin, the recently retired director of the Tufts Center for the Study of Drug Development. He commends the agency for ultimately revoking the EUA, “but that actually made them look even more capricious and that they didn’t know what they were doing.” The agency’s integrity would soon be questioned again.
In the dog days of summer in the US, two topics on the country’s mind—vaccines and the impending election—began to meld. While Moderna and Pfizer enrolled volunteers in large clinical studies to determine if their vaccines were effective at preventing COVID-19, many people began to worry that Trump would push to get the shots authorized in October to boost his reelection campaign. A KFF Health Tracking Poll in September found that 62% of US adults, including 85% of Democrats, worried that the FDA would rush the vaccine approval because of political pressure. And the portion of US adults who said they would definitely or probably get the vaccine dropped from 72% to 51% between May and September, according to Pew surveys.
The FDA did little to help its own credibility. On Aug. 23, the eve of the Republican National Convention, Trump, Department of Health and Human Services (HHS) secretary Alex Azar, and FDA commissioner Stephen Hahn gathered in the West Wing of the White House for a press conference. They were there to announce an EUA for using convalescent plasma to treat people hospitalized with COVID-19. Some doctors were already giving people the experimental plasma treatment, which is the antibody-rich fraction of blood collected from people who recovered from the disease. Researchers at the Mayo Clinic were tracking the results. “It’s had an incredible rate of success,” Trump said. Plasma improved survival by 35%, Hahn added. The headline of the accompanying FDA press release called it “Another Achievement in Administration’s Fight Against Pandemic.”
Scientists instantly and roundly criticized Hahn. The nation’s top drug regulator had bungled his description of the data, which were questionable for several reasons. Mayo’s retrospective study had not undergone peer review, it lacked a control or placebo group, and Hahn overstated plasma’s benefit by some 30 percentage points.
Scientists were furious at the contortion of the science for use as a political lever. “They lied about the level of benefit,” Scripps’s Topol says. And by granting the EUA, the agency had hijacked efforts to get good answers about plasma’s effectiveness—after all, why would someone enroll in a rigorous, placebo-controlled trial and chance not getting the therapy when it was readily available? “It also undermined the credibility of not just Trump and the HHS but also the FDA,” Topol says. “That was a very low point in the year for the FDA, perhaps the lowest, much worse than hydroxychloroquine.”
US adults said in June that the pandemic was definitely or probably “planned by powerful people.”
Source: Pew Research Center.
The way the FDA communicated to the public about plasma “will erode precious public confidence,” former FDA commissioner Scott Gottlieb said on Twitter. “You earn public confidence in small drops and you [lose] it in buckets.”
In September, Hahn said the FDA would institute stricter requirements for the vaccine EUA than what it demanded of hydroxychloroquine and convalescent plasma. In addition to requiring the vaccine to be at least 50% effective, the FDA said companies needed a median of 2 months of safety data after participants got their shots—a decision that would make it unlikely for the vaccines to be authorized before the election. In a press briefing, Trump said the decision “sounds like a political move.” Meanwhile, scientists applauded Hahn. “I give him a lot of credit. I don’t know if that would have happened if the plasma thing hadn’t been such a disaster,” Topol says of the FDA’s stricter EUA requirements. “I had no confidence in the process in September, and now I have very high confidence.”
In November, Pew and Gallup polls found that the number of people willing to get the vaccine had risen to about 60%, likely because of a combination of Joe Biden’s election win and preliminary data showing that Pfizer’s and Moderna’s vaccines are more than 90% efficacious.
Now that the FDA has authorized the two vaccines, the coming months will present the new challenge of convincing people to take them. The Reagan-Udall Foundation for the Food and Drug Administration, a nonprofit that works with the FDA, has conducted listening sessions with essential workers and Black, Hispanic, and Indigenous people to hear their concerns about the vaccines and ultimately address them in future FDA messaging.
Among the common concerns: the lightning-speed discovery and clinical testing of the vaccines as well as racial biases in their development. Some voiced fears that people of color would be used as “guinea pigs” or that the shot they’d get would be different from the ones that White people would get—distrust rooted in a history of medical research’s exploitation of Black people and ongoing bias and inequity in medical care.
Other themes were the public’s general distrust of government and health care, and worries that politics and money are taking priority over science, according to Reagan-Udall CEO Susan Winckler.
“The politics has not helped at all,” National Institutes of Health director Francis Collins told C&EN in early December. “And we have worked really hard and will continue to emphasize that none of this is happening in a fashion that is driven by anything other than the best science and the most rigorous evaluation by objective evaluators who are not political appointees.” Collins hopes that the public will take that message to heart once it’s time for people to roll up their sleeves. “It will be a terrible tragedy if this resistance persists to the point where we fail to get to the end of this epidemic when we could have.”
Researchers are also worried about someone getting sick or dying soon after they get a COVID-19 vaccine. “It is inevitably going to happen, and it may have nothing to do with the vaccine. It just has to do with the fact that people get sick and die every day,” says William Hallman, a psychologist at Rutgers University–New Brunswick and former chair of the FDA’s Risk Communication Advisory Committee. But people will likely blame the vaccines, and that could cause people to lose trust in the process that brought us the shots to begin with, he says.
If you were trying to write a playbook for how to destroy public trust in science, the combination of mixed messaging, misinformation, and political interference would be a good start. But against all odds, Pew found that there hasn’t been a large decrease in that trust during the pandemic. From January 2019 to November 2020, the percentage of US adults who said they had a great deal or fair amount of trust in scientists to act in the public’s best interests decreased only slightly—from 86% to 84%. The portion saying they had a great deal of confidence in scientists even increased slightly.
At first glance, trust in science seems stable, but if those numbers seem to belie reality, it’s because there’s an important caveat: confidence increased among Democrats but not Republicans. The differences are starkest when focusing on those who said they had a great deal of confidence: 55% of Democrats said they had a great deal of confidence in scientists in November 2020, up from 43% in 2019. The portion of Republicans saying the same was 22%, down from 27% in 2019. That 33-point rift is the largest Pew has seen since it began asking the public about confidence in scientists in 2016.
Some researchers worried about this division early on in the outbreak. Dominique Brossard, chair in the Department of Life Sciences Communication at UW-Madison, recalls that when she was asked in March for her most dire prediction about the pandemic, she said that “COVID-19 could become like climate change, where it is no longer a public health issue but a highly political and social issue.” Unfortunately, she says, her prediction came true. Researchers from the University of Michigan and UW-Madison found that the levels of politicization and polarization in news coverage of COVID-19 from March to May equaled or exceeded levels found in climate change coverage.
“Trust in science and public health officials is really secondary to partisan tribal loyalties or affiliations,” says Chris Jackson, senior vice president of US public affairs at the polling group Ipsos. Numerous surveys indicate that the division between Republicans and Democrats holds up in just about every aspect of the pandemic related to science, whether it’s about COVID-19 death counts, social distancing, mask wearing, confidence in vaccines, or regulatory and scientific institutions themselves. Democrats are even twice as likely as Republicans to say they trust National Institute of Allergy and Infectious Diseases director Anthony Fauci, whose popularity has put his face on doughnuts and bobblehead figures.
“We now have a highly polarized political environment in which partisans find it convenient to discredit science that is inhospitable to their views,” the University of Pennsylvania’s Jamieson says. The polarization happened over years with climate change and seemingly overnight with COVID-19. “Science didn’t create the polarization; it is a victim of the polarizing environment.”
Part of the problem is identifying science’s proper role in decision-making. Science helps people describe problems, narrow down a list of possible solutions, and predict their potential outcomes. But science is not prescriptive, and it is one of many factors that politicians and society should weigh. “Many in the scientific community were hoping that science would not only inform but determine policy,” UW-Madison’s Scheufele says. But social, ethical, and economic considerations may factor more heavily in the decision-making process for some people. People can agree that everyone would be better off wearing masks and getting vaccinated but still oppose mandating those measures.
“Everyone will say they want evidence-based policy making, but they disagree on what constitutes sound science,” says Shobita Parthasarathy, director of the Science, Technology, and Public Policy Program at the University of Michigan. Since there will always be gaps in our knowledge, people can always argue that we need more data before taking action. “These are really values debates masquerading as scientific debates,” Parthasarathy says. “And putting them into the language of science is bad for science, but it is also bad for politics and democracy, because we are not being honest about what we are debating. There are legitimate value differences that we can and should be able to have honest conversations about.”
Decisions about what research to fund or whether we should mandate masks or enact regulation to curb climate change are policy questions and thus inherently political. “But science shouldn’t be partisan,” Brossard says. “That is the danger, and that is what happened in this country. Wearing a mask became a partisan statement.” It doesn’t help that many states and cities have adopted slipshod measures that attempt to control the virus in one sliver of society but allow it to run rampant in others. What’s the scientific basis for instituting nightly curfews and closing churches and parks but allowing bars and restaurants to open for indoor dining?
Trump’s detractors like to label him, his former administration, and his followers as antiscience. But many scholars of science policy and science communication, despite openly despising Trump’s policies, say the label is inaccurate. After all, the US has spent more than any other nation on developing and manufacturing COVID-19 vaccines and novel therapies—including Regeneron Pharmaceuticals’ antibody cocktail that Trump was privileged to get when he was infected. The problem, rather, is that Trump’s interest in science is highly selective. Vaccines and antibody therapies sounded like magic bullets and were largely made in America, to boot. Masks, shutdowns, and social distancing, in contrast, demanded both personal responsibility and policy mandates.
“We are all looking to science for this technological fix,” Rutgers’s Hallman says. “The reason we like technological fixes is that we don’t have to change our individual behaviors. We trust the science that leads to the gadgets. Buying stuff is easy. But changing our own behaviors and opinions is hard.”
People who do change their behaviors have a tendency to insult people who don’t—names like “antimaskers” and “covidiots” come to mind. Michigan State’s Besley worries about these communication choices. “We can’t find any evidence that it helps to call people idiots,” he says. And he’s not joking. Besley published a study in 2019 in which he took an article about the “war on science” and reframed it in two ways: the “challenge for science” and the “neglect of science.” Liberal-leaning people who viewed the “war on science” version as more aggressive than the other two framings rated scientists as more credible, while conservative-leaning people who perceived that version as more aggressive rated scientists as less credible.
The war-on-science framing is a good way to get Democrats riled up, but it comes at the potential cost of growing the partisan divide. “In wars we have winners and losers. We take sides, and the solution is conquering and defeating your enemy,” Besley says. “Do we want people to see scientists as angry, frustrated people or people who are doing our best to solve problems to make the world better?”
Back in April, University of Alberta’s Caufield optimistically predicted the pandemic would lead to “a greater appreciation of the value of good science,” he recalled in an op-ed published in July. But his hopes had diminished by then. “I now fear that this pandemic will cause trust in science to be irreparably harmed,” he wrote. Today, he thinks the pandemic’s legacy will be more complex. “There is this greater recognition of the importance of science, but in addition to that, maybe because of that, you are seeing science and representations of science being twisted by various entities,” he says.
That twisting underscores the power and appeal of scientific rhetoric. Trump, for instance, sensationalized science that supported his cause—like hydroxychloroquine and convalescent plasma—and attempted to silence or sideline science that was at odds with it. The problem could be worse. “If people were to not trust the scientific process in general and science as the best way society has for identifying and working through knowledge, then we have an almost unsolvable problem, because that is the problem of Enlightenment,” Scheufele says. People on both sides of a debate invoke science and reason in their arguments, says Arizona State University’s Sarewitz. “If one side says something is about the facts, will the other side really say they don’t care about the facts? You’d be crazy to do that.”
It is tempting, but inaccurate, to boil the problem down to one of science education. People who argue against the reality of human-caused climate change, the usefulness of masks, or the safety and effectiveness of vaccines often appeal to scientific, albeit erroneous, logic. “These people are not antiscience,” explains Maya Goldenberg, a philosopher who studies vaccine acceptance and hesitancy at the University of Guelph. “We just disagree on who is the legitimate expert and who’s got the right science.”
Of course, that disagreement is no small problem, and in her upcoming book, Goldenberg argues that the root cause of vaccine hesitancy is an eroding trust in scientific institutions rather than science itself. People who vaccinate their kids don’t understand the science of vaccination better, she says; they just trust the scientific consensus that vaccines are safe and effective. And if you think that the CDC or the FDA primarily serves corporate or government interests rather than the interests of the people, then why would you trust their consensus?
The erosion of confidence in government institutions at large has been well documented by the General Social Survey, Pew, and others. Although that declining trust hasn’t done the CDC or FDA any favors during the pandemic, these agencies are not blameless. The CDC was slow to respond to the latest scientific consensus about the virus, and the FDA subjugated itself to political manipulation; in doing so, these institutions drew harsh criticism from their closest allies: scientists themselves. “I think someone found the CDC’s list of what to do in a pandemic and thought it was a list of what not to do,” Texas State’s Brunson says.
But worried citizens and scientists should remember that the heads of the CDC and the FDA are political appointees. “With the right president who makes the right appointments, I think we can recover quickly,” says Handelsman, the former White House scientist. “These directors come and go, but that agency is filled with career scientists who still do their jobs.”
It’s never a good time for a pandemic, but it’s hard to imagine a worse year for one in the US than 2020, when polarization reached fever pitch. Far too often, science did not have its proper place, or any place, at the table. Yet for all the mistakes, negligence, and lies by political leaders, and in spite of the perfect storm of mixed messaging, misinformation, and unapologetic politicization, the public somehow retained a sizable, and in some groups slightly growing, trust in science. Incredibly, of the more than 30 researchers, social scientists, and science policy experts interviewed for this article, nearly all think science will emerge in a positive light at the end of the pandemic—especially if those technological fixes like therapies and vaccines are ultimately responsible for putting an end to COVID-19.
But the outlook is not entirely upbeat. As this article went to press, pro-Trump rioters breached the US Capitol, fueled by some Republicans’ unfounded claims that the presidential election was stolen. Compared with that physical assault on democracy, the erosion of public trust in science may seem like a cerebral threat. Yet both attacks, on democracy and science, are rooted in misinformation and compounded by polarization.
The science of COVID-19, like the science of climate change before it, has been dragged into the widening rift between Democrats and Republicans. And as those “science is real” yard signs demonstrate, science is increasingly finding itself tacked onto a list of liberal political values. Much like Republicans positioned themselves as the party that prioritized national security during the Cold War, Democrats are increasingly branding themselves as the party that cares about science, Sarewitz says. That framing will likely only extend partisan fissures.
“By signaling that science is a cause that aligns with Democratic causes, we’re taking it away from [more than 74] million voters who just voted non-Democrat in the last election,” Scheufele says. Scientists lean liberal, and Scheufele worries that if people start viewing researchers and their institutions as agents of the Democratic Party, science’s bipartisan support could crumble.
The same polarization that’s been tearing the US apart for the past decade has sabotaged its response to the pandemic. If the vaccines are ultimately successful, science may well put an end to this crisis, but the US didn’t have to lose more than 400,000 lives in the process. Scientists warned that a pandemic was inevitable. They never said it was going to be easy, but it never had to be this hard.
Join the conversation
Contact the reporter
Submit a Letter to the Editor for publication
Engage with us on X