April 16, 2021 | David F. Coppedge

Who Decides What Is Misinformation About Science?

Can scientists recognize their own fallibility? Must they portray themselves as judges over the public?

PNAS published a colloquium on “Misinformation in and about science” last week. Here are short synopses of the submitted papers:


Introduction by Scheufele et al., “Misinformation about science in the public sphere.PNAS 13 April 2021.

Bouncing off the Covid-19 pandemic, recalling some misleading statements by scientists, this group recognizes some fallibility from scientists, such as retractions by leading institutions. Most of the misinformation, though, was the fault of the public.

The misinformation crisis exemplified and intensified by the COVID-19 pandemic lays a gauntlet at the door of all science communicators. Scholars, experts, educators, activists, organizers, public servants, and philanthropists share an obligation to engage in “difficult, broad-based negotiation of moral, financial, and other societal trade-offs alongside a collective investigation of scientific potential”. In the end, it is our hope that this colloquium issue will stimulate deeper explorations of the causes and cures for misinformation, conducted in closer collaborations among researchers and practitioners.


Howell and Brossard, “(Mis)informed about what? What it means to be a science-literate citizen in a digital world.PNAS 13 April 2021.

This paper contains gobbledygook about “scientific literacy” that presupposes that scientists have it but the unwashed masses do not. Howell and Brossard ramble about how best to get the public in line with what Big Science knows to be true.

If science literacy is to truly enable people to become and stay informed (and avoid being misinformed) on complex science issues, it requires skills that span the “lifecycle” of science information. This includes how the scientific community produces science information, how media repackage and share the information, and how individuals encounter and form opinions on this information. Science literacy, then, is best conceptualized as encompassing three dimensions of literacy spanning the lifecycle: Civic science literacy, digital media science literacy, and cognitive science literacy.


Valerie Reyna, “A scientific theory of gist communication and misinformation resistance, with implications for health, education, and policy.” PNAS 13 April 2021.

Reyna makes a distinction between heart and mind, alleging that getting the “gist” of a scientific claim is more emotional than understanding it. For instance, through the use of the association fallacy, she writes a biased big lie implying that Darwin skeptics don’t understand evolution:

Topics such as evolution, vaccination mandates, and stem-cell therapy seem to involve the heart, or motivational biases, more than the mind. Misinformation often pulls at the heart, such as case stories of children who develop autism or rare neurological diseases shortly after vaccination.

Darwin skeptics could point to many examples of emotional rants against them by Darwinists.


Jevin T. West and Carl T. Bergstrom, “Misinformation in and about science.PNAS 13 April 2021.

These authors, refreshingly, allow for some blame to go to the scientists. This is the only paper in the series that mentioned “fraud” – but that only once, stated as a possible risk: “Given that top journals often look for exciting results of broad impact, these policies encourage researchers to hype their work. Worse still, they may encourage fraud.

One can sniff their political biases, though. They repeat the myth that science is self-correcting: “We stress that while science has its problems, it incorporates mechanisms to correct mistakes” (but see 6 Jan 2021). That is not a virtue unique to science; any publisher should incorporate mechanisms to correct misinformation. Moreover, scientists are not particularly good at it when consensus paradigms are at stake. Note: “filter bubbles” (echo chambers) are groups that agree with each other without hearing opposing opinions.

Humans learn about the world by collectively acquiring information, filtering it, and sharing what we know. Misinformation undermines this process. The repercussions are extensive. Without reliable and accurate sources of information, we cannot hope to halt climate change, make reasoned democratic decisions, or control a global pandemic. Most analyses of misinformation focus on popular and social media, but the scientific enterprise faces a parallel set of problems—from hype and hyperbole to publication bias and citation misdirection, predatory publishing, and filter bubbles. In this perspective, we highlight these parallels and discuss future research directions and interventions.


Watts, Rothschild and Mobius, “Measuring the news and its impact on democracy.PNAS 13 April 2021.

The use of  the codeword “democracy” (i.e., socialism) by these anti-Trumpers indicates an upcoming rant about the 2016 election. With understatement, they imply that the misinformation came from the pro-Trump side, although they allow for some culpability from the mainstream media—how?—for repeating the Republican misinformation.

The origin of this extraordinary surge in interest in a previously sleepy topic was of course the 2016 US presidential election, which, along with other events that year such as Brexit, raised widespread concerns about a possible rise of populist/nationalist political movements, increasing political polarization, and decreasing public trust in the media. Early reporting by journalists quickly focused attention on fake news circulating on social media sites during the election campaign. The philanthropic and scientific communities then responded with dozens of conferences and thousands of papers studying various elements of fake news. Reinforced by continued mainstream media attention and increasing congressional scrutiny of technology companies, the conjecture that the deliberate spread of online misinformation poses an urgent threat to democracy subsequently hardened into conventional wisdom.


Michael F. Dahlstrom, “The narrative truth about scientific misinformation.PNAS 13 April 2021.

Dahlstrom advocates more storytelling! “While narrative can indeed lead to scientific misinformation, narrative can also help science counter misinformation by providing meaning to reality that incorporates accurate science knowledge into human experience.” He knows that storytelling risks misinformation itself, but advocates it anyway as a way to get people emotionally activated. That is dangerous. It is not the business of science.

Acknowledging these complex intersections, narrative remains meaningfully distinct from science based on contrasting evaluations of truth. However, does this difference necessarily lead to scientific misinformation? While narrative may be distinct from science, that does not necessarily mean it is deficient. Narratives may lack the generalizability of science, but they retain more of the surrounding complexity that scientific formats commonly discard, such as the emotional meanings and motivating factors for action. Additionally, in science communication contexts where understanding this type of content is desirable, narrative may serve the preferred role.


Yeo and McKasy, “Emotion and humor as misinformation antidotes.” PNAS 13 April 2021.

These two go further, advocating humor as a way for scientists to combat misinformation about their laudable, squeaky-clean work. This is another attempt at manipulation of the public without sufficient self-reflection about scientists’ own fallibilities. Be forewarned. Evolution is already funny enough without the help of these storytellers.

Recent research sheds light on how funny science and emotions can help explain and potentially overcome our inability or lack of motivation to recognize and challenge misinformation. We identify some lessons learned from these related and growing areas of research and conclude with a brief discussion of the ethical considerations of using persuasive strategies, calling for more dialogue among members of the science communication community.

Scientists at the University of Utah were quick to echo this opinion (immediately reverberated through popular science outlets like Phys.org and Science Daily) about the usefulness of “funny science” for propaganda. “Understanding how emotion and humor shape the public’s understanding of science is one more resource that can aid communicators’ efforts to combat misinformation,” they say, provided it is “used ethically.” How to use it ethically was not explained; that detail was pushed into futureware: “It is essential that we engage in dialogue about the ethical considerations that face science communication in the digital media era.” Will that dialogue include the public? Once again, the assumption seems to be that science communication must be one-way: from elite scientists who know information to the misinformed laypeople.


Michael A. Cacciatore, “Misinformation and public opinion of science and health: Approaches, findings, and future directions.” PNAS 13 April 2021.

This paper focuses primarily on retractions of scientific papers that create misinformation by the “continued influence effect” (CIE) in the public among those who failed to hear about the retractions (see 6 Jan 2021). However, Cacciatore fails to address the problems of scientific fraud and misconduct. This omission tends to perpetuate the myth of the morally neutral scientist in the white lab coat who didn’t mean to publish a flawed result. He also locates conspiracy theories (e.g., skeptics of climate change on social media) only outside the scientific consensus, revealing a bias toward Big Science as his own filter bubble.

A common thread in much of the literature cited in this paper is a focus on individuals—typically everyday citizens—and their perceptions. Of course, mis/disinformation can also influence other populations, including political elites, the media, and funding organizations. Indeed, it is arguably most impactful when these audiences are reached as they represent potentially powerful pathways to political influence. Unfortunately, there is a relative dearth of work in this space, at least as compared to studies focused on individual perceptions.

cartoons by Brett Miller; used by permission.


Not part of the colloquium but related to the topic of misinformation is a short press release from the University of Waterloo, published 8 April 2021, “The truth about doublespeak: is it lying or just being persuasive?” As expected, it shows that manipulating language can sweeten fibs and sidestep charges of lying.

“Like the much-studied phenomenon of ‘fake news,’ manipulative language can serve as a tool for misleading the public, doing so not with falsehoods but rather with the strategic use of euphemistic language,” said Alexander Walker, lead author of the study and a PhD candidate in cognitive psychology at Waterloo. “The avoidance of objectively false claims may provide the strategic user of language with plausible deniability of dishonesty, thus protecting them from the reputational cost associated with lying.”

Where have you seen that taking place in the news recently? The authors could have just read the CEH Baloney Detector for definitions and examples of loaded words, big lies, half truths and the whole toolkit of propagandists and deceivers.

How to lose respect for scientists. Look what the consensus promoted in the past!

As usual, these papers by scientists and for scientists downplay the role of scientists in the propagation of misinformation. It’s always the unwashed masses who are guilty. To members of Big Science wearing their D-Merit Badges, they have truths from on high that get twisted by you, the layperson. Communication of truth is always one way.

CEH thinks that any scientific education should include courses and good books on the philosophy of science (to teach them about the limitations), history of science (to remind them of terrible blunders by the scientific consensus), and sociology of science (to warn them of filter bubbles and echo chambers). Some reading on the rhetoric of science (how narratives are framed) would be helpful as well.

It would be so refreshing for scientists to actually interview people with the goal of learning from them. There are very smart and wise thinkers out there, outside of the scientific establishment, who have much to say about science and its role in policy and public understanding. One recent good example was posted April 5 on Prager U by Dr Brian Keating, who debunks the myth of “Following the Science.” While not agreeing with everything he says, CEH appreciates his candor about the fallibility of scientists.

 

 

 

(Visited 721 times, 1 visits today)

Leave a Reply