Science’s Got Troubles
Numerous news articles point to moral shortcomings in Big Science that threaten public trust.
The US Constitution was a great idea. But John Adams once said, “Our Constitution was made only for a moral and religious People. It is wholly inadequate to the government of any other” (US Archives). Similarly, the “Scientific Method,” as it is popularly conceived, is a great idea with a long train of spectacular successes. But science is always mediated by fallible human beings. Misuse of scientific methods could produce fake science or even evil science.
We have all heard how foreign countries have tried to manipulate elections with disinformation campaigns. Imagine what would happen to public trust in science if political entities, or even artificial intelligence (AI) algorithms, became so clever with scientific disinformation that journal editors and reviewers could not tell the true from the false. An experiment like this was actually run recently. The Wall Street Journal tells how 3 researchers submitted 20 bogus papers to journals. Seven were accepted, and four were published – including one that quoted sections of Hitler’s Mein Kampf as supporting evidence.
Despite the most rigorous safeguards, rules and regulations are only as good as those who follow them. Professors and grad students are busy, distracted by various temptations and motivations that can be less than noble (q.v. the IgNobel Prizes). For a dose of reality about how scientific sausage is made, look at some of the worries in Big Science going on right now, and pay attention to the implications in each proposal: science has been failing in many ways.
Predatory publishers: the journals that churn out fake science (The Guardian). Pay a fee and pad your resume with all your published papers, good or bad. This is a worry about predatory journals with temptations of filthy lucre, showing that the love of money is the root of all kinds of evil, even in science. On the flip side, the article complains about how mainstream journals routinely deny publication to maverick ideas.
Ghost authorship haunts industry-funded clinical trials (Nature). Big Science is haunted. “Drug companies make big contributions to analysis in the trials they fund but can fail to report their contributions,” Matthew Warren writes. There are even ghosts in the data. A large number of trials give funders access to the data and even the methods:
About 21% of the academic authors indicated that a funder, or one of their contracted employees, had been involved in the design, analysis, or reporting of the research in a way that had not been declared in the paper. This “ghost authorship” could potentially be more widespread than this, write Rasmussen and her colleagues, as academic researchers who had a relatively small role in a study may not have been aware of the extent of industry involvement.
Rasmussen says she was surprised by how common these undeclared contributions and associated issues were. “It’s incredibly inaccurately reported,” she says. “The roles of the funder were often downplayed or even omitted in the publications, funder employees rarely had first or last authorship despite having played a role in every single part of the trial.”
No mixed motives in those papers. Money can buy politics; it can also buy science. And sometimes, people’s lives are at stake, trusting the results of a clinical trial that could have been manipulated to profit the funder. Science Daily posted a related story on this problem.
How three research groups are tearing down the ivory tower (Nature). The subtitle points out another shortcoming in Big Science: overlooking indigenous people. “The people who should benefit from research are increasingly shaping how it’s done,” the authors say, complaining that “traditional research” has tended to be “myopic.”
What ‘data thugs’ really need (Nature). Keith Baggerly argues, “Science needs to develop ways and means to support the checking of data.” Retracted papers, lawsuits, halted clinical trials, sloppy research, faulty statistics, retaliation on whistleblowers – these are all addressed in Baggerly’s tour of the sausage factory. “Corrections are much rarer than they should be,” he worries. You can’t expect vigilantes to shore up science’s ideals of self-correction.
Biased Estimates of Changes in Climate Extremes From Prescribed SST Simulations (Geophysical Research Letters). Lack of integrity is not the only potential source of fake science. Carelessness about bias can also do it. In this paper, researchers found that data on surface sea temperatures (SST) can be fraught with bad assumptions or bad methods. “Our results illustrate the importance of carefully considering experimental design when interpreting projections of extremes.” Note to world leaders: these are the climate scientists who inform politicians, telling them that “science says” we must take drastic measures or we will die (e.g., “Terrifying climate change warning: 12 years until we’re doomed,” Fox News). They’re also the ones telling politicians how to nudge skeptics into following the consensus without questions (“Confronting Climate Science in the Age of Denial,” PLoS Biology).
Science’s credibility crisis: why it will get worse before it can get better (The Conversation). Bad news: Science has a credibility crisis. Worse news: It will get worse before it gets better, argues Andrea Saltelli, because poor ethics invades modern science. Psychology and economics have taken embarrassing hits, but other branches of science cannot escape what Jerome Ravetz warned in a book in 1971, that science can become diseased without ethics. Social scientists, still smarting from the “science wars” of the 1970s, are reluctant to confront the problem, fearing their image (by popular opinion, “scientific realists” won the war).
John Ioannidis has recently received prominence for producing statistics on the “science of science,” showing how widespread fake science has become, but he is optimistic that science’s reputation can be resuscitated. The author of this article, Andrea Saltelli from the University of Bergen, does not share his optimism.
Here we clash with another of science’s contradictions: at this point in time, to study science as a scholar would mean to criticise its mainstream image and role. We do not see this happening any time soon. Because of the scars of “science wars” – whose spectre is periodically resuscitated – social scientists are wary of being seen as attacking science, or worse helping US President Donald Trump.
Scientists overall wish to use their moral authority and association with Enlightenment values, as seen in the recent marches for science.
If these contradictions are real, then we are condemned to see the present crisis becoming worse before it can become better.
Austrian agency shows how to tackle scientific misconduct (Nature). This optimistic headline quickly informs the reader that Austria got worse before it got better: “A decade on from a major academic scandal, officials there have got their act together,” the editorial says. Of course, it will never happen again, will it? The Editors list four lessons learned from the scandal, and describes laws intended to prevent future scandals. But like Constitutions, laws are “made only for a moral and religious People.” They are “wholly inadequate to the government of any other.”
There were so many recent articles on this subject, we will continue tomorrow.