Big Science Failing Integrity Test
What happens when the purveyors of knowledge admit they are unreliable?
Editors of the leading Big Science journals, Science and Nature, continue to wring their hands over rampant scientific misconduct. Conflicts of interest, sloppy work, fudging data, plagiarism, non-reproducible results and plain old dishonesty (“fake science”) are just some of the problems they admit to. Paradoxically, industry is more alarmed about reproducibility than academia is, Nature says this week. Notice how academic scientists engage in “questionable practices.”
Despite the advent of important new therapeutics, the number of innovative treatments reaching the patient is disappointingly low. To help rectify this, industry is investing in drug-discovery alliances with peers and academic groups, and in precision medicine. It sees high standards of research quality as the route to the most promising drug candidates and to maximum return on investment.
By contrast, academic scientists may be reluctant to devote extra time and effort to confirming research results in case they fail. That would put paid to publication in high-impact journals, damage career opportunities and curtail further funding. Evidence of questionable practices such as selective publishing and cherry-picking of data indicates that rigour is not always a high priority.
Aren’t those things supposed to be the highest priority? Isn’t the scientific method supposed to be the most dispassionate, disinterested, truth-seeking approach to knowledge? It seems that science is deserving of a similar statement Gandhi allegedly said about Western civilization: “I think it would be a good idea.” Without integrity, science is nothing. You can’t get integrity by the scientific method.
The Science of Science
The AAAS flagship journal Science recently had a special section devoted to scientific integrity. This wouldn’t have been necessary had not serious flaws been worrying the editors. The article titles are instructive:
Research on research (Martin Enserik, Science). It seems like a return to the “science wars” of the 1960s. The efforts below may sound promising, but who will watch the watchers?
Given the billions of dollars the world invests in science each year, it’s surprising how few researchers study science itself. But their number is growing rapidly, driven in part by the realization that science isn’t always the rigorous, objective search for knowledge it is supposed to be. Editors of medical journals, embarrassed by the quality of the papers they were publishing, began to turn the lens of science on their own profession decades ago, creating a new field now called “journalology.” More recently, psychologists have taken the lead, plagued by existential doubts after many results proved irreproducible. Other fields are following suit, and metaresearch, or research on research, is now blossoming as a scientific field of its own.
Journals under the microscope (Jennifer Couzin-Frankel, Science). This article mentions “threats to the scientific enterprise, such as reproducibility, fake peer review, and predatory journals.”
The metawars (Jop de Vrieze, Science). Scientists don’t like being scrutinized. They expect to be trusted just because they are scientists. A meta-analysis is an “analysis of analyses” like a report card on grading methods. John Ioannidis is one such meta-researcher, whose findings about bias and malpractice have been widely reported. Has it helped? “Meta-analyses were supposed to end scientific debates,” de Vrieze writes. “Often, they only cause more controversy.” What is the situation after years of meta-analysis? Stalemate. Hostility. Resistance to change. Scientists continue to balk at calls for transparency that meta-researchers say is essential for improvement.
The Truth Squad (Erik Stokstad, Science). Some “metaresearchers” (those researching research) are stepping on toes. Stokstad tells incidents where integrity investigators are getting rough with perpetrators of fake science, but perpetrators are resisting the bad report cards. Young scientists who report malpractice often suffer. “At the current pace, it’s going to be 2100 before things are really different,” said one researcher into non-reproducibility in psychological studies who felt the backlash of furious institutions that were exposed with bad grades.
A recipe for rigor (Kai Kupferschmidt, Science). This article promises “a simple strategy to avoid bias” that is “rapidly catching on,” called pre-registration.
Preregistration, in its simplest form, is a one-page document answering basic questions such as: What question will be studied? What is the hypothesis? What data will be collected, and how will they be analyzed? In its most rigorous form, a “registered report,” researchers write an entire paper, minus the results and discussion, and submit it for peer review at a journal, which decides whether to accept it in principle. After the work is completed, reviewers simply check whether the researchers stuck to their own recipe; if so, the paper is published, regardless of what the data show.
Simple enough, but scientists are pushing back against this corrective policy. And journal editors don’t like publishing negative results; they want flashy discoveries. Some researchers don’t want to be confined to a hypothesis, thinking they can’t discover things in advance of experiments. One standout comment deserves focus: “A lot of scientists are more like lawyers than detectives. They have a theory and they are trying to use the evidence to support it.” The danger of confirmation bias and sophistry is evident.
Toward a more scientific science (Policy Forum, Science). This implies that what we have now is an insufficiently scientific science. This forum opines on various subjects, and acknowledges sources of bias, but never addresses the underlying fact that without integrity, all research is worthless.
French science behemoth launches research-integrity office (Nature). Protecting whistleblowers is one of the goals of France’s new office of research integrity. Great; who will give them their report card?
Science learns from its mistakes, too (Phys.org). Like the French, the Germans are working on repairs. Their goal is to “do everything possible to maintain social trust in science.” One way to do that is to report negative results, which are not as popular, but are important to contribute to a complete scientific picture.
This entry will be continued tomorrow. —Ed.