Making People Drunk for Science
Even if declared legal, ethical and transparent with participant approval, there are things scientists should not do.
It was for a good cause. Shouldn’t science determine if alcohol has harmful effects on human judgment? Four scientists just wanted to know. They asked whether drinking makes people selfish. Does it make them hoard rewards for themselves instead of wanting to share resources with others? This would be good to know, wouldn’t it? Maybe government leaders need such data to write better regulations about alcohol.
Four researchers at the Clarement Institute used double-blind controlled experimentation. They got the necessary approvals. They got permission from the participants, and paid them for their involvement.
Methods: The Institutional Review Board of Claremont Graduate University approved this study (IRB #2175) with sessions held at the Center for Neuroeconomics Studies in Claremont, CA. Written informed consent was obtained from all participants before they were included in the experiment. There was no deception of any kind and participants were assigned an alphanumeric code to mask their identities. Participants were informed that that they would either consume alcohol or a placebo during the study. Tasks were incentivized with money and participants could earn up to $65 depending on their decisions and decisions of others. Anonymity was maintained by having a lab administrator who was not associated with the study pay participants their earnings in private at the study’s conclusion.
And so the experiments were done. The results were published in PLoS One by Zak et al., “Alcohol unleashes homo economicus by inhibiting cooperation.” What species is “homo economicus,” you ask? That’s Selfish Man, as opposed to homo reciprocus, [lower case theirs], Cooperative Man. The authors believe that the latter evolves into the former when given alcohol. It’s all very Darwinian, you see; cooperation and defection are fitness strategies in Evolutionary Game Theory.
Cooperation in one-shot settings is ubiquitous. But then so are defection and free-riding. A substantial set of mathematical models, laboratory experiments, and empirical analyses have sought to determine when people are likely to cooperate or be selfish. Cooperative behaviors are predicted by multi-level evolutionary models in which individuals in a group out-compete other groups. Costly signaling of one’s value as a future collaborator, known as indirect reciprocity, also supports cooperation. But, indirect reciprocity requires observation of behavior by others.
The results showed that alcohol inhibits cooperation. It was all statistically verified.
We found that moderate alcohol consumption reduced contributions to a public good pool by 32%. Those who consumed alcohol earned 64% more money because they interacted with more cooperative placebo participants. Alcohol also doubled the number of participants who were complete free-riders, contributing nothing to the public good.
A government wishing to ensure its citizens contribute to the public good could use this study to nudge people away from drinking, right? Maybe there are other drugs that could be used to promote cooperation, too.
Our findings show that homo economicus is alive and well and that alcohol is enough to bring him out. A variety of factors besides alcohol reduce prosocial tendencies, including high levels of testosterone and serotonin depletion. The present study was not designed to capture the contribution of changes in neurotransmitters on cooperation, but this is a rich area for future research.
Yes, experimenting on human lab rats (12 May 2021) remains a “rich area for future research.” Giving males drugs can help reduce their toxic masculinity (14 Aug 2020) or put them in cooperative mental states. Adjusting hormones and drugs in citizens can help totalitarian governments reach utopia, like the soma drug that kept the populace compliant in Aldous Huxley’s Brave New World.
If it is dangerous, or wrong, or both, and if it doesn’t need to be done, we just ought not to do it. —Robert Pollack
Ethics Should Rule Science, Not Vice Versa
In the aftermath of the lifting of restrictions on experimentation on human embryos last month (27 May 2021), several scientists wrote in to Nature and issued their concerns. Josephine Johnston, Françoise Baylis and Henry T. Greely aired their opinions in Nature June 22 about the “grave omission of [an] age limit for embryo research.” They do not assume the Christian pro-life position that a fertilized embryo has human rights, but
At some point, the developing human embryo reaches a stage at which it should not be used for research. There is disagreement about when that happens, but scientists need to acknowledge that it does, and reassure the public that they accept limits. The latest guidelines do not prohibit the development or research use of ex vivo embryos at any stage.
… but they should. These scientists “share deep concerns about the latest guidelines from the International Society for Stem Cell Research (ISSCR).”
Just Say No
In the same issue of Nature (June 22), Matthew Cobb and Robert Pollack offered a rule of thumb for all scientists: “A dangerous, wrong or unneeded experiment? Don’t do it.” They recalled a consortium of scientists at Asilomar, California in 1975 after recombinant DNA editing became possible. After intense discussion, the scientists agreed on placing a moratorium on such experimentation. A letter that year by Pollack and Joe Sambrook stated what should be a lesson for today’s scientists, including those on the ISSCR:
Amid today’s debates about heritable gene editing, viral gain-of-function research and embryo experiments beyond 14 days, these words from the letter resonate: “We ought to ask ourselves whether the experimental results are worth the calculable and unknown dangers to ourselves and to the general population … we are obliged to ask ourselves whether the experiment needs to be done, rather than if it ought to be done, or if it can be done.”
Sadly, Cobb and Pollack point out, this letter was never sent. The two didn’t want to risk “antagonizing senior colleagues who might be hostile to the idea of limiting research.” Well, Pollack is now senior himself, and Sambrook is since deceased. Not wanting the message to be forgotten, Pollack resurrected his letter:
The letter concluded: “If it is dangerous, or wrong, or both, and if it doesn’t need to be done, we just ought not to do it.” Then, as now, what is the right experiment to do should not be determined by scientists alone.
Will today’s senior colleagues be hostile to the idea of limiting research? Who will have the courage to stand up to them when they step over the line?
What do you think about the alcohol experiment? Send us your comments. Are there better ways to find out what alcohol does to impair cooperation? Don’t we have enough information about that from police reports and other sources? Were the experiments dangerous, wrong, or necessary? Is it OK if the participants agreed to take part?
Apply the warnings about dangerous and wrong experiments to the current controversy about the origin of the coronavirus. Four health experts at The Conversation say that ethical standards for gain-of-function research are “constantly evolving” –
The main point is that our understanding is constantly evolving. Just before the COVID-19 pandemic began, the U.S. government had started to review and update its policies. It is an open question what lessons will be learned from this pandemic, and how that will reshape our understanding of the value of gain-of-function research. One thing that is likely to happen, though, is that we will rethink the assumptions we have been making about the relationships between biological research, security and society. This may be an opportunity to review and enhance systems of biosecurity and biosafety governance.
Too little too late? The Asilomar moratorium was 46 years ago! With nearly four million people dead from Covid-19 due to a possible lab leak (which is looking more plausible every day), can we trust the scientists to act on the best interests of the public?
Look at the conflict of interest that came to light when Peter Daszak directed US funds to the Wuhan Institute of Virology to support gain-of-function research in bat viruses before the pandemic began (WND). When the pandemic was underway, Daszak in turn spearheaded the “Statement in support of the scientists” published by The Lancet on Feb 19, 2020 that defended the Wuhan scientists against “rumors and misinformation” that the lab had released the virus, either accidentally or intentionally. This month, Daszak also had to walk back his lie that no bats were studied at the Wuhan lab after videos surfaced showing that, in fact, that happened. Now The Lancet has had to supply an Addendum admitting an undisclosed conflict of interest by Daszak. See the complete story in the Epoch Times.
Remember other historic occasions when Big Science supported “research” with profound consequences and questionable ethics: the atomic bomb, the Tuskegee study, eugenics. Science in 2021 has power to kill and destroy like never before, even to change human nature. There are some using climate change as an excuse to genetically engineer human beings (see Tucker Carlson Tonight 6/22/2021). Scientists should be expected by default to act in their own interests, not necessarily the public interest. That is why Big Science cannot regulate itself. Even external ethicists, however, cannot be trusted if they do not build on a foundation of righteousness and integrity.