Will Elitist Science Lead to Mind Control?
When you can’t convince the public, try zapping or manipulating them.
A disturbing paper appeared in PNAS recently: “Selectively altering belief formation in the human brain.” Why would anyone want to do that? The abstract explains,
Humans form beliefs asymmetrically; we tend to discount bad news but embrace good news. This reduced impact of unfavorable information on belief updating may have important societal implications, including the generation of financial market bubbles, ill preparedness in the face of natural disasters, and overly aggressive medical decisions. Here, we selectively improved people’s tendency to incorporate bad news into their beliefs by disrupting the function of the left (but not right) inferior frontal gyrus using transcranial magnetic stimulation, thereby eliminating the engrained “good news/bad news effect.” Our results provide an instance of how selective disruption of regional human brain function paradoxically enhances the ability to incorporate unfavorable information into beliefs of vulnerability. (Sharot et al., PNAS, September 24, 2012, doi: 10.1073/pnas.1205828109.)
Much as altruistic people might like to help the misinformed learn to exercise better judgment in financial decisions and disaster preparedness, or to alleviate the sufferings of those with phobias or schizophrenia, this Frankenstein method of “magnetic stimulation” to alter beliefs resurrects visions of psychopolitics (think Soviet Russia or North Korea). People with knowledge of 20th century history might be justly alarmed by scientists who come with promises, “we just want to help you.” Why not put the magnets away and try reasoning with people as fellow human beings?
A reference to “the evolution of overconfidence” reveals where these psychologists are coming from. Evolution, they say, has left us vulnerable to bad beliefs. Humans don’t engage in good Bayesian reasoning. They don’t update their beliefs when presented with new information. “The consequences of readily integrating good news into our beliefs while underweighting bad news are likely to be considerable for an individual and for society.” This raises fearsome questions about what should be done – and by whom. Yikes; look at what they did to their subjects, right out of Frankenstein:
TMS [trans-cranial magnetic stimulation] pulses were delivered by a Magstim Rapid2 Stimulator (Magstim) at 40% of the maximum stimulator output using a small TMS coil (figure-of-eight shape, 50-mm diameter). We used an off-line continuous cTBS protocol, which consisted of three pulses at 50 Hz repeated at 200-ms intervals for a total duration of 40 s. A 5-min rest period was implemented after the termination of cTBS before participants started the task.
Oh sure, they were nice to the 30 adult participants, all of whom had given informed consent. The interviewers were undoubtedly courteous as they asked the subjects questions before and after the “treatment” But this smells like something out of The Twilight Zone. If “psychologists” and “neuroscientists” can find the secret to manipulate our beliefs with magnets, where will this lead? Will politicians or a scientific oligarchy armed with this information care about “informed consent” when they decide it’s time to “fix society”?
There are even scarier ways, believe it or not, to manipulate people. One is the “nudge” tactic advocated by Cass Sunstein, President Obama’s czar for “Information and Regulatory Affairs.” Don’t force new ideas on people, he advises; they’ll just react. Instead, “nudge” them bit by bit, and over time, they will come around. (This sounds like the “frog in the pot” method.)
Be very worried, then, about what’s on the minds of the Economic and Social Research Council, a group whose motto is “Shaping Society.” In a press release echoed uncritically by PhysOrg, the council discussed the pros and cons of the “nudge” strategy vs. the “think” strategy. Whew; they didn’t toss out the “think” strategy. But what they decided works best could be as deceptive as a half-truth: use both! If people think you’re making them think, when you are simultaneously nudging them, they may think you’re looking out for their best interests.
They didn’t use magnetic pulses on their experimental subjects. They didn’t have to. They thought nudging, and nudged thinking. Remember “community organizing”? Here’s how they envisioned their goal: “The findings are very positive and supports the idea that a local approach using nudge and think techniques can lead to citizens getting involved in collective neighbourhood activities.” Very nice. Strangely, notions of individual liberty were absent from the strategizing about “techniques.”
The ESRC press release seems harmless in itself. After all, rhetoric (the art of persuasion) has a long history. People should want to influence other people. But what if leaders of a society determine that its people need to be deprogrammed from “misinformation”? People can be misinformed, can’t they? Sure; look how many people think Justin Bieber has talent. What’s disturbing is (1) when powerful leaders set themselves up as judges of what comprises misinformation, and (2) when the art of rhetoric turns to the science of manipulation.
Look now at Science Daily‘s entry, “Misinformation: Why It Sticks and How to Fix It.” Here’s how the article starts: “Childhood vaccines do not cause autism. Barack Obama was born in the United States. Global warming is confirmed by science. And yet, many people believe claims to the contrary.” The concern here is not whether these statements are true or not (although regarding global warming, be sure to read yesterday’s entry). What should raise eyebrows is that psychologists are experimenting on humans to figure out how to “fix” their erroneous beliefs. “Misinformation is especially sticky when it conforms to our preexisting political, religious, or social point of view,” the article said, ignoring pre-existing “scientific” points of view, which by implication, are infallible (10/24/2011). “Because of this, ideology and personal worldviews can be especially difficult obstacles to overcome.” There’s the rub: a worldview, in the minds of these self-proclaimed experts, is an obstacle to be overcome. Think here about how manipulators might want to
nudge overcome a Christian’s views on gay marriage, abortion, or the deity of Jesus Christ. After all, religious people have an ideological worldview, whereas scientists have none. Scientists are objective, rational, and unbiased. They care. They want to help, but all those misinformed Christians with their preexisting religious worldview ideologies are hindering progress.
There’s no disputing that misinformation is rampant in society. A misinformed person, by definition, is “someone who disagrees with me.” Who doesn’t want to win friends and influence people? These psychologists want to do it, too, using reasoned discussion, evidence, and persuasion. So far so good. But the manipulation shows through in some of their methods: (1) “Provide people with a narrative” (code for “talking points”). (2) “Focus on the facts you want to highlight, rather than the myths” (presupposing the psychologists know the difference). Fairfax’s Law comes to mind (“Any facts which, when included in the argument, produce the desired result, are fair facts for the argument.”). Points 3 and 4 are sound rhetorical counsel (keep your point brief, know your audience). Point 5, though is “Strengthen your message through repetition.” Ah yes, brainwashing.
Repetition is not necessarily bad. Every good teacher uses it; students know that practice makes perfect. But let’s say a psychologist wants to deprogram conservatives from their opposition to gay marriage or gun control. Or let’s say “The New Teacher” walks in the door to welcome the children to the new regime (see must-read commentary to 12/21/2005 entry). When the speaker views himself or herself as the elite, and the listeners are the misinformed who need to be educated out of their myths, training in these manipulative arts can be powerful and dangerous tools for indoctrination (think 1984).
These worries are not just fiction. A Nature book review on Sept. 6 pointed out the dark history of eugenics. In The Science of Human Perfection: How Genes Became the Heart of American Medicine (Yale, 2012), Nathaniel Comfort described the disturbing legacy of an “ever-evolving group of geneticists, eugenicists, psychologists, medics, public-health workers, zoologists and statisticians intent on using heredity to improve human life” over the span of a century. Lest one think that detour is behind us, the reviewer warned, “Today’s hybridized discipline, he says, is noble in intent but rife with social and ethical questions centred on the ‘illusion of perfectibility’.”
Psychologists are our friends, aren’t they? They just want to help us, don’t they? Just don’t let the populace know that evolutionary psychologists have an unsavory history of fraud (9/05/2012, 8/15/2012, 11/05/2011) and are overwhelmingly leftist in ideology (9/07/2012). Ditto for evolutionary anthropologists and sociologists (2/16/2011).
Maybe the take-home technique should be: Psychologist, nudge thyself.
One of Rod Serling’s most memorable Twilight Zone episodes was “The Obsolete Man,” a parable about how an elitist who had eliminated others became the victim when the regime came to view him as obsolete himself. Psychologists, suppose a time came when your opponents were in the majority. Would you want them to use these techniques of manipulation on you?
Forewarned is forearmed. We may want four arms to combat the manipulative “techniques” that elitists desire to use on all us uncooperative sons of liberty, but one is sufficient, if it is armed with the truth – a word sadly lacking in all the above articles.