Your Brain on Self-Deception

Pathological Altruism

Barbara Oakley, Ariel Knafo, Guruprasad Madhavan and David Sloan Wilson (editors). 2012. Oxford University Press, New York. 496 pages.

Thinking, Fast and Slow 

Daniel Kahneman. 2011. Farrar, Straus and Giroux, New York. 512 pages.

A Mind of Its Own: How Your Brain Distorts and Deceives 

Cordelia Fine. 2005, 2008. W.W. Norton & Company, New York. 256 pages.

A cartoon published in the British magazine Punch in 1909 depicts a young member of the fledgling Boy Scout movement lending his arm to an elderly lady (portraying England). The image, although eventually stripped of its intended satire, quickly caught the public’s imagination and has persisted in Western pop culture ever since: an enduring symbol of the best that human nature has to offer.

But the icon did not remain unspotted. An old joke asks, “Why did it take three large Boy Scouts to help the old lady across the street?” The answer, of course, is “Because she didn’t want to go.”

Is it just possible that we, like those Boy Scouts, might sometimes deceive ourselves into thinking we are helping when our actions are in fact hindering? This phenomenon, among other forms of killing kindness, is the focus of an intriguing collection of essays titled Pathological Altruism. The book is edited by a varied team of researchers headed by Barbara Oakley, an engineering professor whose work focuses on the connection between neuroscience and social behavior.

Pathological Altruism is not a straightforward piece of storytelling. Rather, it’s a compilation of scholarly research in which each chapter offers insight into a different manifestation of self-sacrifice’s dark side. While not exhaustive, the book does serve as an effective introduction to a very new concept, and its implications are almost staggering if not yet fully understood. “Many harmful deeds—from codependency to suicide martyrdom to genocide—” Oakley and her colleagues declare, “are committed with the altruistic intention to help companions or one’s own in-group.” What researchers want to know is how these good intentions disintegrate into pathology—social abnormalities or malfunctions that have a negative effect on the altruist and/or the targeted beneficiary.

Of course, some will argue that a “pathological” altruist cannot be a true altruist and that the term is therefore somewhat of an oxymoron. This is the stance taken by philosopher Bernard Berofsky in his essay, “Is Pathological Altruism Altruism?” Berofsky argues that when it comes to defining the term, motivation is everything. Thus, if the motives of the pathological altruist differ from those of the “normal,” they must not be altruists. Although in the end Berofsky concedes to accepting the label for convenience, he objects that some would-be benefactors may appear to have unselfish intentions, but that a distortion takes place between motive and intention that distinguishes them from real altruists. Admittedly pathological altruism must involve distorted thinking—“cognitive distortion”—or all altruists must be considered pathological. But must these distortions occur only between motivation and intention? Undoubtedly the human mind is capable of utterly deceiving itself about either one. Perhaps in the case of suicide bombers and dictators, both of which are identified as possible pathological altruists in this volume, the distortion comes after the motivation to save the world (a motivation few would argue with) and before the intention to implement some twisted plan for accomplishing their goal.

The problem goes well beyond such obviously misguided individuals, however. Consider those who hoard animals with the aim of saving them from harm. Researchers Jane N. Nathanson and Gary J. Patronek suggest that for such people the distortion may occur in the motivation itself. “Hoarders often report social histories characterized by dysfunctional human relationships,” they write. “Childhood history may interact with loss to create even more difficult challenges in coping, which predisposes hoarders to pursue compulsive animal ‘caregiving’ as an avenue of self-repair.” Of course, the hoarder’s brain doesn’t interpret this motivation as one of self-repair. The addictive rush of positive feelings the hoarder experiences from animal caregiving is interpreted by the brain as the byproduct of selflessness (a cognitive distortion). A hoarder whose animals are in tragic states of neglect due to their unmanageable numbers is certain he or she is rescuing the creatures from a fate far worse. Surely in such a case it could be said that an altruistic motive, as much as the ensuing intention, rests on a distortion.

Some of human history’s most horrific episodes have risen from people’s well-meaning altruistic tendencies.” 

Barbara Oakley, Ariel Knafo and Michael McGrath, “Pathological Altruism—An Introduction” in Pathological Altruism

We are further assured by psychiatrists Madeline Li and Gary Rodin in “Altruism and Suffering in the Context of Cancer” that altruistic motivations, like all human behaviors, arise from a combination of influences. Genetic predispositions may not show themselves unless or until they are activated by aspects of a child’s home environment, and there’s no question that mental health in adulthood is closely tied to the quality of early attachment, bonding and socialization experiences. Psychologists Carolyn Zahn-Waxler and Carol Van Hulle add that a troubled childhood can leave us vulnerable to a variety of thinking distortions that affect how we express altruism.

Still, we don’t need the excuse of a troubled childhood to become addicted to the pleasurable chemical rush we get from certain states of mind such as self-righteousness, an especially insidious form of self-deception that can also lead to wrong-headed forms of benevolence. In a particularly compelling chapter titled “Self-Addiction and Self-Righteousness,” science writer David Brin suggests that self-righteousness is indeed at the bottom of at least some forms of wrong-headed altruism. He writes, “We all know self-righteous people. (And, if we are honest, many of us will admit having wallowed occasionally in self-righteousness ourselves.)” But while it may be easy for us to admit many of its drawbacks, we might hesitate to acknowledge our enjoyment of what he describes as its “heady, seductive, and even . . . well . . . addictive” qualities; “any truly honest person will admit that the state feels good.” He adds that you can’t help but love “the pleasure of knowing, with subjective certainty, that you are right and your opponents are deeply, despicably wrong. Or, that your method of helping others is so purely motivated and correct that all criticism can be dismissed with a shrug, along with any contradicting evidence.”

Self-righteousness and its accompanying indignation at the faults of others can feel so validating that we can’t resist returning over and over again for more of the drug. We are all potential addicts. “Indeed,” points out Brin, “one could look at our present-day political landscape and argue that a relentless addiction to indignation may be one of the chief drivers of obstinate dogmatism and an inability to negotiate pragmatic solutions to a myriad [of] modern problems.”

Of course, cognitive distortions of various persuasions are closely tied to addictions, even addictions to states of mind like self-righteousness. They underlie all forms of pathological altruism, but perhaps they do not always fall in the space between motivation and intention, as Berofsky suggests. Self-deception can occur at any stage in our thinking—even on the metacognitive level. Metacognition, or the ability to think about how we are thinking, usually allows us to root out at least some of the errors and biases that can potentially distort the workings of our mind.

Unfortunately, sometimes people simply fail to entertain the notion that they may have biases. In other cases they may know biases are possible but think they are capable of recognizing them even in the face of ample evidence that they are not. Forensic criminologist Brent E. Turvey calls this “metacognitive dissonance.” Clearly we are all capable of various levels of the problem, deceiving ourselves about our biases as well as through our biases.

This book doesn’t pretend to answer every question about what causes self-sacrifice to go awry, and it is certainly not a fireside read. It does, however, succeed in its aim: to open the door to further study in this burgeoning field and to suggest that the quality we most respect in human nature—altruism—may not always be as honorable as we think.

Best Face Forward

Sadly, a belief in human nature’s virtue is only one of many forms of self-deception we can fall prey to. In Thinking, Fast and Slow, Daniel Kahneman reveals what he has learned as a result of his Nobel Prize–winning research in judgment and decision making: human beings are not the rational agents economists and decision theorists have traditionally assumed. This is not to say that humans are irrational. Rather, says Kahneman, they “are not well described by the rational-agent model.” Although a psychologist, he was awarded the Nobel Prize in Economic Sciences precisely because his research challenges traditional economic theory. Old-school economists have held that the test of rational thinking is whether a person’s beliefs and preferences are internally consistent. For instance, rational thinkers would not be subject to reversing their preferences based on the words in which the choice is framed, but real people are. According to Kahneman, expecting people to think the way economists have traditionally theorized is “impossibly restrictive” and “demands adherence to rules of logic that a finite mind is not able to implement.”

The voice of reason may be much fainter than the loud and clear voice of an erroneous intuition, and questioning your intuitions is unpleasant when you face the stress of a big decision. More doubt is the last thing you want when you are in trouble.” 

Daniel Kahneman, Thinking, Fast and Slow

In other words, we cannot be internally consistent because, far more than we imagine, we are ruled by hidden influences when making judgments and decisions. This is the human mind’s System 1—in Kahneman’s terminology, the fast process that operates automatically and usually outside our awareness. It’s that impressionable gut feeling that looks for patterns and coherence and that makes us feel complacent unless challenged. And it does its best to ignore challenges if it can. Most of our thoughts originate here, but it’s the job of System 2 (the logical, controlled process) to step in when the input becomes more demanding or when effortful self-control is required. Normally, says Kahneman, System 2 gets the last word, but only if it can get one in edgewise; and it is often too busy or lazy to intervene. As long as we feel the sense of “cognitive ease” that System 1 is so willing to provide, we don’t call in System 2 forces.

System 1 is what allows us to make snap decisions in situations that call for them. But it is highly vulnerable to error. Because it operates on emotions, impressions, intuitions and intentions, says Kahneman, it is “gullible and biased” toward believing whatever it is confronted with, and it searches its experience for information that confirms these biases and beliefs. Its job is to jump to conclusions, but these are necessarily based on limited evidence because System 1 operates on the basis of what Kahneman dubs “What you see is all there is.” Trusting this notion, we construct whatever reasonably coherent story we can using whatever information might be within easy reach of our intuition (and our intuition does not allow for information that fails to come to mind, much less information it never had in the first place). Among other problems, this leads to overconfidence. Fallible though it may be, we prefer the “evidence” of our own memory and experience to any kind of fact-checking from outside ourselves.

To illustrate the fallibility of experience, Kahneman tells of the time he taught Israeli Air Force flight instructors the science of effective training. After he explained that rewards work better than punishment in improving performance, one of the instructors objected that his long experience as an instructor had taught him otherwise. Whenever he had praised a flight cadet for excellence, the cadet did worse on the next try. Conversely, when he reprimanded a cadet for bad execution, the next try was better. “So please don’t tell us that reward works and punishment does not,” the instructor said, “because the opposite is the case.” Any of us might be tempted to jump to the same conclusion, but as Kahneman explains, the experience described by the instructor did not, in fact, teach a lesson about reward and punishment but about a principle known as “regression to the mean.” Both high and low performances are usually followed by an attempt that is closer to average, simply because significant variances from the mean usually have more to do with luck than anything else. Rather than viewing the individual’s performance over an extended period, the instructor was making judgments about the effects of praise based on the cadet’s single next performance, which statistically would almost certainly be closer to the average than his more memorable attempt. The instructor’s experience was true, but the conclusion his intuition drew from the experience was wrong.

Those who have a hard time believing this or any of Kahneman’s other claims about the quirks of System 1 need only try his examples at home; doing so will make a believer out of even the most severe skeptic. It is humbling as well as enlightening to see oneself defeated by the wiles of System 1 despite having been warned in advance of its methods. “Changing one’s mind about human nature is hard work,” Kahneman observes, “and changing one’s mind for the worse about oneself is even harder.”

The Mirror Crack’d

With our cherished illusions about human nature’s heart and mind by now well and truly shattered, it may be time to revisit Cordelia Fine’s 2005 heads-up on the state of the human brain. Seven years after its publication, A Mind of Its Own: How Your Brain Distorts and Deceives is still bang on the mark. Fortunately it is also easier for today’s busy readers to sink their teeth into than the two more recent, weightier volumes already discussed.

You might think that if there’s one thing in this world you can trust, it’s your own brain,” Fine begins in the introduction. “You two are, after all, as intimate as it is possible to be.

But the truth of the matter—as revealed by the quite extraordinary and fascinating research described in this book—is that your unscrupulous brain is entirely undeserving of your confidence. It has some shifty habits that leave the truth distorted and disguised. Your brain is vainglorious. It’s emotional and immoral. It deludes you. It is pigheaded, secretive, and weak-willed. Oh, and it’s also a bigot. This is more than a minor inconvenience.”

Your poor, deluded gray matter sees what it expects to see, not what is actually there. The moral? Treat with the greatest suspicion the proof of your own eyes.” 

Cordelia Fine, A Mind of Its Own

Fine’s point, not to put too fine a point on it, has been made before. In fact, long before Kahneman, Oakley and their respective colleagues began their research—or even walked the earth—an ancient writer recorded a similar assessment. Like modern researchers, he found that humans are not fully rational and that the brain distorts much of the information that comes its way. Further, he described the human mind as “deceitful above all things and beyond cure,” and then asked the rhetorical question, “Who can understand it?” (Jeremiah 17:9, New International Version). The prescribed antidote was a mercilessly honest examination of the mind’s inner workings.

Interestingly, researchers today come to a similar conclusion. Only by being scrupulously vigilant about how we are thinking, they tell us, can we hope to get a handle on our thoughts and judgments. As Kahneman puts it, “little can be achieved without a considerable investment of effort. . . . System 1 is not readily educable.” Still, he offers a seemingly simple principle: “Recognize the signs that you are in a cognitive minefield, slow down, and ask for reinforcement from System 2.” Unfortunately System 2 is often sluggish in coming to our aid. Illusions and biases in our thinking are hard to recognize, and we would rather not recognize them if we can help it. As a result, we fail to exercise caution when we need it most. If we do happen to stumble upon our mind’s biases in a mirror, we quickly look the other way.

Fine notes it isn’t just our biases that lead us astray, however. Indeed, she points out, “an ignoble agenda—the desire to see evidence for a belief we’d secretly prefer to hold—can wreak its prejudicial influence on our opinions.” Worse, she declares, “even when we genuinely seek the truth, our careless data collection and appraisal can leave us in woeful error about ourselves, other people, and the world.”

We don’t like to look in that inward mirror; it can be very unpleasant to identify our self-deceptions, or even to entertain the possibility that we may have them. We feel much safer avoiding what Fine calls “disturbing self-revelations.” But the downside of safety is that shrouding the mirror in a cloak of self-deception leaves us unable to compare “what I am doing” to “what I should be doing”—an exercise, Fine points out, that is “essential for keeping the self in line.”

Ironically, it is part and parcel of the vanities and weaknesses of the human brain that we secretly doubt that we ourselves are vulnerable to those vanities and weaknesses.” 

Cordelia Fine, A Mind of Its Own

Of course, there is more to keeping the self in line than simply identifying differences between how we think and how we should think. The clear implication of all this research is that an actual change in the way we use the mind is required. Having seen our real face in the mirror, the next logical step must be to do something about it. But what? Fine suggests the way forward is to resist our brain’s wiles. “Wisdom is the principal thing,” she quotes from the Bible; “therefore get wisdom: and with all thy getting get understanding” (Proverbs 4:7, King James Version).

What the research leads us to understand is that we have no real basis for the trust we routinely place in our own thoughts and judgments. The brain’s capacity for self-deception is almost limitless, and unless we accept and act on that understanding, wisdom will remain elusive.