Delusions—beliefs divorced from reality and yet firmly, oftentimes fiercely, maintained by people who are otherwise rational—are of great interest to me. People experiencing these kinds of delusions don’t hear voices or see visions or display signs of mental disturbance. “Functioning is not markedly impaired, and behavior is not obviously bizarre or odd,” according to the most recent edition of the Diagnostic and Statistical Manual of Mental Disorders, published by the American Psychiatric Association. Even if you were to spend extended periods of time with someone harboring an isolated delusion, you most likely wouldn’t notice anything out of the ordinary—unless the conversation veered toward the subject of the delusional belief.
I once assumed that delusions were rare and almost always associated with mental illness, but as I discovered, delusions frequently occur among perfectly “normal” people. In a 2011 study, 1,000 participants drawn from the general population were asked to rate the intensity of their conviction (“weak,” “moderate,” or “strong”) about what researchers determined were delusion-like beliefs, which were defined as “false beliefs different from those that almost everyone else believes.” Ninety-one percent reported some level of acceptance of one or more such beliefs, and 39 percent expressed strong beliefs, most notably regarding political, social, and science-related topics. A Harris poll found that a substantial minority of adults in the United States report a belief in ghosts. Such findings suggest that delusions are widespread, often subtle, and sometimes merge imperceptibly into normal thinking.
If, as seems reasonable to assume, we have delusions more often than we think we do, then how, in a given instance, can we be certain that we aren’t suffering from a delusion? This question is especially important today, when 85 percent of U.S. adults regularly use the Internet, which can spread delusional beliefs with the rapidity and randomness of plagues. Delusion-inspired ideologies, such as those that sanction violence as a response to religious differences, proliferate on the Web. They harken back to pogroms, witch hunts, and other persecutions that targeted individuals and groups thought to be associated with devils and demons.
We may assure ourselves that nothing like that could happen today. We may take comfort in the thought that delusions can always be confidently separated from logically valid conclusions. But can they? To distinguish the offbeat from the just plain crazy, most of us rely on our inner feelings of certainty about such things as, say, our name, location, close friends, and family. Yet research reveals equally strong feelings of certainty about delusional beliefs. It’s fair to say that for “normals” as well as for those harboring delusions, the strength of one’s convictions isn’t a totally dependable indicator of the correctness of one’s conclusions.
SOMETHING IS GOING ON
As an entry point to understanding the delusional state of mind, consider this account by English anthropologist and explorer Francis Galton (1822–1911) of a deliberately induced delusion using himself as the subject:
The method tried was to invest everything I met, whether human, animal, or inanimate, with the imaginary attributes of a spy. … I started on my morning walk from Rutland Gate, and found the experiment only too successful. By the time I had walked one and a half miles, and reached the cabstand in Piccadilly … every horse on the stand seemed [to be] watching me, either with pricked ears or disguising its espionage. Hours passed
before this uncanny sensation wore off, and I feel that I could only too easily re-establish it.
Galton recognized that the horses were not singling him out for attention; he was self-inducing the delusion-like experience. Psychiatrists refer to this ability to stand back and take a critical stance toward one’s inner thoughts and experiences as insight. It distinguishes imagination and fantasy from psychosis.
As Galton’s description makes clear, a delusion may start with a misperception or misinterpretation of what’s happening. Since all of us are susceptible at every waking moment to misperceptions and misinterpretations, a high prevalence of delusion in the population should come as no surprise.
Contrast Galton’s deliberately induced and controlled experience with this description (originally published in 1960 in the Canadian Medical Association Journal ) by a schizophrenic woman during what she referred to as an “exaggerated state of awareness.”
At first it was as if parts of my brain “awoke” which had been dormant, and I became interested in a wide assortment of people, events, places, and ideas which normally would make no impression on me. … I felt that I was duty-bound to ponder on each of these new interests, and the more I pondered the worse it became. The walk of a stranger on the street could be a “sign” to me which I must interpret. Every face in the windows of a passing streetcar would be engraved on my mind, all of them concentrating on me and trying to pass me some sort of message.
Based on description alone, distinctions between Galton’s auto-experiment and the patient’s psychotic delusional experience aren’t easy to make. Both share a common aura of threat described as a “delusional atmosphere” by German psychiatrist and philosopher Karl Jaspers (1883–1969): “Something seems in the air which the patient cannot account for, a distrustful, uncomfortable, uncanny tension invades him. There is some change which envelops everything with a subtle, pervasive and strangely uncertain light.” As one of Jasper’s delusional patients expressed it, “Something must be going on.”
SEEING ISN’T ALWAYS BELIEVING
Any one of us may explain inexplicable experiences in delusion-like terms yet stop short of succumbing to a full-blown delusion. We know this on the basis of an experiment carried out by psychologist Brendan Maher.
Maher asked volunteers to track a target on a computer screen while watching the movement of what appeared to be their hands manipulating a joystick. But things were not as they seemed. A trick was being played on some of the volunteers: the hands shown on the screen were not their own and failed to comply with their intentions. If they willed the joystick to move the arrow up and to the left, the arrow moved down and to the right instead. As a result, the volunteers performed poorly on the tracking task. When asked later to explain their poor performance, they offered an assortment of delusion-like responses. “My hand was controlled by an outside physical force”; “I tried hard to make my hand go to the left, but my hand tried harder and was able to overcome me and went off to the right.” But after initially expressing their puzzlement at such a strange experience, the volunteers moved on to other concerns. None of the volunteers maintained delusion-like conclusions.
Maher’s experiment suggests that in order for a delusion to form, one must make the decision to give greater credence to an anomalous perception (one’s hands moving on their own) than to long-held beliefs about causation (“I am in control of my hands, and they do what I want them to do”). The person prone to delusions adopts a “seeing is believing” approach. This choice of perception over logic, of intuition over reasoning, forms the nidus (origination point) for the delusion. As one delusional patient described her experience, “If I’m mad, so be it, but this is the most real thing I’ve ever known.”
On occasion we can observe in ourselves the kind of mental rebalancing required to prevent a delusional explanation for an unusual experience. Several years ago I took an overnight flight from Washington, D.C., to Munich, where I was to deliver a lecture on the human brain the day after my arrival. I had been up late the night before and was unable to doze on the plane. Despite my fatigue, I forced myself to stay awake all that day, resulting in more than 36 hours of sleep deprivation.
Walking at dusk that evening along a fashionable street, I saw a woman step out of a department store into a waiting limousine. Although I could not have seen her for more than a few seconds, her looks, gait, and style of dress quickly convinced me that the woman was my wife. I speculated that she had flown to Munich to surprise me and had hired the limousine to do some shopping before heading over to my hotel—an extravagance that I mentally registered as uncharacteristic of my wife. Nevertheless, I was convinced that when I arrived back at my hotel room, she would be waiting there for me. So strong was my emotional identification with the woman that I briefly considered approaching the vehicle, tapping on the window, and surprising her. But as the limousine pulled away, the psychological linkage to the woman abruptly faded, leaving me with the realization that I had observed a stranger who, at a distance and in the fading light of early evening, resembled my wife.
This unsettling incident conformed in several respects to the conditions that could lead, in the predisposed, to the development of a delusion. First was the momentary intensity of feelings upon noting physical similarities between the unknown woman and my wife. But in this instance I was able to recognize my misperception as resulting from nervous tension and my lack of sleep.
If, however, I had given more credence to my misperception, I might well have taken the second step toward a delusion. For instance, if I called home and my wife wasn’t there for some reason, this could have strengthened the conviction that the woman I had briefly observed really was my wife. Various delusional beliefs could then follow.
As with my Munich experience and those of Maher’s joystick-wielding subjects, something in addition to altered perception is required to induce a delusion.
EXPLAINING AWAY THE UNCANNY
To discover that added something, I suggest a 10-minute experiment designed by Giovanni Caputo, a psychologist at the University of Urbino in Italy. It will enable you to experience a discomfiture that can lead to the onset of a delusion. (You may prefer to read the following description rather than carry out the experiment.)
Set out two chairs about two feet apart in a dimly lit room. Place a large mirror on one chair and sit on the other so that you can stare at your reflection in the near darkness. “After about one minute of mirror-gazing, most people begin to perceive a sense of unsettling distortion in their reflected face,” according to Caputo. “The eyes start to move or shine, the mouth opens, or the nose becomes very large. If you continue to gaze, very big changes occur, until completely new faces appear.” The participants in Caputo’s experiment reported perceiving strange faces—often unknown, human or animal, living or dead, along with “fantastical and monstrous beings.”
Strong emotional responses accompanied these apparitions. Some people felt that the “other” in the mirror watched them with an enigmatic, even threatening expression that created anxiety and dread in the viewers. Dynamic deformations of the new faces (“pulsations or shrinking”) resulted in “an overall sense of inquietude for things out of control.”
Although an explanation of Caputo’s mirror illusion remains conjectural, the dim lighting was an important component of the effect: turn up the lights, and the perceived facial alterations disappear, which suggests that viewing one’s face in the semidarkness disrupted the brain’s ability to bind together the facial components into a recognizable pattern. “This long-term viewing of face stimuli of marginal strength may generate a haphazard assembly of face traits that generate deformed faces or scrambled faces,” Caputo wrote in “Strange-face-in-the-mirror illusion,” published in 2010 in the journal Perception. Haphazard assembly results in a temporary disturbing interruption in the mirror-gazer’s sense of identity. The subjects know who they are—the sense of identity remains basically intact—but somehow the feeling of their identity seems altered.
In Vladimir Nabokov’s short story “Terror,” an unnamed character undergoes an experience similar to those of Caputo’s subjects:
I now stood considering my own reflection in the glass and failed to recognize it as mine. And the more keenly I examined my face … and the more insistently I told myself “This is I,” the less clear it became why this should be “I,” the harder I found it to make the face in the mirror merge with that “I” whose identity I failed to grasp. When I spoke of my odd sensations, people justly observed that the path I had taken led to the madhouse. In point of fact, once or twice, late at night, I peered so lengthily at my reflection that a creepy feeling came over me and I put out the light in a hurry.
If you decided to try Caputo’s experiment for yourself, you likely experienced at the very least an eerie sensation of disquiet. Now hold that feeling for a moment and consider several classic delusions.
Capgras delusion: the belief that a family member or close intimate is an imposter rather than the person he or she claims to be.
Fregoli delusion: the belief that a person is capable of changing his appearance to resemble others while maintaining his psychological identity.
Cotard delusion: the belief that one is dead. The original patient, described by 18th-century scientist and philosopher Charles Bonnet, not only claimed to be dead but also reclined in a coffin and demanded to be buried.
Intermorphosis: the delusion that people are changing both their physical and psychological identities.
Doppelgänger delusion: the belief that one has a double or impersonator.
Think of these delusions as an attempt to make sense of misperceptions and altered experiences similar to those described by Caputo and Nabokov. Delusion thus becomes a response to the tension between what is seen or heard and what is believed. But instead of stepping back from the strange misperceptions and realizing that things are not what they seem, the person given to delusions unquestioningly accepts the “evidence” provided by his anomalous perceptual experience.
A COMMUNITY OF DELUSIONS
Delusions can transfer to others by a kind of mental infectivity. French psychiatrists in the latter part of the 19th century coined the term folie à deux to describe a deluded person successfully persuading another person in close contact with him to accept the delusional belief. In folie simultanée the delusion occurs simultaneously in a pair of “predisposed” closely associated individuals who live in isolation from others.
Although usually occurring in persons afflicted with a psychiatric illness, the communication of a delusion or delusion-like idea from one person to another can easily occur, under certain circumstances, in people without any mental disturbance. In 1942 a British intelligence officer with advance knowledge of an impending secret military mission in Dieppe, France, was idly paging through a newspaper when he came upon an advertisement. The headline “Beach Coat from Dieppe” was accompanied by a drawing of a young woman in a house coat snipping branches with a pair of garden shears. The officer, in a moment of “insight,” concluded that the advertisement was a code intended to warn the enemy of the upcoming military operations. “Coat” represented “Combined Operations Attack,” which was the official designation for the Dieppe raid. What’s more, the number of buttons on the woman’s coat conformed to landing points of the raid. Finally, the use of shears represented the British use of tanks against barbed wire, which was part of the planned Dieppe operation.
Thoroughly convinced of the validity of his conclusions, the officer persuaded his superiors to allow him to convey the information to Scotland Yard. Experts there quickly concluded that the items in the ad and the planned raid shared only purely coincidental features. The explanation convinced the officer that the ad was not intended to convey secret information. In contrast to formulating a delusion, the intelligence officer readily abandoned his misinterpretation when another interpretation seemed more plausible.
Contemporary examples of widespread delusion-like ideas include the conspiracy theories that persist regarding the assassination of President Kennedy. According to polls, between 61 percent and 81 percent of Americans adults believe that Lee Harvey Oswald did not act alone but was part of a conspiracy. Such theories endure despite painstaking investigations that have failed to support any explanation for Kennedy’s death other than a lone gunman. As Adam Gopnik points out in a New Yorker article, conspiracy theory-based explanations of Kennedy’s assassination merge imperceptibly into delusion. “It is possible, in other words, to construct an intricate scenario that is cautiously inferential, richly detailed, on its own terms complete, and yet utterly delusional,” Gopnik writes.
THE ABSENCE OF DOUBT
How do you go about recognizing a delusion? The first step is to distinguish bizarre from nonbizarre delusions. If someone asserts that he or she has recently been kidnapped by Martians, taken into a spaceship, and brainwashed, the bizarreness of the claim makes for an easy determination that the person is delusional. Compare this with a nonbizarre delusion: a man alleges that a next-door neighbor has planted cameras and other electronics in his house and is monitoring his every move. In this case the detection of a delusion is harder because—however weird and unlikely it may sound—the surveillance could conceivably be true. Such cases call for a different approach.
When evaluating a nonbizarre delusion, it’s important not to get caught up in the question of whether the allegation is objectively true or false. It doesn’t matter. The distinction is that a person with the delusion is so convinced of its reality that he or she will not even consider other possible explanations. Indeed, suggesting alternative possibilities usually leads to impatience and arguments, followed by verbal and in some instances even physical aggression. I learned this firsthand several years ago when I encountered a tragic example of a delusion in a woman imprisoned for murder.
Grace H. admitted to killing her lesbian partner of 15 years after she became convinced the partner was sexually abusing the 11-year-old boy whom they had adopted as a baby. So strong was her conviction that she set up concealed cameras around the house, placed baby powder on the floor outside the bedrooms “to detect footprints,” and began a “secret diary,” which she password-protected. (Software analysis later revealed the password to be death.)
After finding “strange events … someone has moved things around the house,” Grace H. redoubled her surveillance efforts. A grim diary entry the night of the murders became a crucial piece of evidence: “IT IS DONE! I finally caught her in his bedroom after she had been with him.” By the time she wrote this, she had hacked both partner and child to death. As a defense consultant on the case, I had access to police files, pictures of the recording equipment, and Grace H.’s diaries. But these sources of information established only what Grace H. believed, not what was actually going on.
In the first few minutes of our interview, Grace H. became angry when I asked her to describe what she had seen when she entered the bedroom. She ignored my request and continued to speak only of her unalterable conviction that her lover was sexually molesting the child. When I pointed out that her diary entry stated she had found her lover in the boy’s room but did not say that she had witnessed anything happening, Grace became increasingly agitated. I asked if it were possible that she had been mistaken about the alleged abuse, and I suggested alternative explanations. The more accounts I proposed, the more agitated and enraged she became. At one point in the interview I felt physically threatened by a shackled woman considerably smaller than I am. Eventually her shouting and frenzied excitement prompted the guard to check on my safety in the interview room.
Was the alleged sexual abuse actually going on? I don’t believe there is any way of answering that question, since the only ones who knew the facts are now dead. But there was no doubt in my mind then, nor is there now, that at the time of the murders Grace H. was suffering from a psychotic delusion and therefore was not fully responsible for her actions.
CAN DELUSIONS BE CONTROLLED?
The tragic case of Grace H. raises a question: to what extent can a delusional person control his or her aberrations? Although delusions may be resistant to all counterarguments, they seldom are accompanied by a need to convince or intimidate nonbelievers with force. In many cases, some degree of control is possible. People in the thrall of a delusion are often evasive about their delusional beliefs and go to great lengths to avoid speaking about them, especially to people who are likely to react critically—the principal reason delusions often go undetected. What’s more, deluded people perform normally on tests of logical reasoning and possess at least average intelligence. In the few instances when reasoning deficits are found, they are limited to what has been referred to as a “reasoning bias”: drawing inferences and reaching conclusions much more rapidly and impulsively than the average person. A substantial subgroup (between a third and a half) is willing to arrive at a conclusion on the basis of examining just one line of evidence. This “jumping to conclusions bias,” as it has been called, leads to premature determinations that form the groundwork for delusions.
But eventually, as with physicist John Nash, the subject of A Beautiful Mind (Sylvia Nasar’s biography and Ron Howard’s movie), the deluded person may progress to the point of being able, as Nash describes it, to “discriminate between delusional and real experience.” Over time and with repeated effort, Nash gradually became able to reject “some of the delusionally influenced lines of thinking which had been characteristic of my orientation.” Nash attributes this ability to discriminate between the delusional and the real to the exercise of willpower. “If one makes an effort to ‘rationalize’ one’s thinking then one can simply recognize and reject the irrational hypotheses of delusional thinking.”
My experience with delusional patients leads me to agree with Nash on this point. Many times the delusion resolves itself and eventually disappears as part of a slowly evolving healing process. Recovery most often begins with a “double-awareness phase,” during which the deluded person starts to question the validity of the delusional belief while continuing to maintain it. Only gradually does the patient decide to abandon the belief. Treatment of a delusion takes advantage of this willingness to consider alternative possibilities by encouraging and affirming the first stirrings of the patient’s doubts about the delusion. But the process cannot be rushed. Since delusion is an affliction of belief, reasoning can go only so far in combatting it. What proves most effective is a calm acceptance on the part of the listener to the existence, but not the content, of the delusion.
DELUSIONS AND THE BRAIN
Although delusions have been recognized for centuries, physical explanations for them are of comparatively recent origin. The late-in-life delusion of Henry James provides a window into neuropsychiatric causes of the disorder. On December 2, 1915, the novelist collapsed in the bedroom of his London flat. Still conscious when his secretary found him, James told her that his left leg had buckled and correctly concluded that he had suffered a stroke. Two days later, another stroke left James paralyzed on his left side. Following the second event, James ascribed to himself a “sketchy state of mind” and expressed the wish that others not speak of his “madness.”
On December 12 James dictated to his secretary a letter that would later be known as his Napoleonic fragment. In the words of his secretary, “he dictated perfectly clearly and coherently two letters from Napoleon Bonaparte to one of his married sisters [and her husband].”
Dear and most esteemed brother and sister,
I call your attention to the precious enclosed transcripts of plans and designs for the decoration of certain apartments of the palaces, here, of the Louvre and the Tuileries, which you will find addressed in detail to artists and workmen who are able to take them in hand. I commit them to your earnest care till the questions relating to this important work are fully settled.
James ended the letter, “This will be the case with all further projects of your affectionate
James died two months later. No autopsy was performed, but certain conclusions about his brain injury can be drawn with a reasonable degree of confidence. The occurrence of left-sided paralysis suggests that the stroke originated in the right hemisphere, which controls the left side of the body. In the 30 years preceding James’s terminal delusion, neurologists and neuropsychiatrists had uncovered abundant evidence that delusions could result from brain injury involving in almost all instances the right hemisphere.
After right-hemisphere damage, patients often experience a host of strange alterations in identity and self-image. The most common is anosognosia: unawareness of disability after brain injury. Despite left-sided paralysis, the patient denies being unable to move his left arm or leg. On occasion he declares that his left limbs don’t belong to him. It’s likely that delusions in such settings provide sudden “clarification” for puzzling and inexplicable impairments. (“My left arm doesn’t seem to be working and doesn’t follow any commands that I give it. Therefore it doesn’t belong to me but is somebody else’s arm.”)
Such a delusional belief is accepted because it provides comfort and an antidote to perplexity. But so far no one has come up with a fully satisfactory explanation for why delusions occur much more frequently after right-sided brain damage than after left-sided damage.
On occasion the delusions caused by brain damage can be extraordinarily bizarre. Several years ago, I examined a woman in her 60s who had suffered a stroke that affected that part of her brain responsible for vision. She was blind but seemed curiously unconcerned about her visual loss. In fact, she denied any problem with her vision. When I tried to tell her as gently as I could that her claim of normal vision was impossible in light of the severe damage done to her brain’s visual centers, she angrily leapt out of bed, strode across the room, and collided with a wall. Did my patient really believe that her vision was normal? Her willingness to get out of bed and injure herself suggests that she did.
The Roman poet Seneca described a similar situation in Epistulae morales ad lucilium. “You know Harpaste, my wife’s female clown. … Now this clown suddenly became blind. The story sounds incredible, but I assure you that it is true: she does not know that she is blind. She keeps asking the attendant to change her quarters; she says that her apartments are too dark.”
What do we conclude in such instances about a person’s “real” belief when words and actions are so inconsistent with reality?
Whether resulting from brain damage in the right hemisphere or arising from no discernible cause in people whose brains seem perfectly normal, delusions provide an unsettling insight that we are not always the reasonable creatures we consider ourselves to be.