Split Decisions

A renowned neuroscientist examines human experience

Neil Williamson/Flickr
Neil Williamson/Flickr

The Consciousness Instinct: Unraveling the Mystery of How the Brain Makes the Mind by Michael S. Gazzaniga; Farrar, Straus and Giroux, 288 pp., $28

In his engaging and wide-ranging new book, The Consciousness Instinct, neuroscientist Michael S. Gazzaniga explores a conundrum that has long baffled scientists: “Gazillions of electrical, chemical, and hormonal processes occur in our brain every moment, yet we experience everything as a smoothly running unified whole,” he writes. “What is the organization of our brain that generates conscious unity?”

Gazzaniga, director of the SAGE Center for the Study of the Mind at the University of California, Santa Barbara, and the author of numerous books on the brain, leads us through three possible approaches to answering this question. The first, and the one to which he devotes the most attention, is the modular theory of brain functioning. It holds that the brain, rather than operating in a holistic fashion, relies instead on thousands of independent processing units, or modules—localized neuronal networks that serve a specific function.

The inner workings of these networks are often revealed when people sustain brain damage to a specific area, allowing scientists to identify the module responsible for normal processing in a particular domain. Patients who suffer damage to the parietal lobe on the right side of the brain, for instance, experience spatial neglect: everything on the left side is ignored, almost as if it didn’t exist. They will not eat food on the left side of a plate, or shave or apply makeup to the left side of the face. Some even go so far as to deny the existence of a left arm or leg. This strange behavior, fully described in the 1950s, provided early evidence that the parietal lobe on the brain’s right side is responsible for bodily and spatial orientation.

Some explanation is required here. Since the brain’s incoming and outgoing tracts cross in the brain stem and spinal cord, the right hemisphere controls sensation and movement related to the left side of the body; the left hemisphere controls them on the right. But when it comes to representations of space, the arrangement is somewhat different: the right hemisphere mediates representations of both sides of space, while the left hemisphere controls only the right. In the event of damage to the right hemisphere, the intact left hemisphere maintains awareness of the right side of space. Because of this, neglect almost always involves impaired appreciation of the left side of space.

A similar dynamic, discovered in the mid-19th century, applies to speech. The brain’s left hemisphere contains modules devoted to language. Damage in one area affects a person’s ability to produce comprehensible speech, whereas damage in a nearby area impairs the ability to understand the speech of others.

What are the modular brain’s implications for consciousness? “Each mental event is managed by brain modules that possess the capacity to make us conscious of the results of their processing,” Gazzaniga writes. He suggests an illustrative metaphor: the bubbles in a pot of boiling water. Each bubble has in itself the capacity to evoke the feeling of being conscious, but since they percolate continuously, our sense of consciousness flows without interruption. In other words, consciousness is the product of separate, yet-to-be identified modules somehow working together. As Gazzaniga puts it, “A lot of bubbles are conjoined by the arrow of time and produce something like what we call conscious experience.”

The idea that consciousness results from the confluence of multiple sources comes naturally to Gazzaniga, who along with his mentor, the neuropsychologist Roger Wolcott Sperry, was involved in the so-called split-brain studies of the 1950s—work that later earned Sperry and two of his colleagues a Nobel Prize. When the fibers connecting two cerebral hemispheres are cut, each hemisphere of the split-brain patient functions independently according to the sensory information that it receives. As Gazzaniga explains, the resulting “tug of war” between the hemispheres exposes “the illusion of a unified consciousness”—illusion because, although consciousness seems like a “coherent, flawlessly edited film,” it is actually more like a stream of “single vignettes,” occurring and recurring in an unpredictable sequence.

The theory of modularity provides a rich and useful approach to understanding consciousness, but it is nothing new. Neuroscientists who treat patients (e.g., neurologists, neurosurgeons, and neuropsychologists) have been studying it for well over 100 years. In terms of pedigree, a modular mind theory was predicated by philosopher Jerry A. Fodor in his 1983 book, Modularity of Mind, although it made no claim to anatomic localization. A decade later in my own book, The Modular Brain, I affirmed, based on my clinical experience, that modules provide the best explanation for brain functioning. Over the ensuing decades, as neuroscientists have learned more about the elaborate and intricate circuitry within the brain, knowledge about the number and specificity of modules has increased as well. Modular theory is a natural evolution of cerebral localization, dating back to French physician Pierre Paul Broca’s 1861 observation that language is encoded in the left hemisphere.

Gazzaniga’s second approach to explaining consciousness—and the brain in general—involves the application of subatomic physics, specifically the principle of complementarity, which states that quantum objects possess complementary properties that cannot be measured simultaneously. In other words, there’s a gap between the subjective experience of an event (“I had so much fun bodysurfing”) and the event itself as observed by someone else (“A person went swimming in the ocean”). A different but related principle prevails within the brain of the bodysurfer himself: on the one hand, the subjective experience (the fun of bodysurfing), and on the other, the associated brain activity (as revealed by neuroimaging).

This distinction between mind and brain was first made more than half a century ago by British philosopher Gilbert Ryle, who coined the term “category mistake.” Brains and minds belong to two different categories, and the workings of one cannot be adequately described in terms appropriate for the other. A similar dynamic holds true on the microscopic level: a neuron and its function represent two separate entities with different protocols. Therefore, a thought cannot be reduced to something as mechanical as the interplay of multiple neurotransmitters. Failure to heed these category distinctions can have real-world implications. The current debate about the role of neuropsychiatry in determining criminal responsibility, for example, rests on some variation of the often quoted but unattributed belief that “behind every crooked thought lies a crooked molecule,” a dangerous and wrong-headed approach to understanding criminal behavior.

Finally, in his third approach to explaining the nature of consciousness, Gazzaniga posits that in the future, neuroengineers will be tasked with explaining the various levels of brain processing and “crack[ing] the protocols that allow one layer to interpret the processing results of its neighbor layers.” But before we place confidence in the success of this approach, Gazzaniga encourages us to see consciousness as a “slippery complex instinct,” not “tangible, like an apple, or elusive, like a democracy.” Certainly whenever we speak of consciousness, we encounter the slippery paradox of duality: although we know consciousness exists because we experience it in ourselves and can infer it in others, we cannot be certain about consciousness in other creatures. Is a lobster conscious when we toss it into a pot of boiling water? Observing it, many of us would conclude that it “feels” pain. Assuming this to be true, does that pain sensitivity imply conscious appreciation? However unlikely that may seem, we can never know for certain, since lobsters do not possess a brain like humans but instead an arrangement of segmented nerve clusters. Thus the efforts of neuroengineers to explain consciousness must remain limited to human consciousness. Ultimately what’s involved is consciousness itself explaining its own processing, which reminds me an awful lot of children trying to jump onto their own shadows.

Permission required for reprinting, reproducing, or other uses.

Richard Restak is clinical professor of neurology at George Washington University School of Medicine and Health Sciences, and the author of 25 books on the brain, including the forthcoming The Complete Guide to Memory: The Science of Strengthening Your Mind.

● NEWSLETTER

Please enter a valid email address
That address is already in use
The security code entered was incorrect
Thanks for signing up