Over my career as a neuroscientist and neuropsychiatrist, I’ve become convinced that our brain’s organization and functional activity powerfully influence our judgments and conclusions about topics traditionally explored by philosophers. Included among these topics are time, cause-effect, simultaneity, perception, logic, and free will. Our understanding of these concepts can be greatly enriched by taking into account several important recent discoveries from neuroscience. Consider, for instance, how our spatial experience influences how we think about time.
Imagine that you’ve just received this email: “Next Wednesday’s staff meeting has been moved forward two days.” On what day would you appear for the meeting now that it has been rescheduled?
Your selection of either Monday or Friday is determined by whether you are operating under what psychologist and neuroscientist Lera Boroditsky terms an ego-moving perspective or a time-moving perspective. If you think of yourself as moving forward through time (the ego-moving perspective), then moving the meeting forward means moving it in your direction of motion—from Wednesday to Friday. But if you conceive of time as coming toward you (the time-moving perspective), then moving the meeting forward means moving it closer to you—from Wednesday to Monday.
During her tenure at MIT’s Department of Brain & Cognitive Sciences, Boroditsky put that “moved forward” question to hundreds of people under varying circumstances. She discovered that the answers depended very much on what people were doing when they were questioned. People who had traveled to an airport to pick up an arriving passenger were about equally likely to pick Monday or Friday. In contrast, the arriving passengers, having experienced themselves during their flight as moving forward from their initial to their final destination, overwhelmingly selected Friday.
Boroditsky’s experiment suggests that our brain’s processing of time is closely coupled with how we envision ourselves in space. To see this, simply substitute the word push for move and the sentence becomes disambiguated: “Next Wednesday’s staff meeting has been pushed forward two days.” While moved can refer to movement in several different directions depending on one’s perspective, pushed nearly always implies movement in a forward direction. When we push something, we use the muscles of our arms and trunk to propel the object away from us in a forward direction.
The linking of spatial motion to our understanding of time comes as a surprise to anyone familiar with our brain’s organization. Spatial and temporal information are processed differently within the brain. The frontal and parietal lobes, important in spatial processing, are located, respectively, behind the forehead and toward the top of the skull. Temporal information, in contrast, doesn’t really have a clearly defined location in the brain.
But Albert Einstein wouldn’t be surprised at Boroditsky’s findings. As he famously established, time’s passage depends on the location and circumstances of time measurement.
A time discrepancy between clock time and subjective time can be demonstrated in the laboratory. If I ask you to press a button and hold it for exactly five seconds, your response may vary by as much as 20 percent from one testing to another. If I ask you to hold the button for 50 seconds, you will on average do so within a range of 40 or 60 seconds—again a variation of 20 percent.
Attention is the most important variable influencing our appreciation of the passage of time. The closer attention we pay to it the slower it seems to go (a “watched pot never boils”). But there’s a paradox here. While we may experience the passage of time as agonizingly slow when we’re doing something like waiting for a pot to boil or sitting in a doctor’s waiting room, we’re likely to underestimate rather than overestimate the duration when we’re later asked to guess how much time actually passed.
In an experiment illustrating this, people watching an action movie experienced time passing faster than people sitting in a waiting room. No surprise here. Yet when the two groups were later asked to estimate how much time had actually passed during these two experiences, the results were surprising. Despite their subjective feeling that time had passed quickly, the movie watchers later estimated their elapsed time at about 10 percent longer than the waiting-room group.
The explanation for such paradoxical findings, according to John Wearden of Keele University in Staffordshire, is that the two groups based their time assessment on the amount of information processed by their brains. In the waiting room, not much was happening and time seemed to drag on. But later, the time span seemed shorter because not much had happened and as a result little information was processed. For the viewers of the action movie, in contrast, time passed quickly while watching the movie because a lot was happening. But when recalling the movie, it seemed that more time had passed than actually had, thanks to the sheer number of events that transpired in the movie.
As a general rule, the number of things that happen during a specific interval tends to make the experience seem in retrospect either longer or shorter than it was. If we’re in an auto accident or get mugged, for instance, these events tend to be perceived as lasting much longer than they actually did.
Peter U. Tse, a neuroscientist at Dartmouth, proposes an explanation based on a “counter” model. Imagine that the brain estimates time in bits (units) based not on clock time but on its own rate of information processing. Thus under most conditions one bit of processed information corresponds to the passage of one unit of objective (clock) time. Now imagine the rate of processed information suddenly increasing to two bits per unit of clock time during an emergency situation such as when you are avoiding a collision with another car. Because of the increased attention you direct to this dangerous situation, the “counter” now registers two bits rather than one bit per unit of objective time.
Our brain’s role in time processing becomes increasingly apparent as we get older. As anyone over 40 can attest, the rate of time’s passage seems to increase as we age. But time “dragging on” is a complaint I frequently hear from many of my retired patients. With less to do during retirement, they pay more attention to the passage of time—resulting in boredom and a stultifying sense of time “standing still.”
Here’s the tradeoff: if you want to slow up the subjective sense of time, do little (sit in a lot of waiting rooms). While waiting, the time will seem endless. But, as a cruel irony, if you experience enough of those boring episodes, your life will seem in retrospect to have “raced by” at an accelerated pace. In the absence of much happening, your brain didn’t have much information to process.
Simultaneity provides another example of how our reality is determined for us by our brain’s processing. Think back to the last time you watched a movie or television broadcast marred by a lack of synchronization between the actor’s lip movements and speech. Such events would occur much more frequently were it not for the existence within the brain of a 10th-of-a-second window that allows it to hold “on-line” the faster arriving visual signal (the speaker’s lip movements) for subsequent synchronization with the slower arriving auditory signal (speech). As long as this 10th-of-a-second grace period isn’t exceeded, we remain blissfully unaware that our brain is providing us with what neurophilosopher David Eagleman of the Baylor College of Medicine refers to as postdictive awareness: incorporating data from a window in time after an event and delivering a retrospective interpretation of what happened. In a sense we experience the present moment based on the synthesis of information reaching the brain at different speeds in the form of a reconstruction that takes place outside of our conscious awareness.
As another example of time distortion, in this case time expansion, imagine yourself watching a series of black dots presented in rapid-fire succession on a computer monitor. If at a certain moment I introduce a red dot you’ll report that the red dot remained on the screen for a longer time than any of the black dots, even if the duration on the screen of each of the dots was the same. Further, using the same setup, I can reverse your perception of cause and effect. Imagine yourself pushing a button that produces a brief flash of light. After a few minutes I create a slight imperceptible delay between your button push and the light flash. A few minutes later I remove the delay. You will then experience a reversal of your judgment about cause and effect. You will experience the flash occurring before you pushed the button.
As the above examples suggest, time is a reconstruction of the brain. “No matter how much we may feel that our thought takes weightless flight, or that its velocity transcends time, mental processes work within biological materiality and have actual duration,” writes essayist Eva Hoffman in her 2009 book, Time. When we factor in the circumscriptions and limitations imposed on our thinking by the way our brains operate, we find ourselves ensnared in conceptual conundrums.
For instance, imagine yourself at the health club walking on a treadmill. You’re told that at a certain time the resistance of the treadmill will be increased and when that happens you should increase your efforts in order to keep up the same walking speed. During experiments testing this, people tend to increase their efforts several seconds before they’ve become conscious of the change in resistance of the treadmill. Despite their impression to the contrary, their brain detected the change in treadmill resistance and altered their walking pattern prior to their first noticing that change.
The treadmill experiment illustrates one of the significant insights of modern neuroscience: the lion’s share of the brain’s processing of information occurs outside our conscious awareness, in what neuroscientists refer to as the cognitive unconscious. And that arrangement makes a good deal of sense. Imagine how dull life would be if we had to plan each movement of our legs as we walked.
But it’s not just our perceptions of time, space, simultaneity, and unconscious cognitive processing that neuroscience is providing new insights about. It’s also telling us a lot about such distinctly human activities as empathy. Important here is an area toward the front of the brain and immediately behind our forehead called the medial prefrontal cortex (MPFC). Whenever we are introspective about our own feelings or imaginatively intuit the feelings of others, the MPFC comes into play. For example, the MPFC is at work when we wince upon encountering someone writhing in pain, when we become anxious looking at a picture depicting a scene of horror, or when we otherwise put ourselves in someone else’s shoes. Thus the MPFC is concerned with representing our own thoughts, feelings, and beliefs, as well as providing us with representations of the mental states of other people.
The research finding that our thoughts and feelings about ourselves and others are processed in the same brain areas confirms what sages and religious thinkers have been saying throughout the ages: we’re not isolated components in an impersonal social network but, rather, deeply social creatures capable of imagining each other’s internal experiences. Thanks to the MPFC we can mentally transcend our own perspective and see things from another person’s perspective.
The MPFC also helps us imagine what others are thinking about us. As an example of this process, taken from an experiment, picture yourself at a computer, rapidly shifting your attention from a word game to a briefly presented target flashed on the screen. After a few minutes you are told you are being observed by a video camera during some parts of the experiment but not others. Further, you will be informed when the camera is on and when it is off. What effect do you think the camera-on segments will have on your experience, in contrast to the camera-off periods?
If you’re like the subjects who participated in this study, you’ll become slightly uneasy when the camera is on. “I feel like people are watching me” and “I wonder how I look” were typical comments. Your reaction time will also increase when the camera is on. Presumably, your interior dialogue about your appearance and performance, accompanied by your self-evaluation and self-criticism, take a toll on the rapidity of your response. But the really interesting findings appear on images showing your brain’s activity pattern.
During the camera-on segments, your MPFC, that self-awareness and empathy site, springs into action. You will become acutely focused on how you will appear to others who may be watching you on the monitor. During the camera-off segments, your internal focus disappears and you will concentrate on doing your best in the experiment. In an instant your focus can shift from worrying about how other people may be judging your performance to doing your best in the word game. We’re dealing here with the age-old conflict between reason and emotion. Fortunately, brain research suggests a resolution wherein reason can gain the upper hand.
We can enhance our reasoning powers and lessen the effects of our emotions by the simple act of identifying and labeling them. For instance, when we look at a picture of an angry face, a portion of our brain that’s important for emotional experience (the amygdala) activates alongside our emotional response of fear. But if we identify and label the emotion that we’re experiencing, we can lessen its effect. As we do this, we activate our prefrontal cortex—the reasoning and executive center toward the front of the brain—which dampens the activity of the amygdala and lessens our fear.
Underlying all of the processes described so far is the human brain’s inherent mutability—its plasticity, as neuroscientists refer to it. Plasticity is a comparatively new insight into the brain’s functioning. As recently as a decade ago, it was believed even by experts that the brain didn’t change very much after a person reached adulthood. In my medical school and neurology training, my teachers claimed that the brain’s structure and functioning remained largely unaltered after adulthood. Subsequent research has revealed that the brain is dynamic and continues to change across our entire lifespan, varying from person to person and from moment to moment, based on an individual’s life experience. We can recognize the truth of this insight when we reread without enjoyment a once-favorite book and wonder what about it had formerly appealed to us. Thanks to brain science, we now have a plausible explanation for such experiences. We didn’t enjoy the book that second time around because, as a result of the lifetime plasticity of our brain, we’re literally a different person from the person who read the book the first time.
Plasticity also exerts a powerful effect on our memories. We do not so much forget the events of our past as we reinterpret and recontextualize them. As Viktor Mayer-Schönberger writes in Delete: The Virtue of Forgetting in the Digital Age, “Even if we are confronted with an exact record of a past event we were involved in, it would be impossible for us to block out the changes in our minds that have happened since: the knowledge gained (or lost), the values changed, preferences adjusted, emotions felt.” Substitute the word brain for mind in that quote and you have a perfect explanation for the malleability of memory. Thanks to the brain’s plasticity and its effect on the brain’s memory encoding and retrieving systems, remembering is not like watching the digital images of a DVD. It’s more like examining and interpreting a faded photograph from the predigital age.
New insights such as these into the nature of memory are forcing revisions in our ideas about identity, the unity of personality, the exercise of free will, and the soundness of our decisions. Looking back on an earlier decision doesn’t necessarily help us to understand the how and why of our original choice, or why we acted in the past in ways that puzzle us now. Indeed, according to Mayer-Schönberger, “we might be at odds with our original decision, wondering how we could have erred so starkly and acted so wrongly.”
According to neuroscientist Karl H. Pribram of Georgetown University, our memories are embedded in the brain like an image on a hologram. If you cut a hologram in half, each half shows the entire image but in slightly degraded detail. If you further divide the hologram by cutting the remaining portions in half, the resulting images lose even more detail. Time exerts a similar effect on the brain’s memory for detail. That’s why, although we continue to remember the highlights of important events in our lives, as the years pass the specifics of those memories become vaguer and harder to retrieve—for example, the identities of each of the relatives and friends who attended our wedding and what they were wearing.
Thanks to the malleability of our memories, it’s even possible for us to be convinced of the reality of something that didn’t happen. “Memory morphing” is the technical term for this process whereby our memories can be distorted or drastically altered by suggestion. In one illustrative experiment, researchers created a fake picture by transposing a childhood snapshot of a boy with his uncle so that it looked like the two of them had been riding together years earlier in a hot-air balloon. It was as if for each participant in the experiment a new snapshot was created. Even though such a trip had never occurred, 50 percent of those shown the picture recalled a childhood trip in a hot air balloon.
As another example of the fragility of human memory, consider what marketers refer to as “backward framing.” “Consumers can be influenced to recall prior experiences differently without being aware that their recall has changed,” according to marketing researcher Kathryn Braun. In a study confirming Braun’s point, people tended to incorrectly recall their original impression of a movie after reading a review. Suggestibility played no part in this experiment since the subjects were specifically instructed to recall how they originally felt about the movie, not any change in their opinion brought about by reading the review.
In another experiment illustrating “backward framing,” Braun prepared a rigged Disney advertisement suggesting that children visiting Disneyland would have an opportunity to meet and shake hands with Bugs Bunny. Although such a meeting would be impossible at Disneyland (Bugs Bunny is a Warner Bros. character), 16 percent of parents who had themselves visited Disneyland as children recalled meeting Bugs Bunny. Among those who hadn’t been provided with the ad, no one recalled such a meeting. “The power of memory alteration is that consumers are not aware they have been influenced,” cautions Braun.
Consider the implications of such findings: The more we learn from neuroscience research about the malleability of human memory, the less secure we should feel about the reliability of our own memories. This has nothing to do with the normal memory decrement that accompanies aging, nor the exaggerated memory loss associated with Alzheimer’s disease. Rather, brain research is showing that individual memories are highly unstable and can be modified by new information, whether accurate or inaccurate. Furthermore, the act of recalling a specific memory renders that information vulnerable to suggestion and change. A famous illustration of this phenomenon occurred to the Swiss psychologist Jean Piaget.
Throughout his life Piaget frequently spoke of a vivid memory of an incident from his early childhood. Even as an old man he could recall a specific afternoon spent with his nanny pushing him in a pram down the Champs-Elysees in Paris. Suddenly a man leaped out from the bushes and, in an attempt to kidnap Piaget, scuffled with the nanny. She put up a fierce resistance and successfully fought him off, but not before he inflicted superficial scratches on her face. Piaget’s memory of the frightening event was exquisitely detailed. He remembered the exact location of the assault, the uniform of the policeman, the sympathetic faces of the people gathered at the scene, even the scratches on his nanny’s face. And yet, as Piaget and his family subsequently learned, the episode had never taken place.
Years later, the guilt-ridden nanny wrote to Piaget’s parents and confessed to fabricating the whole incident (including the scratches). Apparently she felt badly enough about her deceit that she returned the gold watch Piaget’s parents had given her as a reward for her bravery. Yet even though he now knew that the incident hadn’t taken place, he continued to remember it for the rest of his life. Piaget had assimilated the description of the incident as given by the nanny to his parents along with conversations about the event that he had overheard as a child. In the process he stored the fictitious incident as a true memory. Indeed, the false memory proved so resistant to modification that Piaget ironically termed it “a memory of a memory, but false.”
Piaget’s experience suggests the existence of a kind of double bookkeeping system within the brain whereby it’s possible for us to believe that something is true when intellectually we recognize that it is not. Piaget’s vivid memory of an event that hadn’t occurred is reminiscent of Winston Smith’s “doublethink” in George Orwell’s novel 1984. To Smith, doublethink implies the capacity “to forget whatever it was necessary to forget, then to draw it back into memory again at the moment when it was needed, and then promptly to forget it again.” In Piaget’s version of doublethink, his early “memory” was both true (experientially) and false (intellectually). So far, neuroscience doesn’t have an explanation for such a phenomenon. Perhaps it never will. But even at the purely descriptive level, such examples provide us with new ways of conceptualizing our mental life.
Thanks to neuroscience, we’ve learned more in the last decade than in the previous hundreds of years about topics like time, simultaneity, cause-effect, empathy, memory, and our mental representations of ourselves and others. Over the next few years I believe we can expect additional brain-based contributions to the understanding of our internal and external experiences. As we move in that direction we’ve witnessed the recent formation of such hybrid disciplines as neurophilosophy and neuroethics.
Nor should the incorporation of neuroscience into traditionally humanistic areas of inquiry be surprising. Why would we expect to reach persuasive conclusions about the nature of our inner and outer worlds without reference to the one organ that enables us to explore and understand them?