The Emergence of the Neurosociety
Brain Imaging: Peering into Bertino’s Brain
As a first step in appreciating the impact of social neuroscience, it helps to understand the power of imaging techniques to provide a window into events happening within the brain.
The earliest techniques capable of revealing the brain’s inner processing carried a definite risk of injury and sometimes even death. Consequently, they were restricted to patients suffering from various brain diseases. As a result of this emphasis on disease, we presently know more about the functioning of abnormal brains than we know about normal ones. As a neurologist, I’m especially aware of this paradox. Ask me about the brain dysfunctions associated with strokes or autism or even some forms of learning disability and I can explain the difficulties in more detail than you probably want to hear. But ask me how the brain of a genius differs from that of his or her less intellectually gifted counterparts and the explanation isn’t going to take long at all.
Not that we can’t learn a lot about the normal brain on the basis of studying abnormal brains. Even a study of the diseased brain often provides some helpful insights toward furthering our understanding of the normal brain. My favorite example of this comes from the observations of the late-nineteenth-century Italian experimenter Angelo Mosso.
In the course of his research Mosso encountered a peasant, Bertino (no last name is recorded), who several years earlier had suffered a head injury severe enough to destroy the bones of the skull covering his frontal lobes (located immediately behind the forehead). The resulting opening, covered only by skin and fibrous tissue, provided Mosso with a window through which he could directly observe the pulsations of Bertino’s brain. Similar pulsations can be observed in a newborn baby during the first few weeks of life prior to the growth and fusion of the skull bones. When the baby cries or strains, the pulsations increase; when the baby sleeps, the pulsations subside.
One day when Mosso was observing the pulsations he noticed a distinct increase in their magnitude coincident with the ringing at noon of the local church bells. At this point Mosso, in an act of inspiration, asked the peasant if the ringing of the Angelus reminded him of his obligation to silently recite the Ave Maria. When Bertino responded yes, the pulsations increased again. Intrigued at this sequence, Mosso asked his subject to multiply eight by ten. At the moment Mosso asked the question, the pulsations increased and then quickly decreased. A second increase occurred when Bertino responded with the answer. From this simple but elegant experiment Mosso correctly concluded that blood flow in the brain could provide an indirect measurement of brain function during mental activity.
Inspired by Mosso’s findings with Bertino, students of the brain during the early and middle parts of the twentieth century developed more accurate techniques for measuring blood flow and metabolism in the human brain. For instance, dyes and radioactive substances injected into the arteries leading to the brain help pinpoint the relevant structures responsible for vision, movement, and sensation. But one important limitation lessened the usefulness of these probes into the brain’s functioning: All of them were intrusive, dangerous, and on occasion fatal. While undergoing one of the tests the subject could suffer a stroke, blindness, even death. Fortunately, that problem is now a thing of the past thanks to the safety of newer techniques, which carry little risk.
Current imaging techniques are often described using a kind of alphabet soup terminology: “The patent’s CAT was normal but a contrast-enhanced MRI showed a small SOL in the frontal lobes later confirmed by PET.” Such a sentence isn’t very helpful to anyone other than a doctor or someone else trained in the use of this off-putting terminology. In place of acronyms and obfuscation, here’s a simplified way of thinking about brain imaging.
Basically, imaging techniques are either structural or functional. If you’ve ever undergone a CAT (shorthand for computerized axial tomography) scan or an MRI (magnetic resonance imaging) scan, the doctor ordering that scan was interested in capturing an image of your brain’s structure. Perhaps your doctor thought that you might have suffered a stroke or developed a brain tumor. Tumors and strokes can be recognized by the alterations that they bring about in normal brain anatomy. CAT scans and MRI scans provide a picture of those alterations.
Functional imaging, in contrast, depicts what the brain is doing over a certain period of time ranging from seconds to minutes. All of the functional imaging techniques (functional MRI, or fMRI, PET scans, and SPECT scans are the most common) are based on a simple principle: Brain activity leads to changes in blood flow (as with Bertino), electrical discharges, and magnetic fields.
As I’m writing this sentence an fMRI would show increased activity in those areas of my brain associated with thought (especially the frontal areas), vision, and the movement of my fingers across the keyboard of my word processor. An fMRI of your brain would show activation of the visual areas, which process the words on this page, along with the frontal areas, which grasp the meaning of the sentences, and the motor areas, which control the movement of your hand as it reaches up and turns the page.
Ideally, an imaging technique should accurately pinpoint both the structure and the function, the “where” and the “when” of brain activity. On the “where” scale currently available techniques are accurate within millimeters. But the “when” determinations leave a lot to be desired. The temporal resolution of PET (positron-emission tomography) scans is tens of seconds or even minutes. The most technologically advanced fMRI does a bit better, with a resolution on the order of a tenth of a second. But even that is woefully insufficient as a measurement of how rapidly things are happening in the brain. To give you some perspective, consider that an activation originating in the motor neurons of your brain takes only about 150 milliseconds (thousandths of a second) to reach the muscles of your forefinger when you press a doorbell. Or consider that you can accurately identify an object that suddenly enters your field of vision within a few hundred milliseconds.
In short, in order to establish meaningful relationships between our mental lives and events occurring in our brain, it is important to achieve a temporal resolution of milliseconds. But here’s the sticky point. The most accurate technique for doing that involves inserting a tiny needle into the brain and then threading it into a single brain cell. While Dr. Strangelove might consider this invasive, potentially risky procedure acceptable in healthy brains, most others would consider it totally unacceptable.
To further appreciate the challenges in depicting brain activity, think back for a moment to the Heisenberg uncertainty principle in quantum physics: You cannot simultaneously determine the position and the velocity of a particle because of the effect created by the act of measurement (“The more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa,” Heisenberg wrote in 1927). Neuroscientists also encounter a kind of uncertainty principle when studying the brain: They have to choose between achieving either an accurate positional fix (within millimeters) or an accurate temporal fix (within fractions of a second). So far no single technique exists that can provide both; only the use of multiple techniques can make possible the desired integration of spatial and temporal information. To further complicate matters, the brain’s operation can’t be understood by measuring one neuron at a time; instead, we must focus on thousands of neurons firing together to form “circuits.”
Everything You’ll Need to Know About the Brain
Although a lot will be said about the brain in this book, a detailed knowledge of that incredibly complex structure won’t be required. Indeed, all that you’ll need is to remain mindful of two useful distinctions.
The first is between controlled and automatic processes. As an example of a controlled process, recall the last time you worked on your income tax or balanced a budget. Your thoughts followed each other in a sequential manner; you remained consciously aware of what you were doing; if requested, you could explain your thought processes to somebody else. In addition, if you continued your efforts long enough you were likely to experience fatigue or boredom.
As an example of an automatic process, think back to the last time you took an immediate dislike to someone you had just met. Or an occasion when you discerned a hint of condescension in a coworker’s voice as she explained a new procedure to you. Or an afternoon when you paused while walking down a street and looked appreciatively as an attractive person passed by. If asked at the time about such occurrences, you would have come up with various explanations to justify your impressions, but these would only have been guesstimates. That’s because with automatic processes things just happen. You can’t really explain why you disliked the new acquaintance while everybody else liked him. Nor why you were the only person who perceived condescension in the coworker. Nor why other people weren’t stopping to gawk at that man or woman you found so attractive. Automatic processes, in contrast to controlled processes, involve more than one avenue of thought occurring at a time, don’t involve consciousness, aren’t accompanied by a sense of effort, and can’t easily be explained to anyone else. And since we’re not consciously aware of them, automatic processes don’t make it onto our mental radar screens.
Most of the things we do involve a necessary balance between controlled and automatic processing. Too much automatic processing and we behave impulsively; too much controlled processing and we become paralyzed by indecision, such as when we mentally rehearse “perfect” responses to every conceivable question we might be asked during the next morning’s job interview.
The second distinction is between cognitive and emotional (affective) processes. While cognition is usually defined as “thinking,” that doesn’t quite capture the elements of what’s meant by cognition. Cognition refers to your perceptions of everything that is going on around you, to all of your thoughts and all of the actions that you might take in response to your outer and inner experiences. Here’s a definition from a current textbook: “The ability of the central nervous system to attend, identify, and act on complex stimuli.” But let’s keep it simple: think of cognition as a shorthand term for all of the ways we come to know the world. Cognition isn’t a thing but a process; nobody can hold it in her hand and show it to you, nor can scientific instruments reveal a picture of it. It includes thinking, remembering, daydreaming, mentally calculating—indeed, any mental activity you select can be included under the umbrella term cognition.
When you explain to your accountant all of the income tax deductions that you’re claiming, you’re involved in a cognitive process, essentially repeating aloud the thoughts you previously entertained while working out for yourself the amount of money that you think you owe. Everything is very intellectualized and rational: You talk and he listens. Emotions play little role here. But suppose later that evening you get a call from your accountant informing you that you actually owe additional money to the IRS. At this point, emotional processes—anger and resentment—are likely to displace cognitive processing (at least momentarily). Indeed, you can define an argument as essentially a dialogue in which affect displaces cognition, emotion overpowers reasoning.
While distinguishing between emotions and cognition and between controlled and automatic processes helps in understanding our own and other people’s behavior, it still doesn’t help us to answer fundamental questions such as: What happens in my brain when I make an impulsive purchase? Or decide that a certain person is trustworthy? Or invest money in a stock after listening to a broker? In order to address such questions from the perspective of the new social neuroscience, look at the diagram of the human brain on page 16. It is the only brain diagram that you will need to understand the topics that we will take up in this book.
The key structures are the medial prefrontal cortex (MPFC), the dorsolateral prefrontal cortex, the temporal lobes, the su- perior temporal sulcus (STS), the anterior cingulate, the in- sula, the parietal lobes, the amygdala, the basal ganglia, and the cerebellum.
In order to understand the diagram, keep one organizational principle in mind: automatic and controlled processes occur at different locations in the brain. Those regions that are involved
with automatic activity are concentrated toward the back (occipital), top (parietal), and side (temporal) lobes. Controlled processes, in contrast, occur mainly in the front (orbital and prefrontal) areas, with the prefrontal cortex especially important since it integrates information from all other parts of the brain, fashions long- and short-term goals, and directs our overall behavior. Think of the frontal lobes as the CEO of the most complex organization in the world, the human brain. And thanks to the exponential growth of the prefrontal lobes over the past several million years, we are capable of mentally outperforming any other creature on earth.
Since the frontal lobes will be playing a starring role in this book, it’s worth spending a few moments to give you a more complete picture of what the frontal lobes do. So let’s look first at the kinds of problems that can arise when frontal lobe functioning is compromised.
The Frontal Lobes of Jonathan Meaden
Meet Jonathan Meaden, a sixty-three-year-old man brought by his wife to my office for neurological consultation. After retiring five years ago from a successful career in business, Jonathan devoted himself full time to his lifelong passion: chess. Following a year of concentrated chess study—including tutoring by a local chess master—Jonathan improved his game to the point he could hold his own against highly rated amateur players.
Suddenly things started to go terribly wrong. On several occasions Jonathan became lost while driving to a tournament. In addition, his level of play deteriorated to the point that he placed last in several competitions, losing to players he had consistently defeated in the past. Most of these losses resulted from careless mistakes due to failures of concentration. But his problems involved more than just chess and finding his way on the highway.
Always a meticulous and careful investor, Jonathan started buying questionable oil and gas investments that, according to his wife, “he would never have had anything to do with before.” Books that formerly interested him now sat unread on the shelves; if sent to him by relatives or friends, the books sometimes weren’t even taken from their packaging. Even more distressing to his wife were Jonathan’s “memory problems.” In addition to forgetting the names of several of their close friends, Jonathan’s memories were “mixed up,” according to his wife; he recalled things out of sequence. But if she raised any questions about accuracy, Jonathan flew into a rage and began shouting.
On a recent vacation trip to Los Angeles Jonathan left his wife in the baggage area and rushed ahead to arrange for their rented car. With the paperwork done, he drove off. Three hours later Jonathan called his wife from a cell phone and told her to meet him at a ticket counter in another terminal. When she arrived at the counter the ticket agent told her that her husband had become impatient after a few minutes of waiting and had gone off to have lunch. “I was so upset, I was beyond anger. I was utterly flabbergasted that he could be so rude and inconsiderate,” his wife said. When she confronted Jonathan about his behavior, he told her, “You weren’t there and I was hungry, so I went and got some lunch.” He seemed unconcerned by the episode and expressed surprise that his wife was upset about it.
Formerly gregarious, Jonathan now spent most of his time alone. Always a voracious consumer of newspapers, he now only glanced at the headlines of the five morning papers delivered to his home, then turned on the TV and sat for hours “changing channels with the remote button at a rate that makes me dizzy,” said his wife. Once a stickler for manners, courtesy, and protocol, Jonathan now frequently used profane language and recounted crude jokes to whomever would listen. As another deviation from his usual behavior, he established accounts with several Internet pornography sites. Jonathan’s personal habits were also starting to deteriorate. His wife had to remind him to shower, and he frequently went for days at a time without shaving.
At my first meeting with Jonathan at my office I encountered a middle-aged man with a three-day growth of beard and food stains on his necktie. He made little eye contact and during the greater part of our interview sat looking down at the floor. When I asked him why he had come to the office, he denied any problems. “My wife brought me. She has the problems,” he responded with a wry smile. This was followed by short, unelaborated responses to my questions. He volunteered no comments of his own.
Routine tests of Jonathan’s mental functioning turned up a slightly impaired memory (he could remember only three out of four items four minutes after being asked to remember them for later recall). Most striking was his failure in what’s called verbal fluency: rapidly generating as many words as possible within one minute beginning with a particular letter of the alphabet. While the average person can come up with sixteen to twenty words beginning with the letters S or F, Jonathan could only come up with five words apiece. Even more striking, when I asked him to name as many animals as he could in one minute, he named only nine (the normal response is anywhere from twenty to twenty-five).
When asked about a typical day, Jonathan responded that he was spending his time reading chess magazines and engaging in regular chess matches with friends. His wife, seated across the room, indicated by a vigorous shaking of her head that none of this was true. Later I learned from her that Jonathan was very apathetic and didn’t get excited by anything, including visits from his daughter and infant granddaughter.
An MRI of Jonathan’s brain showed atrophy of the left and right frontal lobes. A similar pattern of abnormality appeared on a PET scan, with reduced activity in both frontal lobes. The diagnosis was frontal lobe dementia.
The frontal lobes occupy almost a third of the total area of the cerebral hemispheres. Each forms a pyramid with the apex pointing to the front and the base extending toward the back of the brain. Damage anywhere within these triangles results in serious impairments that can affect every aspect of cognitive life. Two principal disorders occur. If the damage involves the upper and lateral surface of the prefrontal cortex, the result is a lack of initiative, loss of creativity, impaired concentration, and a personality marked by blandness and apathy—in short, a great reduction in activity. Damage to the underside of the frontal lobes lying just above the eyes results in too much rather than too little activity: angry outbursts, impulsive acts that may include buying and selling sprees, and tactless, self-destructive behavior toward family, friends, and coworkers. Usually, as with Jonathan, signs of both impairments coexist.
In summary, Jonathan’s personality changes included a generalized indifference to other people and events (his self-imposed social isolation), impaired social judgment (the crude jokes and frequent recourse to profanity), diminished emotional responses (the bland, emotionless attitude toward his daughter and granddaughter), faulty practical judgment (the risky oil and gas ventures), and problems with self-control (the unpredictable angry outbursts). Oversimplifying a bit here, Jonathan had great difficulty acknowledging any point of view other than his own. For example, he was genuinely puzzled at his wife’s anger at him for needlessly delaying them at the airport. He saw the episode in a different way: Since he had become hungry while waiting for her, he had every right to leave the ticket counter before her arrival and get something to eat.
Jonathan illustrates an extreme example of impairment in empathizing with the thoughts and feelings of other people; he wasn’t deliberately choosing to treat his wife and others so callously. Rather, the damage to his frontal lobes robbed him of the ability to place himself in another’s shoes and thereby imagine how he would feel under similar circumstances. Neuropsychologists describe people like Jonathan as suffering from a theory of mind (TOM) disorder: a loss of the capacity to infer the internal mental state of another person.
If you’ve ever interacted with anyone like Jonathan, you will also observe another puzzling aspect of their thinking: They have great difficulty acting in their own best interests. As neurologist Antonio Damasio explains it, damage to the frontal lobes deprives the affected person of a somatic marker (defined as the emotional state triggered when thinking of the potential outcomes of one’s choices). For example, if you think of not filing a tax return this year, you’re likely to experience repetitive feelings of anxiety and dread at the thought of the consequences you’ll face if your failure to file is detected. That anxiety and dread serve as somatic markers. In people like Jonathan the somatic markers aren’t functioning normally: such people experience no internal disquiet at the prospect of something bad happening to them.
Frontal lobe damage also brings about a strange dissociation between hypothetical and practical decisions. For example, if asked a hypothetical question (“What would you do if you found a stamped addressed envelope on the street?”), the person with frontal lobe damage is likely to give you the correct answer (put the letter in the nearest mailbox). But in a real situation where he actually finds a letter on the street, he is just as likely to open it as to place it in a mailbox. Such a split between saying and doing is customarily considered a hallmark of damage to the frontal lobes. Since people with damage to the frontal lobes can’t process anything except from a purely personal perspective, questions of right and wrong are hopelessly muddled.
When Less Is More
Now that we have a general picture of what the frontal lobes contribute to our mental lives, I want to make one final point about brain anatomy and organization: At any given moment our brain is largely under the influence of automatic processes. If that strikes you as a strange arrangement for “rational” creatures such as ourselves, think for a moment about the consequences if everything we did involved controlled process.
When you get up from your chair to walk to the refrigerator, you would have to decide whether to lead with your right or left leg. When you pick up a fork, you would have to stop talking so that you could concentrate on bringing the food to your mouth. And you couldn’t even think of driving and talking at the same time. Life would be pretty dull if you had to consciously attend to things such as walking, eating, and driving, wouldn’t it? The philosopher Alfred North Whitehead captured how difficult it would be to function effectively if deliberate controlled mental processing were the brain’s default mode:
It is a profoundly erroneous truism . . . that we should cultivate the habit of thinking what we are doing. The precise opposite is the case. Civilization advances by extending the number of operations that we can perform without thinking about them. Operations of thought are like cavalry charges in battle—they are strictly limited in number, they require fresh horses, and must only be made at decisive moments.
A similar suggestion appears in one of my previous books, Mozart’s Brain and the Fighter Pilot, where I provided a rule for enhancing brain function: Let the brain be the brain. By that I meant, don’t pay too much conscious attention (don’t “think too hard”) when you’re trying to learn something new; rather, allow the brain to work on things at its own pace. That’s because in many cases we decrease accuracy and efficiency by thinking too hard.
For example, think back to the last time you took a multiple-choice test. If you’re like most people, you sometimes found yourself doubting the accuracy of your initial selections on some questions. And the more you thought about those questions, the more uncertain you became about the correct response. According to numerous studies of this experience, your original answer was most likely correct, and additional thinking about the question was likely to lead to errors—one of the reasons students are encouraged after responding to one question to move on to the next question. (Assuming, of course, they’re prepared for the test and know the material.)
Now neuroscientists have an explanation for why we sometimes do better by exerting less rather than greater mental effort. British psychiatrist Paul Fletcher observed via fMRI the brain activity of volunteers instructed to push one of four buttons in response to the highlighting of four boxes on a computer screen. The button they pushed depended on which one of the boxes was highlighted. The subjects were initially asked to press the corresponding button as rapidly as possible without making any errors. Unknown to the subjects, there was a repeating sequence of highlighting (eighteen repetitions of a ten-item sequence). When the volunteers simply relaxed and pushed the buttons, they responded faster and more accurately as time went on, even though they strongly denied noticing any sequence. Later the volunteers were told about the sequence, and asked to try to learn it. But when they actually tried to discern the sequence their responses were slower and less accurate. In other words, those who learned the pattern without consciously trying to do so did better than those who put conscious effort into divining it.
The fMRI results also differed. Among those consciously trying to learn the pattern, a significant increase in activity occurred in the right frontal lobe. And since, as we now know, the frontal lobes are involved in the making of executive-type decisions— decisions that often require the time-consuming weighing of various options—it’s no surprise that enlisting the right frontal lobe lengthened response times.
Memory provides another good example of the wisdom of acting spontaneously rather than thinking too much about what response to make. For example, imagine yourself looking very briefly at twenty objects. If I asked you to recall them later, how well do you think you would do? If you’re like most adults, you will only be able to come up with about twelve to fifteen. In contrast to this lackluster performance in recollective memory, you will do much better in recognition memory, successfully recognizing up to ten thousand objects when they are re-shown to you after some delay.
Such findings will seem less incredible to you if you consider that most of the things we know exist outside of our conscious awareness; we always know more than we can say. The philosopher Michael Polanyi referred to this as “tacit knowledge.” Thanks to tacit knowledge (also called implicit knowledge) we can successfully ride a bike or drive a car even though we aren’t able to describe exactly how we do it.
In essence, our brain is organized so that once an activity becomes routine it doesn’t require conscious effort but occurs automatically. Indeed, controlled processes come into play only when we encounter something unexpected that forces us to sit back and figure out our response. Or when we have to make a decision about something, especially if we may later have to explain our decision to someone else. Neuroscientists refer to this thinking without conscious effort as the “cognitive unconscious.” It plays a pivotal role in social neuroscience and the evolution of the neurosociety.
How the Brain Processes Information
The Cognitive Unconscious
In the 1940s movie Random Harvest the main character played by Ronald Colman is first introduced as John “Smithy” Smith, a World War I shell shock victim who can’t remember anything about his life and identity prior to the war. Through a series of events unlikely in anything but a movie, Smithy meets and marries Paula, an entertainer played by Greer Garson. Months afterward, Smithy is struck by a car while visiting a neighboring town in search of work. The resultant head injury deprives him of his memory of Paula and their recently born child. Now freed from his battlefield-induced amnesia, he reverts to his original identity as Charles Rainier, the scion of a wealthy family, and returns to the magnificent English manor where he grew up.
Throughout the remainder of the movie Rainier becomes uneasy and perplexed whenever he encounters certain people or goes to places such as the pub and tobacco shop Smithy had frequented prior to the car accident.
While it wouldn’t be true to claim that Rainier remembers these past associations—he can’t consciously recall them—they nonetheless exert a powerful influence on him. He regularly experiences the feeling that he has been to these places before, yet he can’t arouse any conscious memory supporting his feelings. Throughout the movie similar examples of unconscious mental processes provide the main dynamic motivating the actions of Rainier/Smithy.
Although Random Harvest is a work of fiction, similar examples of unconscious mental processing can be traced back almost two hundred years. I’m not referring here to the “unconscious” associated with nineteenth-century Vienna, Freud, or the pseudoscience of psychoanalysis. Indeed, the unconscious that I’m referring to has nothing to do with sex, violence, or any of the other factors much beloved by psychoanalysts; rather, it involves how our brain processes information. The operative term now is the cognitive unconscious.
One of the first discoverers of the cognitive unconscious was Michael Faraday, the scientist who invented the forerunner of the battery that powers your laptop computer. In 1853 he published a paper in the English periodical Athenaeum, “Experimental Investigation of Table Turning.” During his research for the paper Faraday had attended several table-turning sessions where, typically, a group of people gathered around a light table and rested their hands on it. After a few minutes—varying from one session to another—the table would rock from side to side, or even turn clockwise or counterclockwise. Since each of the participants denied willfully pressing, pushing, or exerting any force whatsoever on the table, spirits or other supernatural agents were deemed responsible for the table’s movement, according to the spiritualist movement popular at the time.
Faraday, however, was convinced that the participants were moving the tables but weren’t aware that they were doing so. In order to prove his point, he placed force measurement devices between the participants’ hands and the table. Sure enough, the source of the table movement turned out to be the participants. A similar explanation also applied to Ouija boards and automatic writing: The participants were unknowingly (Faraday never claimed that they were lying or attempting to deceive anybody) exerting a force absent the usual subjective feeling that accompanies willed action.
From the Hardcover edition.