- Shopping Bag ( 0 items )
Ships from: acton, MA
Usually ships in 1-2 business days
Mapping the Mind charts how human behaviour and culture have been molded by the landscape of the brain. It shows how our personalities reflect the biological mechanisms underlying thought and emotion and how behavioural eccentricities may be traced to abnormalities in the geography of an individual brain. Obsessions and compulsions, for example, seem to be caused by a stuck neural switch in a brain area which monitors the environment for danger. Addiction, eating disorders, and alcoholism stem from dysfunction in the brain's reward system. Inability to change one's ideas suggests a lack of activity in the frontal lobes where plans and high-level concepts are constructed. Even belief in God has been linked to activity in a particular brain region. The differences between men's and women's brains and the distinctive characteristics of the brains of people with disorders such as dyslexia, autism, attention deficit, depression, mania, and mood swings are also explored.
THE EMERGING LANDSCAPE
The human brain is made of many parts. Each has a
specific function: to turn sounds into speech; to process
colour; to register fear; to recognize a face or distinguish
a fish from a fruit. But this is no static collection of
components: each brain is unique, ever-changing and exquisitely
sensitive to its environment. Its modules are interdependent and
interactive and their functions are not rigidly fixed: sometimes
one bit will take over the job of anoth-er, or fail, owing to some genetic
or environmental hiccup, to work at all. Brain activity is controlled by. currents
and chemicals and mysterious oscillations; it may even be subject
to quantum effects that distort time, The whole is
bound together in a dynamic system of systems that does millions
of different things in parallel. It is probably so
complex that it will never succeed in comprehending
itself. Yet it never ceases to try.
IF YOU PLACE A FINGER ON THE NAPE of your neck then edge it up and outwards, you will come across a bump formed by the base of your skull. Have a little feel. According to Franz Gall, the founder of phrenology, this particular protuberance marks the location of the Organ of Amativeness, 'the faculty that gives rise to the sexual feeling'. Now slide your finger an inch or so further up towards your crown. You are now passing over the Organ of Combativeness.
Those of a warm and peaceful disposition should, in theory, find the second area of their skull flatter than the first. But don't be too concerned if your bumps do not match your self- perceptions — Gall arrived at the Amativeness Organ by seeking out the area of greatest heat in the skulls of two recently widowed and 'emotional' young women, and he fixed the Organ of Combativeness by observing that area's smallness in 'most Hindoos and Ceylonese'. It was dubious methodology even for the early nineteenth century.
Bump-reading was nonsense, anyway, because the soft tissue of an individual brain generally has no effect on the shape of the skull. Yet it wasn't all wrong. Feel your skull again, on top this time, just a little forward and to the left of the crown. This is where Gall placed his Organ of Mirthfulness. Surgeons at the University of California Medical School recently applied a tiny electric current near to this part of the left brain of a sixteen-year-old girl.
The patient had intractable epilepsy and the stimulus was, part of an established procedure designed to locate the foci of her seizures prior to their removal. She was conscious, and as the stimulus passed through this particular part of her cortex she started to laugh. It was no empty rictus but a genuine, humour-filled chortle, and when the surgeons asked her what was so amusing she had an answer: 'You guys are just so funny - standing around.' The doctors applied the current again and this time the girl suddenly saw something comic in a picture, which happened to be in the room, of a perfectly ordinary, horse. A third time she was struck by something else. The surgeons seemed to have come across a part of the brain that creates amusement, however unpromising the circumstances. Gall's pinpointing of it nearly two hundred years ago was no doubt accidental. Yet his basic idea - of a brain made of functionally discreet modules was prescient.
Phrenology's downfall came, ironically, from the discovery of real brain modules. Towards the end of the nineteenth century a craze for biological psychiatry took root in European universities, and neurologists started to use localized electrical brain stimulation and animal lesion experiments to find out which bits of the brain did what. Others observed the association between certain behaviour and certain brain injuries. Many important landmarks were identified during this first era of brain mapping, including the language areas discovered by the neurologists Pierre Broca and Carl Wernicke. Embarrassingly for the phrenologists, these areas were found in the side of the brain, above and around the ear, while Gall's Organ of Language was firmly located near the eye.
The language areas identified by Broca and Wernicke still bear those names today. If early twentieth-century scientists had continued the search for functional brain areas, today's charts might now be crowded with the names of other long-dead individuals instead of the dull labels — primary auditory cortex, SMA , V1 — now attached to each newly discovered region. Instead scientific brain mapping fell out of fashion along with phrenology, and the modular brain theory was largely abandoned by scientists in favour of the theory of 'mass action', which held that complex behaviour arose from the action of all the brain cells working together.
On the face of it, the mid-twentieth century should have been a bad time for anyone with ambitions to use physical methods to treat mental illness or alter behaviour. Yet psychosurgery flourished. In 1935 the Lisbon neurologist Egas Moniz heard about some experiments in which aggressive, anxious chimps had certain fibres in the front of their brains lesioned. The operation — leucotomy — made them quiet and friendly. Moniz promptly tried it on similarly afflicted humans and found it worked. Frontal leucotomy (which later evolved into the more radical frontal lobotomy) rapidly became a standard treatment in mental hospitals, and during the 1940s at least twenty thousand such operations were carried out in America alone.
Looking back, the approach to brain surgery at that time seems shockingly cavalier. It was used for almost any mental disorder — depression, schizophrenia, mani — even though no one then had a clue what was causing the symptoms or why making cuts in the brain should relieve them. Travelling surgeons went from hospital to hospital with their implements in the back of their cars, knocking off a dozen or so operations in a morning. One surgeon described his technique like this:
'[There's] nothing to it. I take a sort of medical icepick...bop it through the bones just above the eyeball, push it up into the brain, swiggle it around, cut the brain fibers and that's it. The patient doesn't feel a thing.'
Unfortunately, the lack of feeling extended, in some patients, to a long-term flattening of emotion and a curious insensitivity that left them seemingly only half-alive. It didn't always cure aggression either: Moniz's career was brought to an end when he was shot by one of his own lobotomized patients.
On balance, the mid-century craze for cutting up the brain probably relieved more suffering than it caused, but it created a deep sense of unease in the medical profession and a profound and still-lingering suspicion of psychosurgery among lay people. When effective psychotropic drugs came along in the 1960s, surgery for mental disorders was almost entirely abandoned.
As we enter the twenty-first century the idea of modifying behaviour and relieving mental anguish by manipulating brain tissue is creeping back. This time, though, any tinkering that is less to be done will be based on a much greater understanding of how the brain works. Recently developed techniques — functional brain scanning in particular — allow researchers to explore the living, working brain. Watching the brain at work gives new insight into both mental illness and the nature of our everyday experiences.
Take, for example, pain. Common sense might predict a specialized pain centre in the brain, connecting, perhaps, with another bit that registers sensation in the affected part of the body. In fact, scans show that there is no such thing as a pain centre. Pain springs mainly from the activation of areas associated with attention and emotion. Seeing what pain is, in terms of neurological activity, it becomes clear why we feel it so much more when we are emotionally stressed and why we often don't notice it — even when our bodies are quite badly injured — when more pressing things have captured our attention.
While apparently simple functions like pain are showing themselves to be more complex than might be expected, some seemingly imponderable qualities of the mind are looking to be surprisingly mechanistic. Religious belief and experience are usually regarded as beyond scientific exploration, yet neurologists at the University of California San Diego have located an area in the temporal lobe of the brain that appears to produce intense feelings of spiritual transcendence, combined with a sense of some mystical presence. Canadian neuroscientist Michael Persinger, of Laurentian University, has even managed to reproduce such feelings in otherwise unreligious people by stimulating this area. According to Persinger:
'Typically people report a presence. One time we had a strobe light going and this individual actually saw Christ in the strobe... [another] individual experienced God visiting her. Afterwards we looked at her EEG [electroencephalogram] and there was this classic spike and slow-wave seizure over the temporal lobe at the precise time of the experience the other parts of the brain were normal. The fact that we seem to have a religious hot-spot wired into our brains does not necessarily prove that the spiritual dimension is merely the product of a particular flurry of electrical activity. After all, if God exists, it figures He must have created us with some biological mechanism with which to apprehend Him. Nevertheless, it is easy to see that being able to get your God Experience from a well-placed electrode could at the very least undermine the precious status such states are accorded by many religions. How believers will cope with what many might see as a threat to their faith is one of many interesting challenges that brain science will throw up in the coming millennium.
How does the conglomeration of neuronal clumps and cat's cradle wiring actually do what brains do? Essentially the neurons get fired up, joined up and dancing - on a huge scale.
The firing of a single neuron is not enough to create the twitch of an eyelid in sleep, let alone a conscious impression. It is when one neuron excites its neighbours, and they in turn fire up others, that patterns of activity arise that are complex and integrated enough to create thoughts, feelings and perceptions.
Millions of neurons must fire in unison to produce the most trifling thought. Even when a brain seems to be at its most idle a scan of it shows a kaleidoscope of constantly changing activity. Sometimes, when a person undertakes a complex mental task or feels an intense emotion the entire cerebrum lights up.
New neural connections are made with every incoming sensation and old ones disappear as memories fade. Each fleeting impression is recorded for a while in some new configuration, but if it is not laid down in memory the pattern degenerates and the impression disappears like the buttocks-shaped hollow in a foam rubber cushion as you stand up.
Patterns that linger may in turn connect with, and spark off, activity in other groups, forming associations (memories) or combining to create new concepts. In theory, each time a particular interconnected group of neurons fires together it gives rise to the same fragment of thought, feeling or unconscious brain function; but in fact the brain is too fluid for an identical pattern of activity to arise — what really happens is that similar but subtly mutated firing patterns occur. We never experience exactly the same thing twice.
Little explosions and waves of new activity, each with a characteristic pattern, are produced moment by moment as the brain reacts to outside stimuli. This activity in turn creates a constantly changing internal environment, which the brain then reacts to as well. This creates a feed back loop that ensures constant change.
Part of the brain's internal environment is a ceaseless pressure to seek out new stimuli. This greed for information is one of the fundamental properties of the brain and it is reflected in our most basic reactions. People can have their conscious mind totally destroyed, yet their eyes will still scan the room and lock on to and track a moving object. The eye movements are triggered by the brainstem and are no more significant of consciousness than the turning of a flower to the sun. Yet even when you know this, it is deeply disturbing to have your movements followed by the eyes of a person you know is for all intents and purposes dead.
The loop-backs between brain and environment are a bootstrap operation par excellence. Computer simulations of neural networks show that the simplest network can develop phenomenal complexity in a short time if it is programmed to replicate patterns that are beneficial to its survival and junk those that aren't. Similarly, brain activity evolves in the individual.
This process - sometimes referred to as neural Darwinism — ensures that patterns that produce thoughts (and thus behaviour) that help the organism to thrive are laid down permanently while those that are useless fade. It is not a rigid system — the vast majority of brain patterns we produce are quite irrelevant to our survival — but overall this seems to be the way that the human brain has come to be furnished with its essential reactions.
Some of this furniture has been built in at genetic level. Certain patterns of brain activation — even quite complex ones like speech production — are so strongly inherited that only an extraordinarily abnormal environment can distort them. The pattern of brain activation during, say, a word retrieval task is usually similar enough among the dozen or so participants who typically take part in such studies for their scans to be overlaid and still show a clear pattern. This is how brain mappers can confidently talk of a chart of 'the' brain, rather than 'a' brain.
This is not to say that everyone thinks alike. Thanks to the infinitely complex interplay of nature and nurture no two brains are ever exactly the same. Even genetically identical twins — clones — have different brains by the time they are born because the tiny divergence in the foetal environment of each is enough to affect their development. The cortex of human twin babies is visibly different at birth, and structural variations inevitably produce differ . ences in the way brains function.
During foetal development the brain develops, bulb-like, at the upper end of the neural tube that forms the spine. The main sections of the brain, including the cerebral cortex, are visible within seven weeks of conception and by the time the child is born the brain contains as many neurons — about 100 billion — as it will have as an adult.
These neurons are not mature, however. Many of their axons are as yet unsheathed by myelin — the insulation that allows signals to pass along them and the connections between them are sparse. Hence large areas of the brain, and particularly the cerebral cortex, are not functioning. PET studies of newborn brains show the active areas are those associated with bodily regulation (the brainstem), sensation (thalamus) and movement (deep cerebellum).
The uterine environment has a profound effect on the wiring of the infant brain. Babies born to drug addicts are frequently addicted at birth, and those born to mothers who eat curries while they are pregnant take to spicy food more readily than others, suggesting that their tastes are primed by exposure to the residue of such foods in their mothers' blood.
Life in the womb provides a good example of how genes and environment are inextricably combined. A male foetus, for example, has genes that provoke the mother's body to produce a cascade of hormones, including testosterone, at certain times in its development. This hormonal deluge physically alters the male foetus's brain, slowing the development of certain parts and speeding the development of others. The effect of this is to masculinize the foetal brain, priming it to produce male sexual behaviour. It also creates many of the typical differences seen between the sexes, like girls' superiority at speech and boys' at spatial tasks. If a male foetus does not get the appropriate pre-birth hormone treatment, his brain is likely to remain more typically female; if a female foetus gets exposed to a male-pattern hormonal sequence, she is likely to be more typically masculine.
Inside the developing brain individual neurons race about looking for a linked team of other neurons to join as though in some frantic party game. Every cell has to find its place in the general scheme, and if it fails it dies in the ruthless pruning process known as apoptosis or programmed cell death. The purpose of apoptosis in the immature brain is to strengthen and rationalize the connections between those that are left, and to prevent the brain becoming literally overstuffed with its own cells. This 'sculpting' process, though essential, may also have a price. The connections that get killed off during it include some that may otherwise confer the sort of intuitive skills we label gifts. Eidetic (photograph) memory, for example, is quite common among young children but usually disappears during the years of cerebral pruning. Incomplete apoptosis may account for the astonishing abilities of so-called idiot savants, as well as being one causal factor of their deficits. Conversely, apoptosis that runs wild and strips out far too many connections is thought to be one of the causes of impaired intelligence in Down's syndrome. It is also probably the reason why Down's people are more likely than others to develop Alzheimer's disease.
Climbing to consciousness
A baby's brain contains some things that an adult's does not. There are, for example, connections between the auditory and visual cortices, and others between the retina and the part of the thalamus that takes in sound. These connections probably give the infant the experience of 'seeing' sounds and 'hearing' colours — a condition that occasionally continues into adulthood and is known as synaesthesia. Babies show emotion dramatically, but the areas of the brain that in adults are linked to the conscious experience of emotions are not active in a newborn baby. Such emotions may therefore be unconscious.
'Unconscious emotion' sounds like a contradiction in terms what is emotion if not a conscious feeling? In fact, the conscious appreciation of emotion is looking more and more like one quite small, and sometimes inessential, element of a system of survival mechanisms that mainly operate — even in adults — at an unconscious level.
This does not necessarily mean that early traumas do not matter. Unconscious emotion may not be, strictly speaking, experienced, but it may lodge in the brain just the same. We cannot remember things before the age of about three because until then the hippocampus — the brain nucleus that lays down conscious longterm memories — is not mature. Emotional memories, however, may be stored in the amygdala, a tiny nugget of deeply buried tissue that is probably functioning at birth. As the baby gets older myelinization creeps outward and brings increasing numbers of brain areas 'on-line'. The parietal cortex starts to work fairly soon, making babies intuitively aware of the fundamental spatial qualities of the world. Peek-a-boo gaines are endlessly intriguing once this part of the brain is working because babies then know that faces cannot really disappear behind hands — yet the brain modules that will one day allow them to know why have not yet matured.
The frontal lobes first kick in at about six months, bringing the first glimmerings of cognition. By the age of one they are gaining control over the drives of the limbic system — if you offer two toys to a child of this age, they will make a choice rather than try to grab both. Up until the age of about a year babies are, in the words of one developmental psychologist, 'robotic looking machines' — their attention can be caught by more or less any visual stimulus. After that age they get an agenda of their own — not always one that fits in with other people's.
The language areas become active about eighteen months after birth. The one that confers understanding (Wernicke's area) matures before the one that produces speech (Broca's area), so there is a short time when toddlers understand more than they can say — a frustrating condition that probably does much to fuel the tantrums that typify the 'Terrible Twos'.
Around the same time as the language areas become active myelinization gets under way in the prefrontal lobes. Now children develop self-consciousness they no longer point at their reflection in the mirror as though they see another child and if a dab of coloured powder is put on their face while they look at their reflection, they rub it off they don't rub the mirror as younger children do. This self-consciousness suggests the emergence of an internal executor — the 'I' that most people say they feel exists inside their heads.
Certain brain areas take many years to mature. A nucleus called the reticular formation, for example, which plays a major role in maintaining attention, usually only becomes fully myelinated at or after puberty, which is why prepubescent children have a short attention span. The frontal lobes do not become fully myelinated until full adulthood. This is one reason, perhaps, why younger adults are more emotional and impulsive than those who are older.
Human brains are at their most plastic during infancy. You can take away an entire hemisphere from a child's brain and the other will rewire itself to take on the tasks of both. It will even manage to develop functions that are usually exclusive to its other half. As we age, however, brain functions become more rigid and more distinctive. By the time we are adults our mental landscapes are so individual that no two of us will see anything in quite the same way. A couple watching the same film, for example, would probably have entirely different patterns of neural activity because each would be cogitating on different aspects of the show and associating what they see with personal thoughts and memories. She might be wondering when the tiresome couple on screen will finally reach their feel-good ending so she can get some dinner; he might be thinking how the heroine's cute upper lip reminds him of his ex-girlfriend.
That is why experiments designed to reveal which brain areas are responsible for what have to involve artificially rigid and narrow tasks. The subjects who lay for more than two hours in a PET scanner doing nothing more than lifting a finger in response to a given signal, for example, must at times have wondered what possible insight could be gained from such a tedious manoeuvre.
In fact, wonderful discoveries are emerging from these unpromising exercises. The finger- lifting experiment, carried out by Chris Frith and colleagues at the Wellcome Department of Cognitive Neurology in London, revealed something that, not so long ago, would have been expected to remain one of life's eternal mysteries: the source of self-determination. They arrived at it by designing a procedure that first narrowed down what was going on in the participants' brains to a few things that they already knew from previous work would show up as particular patterns of activity in certain brain localities. In this case they got the subjects to move a specified finger on cue a task that duly provoked activity in the auditory cortex (when the cue was a noise) and the motor cortex (the area that controls movement). They then added the element of the task for which they wanted to find a brain location: self-willed activity. Instead of telling the subjects which finger to lift they left it to them to decide which one to move. Then they watched to see how the brain activity involved in doing this differed from the brain activity involved in lifting an externally specified finger.
The difference was clear: as soon as the participants started to make their own decisions a previously 'dead' area of their brains sprang into life. Various controls were in place to ensure that this new activity did not just represent the extra effort required to think about the task once it was not a simple matter of obeying an order. The elegant and careful design of the experiment made sure that the bit of brain identified is almost certainly that which, quite specifically, allows people to do things of their own volition.
Can identifying the brain activity involved in deciding which of two fingers to lift possibly shed any light on decision-making in the messy and infinitely more complicated world outside a brain research laboratory?
Indirectly, yes. The region of brain in which the self-will area was found is the prefrontal cortex, a region of the frontal cortex which lies mainly behind the forehead. Injuries to this area often produce a characteristic change in behaviour that includes loss of self-determination on a grand scale. The classic case is that of Phineas Gage, a nineteenth-century rail-worker who lost a large chunk of forebrain when a steel rod was blown through his skull by a mistimed explosion. Gage survived, but from the time of the accident he changed from a purposeful, industrious worker into a drunken drifter. John Harlow, the doctor who treated him, described the new Gage as 'at times pertinaciously obstinate, yet capricious and vacillating, devising many plans of future operations which are no sooner arranged than they are abandoned... a child in his intellectual capacity and manifestations yet with the animal passions of a strong man.' Ladies were advised not to stay in his presence. The hallmark of Gage's new condition was his complete inability to direct or control himself.
If self-determination lies in a specific bit of tissue, it follows that those who appear not to have it may simply be unlucky — victims of a sluggish brain module. So is it reasonable to blame the Phineas Gages of today for their ways? Should we be unsympathetic to addicts who fail to conquer their habit, or punish recidivist criminals?
The current discoveries about the brain should breathe new life into this old debate. If antisocial behaviour can be linked to malfunctioning brain modules, perhaps we should start investigating how to turn the modules on or off. This is not science fiction — it is already being tried. A technique called Transcranial Magnetic Stimulation (TMS) uses a powerful magnetic field to stimulate or inhibit precise areas of the brain. Trials in several countries have shown that it can help relieve depression, and doctors at the US National Institute of Neurological Disorders and Strokes are currently trying it on people with obsessive-compulsive disorder (OCD), post-traumatic stress disorder (PTSD) and mania. Current research into rage, psychopathy and the sort of behaviour typified by, Phineas Gage suggests that these might be treated in the same way. If the idea sends shivers clown your back, think of what we do to such people now. Is an artificially induced change of mind worse than a stretch in prison?
Windows on the Mind
The safety video that comes with one brand of MRI scanner shows a man walking up to the machine with a metal wrench in his hand. When he gets within a few feet the hand with the wrench in it suddenly shoots up and his arm locks into a rigid horizontal salute with the wrench pointing straight at the scanner. The next few seconds have a cartoon quality as the man engages in a tug-of-war with an unseen rival. As he inches nearer to the machine the wrench starts to quiver like a banner in a wind-runnel, then slips through his clenched fingers towards the mouth of the scanner. The man clutches it with both hands and leans back, but he obviously cannot keep hold. Suddenly the tool jettisons forward into the tube, where it smashes into a strategically placed brick. The force of the impact is so great the brick is pulverized.
The scene is meant to show the dangers of taking metal near an MRI scanner. These machines are basically massive, circular magnets. Their gravitational pull is some 40,000 times greater than that of the earth — it is easy to see that the consequences of entering one with, say, a heart pacemaker in place, would be catastrophic. If you do not happen to have any metal about your person, however, MRI scanning seems to be perfectly safe — no one has yet reported any detrimental biological effect from it.
Powerful scanning techniques like fMRI are opening the brain to scrutiny in a way undreamt of until a couple of decades ago. But brain mapping began long before fancy scanning machines were invented.
The two main language areas, which are still among the most important cortical landmarks on the map, were identified by Broca and Wernicke more than a hundred years ago. They did it by looking at the brains of people with speech disorders and noting that those with the same problems all had damage in the same place. Broca located the area that allows us to articulate speech by post-mortem examination of the brains of people who were unable (usually as the result of a stroke) to get words out. His classic case was a man named Tan.
Tan was so-called because that was what he said when he was asked his name. It was also what he said when he was asked his date of birth, his address and what he wanted for dinner. 'Tan' was all that Tan could say, even though he understood speech perfectly well.
Broca had to wait until Tan was dead before he could look at his brain and see which bit was injured. Today, scanners allow neuroscientists to locate injured tissue in living patients, but the basic technique of deducing a brain area's normal activity by seeing what happens in people in whom it is damaged remains an important one.
Another time-honoured technique is to stimulate different areas of the brain directly and see what effect it has. It was by doing this that the surgeons in California brought about so much amusement in their epilepsy patient and thus identified what seems to be the (or a) humour module.
Direct stimulation was pioneered in the 1950s by the Canadian neurosurgeon Wilder Penfield, who charted large regions of the cerebral cortex by applying electrodes to different areas in the brains of hundreds of epilepsy patients. He demonstrated in this way that the entire body surface is represented on the brain's surface as though it had been drawn on: the bit that affects the arm lies next to the bit that affects the elbow, and the bit that affects the elbow is next to the bit that affects the upper arm and so on. More famously, he found that stimulating points in the temporal lobes produced what seemed to be vivid childhood memories, or snatches of long-forgotten tunes.
Most patients reported these recollections as dream-like, yet crystal clear. 'It was like ... standing in the doorway at [my] high school,' reported a twenty-one-year-old man. 'I heard my mother talking on the phone, telling my aunt to come over that night,' said another, '... my nephew and niece were visiting at my home ... they were getting ready to go home, putting on their coats and hats ... in the dining room ... my mother was talking to them. She was rushed — in a hurry.'
Penfield's findings have been widely interpreted as showing that memories are stored in discreet packets ('engrams') just waiting to be revived, but recent work indicates a more complex arrangement. Research by Steven Rose and colleagues at the Open University in Britain suggests that memories are cloned and that each clone is laid down in a different sensory area of the brain: visual, auditory and so on. Stimulation of one of these clones may in some way trigger the others, to give an integrated, multimedia experience. Penfield was probably stimulating just one sensory facet of the memory but eliciting a response in many.
The area stimulated in the laughing patient may similarly be just one node of a much larger brain module. Indeed, many of the neat little 'spots' currently marking particular functions may turn out to be exposed peaks of mainly buried neural conglomerates — icebergs of the mind.
It is possible, too, that areas that light up when a certain mental task is performed are not themselves responsible for that task, but are simply passing on a stimulus to the bit that is. There is an apocryphal story of a scientist who claimed to have discovered that frogs heard through their legs. When challenged to prove it he produced a frog that he had trained to jump on command. After demonstrating the frog's trick he picked the animal up and cut off its legs. Then he put it back on the table and told it, again, to jump. Naturally, it did not. 'There!' said the scientist, triumphantly. 'You see — it can no longer hear my voice!'
Brain mappers are trying hard to avoid falling into that trap but sometimes, inevitably, they do. According to some there is still too much of the gold rush about this science — too many researchers trying to stake out new claims rather than replicating the findings of others. Yet the ground is firming up. In the last couple of years standard scanning protocols have been adopted that have drastically reduced the number of rogue results, and methodology — particularly the problems of designing experiments that give unambiguous results — is a constant concern. The New Phrenologists are determined that their discoveries — unlike those of Franz Gall — will stand the test of time.
Posted January 11, 2002
The book is quite fascinating and gives one true insight into the workings of the mind. However, I was bothered by the way the author inserted her own opinions into what is otherwise a collection of scientific studies. Her views on religion, justice, and the future of mankind, whether right or wrong, simply don't belong in the book. What should be a purely scientific work is laced with the author's personal flavoring. The book is well worth the read, just be sure to distinguish between fact and opinion.Was this review helpful? Yes NoThank you for your feedback. Report this reviewThank you, this review has been flagged.
Posted April 23, 2001
Absolutely fascinating stuff and very well-written. My only complaint is that the sidebars were very long and also fascinating but disruptive to the reader. I had a hard time figuring out when I was supposed to read them. I'd finish a page in mid-sentence and have to decide whether to turn it or go back and read the sidebar. Anyway, the illustrations were very helpful and the text was surprisingly readable, considering the subject matter. I'd recommend this book to anyone.Was this review helpful? Yes NoThank you for your feedback. Report this reviewThank you, this review has been flagged.
Posted September 30, 2010
No text was provided for this review.
Posted July 7, 2010
No text was provided for this review.
Posted April 27, 2009
No text was provided for this review.
Posted February 5, 2009
No text was provided for this review.