- Home
- Rahul Jandial
Life Lessons from a Brain Surgeon Page 3
Life Lessons from a Brain Surgeon Read online
Page 3
NEURO GEEK: THE SECRETS OF EINSTEIN’S GLIA
Albert Einstein left clear instructions on what to do with his remains in the event of his death: he wanted to be cremated, his ashes to be scattered secretly. Instead, when he died on April 18, 1955, the pathologist on call, Thomas Harvey, stole Einstein’s brain and took it home. He sliced it into 240 pieces, put them in a preservative, and then placed the pieces in two jars in his basement. Eventually, he sent some of the slices to scientists around the world.
One of those scientists was a neuroanatomist named Marian Diamond. I remember Dr. Diamond well from her popular anatomy class at Berkeley. She was the first scientist to show, in rats, that an enriched environment, in the form of toys and other rats, increased the thickness and performance of their brains. But her greater fame came in 1985, when she reported the results of her study of four slices of Einstein’s brain.1
What she found was that he had significantly more glia — the often ignored brain cells that surround and protect neurons — than in the average male brain. This opened the door to our understanding of the importance of glia as more than mere bystanders in the brain’s development.
We now understand that the approximately 85 billion glia feed neurons with nutrients and oxygen, insulate them from each other, destroy invading pathogens, remove dead neurons, and enhance their communication.
2
Beyond Memory and IQ
After the second year of medical school, all would-be physicians must take a grueling day-long test: Step 1 of the United States Medical Licensing Exam. It’s an eight-hour multiple-choice test on anatomy, biochemistry, behavioral sciences, genetics, immunology, pathology, pharmacology, physiology, plate tectonics, quantum mechanics, rocket science, and the Upper Paleolithic history of Siberia.
Some of that was exaggeration, but it is painful, because you’re scored against about 18,000 other medical students who all excel at memorizing and taking tests.
It’s not quite true that a physician’s entire future depends on their score, but it’s close. That score is the chief element determining not only how prestigious an institution they can train at for residency, but also the speciality for which they can train.
Preparation requires literally hundreds of hours of memorization. I scored better than most, but there was a guy at my residency rumored to have gotten the highest score in the country that year. He was the same guy I mentioned in the first chapter, the one initially selected to become the neurosurgery resident at our hospital who was then kicked out after just a few months.
He did not lack for IQ. He was great at acing tests. But, from what I’ve been told, he didn’t know when to ask for help with a dangerously sick patient or when he could handle one on his own. The constant crises left him flustered. Juggling a neurosurgery service with twenty patients, as a rookie surgeon, requires multitasking and judgment — very little to do with multiple-choice questions and raw knowledge.
The best scorers on tests in medical school are given the opportunity to train in surgery, but they are never evaluated for technical ability and performance under pressure. So, as you can expect, there is often a mismatch of smarts and ability.
Of course, intelligence matters. The question, however, is how much it matters. Bill Gates and Oprah Winfrey didn’t become giants in their fields without intellectual firepower. But then again, they didn’t succeed at turning their smart ideas and insights into business juggernauts by IQ alone. They needed good judgment, determination to succeed, and the ability to manage, delegate, and inspire those around them.
Let’s take a look at how all those essential capacities emerge from the brain and how to maximize your own natural gifts.
NEURO GEEK: THE FLYNN EFFECT
In 1984, a New Zealand academic named James R. Flynn made a curious discovery.1 While examining IQ scores going back to the early twentieth century, he saw that the average had been steadily rising by about three points per decade. The middle-of-the-pack score back in 1920 would now be graded as 70 — which by today’s standards would be considered mildly intellectually impaired. The average person taking the test today would have been given a score of 130 — nearly “genius” level — had they been graded back in 1920.
Some researchers explain away the so-called “Flynn effect” by arguing that people have simply gotten better at taking tests. Improved teaching methods, the introduction of kindergarten and preschool, and higher graduation rates have all resulted, they claim, in young people simply testing better.
Flynn, however, insists that kids are literally getting smarter in part because of school, but also because of better nutrition and fewer childhood illnesses than their grandparents experienced. Even more important, Flynn says, is that our world has become ever more cognitively challenging. A hundred years ago, nearly one-third of Americans lived on a farm; today that figure is less than 2 percent. Only in the 1920s did radio become popular; not until the late 1950s did television reach most homes; as recently as the year 2000, fewer than half of all U.S. homes had access to the internet; the smartphone as we know it didn’t exist until Apple released the iPhone in 2007.
“We are the first of our species to live in a world dominated by categories, hypotheticals, nonverbal symbols, and visual images that paint alternative realities,” Flynn has written.2 “We have evolved to deal with a world that would have been alien to previous generations.”
The undisputed fact of the Flynn effect demonstrates that human beings are capable of getting smarter, that intelligence is not determined simply by DNA.
HOW YOUR BRAIN REMEMBERS
For much of the twentieth century, scientists believed that each memory in the brain was stored as a web of connections between neurons — not in any single neuron or collection of neurons, but in how they linked together. This view was seemingly proved beyond doubt in a famous paper published in 1950 by psychologist Karl Lashley.3 He performed hundreds of experiments on rats in which he taught them to remember a maze, a task, or an object and then made surgical incisions in various spots of their brains. No matter where he cut, the rats still remembered what they had learned — but just a little less well. Two incisions made them a little more forgetful; three made them even worse, and so on. No one spot mattered more than any other.
“There are no special cells reserved for special memories,” Lashley wrote.
That orthodox view first came under attack in 1984, when neuroscientist Richard Thompson trained rabbits to blink whenever they heard a particular musical tone by repeatedly pairing the tone with a puff of air to their eye. Once the rabbits had learned to associate the musical tone with the puff of air, they blinked every time they heard the tone, even without the puff. But then, contradicting Lashley, Thompson found that if he removed just a few hundred neurons from a section of the cerebellum near the brain stem, the rabbits no longer blinked.4 Somewhere in those neurons, he described, the rabbits’ memory linking the puff of air and the musical tone had been stored.
By 2005, scientists were showing that individual neurons are involved in recognizing particular faces. Upon being shown a picture of Jennifer Aniston, for instance, a single neuron in the hippocampus reacted.5 Another lit up in response to pictures of the actress Halle Berry.
Since then, researchers have developed an array of neuromolecular tools to create false memories in mice — the sort of thing seen in the Christopher Nolan film Inception.
Other techniques have been used to make the fear associated with a particular stimulus disappear — a treatment that could one day be of value to people suffering from phobias or post-traumatic stress disorder.
LEARN FROM THE GERM
Memory of one kind or another is at the heart of all life. What else is DNA but a way for life to remember its own blueprint for reproduction?
You might assume that a brain is necessary for remembering, but that’s not so. Consider E. coli, the single-celled bacteria that live in our guts and the guts of most other warm-blooded creatures, wh
ere they are usually harmless but occasionally cause food-borne sickness. Incredibly, they have a version of short-term memory. When they’re swimming around in your intestines, looking for food, they will keep going in a more or less straight line as long as they find nothing worth snacking on. But once they find something nutritious, they will stop, eat, and then pirouette from that spot, turning in a tight circle in hopes of finding more deliciousness close by. Once they run out of treats in that localized area, they will then continue on their way.
Almost all animals do this: It’s called area-restricted search.6 If a pigeon finds a single crust of bread under a chair, it will keep pecking for other crumbs nearby until nothing is left. Then it flutters off in search of another spot.
Both parts of this strategy are really important: making sure you find every last crumb in a given area, and then systematically searching other areas.
What’s really strange is that memory works the same way: by area-restricted search. If I ask you to list all the animals you can think of, you are likely to begin with the category of “pets” and list cats, dogs, goldfish, parakeets. Once you run out of items in that category, you’ll move on (like the pigeon who can find no more crumbs) to another category: farm animals such as cows, chickens, pigs, goats, and horses. When you can’t think of any more in that category, you switch to jungle animals: lions, tigers, monkeys. And so on. The same process by which E. coli looks for food in your gut is how you try to remember that last grocery item you were supposed to buy. (Is it in dairy? Fruits and vegetables? Meats?)
A really cool study of this was published in the journal Memory and Cognition.7 It found that smarter people can list more animals overall than less intelligent people, but only because they are better at thinking up more categories in which to mentally search. When the researchers ran the test again with another set of participants, they required the participants to use a set list of categories provided (pets, farms, jungles, forests, etc.). As a result, the gap between the smarter and less smart people disappeared.
On the other hand, people with early signs of dementia tend to show the opposite problem: When trying to recall a long list of items, they are less good at going through each category. They move to another category before exhausting the first.
So when trying to remember stuff, try to follow the lesson of E. coli and pigeons: Intentionally practice area-restricted search. Diligently scour your brain first for categories and then for items in each category.
For instance, let’s try an easy exercise that will take you less than five minutes. First, take a paper and pencil or open a new document on your computer, and set a timer for two minutes. When you are ready, write down the names of as many kinds of water-dwelling creatures as you can think of in the time given.
Ready? Begin!
Okay, once you are finished, I would like you to try it again. This time, however, I want you to use the following categories, again giving yourself just two minutes to list as many as you can.
When you’re ready, begin!
Freshwater fish
Ocean-dwelling fish
Ocean-dwelling mammals
Dangerous fish
Sea creatures that have shells
If you gave it a shot, the list of five categories helped you to think of new kinds of water-dwelling creatures. Area-restricted search works for people just like it works for germs!
THE LEARNING TREE
An amazing series of experiments by Monica Gagliano, an evolutionary ecologist at the University of Western Australia, has shown that a plant can learn.8
The first plant Gagliano studied, Mimosa pudica, is a perennial herb in the pea family. Popularly known as the “sensitive plant” or “touch-me-not,” it is famous for having leaves that fold inward and droop whenever it is touched or shaken. After a few minutes, the leaves reopen.
Gagliano decided to test whether the plant could learn to ignore a particular type of disturbance. She placed dozens of them in holders that would periodically allow them to drop down by about a foot. At first, the plants’ leaves immediately folded inward after the fall. But after repeated descents, the leaves stopped reacting; they remained open. Apparently, they had become habituated to the drops. They had learned.
Then, in 2016, Gagliano published an even more astonishing report.9 Most plants grow toward the light, right? She designed an experiment to see if plants could learn a conditioned response in the same way that Pavlov had taught dogs to salivate in response to the ringing of a bell. This time she used forty-five seedlings of another kind of pea plant, Pisum sativum, and placed fans and a light source either on the same side of the plants or on the opposite. After three days of this, she tested the plants’ growth on a fourth day, when only the fan — without any light — was turned on.
Sure enough — without a brain! — a majority of the pea plants began growing toward or away from the fan, exactly as they had been taught to expect light to appear based on their three days of training.
Gagliano and others have put forward some clever hypotheses for how plants can learn such tricks, but for now, this much is certain: the ability to learn and remember is so important to life, even plants and bacteria do it!
NEURO BUSTED: IS BRAIN TRAINING BS?
Plenty of news articles these days claim that brain training is junk science, so when the most popular commercial provider of online brain games, Lumosity, was fined $2 million by the Federal Trade Commission for making unsubstantiated claims, the media pounced.10
But as a neurosurgeon and neuroscientist who has studied the scientific literature and seen the benefits of brain training on my own patients, I know that at least some kinds of brain training — not Lumosity, perhaps, but other types that have been far better studied — can significantly improve people’s functioning.
One of the most incredible demonstrations of brain training’s effectiveness was reported in the summer of 2016. With funding from the National Institute of Aging, the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study recruited 2,832 healthy older adults with an average age of 73.6 years at the beginning of the trial. The researchers then randomly divided them into four groups. One group received no brain training at all; two groups were taught tricks for improving memory and reasoning; the fourth and final group spent 10 hours playing a video game designed to improve their so-called “speed of processing.”
Five years later, the speed-of-processing group had had half as many car accidents as people in the other groups.
Ten years later, those who had completed the most hours of training in the speed-of-processing group had their risk of developing dementia nearly cut in half — a finding that no drug or any other treatment has ever come close to achieving.11
So what’s this speed-of-processing training? Developed by a company called BrainHQ, it involves looking at a center target on a computer screen while tiny icons appear briefly on the screen’s periphery. The challenge is to keep your eyes firmly fixed on the center yet still correctly identify exactly where those icons appeared. The better you get, the faster the icons on the screen’s edge appear and disappear.
I don’t usually like to recommend a commercial product, but BrainHQ is one of the best-researched programs available for brain training. If you want to try computerized training, I don’t know of a better site to try.
Older adults are hardly the only ones who benefit from brain training. Because I specialize in surgery on brain cancer, I have long been concerned about the cognitive effects of the chemotherapy and radiation my colleagues usually offer after the operation. So-called “chemo brain” is not just a feeling of exhaustion; children, in particular, are known to experience lifelong decreases in IQ after brain surgery followed by chemotherapy or radiation. Yet pilot studies of a brain-training program called Cogmed have found that it may help to prevent or reverse such changes in children.
Offered only by psychologists who have been trained by the company, Cogmed includes a series of
computerized exercises that demand close attention and focus. 3D Grid, for instance, challenges you to flick on a series of panels in the same sequence as they are briefly lit up. Another exercise asks you to type in a series of numbers after you hear them spoken aloud, but you need to list them in reverse order. Easy at first, the exercises get harder — and the results on attention and focus get better — as the sequences get longer.
Even for healthy young adults, brain training appears to pay off. One of the most rigorously designed studies in the field, published in 2018, was carried out by researchers at Oxford, Harvard, and a division of Honeywell Aerospace. After recruiting 113 students from leading universities, they tested the effects of a brain game called Robot Factory either alone or in combination with a form of mild, external brain stimulation called transcranial direct current stimulation (tDCS).12 After just three weeks, they found that the students who had undergone both the stimulation and the brain training saw significant gains on intelligence tests, whereas the other students did not.
BEYOND INTELLIGENCE
Memory and computational prowess are important, but if you want to be anything other than a mathematician, you will probably require a few other brain traits:
EMOTIONAL INTELLIGENCE. From the sandbox to the corner office, the ability to play well with others is huge. As science journalist Daniel Goleman showed in his bestseller of the same name, emotional intelligence is the ability to “rein in emotional impulse; to read another’s innermost feelings; to handle relationships smoothly.”13
As ethereal and slippery as these qualities might seem, they have their basis in the brain — primarily in the frontal lobe. Although popularly characterized as the headquarters of human IQ, the frontal lobe is also where our emotional and social self-control emerges. Damage the frontal lobe, and you become an emotional wreck. Individuals affected by frontotemporal dementia likewise lose control of their emotions; they cry at the drop of a hat, laugh during a funeral, or fly into a rage over nothing.