Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Claire Cain Miller Boys are falling behind. They graduate from high school and attend college at lower rates than girls and are more likely to get in trouble, which can hurt them when they enter the job market. This gender gap exists across the United States, but it is far bigger for poor people and for black people. As society becomes more unequal, it seems, it hurts boys more. New research from social scientists offers one explanation: Boys are more sensitive than girls to disadvantage. Any disadvantage, like growing up in poverty, in a bad neighborhood or without a father, takes more of a toll on boys than on their sisters. That realization could be a starting point for educators, parents and policy makers who are trying to figure out how to help boys — particularly those from black, Latino and immigrant families. “It’s something about family disadvantage itself,” said David Figlio, a Northwestern University economist and co-author of a new paper, presented publicly for the first time on Thursday. “Black people in America are more disadvantaged than white people in America, and if we were to reduce the disadvantage, we may see a reduction in the relative gender gap as well.” Marianne Bertrand, an economist at University of Chicago who with Jessica Pan has studied the gender gap, also found that boys fare worse than girls in disadvantaged homes, and are more responsive than girls to parental time and resources. “Their findings were very consistent: Families that invest more in children are protective for boys,” she said. The reasons that boys react more negatively to disadvantage are varied and hard to pinpoint. Even in utero, boys are more sensitive to extreme stress than girls, and tend to have more unruly temperaments. Society discourages boys from showing vulnerability. Low-income families are often led by single mothers, which has been found to affect boys differently than girls. © 2015 The New York Times Company
Alzheimer's disease can be detected decades before onset, using a virtual reality test, a study suggests. People aged 18 to 30 were asked to navigate through a virtual maze to test the function of certain brain cells. Those with a high genetic risk of Alzheimer's could be identified by their performance, according to German neuroscientists. The findings could help future research, diagnosis and treatment, they report in the journal Science. The scientists, led by Lukas Kunz of the German Centre for Neurodegenerative Diseases in Bonn, say the high risk group navigated the maze differently and had reduced functioning of a type of brain cell involved in spatial navigation. The findings could give an insight into why people with dementia can find navigating the world around them challenging, they say. "Our results could provide a new basic framework for preclinical research on Alzheimer's disease and may provide a neurocognitive explanation of spatial disorientation in Alzheimer's disease," they report in Science. Although genes play a role in dementia, their effects are complex with many unknowns. Dr Laura Phipps of Alzheimer's Research, said the latest study focused on healthy younger people at higher genetic risk of Alzheimer's, suggesting they may already show alterations in spatial navigation several decades before the disease could start. © 2015 BBC.
Link ID: 21555 - Posted: 10.23.2015
Keikantse Matlhagela Susumu Tonegawa unlocked the genetic secrets behind antibodies' diverse structures, which earned him the Nobel Prize in Physiology or Medicine in 1987. Having since moved fields, he tells Keikantse Matlhagela about his latest work on the neuroscience of happy and sad memories. You started as a chemist, then you moved into molecular biology and now you are a neuroscientist. Why change fields? Strangely, the only people to ask me about this are journalists — my students never ask. I see myself as a scientist who is interested in what's going on inside of us. It doesn't matter whether it is chemistry or immunology or neuroscience, I just do research on what I find interesting. The switch from chemistry to immunology did not seem like a big shift when I was young, but immunology to neuroscience was. After about 15 years spent researching immunology I wanted to explore an area of science where there are still big, unresolved questions. The brain is probably the most mysterious subject there is. Do you keep up to date with the field in which you won your Nobel prize? I am sorry to say that I haven't been paying a lot of attention to immunology in recent years because I am preoccupied with my work on memory. I have friends, of course, from that time — very close friends. But my friends are not young. Even though they are experts, they are also retired. We tend not to talk about immunology a whole lot. © 2015 Macmillan Publishers Limited
Keyword: Learning & Memory
Link ID: 21542 - Posted: 10.22.2015
By Emily Underwood CHICAGO—In 1898, Italian biologist Camillo Golgi found something odd as he examined slices of brain tissue under his microscope. Weblike lattices, now known as "perineuronal nets," surrounded many neurons, but he could not discern their purpose. Many dismissed the nets as an artifact of Golgi's staining technique; for the next century, they remained largely obscure. Today, here at the annual meeting of the Society for Neuroscience, researchers offered tantalizing new evidence that holes in these nets could be where long-term memories are stored. Scientists now know that perineuronal nets (PNNS) are scaffolds of linked proteins and sugars that resemble cartilage, says neuroscientist Sakina Palida, a graduate student in Roger Tsien's lab at the University of California,San Diego, and co-investigator on the study. Although it's still unclear precisely what the nets do, a growing body of research suggests that PNNs may control the formation and function of synapses, the microscopic junctions between neurons that allow cells to communicate, and that may play a role in learning and memory, Palida says. One of the most pressing questions in neuroscience is how memories—particularly long-term ones—are stored in the brain, given that most of the proteins inside neurons are constantly being replaced, refreshing themselves anywhere from every few days to every few hours. To last a lifetime, Palida says, some scientists believe that memories must somehow be encoded in a persistent, stable molecular structure. Inspired in part by evidence that destroying the nets in some brain regions can reverse deeply ingrained behaviors, Palida’s adviser Tsien, a Nobel-prize-winning chemist, recently began to explore whether PNNs could be that structure. Adding to the evidence were a number of recent studies linking abnormal PNNs to brain disorders including schizophrenia and Costello syndrome, a form of intellectual disability. © 2015 American Association for the Advancement of Science.
By LISA SANDERS, M.D. The middle-aged couple knocked at the door of the townhouse. When no one answered, the woman took her key and let them in. She called her daughter’s name as she hurried through the rooms. They had been trying to reach their 27-year-old daughter by phone all day, and she hadn’t answered. They found her upstairs, lying in bed and mumbling incoherently. The mother rushed over, but her daughter showed no signs of recognition. She and her husband quickly carried her to the car. Four months before, the mother told the emergency-room doctor at SSM Health St. Mary’s Hospital in St. Louis, her daughter had a procedure called gastric-sleeve surgery to help her lose weight. She came home after just a couple of days and felt great. She looked bright and eager. Once she started to eat, though, nausea and vomiting set in. After almost every meal, she would throw up. It’s an unusual but well-known complication of this kind of surgery. The cause is not clearly understood, but the phenomenon is sometimes linked to reflux. The surgeon tried different medications to stop the nausea and vomiting and to reduce the acid in her stomach, but they didn’t help. She had the surgery in order to lose weight, but now she was losing weight too quickly. After a month of vomiting, her doctors thought maybe she had developed gallstones — a common problem after rapid weight loss. But even after her gallbladder was removed, the young woman continued to vomit after eating. © 2015 The New York Times Company
Susan Gaidos CHICAGO — Eating a high-fat diet as a youngster can affect learning and memory during adulthood, studies have shown. But new findings suggest such diets may not have long-lasting effects. Rats fed a high-fat diet for nearly a year recovered their ability to navigate their surroundings. University of Texas at Dallas neuroscientist Erica Underwood tested spatial memory for rats fed a high-fat diet for either 12 weeks or 52 weeks, immediately after weaning. After rats placed in a chamber-filled box containing Lego-like toys became familiar with the box, the researchers moved the toys to new chambers. Later, when placed in the box, rats who ate high-fat foods for 12 weeks appeared confused and had difficulty finding the toys. But rats that ate high-fat foods for nearly a year performed as well as those fed a normal diet. Underwood repeated the experiment, posing additional spatial memory tests to new groups of rats. The findings were the same: Over the long-term, rats on high-fat diets recovered their ability to learn and remember. Studies of brain cells revealed that rats on the long-term high-fat diet showed reduced excitability in nerve cells from the hippocampus, the same detrimental effects seen in rats on the short-term high-fat diet. “The physiology that should create a dumber animal is there, but not the behavior,” said Lucien Thompson of UT Dallas, who oversaw the study. Underwood and Thompson speculate that some other part of the brain may be compensating for this reduction in neural response. © Society for Science & the Public 2000 - 2015.
By Martin Enserink Researchers who conduct animal studies often don't use simple safeguards against biases that have become standard in human clinical trials—or at least they don't report doing so in their scientific papers, making it impossible for readers to ascertain the quality of the work, an analysis of more than 2500 journal articles shows. Such biases, conscious or unconscious, can make candidate medical treatments look better than they actually are, the authors of the analysis warn, and lead to eye-catching results that can't be replicated in larger or more rigorous animal studies—or in human trials. Neurologist Malcolm MacLeod of the Centre for Clinical Brain Sciences at the University of Edinburgh and his colleagues combed through papers reporting the efficacy of drugs in eight animal disease models and checked whether the authors reported four measures that are widely acknowledged to reduce the risk of bias. First, if there was an experimental group and a control group, were animals randomly assigned to either one? (This makes it impossible for scientists to, say, assign the healthiest mice or rats to a treatment group, which could make a drug look better than it is.) Second, were the researchers who assessed the outcomes of a trial—for instance, the effect of a treatment on an animal's health—blinded to which animal underwent what procedure? Third, did the researchers calculate in advance the sample size needed to show that they didn't just accumulate data until they found something significant? And finally, did they make a statement about their conflicts of interest? © 2015 American Association for the Advancement of Science
By Hanae Armitage Schools of fish clump together for a very simple reason: safety in numbers. But for some, banding together offers more than just protection. It’s a way of getting to the head of the class. Schooling fish learn from each other, and new research shows that when they’re taken out of their normal social group, individuals struggle to learn on their own. Scientists have long known that schooling fish observe and learn from each other’s failures and successes, behaviors that stimulate neural development, especially in the part of the brain responsible for memory and learning. But this is the first time they have found evidence of that link in spatial learning. To test their theory, scientists divided a school of social cichlid fish into two categories: 14 social fish and 15 loners. Researchers kept the social fish grouped together while they partitioned the loners into single-fish isolation tanks. They ran both groups through a simple T-shaped maze, color coding the side that harbored food—a yellow mark for food, a green mark for no food. Seven of the 14 socialized fish learned to associate yellow with food (high marks for the cichlids, which are not the brightest fish in the animal kingdom), whereas only three of the 15 isolated fish successfully made the same association. Writing in this month’s issue of Applied Animal Behaviour Science, the researchers say this suggests fish in group settings are able to learn better and faster than their singled out counterparts. The moral? Simple: Fish should stay in school. © 2015 American Association for the Advancement of Science
By Kimberly G. Noble What if we could draw a line from key areas of a low-income child’s brain to a policy intervention that would dramatically reduce his or her chances of staying in poverty, dropping out of school and entering the criminal justice or social welfare system? Wouldn’t we want to make that policy prescription as widely available as any vaccination against childhood disease? Thanks to remarkable advances in neuroscience and the social sciences, we are closing in on this possibility. In a study published this year in Nature Neuroscience, several co-authors and I found that family income is significantly correlated with children’s brain size — specifically, the surface area of the cerebral cortex, which is the outer layer of the brain that does most of the cognitive heavy lifting. Further, we found that increases in income were associated with the greatest increases in brain surface area among the poorest children. Not surprisingly, our findings made many people uncomfortable. Some feared the study would be used to reinforce the notion that people remain in poverty because they are less capable than those with higher incomes. As neuroscientists, we interpret the results very differently. We know that the brain is most malleable in the early years of life and that experiences during that time have lifelong effects on the mind. Work by social scientists such as Sendhil Mullainathan at Harvard University and Eldar Shafir at Princeton University has shown that poverty depletes parents’ cognitive resources, leaving less capacity for making everyday decisions about parenting. These parents are also at far greater risk for depression and anxiety — poverty’s “mental tax.” All of this has important implications for children.
Gareth Cook talks to Douwe Draaisma Much has been written on the wonders of human memory: its astounding feats of recall, the way memories shape our identities and are shaped by them, memory as a literary theme and a historical one. But what of forgetting? This is the topic of a new book by Douwe Draaisma, author of The Nostalgia Factory: Memory, Time and Ageing (Yale University Press, 2013; 176 pages) and a professor of the history of psychology at the University of Groningen in the Netherlands. In Forgetting: Myths, Perils and Compensations (Yale University Press, 2015; 288 pages), Draaisma considers dreaming, amnesia, dementia and all the ways in which our minds—and lives—are shaped by memory’s opposite. He answered questions from contributing editor Gareth Cook. What is your earliest memory, and why, do you suppose, have you not forgotten it? Quite a few early memories in the Netherlands involve bicycles; mine is no exception. I was two and a half years old when my aunts walked my mother to the train station. They had taken a bike to transport her bags. I was sitting on the back of the bike. Suddenly the whole procession came to a halt when my foot got caught between the spokes of a wheel. I am pretty sure this memory is accurate because I had to see a doctor, and there is a dated medical record. It is a brief, snapshotlike memory, black-and-white. I do not remember any pain, but I do remember the consternation among my mom and her sisters. Looking back on this memory from a professional perspective, I would say that it has the flashlike character typical for first memories from before age three; “later” first memories are usually a bit longer and more elaborate. © 2015 Scientific American
Keyword: Learning & Memory
Link ID: 21474 - Posted: 10.05.2015
By Lisa Sanders, M.d. On Thursday we challenged Well readers to solve the case of a 27-year-old woman who had vomiting, weakness and confusion months after having weight loss surgery. More than 200 readers offered their perspective on the case. Most of you recognized it as a nutritional deficiency, and nearly half of you totally nailed it. The diagnosis is: Wernicke’s encephalopathy due to thiamine (vitamin B1) deficiency. The very first reader to post a comment, Dr. Adrian Budhram, figured it out. His answer landed on our doorstep just five minutes after the case went up. Dr. Budhram is a second year neurology resident at Western University in London, Ontario. He says that Wernicke’s is on the list of diseases he thinks about every time someone is brought to the hospital because they are confused. Thiamine, or vitamin B1, is a nutrient essential for the body to break down and use sugars and proteins. It is found in many foods, including beans, brown rice, pork and cereals. Although the body only stores enough of the vitamin to last three to four weeks, deficiencies are rare when a full and varied diet is available. Diseases caused by a thiamine deficiency were described in Chinese medicine as early as 2600 B.C. – well before the vitamin was identified chemically. Western medicine came to know the disease as beriberi – a Sinhalese term meaning weak (apparently from the phrase “I can’t, I can’t”) characterized by either numbness and weakness in the legs (dry beriberi) or a weakened heart leading to hugely swollen legs (wet beriberi). © 2015 The New York Times Company
Keyword: Learning & Memory
Link ID: 21469 - Posted: 10.03.2015
James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class.
Erin Wayman Priya Rajasethupathy’s research has been called groundbreaking, compelling and beautifully executed. It’s also memorable. Rajasethupathy, a neuroscientist at Stanford University, investigates how the brain remembers. Her work probes the molecular machinery that governs memories. Her most startling — and controversial — finding: Enduring memories may leave lasting marks on DNA. Being a scientist wasn’t her first career choice. Although Rajasethupathy inherited a love of computation from her computer scientist dad, she enrolled in Cornell University as a pre-med student. After graduating in three years, she took a year off to volunteer in India, helping people with mental illness. During that year she also did neuroscience research at the National Centre for Biological Sciences in Bangalore. While there, she began to wonder whether microRNAs, tiny molecules that put protein production on pause, could play a role in regulating memory. She pursued that question as an M.D. and Ph.D. student at Columbia University (while intending, at least initially, to become a physician). She found some answers in the California sea slug (Aplysia californica). In 2009, she and colleagues discovered a microRNA in the slug’s nerve cells that helps orchestrate the formation of memories that linger for at least 24 hours. © Society for Science & the Public 2000 - 2015.
Keyword: Learning & Memory
Link ID: 21434 - Posted: 09.23.2015
Rachel Ehrenberg If not for a broken piece of lab equipment and a college crush, Steve Ramirez might never have gone into neuroscience. As an undergraduate at Boston University his interests were all over the place: He was taking a humanities course and classes in philosophy and biochemistry while working several hours a week in a biology lab. When the lab’s centrifuge, a device that spins liquids, broke, Ramirez had to use one in another lab. “I was trying to make small talk with this girl who was using the centrifuge, ‘What’s your major?’ kind of thing,” Ramirez recalls. Hearing of his myriad interests, the student suggested that Ramirez talk with neuroscientist Paul Lipton. That led to a conversation with Howard Eichenbaum, a leading memory researcher. Eichenbaum told him that everything Ramirez was interested in was about the brain. “Everything from the pyramids to putting a man on the moon, it’s all the product of the human brain, which is kind of crazy when you think about it,” Ramirez says. Studying “the most interdisciplinary organ in existence,” as Ramirez calls it, was a natural fit. While working in Eichenbaum’s lab, Ramirez got turned on to how the brain forms memories. Those explorations led to a Ph.D. program at MIT in the lab of Nobel laureate Susumu Tonegawa, where Ramirez focused on the individual brain cells that hold specific memories. © Society for Science & the Public 2000 - 2015.
By Michael Balter Are some animals smarter than others? It’s hard to say, because you can’t sit a chimpanzee or a mouse down at a table for an IQ test. But a new study, in which scientists tested wild robins on a variety of skills, concludes that they do differ in the kind of “general intelligence” that IQ tests are supposed to measure. General intelligence is usually defined as the ability to do well on multiple cognitive tasks, from math skills to problem solving. For years, researchers have questioned whether measurable differences exist in humans and nonhumans alike. In humans, factors like education and socioeconomic status can affect performance. When it comes to animals, the problem is compounded for two main reasons: First, it is very difficult to design and administer tests that pick up on overall smarts instead of specific skills, such as the keen memories of food-hoarding birds or the fine motor skills of chimpanzees that make tools for finding insects in trees. Second, differences in animal test scores can depend on how motivated they are to perform. Because most experiments award would-be test-takers with food, an empty (or a full) stomach might be all it takes to skew the results. Thus, even studies that suggest variations in intelligence among mice, birds, and apes all carry the caveat that alternative explanations could be at play. To get around some of these limitations, a team led by Rachael Shaw, an animal behavior researcher at Victoria University of Wellington, turned to a population of New Zealand North Island robins for a new round of experiments. The robins live at the Zealandia wildlife sanctuary, a 225-hectare nature paradise in Wellington where more than 700 of the birds live wild and protected from predators in the middle of the city. © 2015 American Association for the Advancement of Science.
We all have our favourite movie moments, ones we love to watch again from time to time. Now it seems chimpanzees and bonobos, too, have the nous to recall thrilling scenes in movies they have previously seen and anticipate when they are about to come up. The results suggest apes can readily recall and anticipate significant recent events, just by watching those events once. Rather than use hidden food as a memory test, Japanese researchers made short movies and showed them to apes on two consecutive days. “We showed a movie instead, and asked whether they remember it when they only watch an event once, and an event totally new to them,” says Fumihiro Kano of Kyoto University in Japan. “Their anticipatory glances told us that they did.” Plot moment Kano and his colleague Satoshi Hirata made and starred in two short films. Another of the characters was a human dressed up as an ape in a King Kong costume who carried out attacks on people, providing the key plot moment in the first movie (see video). Both films were designed to contain memorable dramatic events, and the researchers deployed laser eye-tracking technology to see if the animals preferentially noticed and remembered these moments. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 21421 - Posted: 09.20.2015
By AMY HARMON Some neuroscientists believe it may be possible, within a century or so, for our minds to continue to function after death — in a computer or some other kind of simulation. Others say it’s theoretically impossible, or impossibly far off in the future. A lot of pieces have to fall into place before we can even begin to start thinking about testing the idea. But new high-tech efforts to understand the brain are also generating methods that make those pieces seem, if not exactly imminent, then at least a bit more plausible. Here’s a look at how close, and far, we are to some requirements for this version of “mind uploading.” The hope of mind uploading rests on the premise that much of the key information about who we are is stored in the unique pattern of connections between our neurons, the cells that carry electrical and chemical signals through living brains. You wouldn't know it from the outside, but there are more of those connections — individually called synapses, collectively known as the connectome — in a cubic centimeter of the human brain than there are stars in the Milky Way galaxy. The basic blueprint is dictated by our genes, but everything we do and experience alters it, creating a physical record of all the things that make us US — our habits, tastes, memories, and so on. It is exceedingly tricky to transition that pattern of connections, known as the connectome, into a state where it is both safe from decay and can be verified as intact. But in recent months, two sets of scientists said they had devised separate ways to do that for the brains of smaller mammals. If either is scaled up to work for human brains — still a big if — then theoretically your brain could sit on a shelf or in a freezer for centuries while scientists work on the rest of these steps. © 2015 The New York Times Company
Mo Costandi In an infamous set of experiments performed in the 1960s, psychologist Walter Mischel sat pre-school kids at a table, one by one, and placed a sweet treat – a small marshmallow, a biscuit, or a pretzel – in front of them. Each of the young participants was told that they would be left alone in the room, and that if they could resist the temptation to eat the sweet on the table in front of them, they would be rewarded with more sweets when the experimenter returned. The so-called Marshmallow Test was designed to test self-control and delayed gratification. Mischel and his colleagues tracked some of the children as they grew up, and then claimed that those who managed to hold out for longer in the original experiment performed better at school, and went on to become more successful in life, than those who couldn’t resist the temptation to eat the treat before the researcher returned to the room. The ability to exercise willpower and inhibit impulsive behaviours is considered to be a core feature of the brain’s executive functions, a set of neural processes - including attention, reasoning, and working memory - which regulate our behaviour and thoughts, and enable us to adapt them according to the changing demands of the task at hand. Executive function is a rather vague term, and we still don’t know much about its underlying bran mechanisms, or about how different components of this control system are related to one another. New research shows that self-control and memory share, and compete with each other for, the same brain mechanisms, such that exercising willpower saps these common resources and impairs our ability to encode memories. © 2015 Guardian News and Media Limited
Aftab Ali People who were born prematurely are less intelligent later on in life and earn less money as a result, according to a new study by the University of Warwick. Researchers at the Coventry-based institution said they found a link which connects pre-term birth with low reading and, in particular, maths skills which affect the amount of wealth accumulated as adults. Funded by the Nuffield Foundation, the researchers examined data from two other large studies, following children born more than a decade apart, with one group from 1958 and the other from 1970. In total, more than 15,000 individuals were surveyed – which recruited all children born in a single week in England, Scotland, and Wales. Data were examined for all individuals who were born at between 28 and 42 weeks gestational age, and who had available wealth information at the age of 42. Those participants who were born pre-term – at less than 37 weeks – were compared with those who were born full-term to find both groups’ mathematical ability in childhood had a direct effect on how much they earned as an adult, regardless of later educational qualifications. In order to measure adult wealth, the researchers looked at factors including: family income and social class, housing and employment status, and their own perceptions of their financial situation. In regards to academic abilities, they examined: validated measures for mathematics, reading, and intelligence, along with ratings from teachers and parents. © independent.co.uk
By Gretchen Reynolds At the age of 93, Olga Kotelko — one of the most successful and acclaimed nonagenarian track-and-field athletes in history — traveled to the University of Illinois to let scientists study her brain. Ms. Kotelko held a number of world records and had won hundreds of gold medals in masters events. But she was of particular interest to the scientific community because she hadn’t begun serious athletic training until age 77. So scanning her brain could potentially show scientists what late-life exercise might do for brains. Ms. Kotelko died last year at the age of 95, but the results of that summer brain scan were published last month in Neurocase. And indeed, Ms. Kotelko’s brain looked quite different from those of other volunteers aged 90-plus who participated in the study, the scans showed. The white matter of her brain — the cells that connect neurons and help to transmit messages from one part of the brain to another — showed fewer abnormalities than the brains of other people her age. And her hippocampus, a portion of the brain involved in memory, was larger than that of similarly aged volunteers (although it was somewhat shrunken in comparison to the brains of volunteers decades younger than her). Over all, her brain seemed younger than her age. But because the scientists didn’t have a scan showing Ms. Kotelko’s brain before she began training, it’s impossible to know whether becoming an athlete late in life improved her brain’s health or whether her naturally healthy brain allowed her to become a stellar masters athlete. © 2015 The New York Times Company