Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Eating a Mediterranean diet has been linked to less brain shrinkage in older adults. Human brains naturally shrink with age. But a study that followed 401 people in their 70s found that the brains of those who adhered more closely to a Mediterranean-style diet shrank significantly less over a period of three years. A typical Mediterranean diet contains a high amount of vegetables, fruits, olive oil, beans and cereal grains, moderate amounts of fish, dairy products, and wine, and only a small amount of red meat and poultry. “As we age, the brain shrinks and we lose brain cells, which can affect learning and memory,” says Michelle Luciano, at the University of Edinburgh, UK, who led the study. “This study adds to the body of evidence that suggests the Mediterranean diet has a positive impact on brain health.” The differences in brain shrinkage were measured using brain scans. Statistical analysis of diet data found that simply eating more fish and less meat were not associated with reduced shrinking. “While the study points to diet having a small effect on changes in brain size, it didn’t look at the effect on risk of dementia,” says David Reynolds, at the charity Alzheimer’s Research UK. “We would need to see follow-up studies in order to investigate any potential protective effects against problems with memory and thinking.” Other studies have found that being overweight seems to accelerate shrinking of the brain’s white matter. © Copyright Reed Business Information Ltd.
By LISA FELDMAN BARRETT Think about the people in your life who are 65 or older. Some of them are experiencing the usual mental difficulties of old age, like forgetfulness or a dwindling attention span. Yet others somehow manage to remain mentally sharp. My father-in-law, a retired doctor, is 83 and he still edits books and runs several medical websites. Why do some older people remain mentally nimble while others decline? “Superagers” (a term coined by the neurologist Marsel Mesulam) are those whose memory and attention isn’t merely above average for their age, but is actually on par with healthy, active 25-year-olds. My colleagues and I at Massachusetts General Hospital recently studied superagers to understand what made them tick. Our lab used functional magnetic resonance imaging to scan and compare the brains of 17 superagers with those of other people of similar age. We succeeded in identifying a set of brain regions that distinguished the two groups. These regions were thinner for regular agers, a result of age-related atrophy, but in superagers they were indistinguishable from those of young adults, seemingly untouched by the ravages of time. What are these crucial brain regions? If you asked most scientists to guess, they might nominate regions that are thought of as “cognitive” or dedicated to thinking, such as the lateral prefrontal cortex. However, that’s not what we found. Nearly all the action was in “emotional” regions, such as the midcingulate cortex and the anterior insula. My lab was not surprised by this discovery, because we’ve seen modern neuroscience debunk the notion that there is a distinction between “cognitive” and “emotional” brain regions. © 2017 The New York Times Company
Bret Stetka With a president-elect who has publicly supported the debunked claim that vaccines cause autism, suggested that climate change is a hoax dreamed up by the Chinese, and appointed to his Cabinet a retired neurosurgeon who doesn't buy the theory of evolution, things might look grim for science. Yet watching Patti Smith sing "A Hard Rain's a-Gonna Fall" live streamed from the Nobel Prize ceremony in early December to a room full of physicists, chemists and physicians — watching her twice choke up, each time stopping the song altogether, only to push on through all seven wordy minutes of one of Bob Dylan's most beloved songs — left me optimistic. Taking nothing away from the very real anxieties about future funding and support for science, neuroscience in particular has had plenty of promising leads that could help fulfill Alfred Nobel's mission to better humanity. In the spirit of optimism, and with input from the Society for Neuroscience, here are a few of the noteworthy neuroscientific achievements of 2016. One of the more fascinating fields of neuroscience of late entails mapping the crosstalk between our biomes, brains and immune systems. In July, a group from the University of Virginia published a study in Nature showing that the immune system, in addition to protecting us from a daily barrage of potentially infectious microbes, can also influence social behavior. The researchers had previously shown that a type of white blood cells called T cells influence learning behavior in mice by communicating with the brain. Now they've shown that blocking T cell access to the brain influences rodent social preferences. © 2016 npr
Alan Yu Being overweight can raise your blood pressure, cholesterol and risk for developing diabetes. It could be bad for your brain, too. A diet high in saturated fats and sugars, the so-called Western diet, actually affects the parts of the brain that are important to memory and make people more likely to crave the unhealthful food, says psychologist Terry Davidson, director of the Center for Behavioral Neuroscience at American University in Washington, D.C. He didn't start out studying what people ate. Instead, he was interested in learning more about the hippocampus, a part of the brain that's heavily involved in memory. He was trying to figure out which parts of the hippocampus do what. He did that by studying rats that had very specific types of hippocampal damage and seeing what happened to them. In the process, Davidson noticed something strange. The rats with the hippocampal damage would go to pick up food more often than the other rats, but they would eat a little bit, then drop it. Davidson realized these rats didn't know they were full. He says something similar may happen in human brains when people eat a diet high in fat and sugar. Davidson says there's a vicious cycle of bad diets and brain changes. He points to a 2015 study in the Journal of Pediatrics that found obese children performed more poorly on memory tasks that test the hippocampus compared with kids who weren't overweight. He says if our brain system is impaired by that kind of diet, "that makes it more difficult for us to stop eating that diet. ... I think the evidence is fairly substantial that you have an effect of these diets and obesity on brain function and cognitive function." © 2016 npr
By Heather M. Snyder For more than 25 years, Mary Read was a successful nurse in Lititz, Pennsylvania. But in 2010, at the age of 50, she started having trouble with her memory and thinking, making it difficult for her to complete routine tasks and follow instructions at work. The problems worsened, bringing her career to an abrupt end. In 2011, her doctor conducted a comprehensive evaluation, including a cognitive assessment, and found that she was in the early stages of younger-onset Alzheimer’s, which affects hundreds of thousands of people under 65. A year earlier, Elizabeth Wolf faced another sort of upheaval. The 36-year-old community health program director was forced to abandon her own career, home and community in Vermont when both of her parents were diagnosed with Alzheimer’s three months apart. Wolf took the difficult decision to move back into her childhood home in Mount Laurel, New Jersey in order to become their primary caregiver. These stories are not unusual. Alzheimer’s dementia disproportionately affects women in a variety of ways. Compared with men, 2.5 times as many women as men provide 24-hour care for an affected relative. Nearly 19 percent of these wives, sisters and daughters have had to quit work to do so. In addition, women make up nearly two-thirds of the more than 5 million Americans living with Alzheimer’s today. According to the Alzheimer’s Association 2016 Alzheimer’s Disease Facts and Figures, an estimated 3.3 million women aged 65 and older in the United States have the disease. To put that number in perspective, a woman in her sixties is now about twice as likely to develop Alzheimer’s as breast cancer within her lifetime. © 2016 Scientific American
Ian Sample Science editor The first subtle hints of cognitive decline may reveal themselves in an artist’s brush strokes many years before dementia is diagnosed, researchers believe. The controversial claim is made by psychologists who studied renowned artists, from the founder of French impressionism, Claude Monet, to the abstract expressionist Willem de Kooning. While Monet aged without obvious mental decline, de Kooning was diagnosed with Alzheimer’s disease more than a decade before his death in 1997. Strobe lighting provides a flicker of hope in the fight against Alzheimer’s Alex Forsythe at the University of Liverpool analysed more than 2,000 paintings from seven famous artists and found what she believes are progressive changes in the works of those who went on to develop Alzheimer’s. The changes became noticeable when the artists were in their 40s. Though intriguing, the small number of artists involved in the study means the findings are highly tentative. While Forsythe said the work does not point to an early test for dementia, she hopes it may open up fresh avenues for investigating the disease. The research provoked mixed reactions from other scientists. Richard Taylor, a physicist at the University of Oregon, described the work as a “magnificent demonstration of art and science coming together”. But Kate Brown, a physicist at Hamilton College in New York, was less enthusiastic and dismissed the research as “complete and utter nonsense”. © 2016 Guardian News and Media Limited
Link ID: 23033 - Posted: 12.29.2016
Anna Gorman Rosemary Navarro was living in Mexico when her brother called from California. Something wasn't right with their mom, then in her early 40s. She was having trouble paying bills and keeping jobs as a food preparer in convalescent homes. Navarro, then 22, sold her furniture to pay for a trip back to the U.S. for herself and her two young children. Almost as soon as she arrived, she knew her mother wasn't the same person. "She was there but sometimes she wasn't there," she said. "I thought, 'Oh man this isn't going to be good.' " Before long, Navarro was feeding her mom, then changing her diapers. She put a special lock on the door to keep her from straying outside. Unable to continue caring for her, Navarro eventually moved her mom to a nursing home, where she spent eight years. Near the end, her mom, a quiet woman who had immigrated to the U.S. as a teenager and loved telenovelas, could communicate only by laughing or crying. Navarro was there when she took her last breath in 2009, at age 53. "What I went through with my mom I wouldn't wish on anyone," she said. Article continues after sponsorship It has happened again and again in her family — relatives struck by the same terrible disease, most without any clue what it was. An aunt, an uncle, a cousin, a grandfather, a great grandfather. "Too many have died," Navarro said. All in their early 50s. © 2016 npr
Link ID: 23023 - Posted: 12.27.2016
By Drake Baer Convergent evolution is what happens when nature takes different courses from different starting points to arrive at similar results. Consider bats, birds, and butterflies developing wings; sharks and dolphins finding fins; and echidnas and porcupines sporting spines. Or, if you want to annoy a traditionalist scientist, talk about humans and octopuses — and how they may both have consciousness. This is the thrust of Other Minds: The Octopus, the Sea, and the Deep Origins of Consciousness, a new book by the scuba-diving, biology-specializing philosopher Peter Godfrey-Smith, originally of Australia and now a distinguished professor at the City University of New York’s graduate center. The book was written up by Olivia Judson in The Atlantic, and you should read the whole thing, but what I find mesmerizing is how categorically other the eight-tentacled ink-squirters are, and how their very nature challenges our conceptualizations of intelligence. “If we can make contact with cephalopods as sentient beings, it is not because of a shared history, not because of kinship, but because evolution built minds twice over,” Godfrey-Smith is quoted as saying. “This is probably the closest we will come to meeting an intelligent alien.” (He’s not the first to think so: The Hawaiian creation myth holds that octopuses are the only creatures left over from an earlier incarnation of the Earth, making them more proto-terrestrials than extraterrestrials.) © 2016, New York Media LLC.
By Laurence O’Dwyer Daniel Tammet correctly recited the first 22,514 digits of Pi over the course of five hours and nine minutes. Less well-known, but similarly impressive, is the ability of a Clark’s nutcracker (Nucifraga columbiana)—a bird commonly found along the western flanks of North America—to remember where it stores thousands of separate caches of food. Tammet, who has autism spectrum disorder, is a savant. Some researchers have proposed that Clark’s nutcrackers might also represent a type of autistic savant. However, the unique abilities of a person with an autism spectrum disorder and savant syndrome usually comes at the price of social deficits. Experts in animal cognition who have examined similar abilities in birds and other creatures maintain that nonhuman animals that exhibit savant-like behavior do not display any equivalent dysfunction. The prodigious memory of the Clark’s nutcracker seems to be accompanied by an enlarged hippocampus compared with related species of birds that have not developed caching abilities, but in all other respects the bird seems to function normally. The hippocampus is a brain structure that is crucial for memory formation. In other words, its hyper-performance in one domain does not appear to come at a cost in another. (Admittedly, it is difficult to determine whether Clark’s nutcrackers are socially competent birds.) The “gift at a price” idea stems in part from the left hemisphere dysfunction and right hemisphere compensation that is often associated with savant syndrome. © 1986-2016 The Scientist
Carl Zimmer Leah H. Somerville, a Harvard neuroscientist, sometimes finds herself in front of an audience of judges. They come to hear her speak about how the brain develops. It’s a subject on which many legal questions depend. How old does someone have to be to be sentenced to death? When should someone get to vote? Can an 18-year-old give informed consent? Scientists like Dr. Somerville have learned a great deal in recent years. But the complex picture that’s emerging lacks the bright lines that policy makers would like. “Oftentimes, the very first question I get at the end of a presentation is, ‘O.K., that’s all very nice, but when is the brain finished? When is it done developing?’” Dr. Somerville said. “And I give a very nonsatisfying answer.” Dr. Somerville laid out the conundrum in detail in a commentary published on Wednesday in the journal Neuron. The human brain reaches its adult volume by age 10, but the neurons that make it up continue to change for years after that. The connections between neighboring neurons get pruned back, as new links emerge between more widely separated areas of the brain. Eventually this reshaping slows, a sign that the brain is maturing. But it happens at different rates in different parts of the brain. The pruning in the occipital lobe, at the back of the brain, tapers off by age 20. In the frontal lobe, in the front of the brain, new links are still forming at age 30, if not beyond. “It challenges the notion of what ‘done’ really means,” Dr. Somerville said. As the anatomy of the brain changes, its activity changes as well. In a child’s brain, neighboring regions tend to work together. By adulthood, distant regions start acting in concert. Neuroscientists have speculated that this long-distance harmony lets the adult brain work more efficiently and process more information. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 23008 - Posted: 12.22.2016
By Kate Baggaley In American schools, bullying is like the dark cousin to prom, student elections, or football practice: Maybe you weren’t involved, but you knew that someone, somewhere was. Five years ago, President Obama spoke against this inevitability at the White House Conference on Bullying Prevention. “With big ears and the name that I have, I wasn’t immune. I didn’t emerge unscathed,” he said. “But because it’s something that happens a lot, and it’s something that’s always been around, sometimes we’ve turned a blind eye to the problem.” We know that we shouldn’t turn a blind eye: Research shows that bullying is corrosive to children’s mental health and well-being, with consequences ranging from trouble sleeping and skipping school to psychiatric problems, such as depression or psychosis, self-harm, and suicide. But the damage doesn’t stop there. You can’t just close the door on these experiences, says Ellen Walser deLara, a family therapist and professor of social work at Syracuse University, who has interviewed more than 800 people age 18 to 65 about the lasting effects of bullying. Over the years, deLara has seen a distinctive pattern emerge in adults who were intensely bullied. In her new book, Bullying Scars, she introduces a name for the set of symptoms she often encounters: adult post-bullying syndrome, or APBS. DeLara estimates that more than a third of the adults she’s spoken to who were bullied have this syndrome. She stresses that APBS is a description, not a diagnosis—she isn’t seeking to have APBS classified as a psychiatric disorder. “It needs considerably more research and other researchers to look at it to make sure that this is what we’re seeing,” deLara says.
By Sarah DeWeerdt, Toddlers with autism are oblivious to the social information in the eyes, but don’t actively avoid meeting another person’s gaze, according to a new study. The findings support one side of a long-standing debate: Do children with autism tend not to look others in the eye because they are uninterested or because they find eye contact unpleasant? “This question about why do we see reduced eye contact in autism has been around for a long time,” says study leader Warren Jones, director of research at the Marcus Autism Center in Atlanta, Georgia. “It’s important for how we understand autism, and it’s important for how we treat autism.” If children with autism dislike making eye contact, treatments could incorporate ways to alleviate the discomfort. But if eye contact is merely unimportant to the children, parents and therapists could help them understand why it is important in typical social interactions. The work also has implications for whether scientists who study eye contact should focus on social brain regions rather than those involved in fear and anxiety. Lack of eye contact is among the earliest signs of autism, and its assessment is part of autism screening and diagnostic tools. Yet researchers have long debated the underlying mechanism. The lack-of-interest hypothesis is consistent with the social motivation theory, which holds that a broad disinterest in social information underlies autism features. On the other hand, anecdotal reports from people with autism suggest that they find eye contact unpleasant. Studies that track eye movements as people view faces have provided support for both hypotheses. © 2016 Scientific American
Link ID: 22994 - Posted: 12.17.2016
Older folks tend not to engage as much in risky behavior as teenagers and young adults do. You might call that wisdom or learned experience. But this also may be a result of older brains having less gray matter in a certain spot, according to a new study. Researchers found that adults who were less inclined to take risks had less gray matter in the right posterior parietal cortex, which is involved in decisions that entail risk. In the study, the researchers asked volunteers ranging in age from 18 to 88 to play a game involving risk. The participants were allowed to choose between a guaranteed gain, such as pocketing $5, or an uncertain gain, such as a lottery to earn between $5 and $120 with varying chances of winning or losing. As the researchers expected, those participants who chose the guaranteed gain — that is, no risk — tended to be older than those who opted for the lottery. It wasn’t a perfect correlation, but it was close. One could call this old-age wisdom. Yet when the researchers analyzed brain scans of these volunteers obtained through an MRI technique called voxel-based morphometry (VBM), they found that lower levels of gray matter, even more than age, best accounted for risk aversion. These results suggest that the brain changes that occur in healthy aging people may be behind more decision-making patterns and preferences than previously thought, the researchers noted in their findings, published Dec. 13 in the journal Nature Communications. © 1996-2016 The Washington Post
Keyword: Development of the Brain
Link ID: 22993 - Posted: 12.17.2016
By DONALD G. McNEIL Jr. and PAM BELLUCK Babies born to Zika-infected mothers are highly likely to have brain damage, even in the absence of obvious abnormalities like small heads, and the virus may go on replicating in their brains well after birth, according to three studies published Tuesday. Many types of brain damage were seen in the studies, including dead spots and empty spaces in the brain, cataracts and congenital deafness. There were, however, large differences among these studies in how likely it was that a child would be hurt by the infection. One study, published by The Journal of the American Medical Association, assessed 442 pregnancies registered with the Centers for Disease Control and Prevention between January and September in the continental United States and Hawaii, most of them in returning travelers. That report found that 6 percent had birth defects. None of those birth defects occurred in infants born to women infected in the second or third trimester. By contrast, in a study of 125 Zika-infected women in Rio de Janeiro done by Brazilian and American scientists and released by The New England Journal of Medicine, almost half of pregnancies had “adverse outcomes,” ranging from fetal deaths to serious brain damage. Of the 117 infants born alive, 42 percent had “grossly abnormal” brain scans or physical symptoms, the authors said. Other studies from Colombia, Brazil and French Polynesia have suggested that brain damage rates are between 1 and 13 percent. But each one uses different measurements of brain damage and different definitions of which mothers to include, so the question remains unanswered. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22990 - Posted: 12.15.2016
By James Gallagher Health and science reporter, BBC News website Detailed MRI scans should be offered to some women in pregnancy to help spot brain defects in the developing baby, say researchers. Ultrasounds are already used to look inside the womb and check that the baby is growing properly. However, the study on 570 women published in the Lancet showed doctors were able to make a much better diagnosis using MRI scans. Experts called for the scans to become routine practice. Pregnant women are offered an ultrasound scan at about 20 weeks that can spot abnormalities in the brain. They are detected in three in every 1,000 pregnancies. If the brain fails to develop properly it can result in miscarriage or still birth. Couples are generally offered counselling and some choose to have an abortion More certainty The study, carried out across 16 centres in the UK, analysed the impact of using MRI scans - which use magnetic fields and radio waves to image the body - to confirm any diagnoses. Overall, it showed ultrasound gave the correct diagnosis 68% of the time. But combining that with MRI increased the accuracy to 93%. Image copyright SPL Image caption The detailed picture of the developing baby's brain revealed by MRI The extra tests were most useful in borderline cases where doctors were uncertain of the outcome. The number of pregnant women who were given an "unknown" diagnosis was more than halved by the extra scans, increasing the confidence that the developing baby's brain was healthy or not. © 2016 BBC.
The important role vitamin D plays in early life is back in the spotlight after Australian researchers noticed a link between a deficiency during pregnancy and autism. The study found pregnant women with low vitamin D levels at 20 weeks’ gestation were more likely to have a child with autistic traits by the age of six. The finding has led to calls for the widespread use of vitamin D supplements during pregnancy, just as taking folate has reduced the incidence of spina bifida in the community. “This study provides further evidence that low vitamin D is associated with neurodevelopmental disorders,” said Professor John McGrath from the University of Queensland’s Brain Institute, who led the research alongside Dr Henning Tiemeier from the Erasmus Medical Centre in the Netherlands. McGrath said supplements might reduce the incidence of autism, a lifelong developmental condition that affects, among other things, how an individual relates to their environment and other people. “We would not recommend more sun exposure, because of the increased risk of skin cancer in countries like Australia,” he said. “Instead, it’s feasible that a safe, inexpensive, and publicly accessible vitamin D supplement in at-risk groups may reduce the prevalence of this risk factor.” Vitamin D usually comes from exposure to the sun, but it can also be found in some foods and supplements. While it’s widely known vitamin D is vital for maintaining healthy bones, there’s also a solid body of evidence linking it to brain growth. © 2016 Guardian News and Media Limited
Link ID: 22988 - Posted: 12.14.2016
By CATHERINE SAINT LOUIS As the opioid epidemic sweeps through rural America, an ever-greater number of drug-dependent newborns are straining hospital neonatal units and draining precious medical resources. The problem has grown more quickly than realized and shows no signs of abating, researchers reported on Monday. Their study, published in JAMA Pediatrics, concludes for the first time that the increase in drug-dependent newborns has been disproportionately larger in rural areas. The rising rates are due largely to widening use of opioids among pregnant women, the researchers found. From 2004 to 2013, the proportion of newborns born dependent on drugs increased nearly sevenfold in hospitals in rural counties, to 7.5 per 1,000 from 1.2 per 1,000. By contrast, the uptick among urban infants was nearly fourfold, to 4.8 per 1,000 from 1.4 per 1,000. “The problem is accelerating in rural areas to a greater degree than in urban areas,” said Dr. Veeral Tolia, a neonatologist who works at Baylor University Medical Center in Dallas and was not involved in the new report. Other recent studies have underscored the breadth of the problem. The hospital costs associated with treating addicted newborns rose to $1.5 billion in 2013, from $732 million in 2009, according to a study in the Journal of Perinatology. Some neonatal intensive care units, called NICUs, now devote 10 percent of their hours to caring for infants who have withdrawal symptoms. Hospitals in the eye of this storm are commonly underresourced, experts said. “Typically, rural hospitals that deliver babies have traditionally focused on the lower-risk population in areas they serve,” said Dr. Alison V. Holmes, an associate professor of pediatrics at Geisel School of Medicine at Dartmouth. © 2016 The New York Times Company
By Veronique Greenwood Baffling grammar, strange vowels, quirky idioms and so many new words—all of this makes learning a new language hard work. Luckily, researchers have discovered a number of helpful tricks, ranging from exposing your ears to a variety of native speakers to going to sleep soon after a practice session. A pair of recent papers suggests that even when you are not actively studying, what you hear can affect your learning and that sometimes listening without speaking works best. In one study, published in 2015 in the Journal of the Acoustical Society of America, linguists found that people who took breaks from learning new sounds performed just as well as those who took no breaks, as long as the sounds continued to play in the background. The researchers trained two groups of people to distinguish among trios of similar sounds—for instance, Hindi has “p,” “b” and a third sound English speakers mistake for “b.” One group practiced telling these apart one hour a day for two days. Another group alternated between 10 minutes of the task and 10 minutes of a “distractor” task that involved matching symbols on a worksheet while the sounds continued to play in the background. Remarkably, the group that switched between tasks improved just as much as the one that focused on the distinguishing task the entire time. “There's something about our brains that makes it possible to take advantage of the things you've already paid attention to and to keep paying attention to them,” even when you are focused on something else, suggests Melissa Baese-Berk, a linguist at the University of Oregon and a co-author of the study. In a 2016 study published in the Journal of Memory and Language, Baese-Berk and another colleague found that it is better to listen to new sounds silently rather than practice saying them yourself at the same time. Spanish speakers learning to distinguish among sounds in the Basque language performed more poorly when they were asked to repeat one of the sounds during training. The findings square with what many teachers have intuited—that a combination of focused practice and passive exposure to a language is the best approach. “You need to come to class and pay attention,” Baese-Berk says, “but when you go home, turn on the TV or turn on the radio in that language while you're cooking dinner, and even if you're not paying total attention to it, it's going to help you.” © 2016 Scientific American
By Meredith Wadman There have been few happy endings when it comes to spinal muscular atrophy (SMA), the most common genetic cause of death in childhood. The disease inexorably destroys the motor neurons of the spinal cord and brainstem that control movement, including swallowing and breathing. In its most severe form, SMA kills those afflicted at about age 2, most commonly by suffocating them. There are no Food and Drug Administration (FDA)–approved drugs for the disease. That is almost certainly about to change. An innovative drug that helps cells bypass the genetic flaw responsible for SMA may be approved as soon as this month, on the heels of strongly positive results from late-stage clinical trials. On 7 November, a trial of the drug, nusinersen, in wheelchair-bound children aged 2 to 12, was stopped on the grounds that it was unethical to deny the drug to children in the control arm, given the positive results in the treated children. In August, a similar trial in infants was stopped for the same reason, allowing the untreated infants in a control arm to begin receiving the drug. And today, a paper appearing in The Lancet provides compelling biological evidence that nusinersen is having its desired effect in the cells of the brain and spinal cord. “These [infant-onset] SMA kids are going to die. And not only are they now not dying, you are essentially on the path to a true cure of a degenerative [neurological] disease, which is unheard of,” says Jeffrey Rothstein, a neurologist at the Johns Hopkins School of Medicine in Baltimore, Maryland, who was not affiliated with the trials of the drug and is not connected with either of the two companies involved in its development: Ionis of Carlsbad, California, and Biogen of Cambridge, Massachusetts. © 2016 American Association for the Advancement of Science
Laura Sanders Flickering light kicks off brain waves that clean a protein related to Alzheimer’s disease out of mice’s brains, a new study shows. The results, described online December 7 in Nature, suggest a fundamentally new approach to counteracting Alzheimer’s. Many potential therapies involve drugs that target amyloid-beta, the sticky protein that accumulates in the brains of Alzheimer’s patients. In contrast, the new method used on mice causes certain nerve cells to fire at a specific rhythm, generating brain waves that researchers believe may clear A-beta. “This is a very creative and innovative new approach to targeting brain amyloid load in Alzheimer’s,” says geriatric psychiatrist Paul Rosenberg of Johns Hopkins Medicine. But he cautions that the mouse results are preliminary. Neuroscientist Li-Huei Tsai of MIT and colleagues saw that mice engineered to produce lots of A-beta don’t produce as many gamma waves in the hippocampus, a brain structure important for memory. Using a method called optogenetics, the researchers genetically designed certain nerve cells in the hippocampus to fire off signals in response to light. In this way, the researchers induced gamma waves — rhythmic firings 40 times per second. After just an hour of forced gamma waves, the mice had less A-beta in the hippocampus, the researchers found. Further experiments revealed that gamma waves packed a double whammy — they lowered A-beta by both reducing production and enhancing the brain’s ability to clear it. © Society for Science & the Public 2000 - 2016
Link ID: 22966 - Posted: 12.08.2016