Chapter 13. Memory and Learning

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 2021

By Michael S. Rosenwald In early February, Vishvaa Rajakumar, a 20-year-old Indian college student, won the Memory League World Championship, an online competition pitting people against one another with challenges like memorizing the order of 80 random numbers faster than most people can tie a shoelace. The renowned neuroscientist Eleanor Maguire, who died in January, studied mental athletes like Mr. Rajakumar and found that many of them used the ancient Roman “method of loci,” a memorization trick also known as the “memory palace.” The technique takes several forms, but it generally involves visualizing a large house and assigning memories to rooms. Mentally walking through the house fires up the hippocampus, the seahorse-shaped engine of memory deep in the brain that consumed Dr. Maguire’s career. We asked Mr. Rajakumar about his strategies of memorization. His answers, lightly edited and condensed for clarity, are below. Q. How do you prepare for the Memory League World Championship? Hydration is very important because it helps your brain. When you memorize things, you usually sub-vocalize, and it helps to have a clear throat. Let’s say you’re reading a book. You’re not reading it out loud, but you are vocalizing within yourself. If you don’t drink a lot of water, your speed will be a bit low. If you drink a lot of water, it will be more and more clear and you can read it faster. Q. What does your memory palace look like? Let’s say my first location is my room where I sleep. My second location is the kitchen. And the third location is my hall. The fourth location is my veranda. Another location is my bathroom. Let’s say I am memorizing a list of words. Let’s say 10 words. What I do is, I take a pair of words, make a story out of them and place them in a location. And I take the next two words. I make a story out of them. I place them in the second location. The memory palace will help you to remember the sequence. © 2025 The New York Times Company

Keyword: Learning & Memory; Attention
Link ID: 29673 - Posted: 02.15.2025

By Angie Voyles Askham Identifying what a particular neuromodulator does in the brain—let alone how such molecules interact—has vexed researchers for decades. Dopamine agonists increase reward-seeking, whereas serotonin agonists decrease it, for example, suggesting that the two neuromodulators act in opposition. And yet, neurons in the brain’s limbic regions release both chemicals in response to a reward (and also to a punishment), albeit on different timescales, electrophysiological recordings have revealed, pointing to a complementary relationship. This dual response suggests that the interplay between dopamine and serotonin may be important for learning. But no tools existed to simultaneously manipulate the neuromodulators and test their respective roles in a particular area of the brain—at least, not until now—says Robert Malenka, professor of psychiatry and behavioral sciences at Stanford University. As it turns out, serotonin and dopamine join forces in the nucleus accumbens during reinforcement learning, according to a new study Malenka led, yet they act in opposition: dopamine as a gas pedal and serotonin as a brake on signaling that a stimulus is rewarding. The mice he and his colleagues studied learned faster and performed more reliably when the team optogenetically pressed on the animals’ dopamine “gas” as they simultaneously eased off the serotonin “brake.” “It adds a very rich and beguiling picture of the interaction between dopamine and serotonin,” says Peter Dayan, director of computational neuroscience at the Max Planck Institute for Biological Cybernetics. In 2002, Dayan proposed a different framework for how dopamine and serotonin might work in opposition, but he was not involved in the new study. The new work “partially recapitulates” that 2002 proposal, Dayan adds, “but also poses many more questions.” © 2025 Simons Foundation

Keyword: Learning & Memory
Link ID: 29672 - Posted: 02.15.2025

By Michael S. Rosenwald Eleanor Maguire, a cognitive neuroscientist whose research on the human hippocampus — especially those belonging to London taxi drivers — transformed the understanding of memory, revealing that a key structure in the brain can be strengthened like a muscle, died on Jan. 4 in London. She was 54. Her death, at a hospice facility, was confirmed by Cathy Price, her colleague at the U.C.L. Queen Square Institute of Neurology. Dr. Maguire was diagnosed with spinal cancer in 2022 and had recently developed pneumonia. Working for 30 years in a small, tight-knit lab, Dr. Maguire obsessed over the hippocampus — a seahorse-shaped engine of memory deep in the brain — like a meticulous, relentless detective trying to solve a cold case. An early pioneer of using functional magnetic resonance imaging (f.M.R.I.) on living subjects, Dr. Maguire was able to look inside human brains as they processed information. Her studies revealed that the hippocampus can grow, and that memory is not a replay of the past but rather an active reconstructive process that shapes how people imagine the future. “She was absolutely one of the leading researchers of her generation in the world on memory,” Chris Frith, an emeritus professor of neuropsychology at University College London, said in an interview. “She changed our understanding of memory, and I think she also gave us important new ways of studying it.” In 1995, while she was a postdoctoral fellow in Dr. Frith’s lab, she was watching television one evening when she stumbled on “The Knowledge,” a quirky film about prospective London taxi drivers memorizing the city’s 25,000 streets to prepare for a three-year-long series of licensing tests. Dr. Maguire, who said she rarely drove because she feared never arriving at her destination, was mesmerized. “I am absolutely appalling at finding my way around,” she once told The Daily Telegraph. “I wondered, ‘How are some people so bloody good and I am so terrible?’” In the first of a series of studies, Dr. Maguire and her colleagues scanned the brains of taxi drivers while quizzing them about the shortest routes between various destinations in London. © 2025 The New York Times Company

Keyword: Learning & Memory
Link ID: 29671 - Posted: 02.15.2025

By Sara Reardon A man who seemed genetically destined to develop Alzheimer’s disease while still young has reached his mid-70s without any cognitive decline — in only the third recorded case of such resistance to the disease. The findings, published today in Nature Medicine1, raise questions about the role of the proteins that ravage the brain during the disease and the drugs that target them. Since 2011, a study called the Dominantly Inherited Alzheimer Network (DIAN) has been following a family in which many members have a mutation in a gene called PSEN2. The mutation causes the brain to produce versions of the amyloid protein that are prone to clumping into the sticky plaques thought to drive neurodegeneration. Family members with the mutation invariably develop Alzheimer’s at around age 50. Then, a 61-year-old man from this family showed up at the DIAN study’s clinic with full cognitive function, and the researchers were shocked to discover that he had the fateful PSEN2 mutation. The man’s mother had had the same mutation, as had 11 of her 13 siblings; all had developed dementia around age 50. The researchers were even more shocked when scans revealed that his brain looked like that of someone with Alzheimer’s. “His brain was full of amyloid,” says behavioural neurologist and study co-author Jorge Llibre-Guerra at Washington University in St. Louis, Missouri. What the man’s brain didn’t contain, however, were clusters of tau — another protein that forms tangled threads inside neurons. Positron emission tomography (PET) scans revealed that he had a small amount of abnormal tau and that it was only in the occipital lobe, a brain region involved in visual perception that is not usually affected in Alzheimer’s disease. © 2025 Springer Nature Limited

Keyword: Alzheimers; Genes & Behavior
Link ID: 29667 - Posted: 02.12.2025

By Felicity Nelson Mice immediately bolt for shelter when they see the looming shadow of a bird, just as humans jump when they see a spider. But these instinctive reactions, which are controlled by the brainstem, can be suppressed if animals learn that a scary stimulus is harmless. In Science today, neuroscientists reveal the precise regions of the brain that suppress fear responses in mice1 — a finding that might help scientists to develop strategies for treating post-traumatic stress disorder and anxiety in people. The study showed that two parts of the brain work together to learn to suppress fear. But, surprisingly, only one of these regions is involved in later recalling the learnt behaviour. “This is the first evidence of that mechanism,” says neuroscientist Pascal Carrive at the University of New South Wales in Sydney, Australia. In the study, an expanding dark circle was used to imitate a swooping bird, and caused naive mice to run to a shelter. To teach the mice that this looming stimulus was not dangerous, a barrier was added to prevent the animals from hiding. “I like their behavioural model,” says Christina Perry, a behavioural neuroscientist at Macquarie University in Sydney. “It’s very simple,” she adds. The mice “don’t get eaten, so they learn that this fake predator is not, in fact, a threat”. As the mice were learning to be bolder, the researchers switched specific types of neurons on or off using optogenetics — a well-established technique that allows neurons to be controlled with light. When researchers silenced the parts of the cerebral cortex that analyse visual stimuli (called the posterolateral higher visual areas), the mice did not learn to suppress fear and continued to try to escape from the fake bird — suggesting that this area of the brain is necessary for learning to suppress this fear reaction. © 2025 Springer Nature Limited

Keyword: Emotions; Stress
Link ID: 29664 - Posted: 02.08.2025

By Laura Hercher edited by Gary Stix It is impossible, of course, to identify the precise moment we first suspected the changes in my mother were something other than normal aging. In my own imperfect memory, what rises up is the first morning of a weeklong trip to Rome, when my mother woke up at 2 A.M., got dressed and went down for breakfast. A hotel employee found her wandering from room to room, looking for toast and coffee. She was jet-lagged, my brother and I told each other uneasily. It could happen to anyone. But weren’t there cues? Didn’t she notice the darkened lobby, the stillness, the clock? If we had known then, would it have helped? To date, no U.S. Food and Drug Administration–­approved therapy exists for asymptomatic people at risk of Alzheimer’s disease (AD). My mother was not a smoker, drank in moderation, read books, took classes, and spent that week in Italy soaking up everything the tour guide told her about Caravaggio and Bernini like she was prepping for a quiz. Five years passed after that trip before my mother received a diagnosis of dementia. Today, a simple blood test can detect changes in the brain that predict AD up to 15 years before the first symptoms emerge. For researchers, tools for early detection give a peek at the full spectrum of AD, pinpointing early seeds of pathology deep inside the brain. Cognitive decline—what we typically think of as the disease itself—is merely the illness’s denouement. “Dementia is a result. Dementia is a symptom,” explains Clifford R. Jack, Jr., a neuroradiologist at the Mayo Clinic in Rochester, Minn., and chair of the Alzheimer’s Association (AA) working group responsible for recent, controversial guidelines for the diagnosis of AD based on underlying biology, not clinical presentation. Scientific American is part of Springer Nature,

Keyword: Alzheimers
Link ID: 29657 - Posted: 02.05.2025

By Catherine Offord In scientists’ search to understand the causes of autism, a spotlight has fallen on maternal health during pregnancy. Based partly on association studies, researchers have proposed that conditions including obesity and depression during pregnancy could lead to autism in a child by affecting fetal neurodevelopment. But a study of more than 1 million Danish children and their families, published today in Nature Medicine, pushes back against this view. Researchers analyzed more than 200 health conditions that occurred in these children’s mothers before or during pregnancy. They conclude that many of the supposed links to a child’s autism diagnosis may not be causal, and instead reflect inherited genetic variants or environmental factors shared within families. “It’s a very comprehensive and well-done study,” says Håkan Karlsson, a neuroscientist at the Karolinska Institute who was not involved in the work. It suggests “conditions [pregnant people] suffered from during pregnancy are probably not the cause of autism in their kid.” The findings dovetail with a growing view in the field that shared genetics could explain a lot of the apparent connections between maternal health and autism, adds Drexel University epidemiologist Brian Lee. However, he and others caution the study doesn’t rule out that some conditions during pregnancy could have a causative role, nor does it identify factors that do influence the likelihood of autism. Previous research has linked conditions such as maternal obesity, psychiatric disorders, and pregnancy or birth complications to an increased likelihood of autism diagnoses in children. Such findings can lead some pregnant people to feel that “if they get this or that condition, their [child’s] chance of autism may increase,” says Magdalena Janecka, an epidemiologist at New York University’s Grossman School of Medicine and a co-author on the new paper. © 2025 American Association for the Advancement of Science.

Keyword: Autism
Link ID: 29652 - Posted: 02.01.2025

By Katharine Gammon Today more than 55 million people around the world have Alzheimer’s disease and other dementias, which ravage the minds of those who suffer from them and have devastating impacts on their family members. In spite of decades of research, the precise origins of these diseases continue to elude scientists, though numerous factors have been found to be associated with higher risk, including genetics and various lifestyle and environmental factors. Nautilus Members enjoy an ad-free experience. Log in or Join now . The quest has recently taken a turn to a newer model for studying the brain: brain organoids. These three-dimensional clumps of neuronal tissue derived from human stem cells have been used to study everything from epilepsy to the origins of consciousness. And now, researchers in Massachusetts are slamming them with miniature metal pistons to test out whether they can lend credence to a controversial hypothesis: that concussions might reactivate a common virus in the brain, increasing dementia risk. A decade of research suggests traumatic brain injury, whether from accidents or high-contact sports, is a standout risk factor for Alzheimer’s and other forms of neurodegenerative decline. Some estimates suggest that up to 10 percent of cases could be attributed to at least one prior head injury, but why is not fully understood. Separately, a growing body of research proposes that viral infection, including a common virus known as herpes simplex one, can also increase susceptibility to these diseases. But all three things—head trauma, viral infection, and dementia—have not been directly connected in experimental research, until now. One of the challenges in getting to the roots of dementia is that humans lead complex, messy lives. In the soup of risk factors—from high blood pressure to loneliness to genetic inheritance—it can be hard to filter out the most impactful forces that have contributed to the onset of any one dementia case. There are no ethical ways to test these questions on humans, of course, while using lab animals presents its own ethical and cost challenges. Animals are never a perfect match for humans anyway, and dementia-related findings in animals have so far not translated well to human patients. © 2025 NautilusNext Inc.,

Keyword: Alzheimers; Brain Injury/Concussion
Link ID: 29646 - Posted: 01.29.2025

By Yasemin Saplakoglu Imagine you’re on a first date, sipping a martini at a bar. You eat an olive and patiently listen to your date tell you about his job at a bank. Your brain is processing this scene, in part, by breaking it down into concepts. Bar. Date. Martini. Olive. Bank. Deep in your brain, neurons known as concept cells are firing. You might have concept cells that fire for martinis but not for olives. Or ones that fire for bars — perhaps even that specific bar, if you’ve been there before. The idea of a “bank” also has its own set of concept cells, maybe millions of them. And there, in that dimly lit bar, you’re starting to form concept cells for your date, whether you like him or not. Those cells will fire when something reminds you of him. Concept neurons fire for their concept no matter how it is presented: in real life or a photo, in text or speech, on television or in a podcast. “It’s more abstract, really different from what you’re seeing,” said Elizabeth Buffalo (opens a new tab), a neuroscientist at the University of Washington. For decades, neuroscientists mocked the idea that the brain could have such intense selectivity, down to the level of an individual neuron: How could there be one or more neurons for each of the seemingly countless concepts we engage with over a lifetime? “It’s inefficient. It’s not economic,” people broadly agreed, according to the neurobiologist Florian Mormann (opens a new tab) at the University of Bonn. But when researchers identified concept cells in the early 2000s, the laughter started to fade. Over the past 20 years, they have established that concept cells not only exist but are critical to the way the brain abstracts and stores information. New studies, including one recently published in Nature Communications, have suggested that they may be central to how we form and retrieve memory. © 2025 Simons Foundation

Keyword: Learning & Memory; Attention
Link ID: 29639 - Posted: 01.22.2025

By Holly Barker Previously unrecognized genetic changes on the X chromosome of autistic people could explain the higher prevalence of the condition among men and boys than among women and girls, according to two new studies. About 60 variants are more common in people with autism than in those without the condition, an analysis of roughly 15,000 X chromosomes revealed. Several of the variants are in Xp22.11, a region of the X chromosome linked to autism in boys and men. In the second study, the team pinpointed 27 autism-linked variants in DDX53, one of the genes in the vulnerable region that had not been tied to the condition in past research. Those findings could help explain why autism is diagnosed three to four times more often in boys than girls, according to the study investigators, led by Stephen Scherer, chief of research at SickKids Research Institute. Although that disparity is likely influenced by social factors—male-only studies could lead to autism being less recognizable in women and girls, and girls may be conditioned to mask their autism traits—there is also a clear biological component. The X chromosome plays an outsized role in brain development, and many genes on the chromosome are strongly linked to autism, previous studies have found. Still, the sex chromosomes have been mostly ignored in genetic searches of autism variants, says Aaron Besterman, associate clinical professor of psychiatry at the University of California, San Diego, who was not involved in the work. “It’s been a dirty little secret that for a long time the X chromosome has not been well interrogated from a genetics perspective,” he says. Sex chromosomes are often sidelined because of difficulties interpreting data, given that men possess half the number of X-linked genes as women. What’s more, random inactivation of X chromosomes makes it hard to tell how a single variant is expressed in female tissues. And the existence of pseudoautosomal regions—stretches of DNA that behave like regular chromosomes and escape inactivation—complicates matters further. © 2025 Simons Foundation

Keyword: Autism; Sexual Behavior
Link ID: 29638 - Posted: 01.22.2025

Rachael Elward Lauren Ford Severance, which imagines a world where a person’s work and personal lives are surgically separated, will soon return to Apple TV+ for a second season. While the concept of this gripping piece of science fiction is far-fetched, it touches on some interesting neuroscience. Can a person’s mind really be surgically split in two? Remarkably, “split-brain” patients have existed since the 1940s. To control epilepsy symptoms, these patients underwent a surgery to separate the left and right hemispheres. Similar surgeries still happen today. Later research on this type of surgery showed that the separated hemispheres of split-brain patients could process information independently. This raises the uncomfortable possibility that the procedure creates two separate minds living in one brain. In season one of Severance, Helly R (Britt Lower) experienced a conflict between her “innie” (the side of her mind that remembered her work life) and her “outie” (the side outside of work). Similarly, there is evidence of a conflict between the two hemispheres of real split-brain patients. When speaking with split-brain patients, you are usually communicating with the left hemisphere of the brain, which controls speech. However, some patients can communicate from their right hemisphere by writing, for example, or arranging Scrabble letters. A young patient was asked what job he would like in the future. His left hemisphere chose an office job making technical drawings. His right hemisphere, however, arranged letters to spell “automobile racer”. Split brain patients have also reported “alien hand syndrome”, where one of their hands is perceived to be moving of its own volition. These observations suggest that two separate conscious “people” may coexist in one brain and may have conflicting goals. In Severance, however, both the innie and the outie have access to speech. This is one indicator that the fictional “severance procedure” must involve a more complex separation of the brain’s networks. © 2010–2025, The Conversation US, Inc.

Keyword: Learning & Memory; Consciousness
Link ID: 29635 - Posted: 01.18.2025

By Phie Jacobs For more than 30 years, scientists have known the genetic culprit behind Huntington disease, a devastating neurodegenerative disorder that causes cells deep in the brain to sicken and die. But they couldn’t account for why people who inherit the faulty gene variant take so long to develop symptoms, or why disease progression varies so widely from person to person. A study published today in Cell helps explain: In the brain cells that die off in Huntington, a repetitive stretch of a gene’s DNA gets longer and longer over a person’s life, and this accelerating expansion turns deadly to the cell—and ultimately to the person. The findings represent “a really remarkable insight,” says Leslie Thompson, a neuroscientist at the University of California, Irvine who wasn’t involved in the new research. “This study and some others are changing the way that we’re thinking about the disease.” People who develop Huntington inherit a flawed version of the HTT gene, which produces a protein called huntingtin. This gene contains an unusual stretch of DNA, where a sequence of three of its nucleotide bases—cytosine, adenine, and guanine, or CAG in genetic parlance—are repeated multiple times in a row. And although most people inherit versions of HTT with about 15 to 30 consecutive CAG repeats and never develop Huntington, those with 40 or more in the gene almost always have symptoms later in life, including psychological and cognitive problems and uncontrolled, jerking movements known as chorea. The genetic stutter produces an abnormally large, unstable version of the huntingtin protein, which forms clumps inside brain cells. The condition usually leads to early death, often from issues related to difficulty swallowing, injuries from falls, or suicide. The longer a person’s stretch of repeats, the earlier the disorder rears its head. Scientists originally thought the number of CAG repeats only increased as the HTT gene was passed down through generations; a child of a parent with Huntington might themselves develop the condition at an earlier age. But it turns out the length of this genetic “stutter” can change over a person’s life in at least some of their cells. A 2003 study analyzed brain samples donated by people who had died of Huntington and found shockingly large CAG expansions in a part of the brain known as the striatum.

Keyword: Huntingtons; Genes & Behavior
Link ID: 29634 - Posted: 01.18.2025

By Anna Victoria Molofsky Twenty years ago, a remarkable discovery upended our understanding of the range of elements that can shape neuronal function: A team in Europe demonstrated that enzymatic digestion of the extracellular matrix (ECM)—a latticework of proteins that surrounds all brain cells—could restore plasticity to the visual cortex even after the region’s “critical period” had ended. Other studies followed, showing that ECM digestion could also alter learning in the hippocampus and other brain circuits. These observations established that proteins outside neurons can control synaptic plasticity. We now know that up to 20 percent of the brain is extracellular space, filled with hundreds of ECM proteins—a “matrisome” that plays multiple roles, including modulating synaptic function and myelin formation. ECM genes in the human brain are different than those in other species, suggesting that the proteins they encode could be part of what makes our brains unique and keeps them healthy. In a large population study, posted as a preprint on bioRxiv last year, that examined blood protein biomarkers of organ aging, for example, the presence of ECM proteins was most highly correlated with a youthful brain. Matrisome proteins are also dysregulated in astrocytes from people at high risk for Alzheimer’s disease, another study showed. Despite the influence of these proteins and the ongoing work of a few dedicated researchers, however, the ECM field has not caught on. I would challenge a room full of neuroscientists to name one protein in the extracellular matrix. To this day, the only ECM components most neuroscientists have heard of are “perineuronal nets”—structures that play an important role in stabilizing synapses but make up just a tiny fraction of the matrisome. A respectable scientific journal, covering its own paper that identified a critical impact of ECM, called it “brain goop.” © 2025 Simons Foundation

Keyword: Learning & Memory; Glia
Link ID: 29633 - Posted: 01.18.2025

By Meghan Rosen Baby Boomers may drive a bigger-than-expected boom in dementia cases. By 2060, 1 million U.S. adults per year will develop dementia, scientists predict January 13 in Nature Medicine. Dementia is a broad term encompassing many symptoms, including memory, reasoning and language difficulties that interfere with people’s daily lives. Researchers estimate that it currently affects more than 6 million people in the United States. “This is a huge problem,” says Josef Coresh, an epidemiologist at New York University’s Grossman School of Medicine. A rise in the projected number of dementia cases is not surprising, given the aging U.S. population ­­— but the extent of the rise stands out, he says. His team predicts that 42 percent of people in the United States who are over 55 years old will develop dementia sometime during their lifetime. That’s about double the percentage estimated by previous researchers. Coresh’s new estimate is based on a study population that’s larger — more than 15,000 people — and more diverse than earlier work. His team followed participants for years, in some cases decades, using several methods to identify dementia cases. They pored over hospital and death records, evaluated participants in person and screened them by phone. For the last decade, the researchers have been calling participants twice a year, Coresh says. That gave the team a window into people’s lives, revealing dementia cases that might otherwise have gone unreported. Though the team focused on dementia in people over age 55, risk doesn’t typically start ticking up for decades. And some populations were at greater risk than others, including women, Black people and those with a particular gene variant linked to Alzheimer’s disease. © Society for Science & the Public 2000–2025.

Keyword: Alzheimers
Link ID: 29627 - Posted: 01.15.2025

By Roni Caryn Rabin Water fluoridation is widely seen as one of the great public health achievements of the 20th century, credited with substantially reducing tooth decay. But there has been growing controversy among scientists about whether fluoride may be linked to lower I.Q. scores in children. A comprehensive federal analysis of scores of previous studies, published this week in JAMA Pediatrics, has added to those concerns. It found a significant inverse relationship between exposure levels and cognitive function in children. Higher fluoride exposures were linked to lower I.Q. scores, concluded researchers working for the National Institute of Environmental Health Sciences. None of the studies included in the analysis were conducted in the United States, where recommended fluoridation levels in drinking water are very low. At those amounts, evidence was too limited to draw definitive conclusions. Observational studies cannot prove a cause-and-effect relationship. Yet in countries with much higher levels of fluoridation, the analysis also found evidence of what scientists call a dose-response relationship, with I.Q. scores falling in lock step with increasing fluoride exposure. Children are exposed to fluoride through many sources other than drinking water: toothpaste, dental treatments and some mouthwashes, as well as black tea, coffee and certain foods, such as shrimp and raisins. Some drugs and industrial emissions also contain fluoride. For every one part per million increase in fluoride in urinary samples, which reflect total exposures from water and other sources, I.Q. points in children decreased by 1.63, the analysis found. “There is concern that pregnant women and children are getting fluoride from many sources,” said Kyla Taylor, an epidemiologist at the institute and the report’s lead author, “and that their total fluoride exposure is too high and may affect fetal, infant and child neurodevelopment.” © 2025 The New York Times Company

Keyword: Intelligence; Development of the Brain
Link ID: 29625 - Posted: 01.11.2025

By Laura Sanders Recovery from PTSD comes with key changes in the brain’s memory system, a new study finds. These differences were found in the brains of 19 people who developed post-traumatic stress disorder after the 2015 terrorist attacks in Paris — and then recovered over the following years. The results, published January 8 in Science Advances, point to the complexity of PTSD, but also to ways that brains can reshape themselves as they recover. With memory tasks and brain scans, the study provides a cohesive look at the recovering brain, says cognitive neuroscientist Vishnu Murty of the University of Oregon in Eugene. “It’s pulled together a lot of pieces that were floating around in the field.” On the night of November 13, 2015, terrorists attacked a crowded stadium, a theater and restaurants in Paris. In the years after, PTSD researchers were able to study some of the people who endured that trauma. Just over half the 100 people who volunteered for the study had PTSD initially. Of those, 34 still had the disorder two to three years later; 19 had recovered by two to three years. People who developed PTSD showed differences in how their brains handled intrusive memories, laboratory-based tests of memory revealed. Participants learned pairs of random words and pictures — a box of tissues with the word “work,” for example. PTSD involves pairs of associated stimuli too, though in much more complicated ways. A certain smell or sound, for instance, can be linked with the memory of trauma. © Society for Science & the Public 2000–2025.

Keyword: Learning & Memory; Stress
Link ID: 29622 - Posted: 01.11.2025

By Angie Voyles Askham Old age is the best predictor of Alzheimer’s disease, Parkinson’s disease and many other neurodegenerative conditions. And yet, as deeply studied as those conditions are, the process of healthy brain aging is not well understood. Without that knowledge, “how can we possibly fix something that goes wrong because of it?” asks Courtney Glavis-Bloom, senior staff scientist at the Salk Institute for Biological Sciences. “We don’t have the basics. It’s like running before we walk.” That said, mounting evidence suggests that aging takes a particular toll on non-neuronal and white-matter cells in mice. For example, white-matter cells display more differentially expressed genes in aged mice than in younger ones, according to a 2023 single-cell analysis of the frontal cortex and striatum. And glia present in white matter show accelerated aging when compared with cells in the cortex across 15 different brain regions, another 2023 mouse study revealed. “Different brain regions show totally different trajectories regarding aging,” says Andreas Keller, head of the Department of Clinical Bioinformatics at the Helmholtz Institute for Pharmaceutical Research Saarland, who worked on the latter study. Some of the cell types with the most extensive aging-related changes in gene expression occur in a small region of the hypothalamus, according to a new single-cell mouse atlas, the largest and broadest to date. Rare neuronal and non-neuronal cell populations within this “hot spot” are particularly vulnerable to the aging process, says Hongkui Zeng, executive vice president and director of the Allen Institute for Brain Science, who led the work. “This demonstrates the power of using the cell-type-specific approach that will identify highly susceptible, rare populations of interest in the brain,” she says. © 2025 Simons Foundation

Keyword: Alzheimers
Link ID: 29620 - Posted: 01.08.2025

Kat Lay Global health correspondent Pills that prevent Alzheimer’s disease or blunt its effects are on the horizon, as the fight against dementia enters a “new era”, experts have said. Scientific advances were on the cusp of producing medicines that could be used even in the most remote and under-resourced parts of the world, thereby “democratising” care, said Jeff Cummings, professor of brain science and health at the University of Nevada. An estimated 50 million people live with dementia globally, more than two-thirds of them in low- and middle-income countries. In 2024, the first drugs that can change the course of Alzheimer’s disease entered the market. Eisai and Biogen’s lecanemab and Eli Lilly’s donanemab were approved by medicine watchdogs in many western countries, including the UK and US. “I’m just so excited about this,” said Cummings. “We are truly in a new era. We have opened the door to understanding and manipulating the biology of Alzheimer’s disease for the benefit of our patients.” Cummings conceded that high prices, complicated administration techniques and requirements for advanced technology to monitor patients meant that those newly approved drugs were “not going to be made widely available in the world”. Neither is yet available on the NHS in the UK because of the high cost – about £20,000 to £25,000 a year for each patient. They require additional tests and scans that would probably double that figure. But Cummings said they offered evidence of how to target dementia and “this learning is going to open the door to new therapies of many types, and those drugs can be exported around the world”. There are currently 127 drugs in trials for Alzheimer’s disease. © 2025 Guardian News & Media Limited

Keyword: Alzheimers
Link ID: 29619 - Posted: 01.08.2025

By Joshua Cohen For decades, scientists have been trying to develop therapeutics for people living with Alzheimer’s disease, a progressive neurodegenerative disease that is characterized by cognitive decline. Given the global rise in cases, the stakes are high. A study published in The Lancet Public Health reports that the number of adults living with dementia worldwide is expected to nearly triple, to 153 million in 2050. Alzheimer’s disease is a dominant form of dementia, representing 60 to 70 percent of cases. Recent approvals by the Food and Drug Administration have focused on medications that shrink the sticky brain deposits of a protein called amyloid beta. The errant growth of this protein is responsible for triggering an increase in tangled threads of another protein called tau and the development of Alzheimer’s disease — at least according to the dominant amyloid cascade hypothesis, which was first proposed in 1991. Over the past few years, however, data and drugs associated with the hypothesis have been mired in various controversies relating to data integrity, regulatory approval, and drug safety. Nevertheless, the hypothesis still dominates research and drug development. According to Science, in fiscal year 2021 to 2022, the National Institutes of Health spent some $1.6 billion on projects that mention amyloids, about 50 percent of the agency’s overall Alzheimer’s funding. And a close look at the data for recently approved drugs suggests the hypothesis is not wrong, so much as incomplete. A few years ago, Matthew Schrag, a neurologist at Vanderbilt University, discovered possible image tampering in papers that supported the hypothesis, including in an influential 2006 Nature study that was eventually retracted. At roughly the same time, the FDA had been greenlighting medications that target amyloid beta.

Keyword: Alzheimers
Link ID: 29618 - Posted: 01.08.2025

By McKenzie Prillaman A peek into living tissue from human hippocampi, a brain region crucial for memory and learning, revealed relatively few cell-to-cell connections for the vast number of nerve cells. But signals sent via those sparse connections proved extremely reliable and precise, researchers report December 11 in Cell. One seahorse-shaped hippocampus sits deep within each hemisphere of the mammalian brain. In each hippocampus’s CA3 area, humans have about 1.7 million nerve cells called pyramidal cells. This subregion is thought to be the most internally connected part of the brain in mammals. But much information about nerve cells in this structure has come from studies in mice, which have only 110,000 pyramidal cells in each CA3 subregion. Previously discovered differences between mouse and human hippocampi hinted that animals with more nerve cells may have fewer connections — or synapses — between them, says cellular neuroscientist Peter Jonas of the Institute of Science and Technology Austria in Klosterneuburg. To see if this held true, he and his colleagues examined tissue taken with consent from eight patients who underwent brain surgery to treat epilepsy. Recording electrical activity from human pyramidal cells in the CA3 area suggested that about 10 synapses existed for every 800 cell pairs tested. In mice, that concentration roughly tripled. Despite the relatively scant nerve cell connections in humans, those cells showed steady and robust activity when sending signals to one another — unlike mouse pyramidal cells. © Society for Science & the Public 2000–2025

Keyword: Learning & Memory
Link ID: 29616 - Posted: 01.08.2025