Chapter 13. Memory and Learning

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1828

By Veronique Greenwood In the dappled sunlit waters of Caribbean mangrove forests, tiny box jellyfish bob in and out of the shade. Box jellies are distinguished from true jellyfish in part by their complex visual system — the grape-size predators have 24 eyes. But like other jellyfish, they are brainless, controlling their cube-shaped bodies with a distributed network of neurons. That network, it turns out, is more sophisticated than you might assume. On Friday, researchers published a report in the journal Current Biology indicating that the box jellyfish species Tripedalia cystophora have the ability to learn. Because box jellyfish diverged from our part of the animal kingdom long ago, understanding their cognitive abilities could help scientists trace the evolution of learning. The tricky part about studying learning in box jellies was finding an everyday behavior that scientists could train the creatures to perform in the lab. Anders Garm, a biologist at the University of Copenhagen and an author of the new paper, said his team decided to focus on a swift about-face that box jellies execute when they are about to hit a mangrove root. These roots rise through the water like black towers, while the water around them appears pale by comparison. But the contrast between the two can change from day to day, as silt clouds the water and makes it more difficult to tell how far away a root is. How do box jellies tell when they are getting too close? “The hypothesis was, they need to learn this,” Dr. Garm said. “When they come back to these habitats, they have to learn, how is today’s water quality? How is the contrast changing today?” In the lab, researchers produced images of alternating dark and light stripes, representing the mangrove roots and water, and used them to line the insides of buckets about six inches wide. When the stripes were a stark black and white, representing optimum water clarity, box jellies never got close to the bucket walls. With less contrast between the stripes, however, box jellies immediately began to run into them. This was the scientists’ chance to see if they would learn. © 2023 The New York Times Company

Keyword: Learning & Memory; Evolution
Link ID: 28925 - Posted: 09.23.2023

COMIC: When, why and how did neurons first evolve? Scientists are piecing together the ancient story. By Tim Vernimmen Illustrated by Maki Naro 09.14.2023 © 2023 Annual Reviews

Keyword: Evolution; Development of the Brain
Link ID: 28920 - Posted: 09.21.2023

By Janet Lee Doing puzzles, playing memory-boosting games, taking classes and reading are activities that we often turn to for help keeping our brains sharp. But research is showing that what you eat, how often you exercise and the type of exercise you do can help lower your risk of dementia to a greater extent than previously thought. Live well every day with tips and guidance on food, fitness and mental health, delivered to your inbox every Thursday. Although more studies are needed, “there’s a lot of data that suggests exercise and diet are good for the brain and can prevent or help slow down” cognitive changes, says Jeffrey Burns, co-director of the University of Kansas Alzheimer’s Disease Research Center in Fairway. And living a healthy lifestyle can produce brain benefits no matter what your age. The big diet picture If you’re already eating in a way that protects your heart — plenty of whole grains, vegetables, and fruit, and little saturated fat, sodium and ultra-processed “junk” foods — there’s good news: You’re also protecting your brain. A healthy cardiovascular system keeps blood vessels open, allowing good blood flow to the brain and reducing the risk of high blood pressure, stroke and dementia. Research suggests that two specific dietary approaches — the Mediterranean diet and the MIND diet (the Mediterranean-DASH Intervention for Neurodegenerative Delay, essentially a combo of two heart-healthy eating plans) — may help stave off cognitive decline. Both diets rely on eating mostly plant foods (fruits, vegetables, whole grains, beans, nuts), olive oil, fish and poultry. The main difference between the two is that the MIND diet emphasizes specific fruits and vegetables, such as berries and leafy greens. Studies show that people who most closely follow either diet have a reduced risk of dementia compared with those who don’t. For example, people eating the Mediterranean way had a 23 percent lower risk of dementia in a nine-year study of more than 60,000 men and women published this year in BMC Medicine.

Keyword: Alzheimers
Link ID: 28915 - Posted: 09.21.2023

By Jim Davies Think of what you want to eat for dinner this weekend. What popped into mind? Pizza? Sushi? Clam chowder? Why did those foods (or whatever foods you imagined) appear in your consciousness and not something else? Psychologists have long held that when we are making a decision about a particular category of thing, we tend to bring to mind items that are typical or common in our culture or everyday lives, or ones we value the most. On this view, whatever foods you conjured up are likely ones that you eat often, or love to eat. Sounds intuitive. But a recent paper published in Cognition suggests it’s more complicated than that. Tracey Mills, a research assistant working at MIT, led the study along with Jonathan Phillips, a cognitive scientist and philosopher at Dartmouth College. They put over 2,000 subjects, recruited online, through a series of seven experiments that allowed them to test a novel approach for understanding which ideas within a category will pop into our consciousness—and which won’t. In this case, they had subjects think about zoo animals, holidays, jobs, kitchen appliances, chain restaurants, sports, and vegetables. What they found is that what makes a particular thing come to mind—such as a lion when one is considering zoo animals—is determined not by how valuable or familiar it is, but by where it lies in a multidimensional idea grid that could be said to resemble a kind of word cloud. “Under the hypothesis we argue for,” Mills and Phillips write, “the process of calling members of a category to mind might be modeled as a search through feature space, weighted toward certain features that are relevant for that category.” Historical “value” just happens to be one dimension that is particularly relevant when one is talking about dinner, but is less relevant for categories such as zoo animals or, say, crimes, they write. © 2023 NautilusNext Inc., All rights reserved.

Keyword: Attention; Learning & Memory
Link ID: 28910 - Posted: 09.16.2023

By Joanna Thompson= Like many people, Mary Ann Raghanti enjoys potatoes loaded with butter. Unlike most people, however, she actually asked the question of why we love stuffing ourselves with fatty carbohydrates. Raghanti, a biological anthropologist at Kent State University, has researched the neurochemical mechanism behind that savory craving. As it turns out, a specific brain chemical may be one of the things that not only developed our tendency to overindulge in food, alcohol and drugs but also helped the human brain evolve to be unique from the brains of closely related species. A new study, led by Raghanti and published on September 11 in the Proceedings of the National Academy of Sciences USA, examined the activity of a particular neurotransmitter in a region of the brain that is associated with reward and motivation across several species of primates. The researchers found higher levels of that brain chemical—neuropeptide Y (NPY)—in humans, compared with our closest living relatives. That boost in the reward peptide could explain our love of high-fat foods, from pizza to poutine. The impulse to stuff ourselves with fats and sugars may have given our ancestors an evolutionary edge, allowing them to develop a larger and more complex brain. “I think this is a first bit of neurobiological insight into one of the most interesting things about us as a species,” says Robert Sapolsky, a neuroendocrinology researcher at Stanford University, who was not directly involved in the research but helped review the new paper. Advertisement Neuropeptide Y is associated with “hedonic eating”—consuming food strictly to experience pleasure rather than to satisfy hunger. It drives individuals to seek out high-calorie foods, especially those rich in fat. Historically, though, NPY has been overlooked in favor of flashier “feel good” chemicals such as dopamine and serotonin. © 2023 Scientific American,

Keyword: Obesity; Intelligence
Link ID: 28905 - Posted: 09.13.2023

By Jacqueline Howard and Deidre McPhillips, Most families of children with autism may face long wait times to diagnose their child with the disorder, and once a diagnosis is made, it sometimes may not be definitive. But now, two studies released Tuesday suggest that a recently developed eye-tracking tool could help clinicians diagnose children as young as 16 months with autism – and with more certainty. Kids’ developmental disability diagnoses became more common during pandemic, but autism rates held steady, CDC report says “This is not a tool to replace expert clinicians,” said Warren Jones, director of research at the Marcus Autism Center at Children’s Healthcare of Atlanta and Nien Distinguished Chair in Autism at Emory University School of Medicine, who was an author on both studies. Rather, he said, the hope with this eye-tracking technology is that “by providing objective measurements that objectively measure the same thing in each child,” it can help inform the diagnostic process. The tool, called EarliPoint Evaluation, is cleared by the US Food and Drug Administration to help clinicians diagnose and assess autism, according to the researchers. Traditionally, children are diagnosed with autism based on a clinician’s assessment of their developmental history, behaviors and parents’ reports. Evaluations can take hours, and some subtle behaviors associated with autism may be missed, especially among younger children. “Typically, the way we diagnose autism is by rating our impressions,” said Whitney Guthrie, a clinical psychologist and scientist at the Children’s Hospital of Philadelphia’s Center for Autism Research. She was not involved in the new studies, but her research focuses on early diagnosis of autism.

Keyword: Autism; Schizophrenia
Link ID: 28904 - Posted: 09.13.2023

By Saugat Bolakhe Memory doesn’t represent a single scientific mystery; it’s many of them. Neuroscientists and psychologists have come to recognize varied types of memory that coexist in our brain: episodic memories of past experiences, semantic memories of facts, short- and long-term memories, and more. These often have different characteristics and even seem to be located in different parts of the brain. But it’s never been clear what feature of a memory determines how or why it should be sorted in this way. Now, a new theory backed by experiments using artificial neural networks proposes that the brain may be sorting memories by evaluating how likely they are to be useful as guides in the future. In particular, it suggests that many memories of predictable things, ranging from facts to useful recurring experiences — like what you regularly eat for breakfast or your walk to work — are saved in the brain’s neocortex, where they can contribute to generalizations about the world. Memories less likely to be useful — like the taste of that unique drink you had at that one party — are kept in the seahorse-shaped memory bank called the hippocampus. Actively segregating memories this way on the basis of their usefulness and generalizability may optimize the reliability of memories for helping us navigate novel situations. The authors of the new theory — the neuroscientists Weinan Sun and James Fitzgerald of the Janelia Research Campus of the Howard Hughes Medical Institute, Andrew Saxe of University College London, and their colleagues — described it in a recent paper in Nature Neuroscience. It updates and expands on the well-established idea that the brain has two linked, complementary learning systems: the hippocampus, which rapidly encodes new information, and the neocortex, which gradually integrates it for long-term storage. James McClelland, a cognitive neuroscientist at Stanford University who pioneered the idea of complementary learning systems in memory but was not part of the new study, remarked that it “addresses aspects of generalization” that his own group had not thought about when they proposed the theory in the mid 1990s. All Rights Reserved © 2023

Keyword: Learning & Memory; Attention
Link ID: 28900 - Posted: 09.07.2023

By Astrid Landon In June 2015, Jeffrey Thelen’s parents noticed their son was experiencing problems with his memory. In the subsequent years, he would get lost driving to his childhood home, forget his cat had died, and fail to recognize his brother and sister. His parents wondered: Was electroconvulsive therapy to blame? Thelen had been regularly receiving the treatment to help with symptoms of severe depression, which he’d struggled with since high school. At 34 years old, he had tried medications, but hadn’t had a therapy plan. His primary care physician referred him to get an evaluation for ECT, which was then prescribed by a psychiatrist. Electroconvulsive therapy has been used to treat various mental illnesses since the late 1930s. The technique, which involves passing electrical currents through the brain to trigger a short seizure, has always had a somewhat torturous reputation. Yet it’s still in use, in a modified form of its original version. According to one commonly cited statistic, 100,000 Americans receive ECT annually — most often to ease symptoms of severe depression or bipolar disorder — although exact demographic data is scarce. For Thelen, the treatment appeared to relieve his depression symptoms somewhat, but he reported new headaches and concentration issues, in addition to the memory loss. Those claims are central to a lawsuit Thelen filed in 2020 against Somatics, LLC and Elektrika, Inc., manufacturers and suppliers of ECT devices, alleging that the companies failed to disclose — and even intentionally hid — risks associated with ECT, including “brain damage and permanent neurocognitive injuries.” Thelen’s legal team told Undark that they have since reached a resolution with Elektrika on confidential terms. With regard to Somatics, in June a jury found that the company failed to warn about risks associated with ECT, but could not conclude that there was a legal causation between that and Thelen’s memory loss. The following month, his lawyers filed a motion for a new trial. (In response to a request for comment, Conrad Swartz, one of Somatics’ co-founders, directed Undark to the company’s attorney, Sue Cole. Cole did not respond to multiple emails. Lawyers for Elektrika declined to comment.)

Keyword: Depression; Learning & Memory
Link ID: 28899 - Posted: 09.07.2023

By Claudia López Lloreda Cells hidden in the skull may point to a way to detect, diagnose and treat inflamed brains. A detailed look at the skull reveals that bone marrow cells there change and are recruited to the brain after injury, possibly traveling through tiny channels connecting the skull and the outer protective layer of the brain. Paired with the discovery that inflammation in the skull is disease-specific, these new findings collectively suggest the skull’s marrow could serve as a target to track and potentially treat neurological disorders involving brain inflammation, researchers report August 9 in Cell. Immune cells that infiltrate the central nervous system during many diseases and neuronal injury can wreak havoc by flooding the brain with damaging molecules. This influx of immune cells causes inflammation in the brain and spinal cord and can contribute to diseases like multiple sclerosis (SN: 11/26/19). Detecting and dampening this reaction has been an extensive field of research. With this new work, the skull, “something that has been considered as just protective, suddenly becomes a very active site of interaction with the brain, not only responding to brain diseases, but also changing itself in response to brain diseases,” says Gerd Meyer zu Hörste, a neurologist at University of Münster in Germany who was not involved in the study. Ali Ertürk of the Helmholtz Center in Munich and colleagues discovered this potential role for the skull while probing the idea that the cells in skull marrow might behave differently from those in other bones. Ertürk’s team compared the genetic activity of cells in mice skull marrow, and the proteins those cells made, with those in the rodent’s humerus, femur and four other bones, along with the meninges, the protective membranes between the skull and the brain. © Society for Science & the Public 2000–2023.

Keyword: Alzheimers; Multiple Sclerosis
Link ID: 28898 - Posted: 09.07.2023

Diana Kwon Santiago Ramón y Cajal revolutionized neurobiology in the late nineteenth century with his exquisitely detailed illustrations of neural tissues. Created through years of meticulous microscopy work, the Spanish physician-scientist’s drawings revealed the unique cellular morphology of the brain. “With Cajal’s work, we saw that the cells of the brain don’t look like the cells of every other part of the body — they have incredible morphologies that you just don’t see elsewhere,” says Evan Macosko, a neuroscientist at the Broad Institute of MIT and Harvard in Cambridge, Massachusetts. Ramón y Cajal’s drawings provided one of the first clues that the keys to understanding how the brain governs its many functions, from regulating blood pressure and sleep to controlling cognition and mood, might lie at the cellular level. Still, when it comes it comes to the brain, crucial information remained — and indeed, remains — missing. “In order to have a fundamental understanding of the brain, we really need to know how many different types of cells there are, how are they organized, and how they interact with each other,” says Xiaowei Zhuang, a biophysicist at Harvard University in Cambridge. What neuroscientists require, Zhuang explains, is a way to systematically identify and map the many categories of brain cells. Now researchers are closing in on such a resource, at least in mice. By combining high-throughput single-cell RNA sequencing with spatial transcriptomics — methods for determining which genes are expressed in individual cells, and where those cells are located — they are creating some of the most comprehensive atlases of the mouse brain so far. The crucial next steps will be working out what these molecularly defined cell types do, and bringing the various brain maps together to create a unified resource that the broader neuroscience community can use. © 2023 Springer Nature Limited

Keyword: Brain imaging; Development of the Brain
Link ID: 28880 - Posted: 08.24.2023

By Lauren Leffer When a nematode wriggles around a petri dish, what’s going on inside a tiny roundworm’s even tinier brain? Neuroscientists now have a more detailed answer to that question than ever before. As with any experimental animal, from a mouse to a monkey, the answers may hold clues about the contents of more complex creatures’ noggin, including what resides in the neural circuitry of our own head. A new brain “atlas” and computer model, published in Cell on Monday, lays out the connections between the actions of the nematode species Caenorhabditis elegans and this model organism’s individual brain cells. With the findings, researchers can now observe a C. elegans worm feeding or moving in a particular way and infer activity patterns for many of the animal’s behaviors in its specific neurons. Through establishing those brain-behavior links in a humble roundworm, neuroscientists are one step closer to understanding how all sorts of animal brains, even potentially human ones, encode action. “I think this is really nice work,” says Andrew Leifer, a neuroscientist and physicist who studies nematode brains at Princeton University and was not involved in the new research. “One of the most exciting reasons to study how a worm brain works is because it holds the promise of being able to understand how any brain generates behavior,” he says. “What we find in the worm forms hypotheses to look for in other organisms.” Biologists have been drawn to the elegant simplicity of nematode biology for many decades. South African biologist Sydney Brenner received a Nobel Prize in Physiology or Medicine in 2002 for pioneering work that enabled C. elegans to become an experimental animal for the study of cell maturation and organ development. C. elegans was the first multicellular organism to have its entire genome and nervous system mapped. The first neural map, or “connectome,” of a C. elegans brain was published in 1986. In that research, scientists hand drew connections using colored pencils and charted each of the 302 neurons and approximately 5,000 synapses inside the one-millimeter-long animal’s transparent body. Since then a subdiscipline of neuroscience has emerged—one dedicated to plotting out the brains of increasingly complex organisms. Scientists have compiled many more nematode connectomes, as well as brain maps of a marine annelid worm, a tadpole, a maggot and an adult fruit fly. Yet these maps simply serve as a snapshot in time of a single animal. They can tell us a lot about brain structure but little about how behaviors relate to that structure. © 2023 Scientific American

Keyword: Brain imaging; Development of the Brain
Link ID: 28879 - Posted: 08.24.2023

by Calli McMurray One of the co-directors of a now-shuttered Maryland psychology clinic implicated in 18 paper retractions has retired, Spectrum has learned. Prior to her retirement, Clara Hill was professor of psychology at the University of Maryland in College Park. Headshot of Clara Hill. Recent retirement: Clara Hill retired from the University of Maryland in the midst of 18 paper retractions after a 49-year career. Starting on 1 June, the American Psychological Association (APA) retracted 11 papers by Hill and her university colleagues Dennis Kivlighan, Jr. and Charles Gelso over issues with obtaining participant consent. The publisher plans to retract six more papers by the end of the year, according to an APA representative. On 13 August, Taylor & Francis retracted an additional paper led solely by Hill. The research was conducted at the Maryland Psychotherapy Clinic and Research Lab, where Hill, Kivlighan and Gelso were co-directors. The clinic had shut down as of 1 June. When asked about the circumstances surrounding Hill’s retirement, a university spokesperson told Spectrum in an email, “Dr. Clara Hill retired from UMD effective July 1, 2023.” After Spectrum asked again about the circumstances, a spokesperson replied, “This is all we’ll have for you on the faculty member’s retirement — thanks!” Hill worked at the university for 49 years. As of 1 August, Hill’s faculty page did not mention her retirement. By 14 August, her position had been amended to “Professor (Retired),” and a notice of her retirement had been added to the beginning of her biography. Spectrum left two voicemails on Hill’s university office phone and emailed her university address with requests for comment but did not hear back. The 11 papers retracted by the APA appeared in the Journal of Counseling Psychology, Dreaming and Psychotherapy. The additional retractions will come from the same titles, according to an APA representative. Hill conducted all 11 studies, whereas Kivlighan and Gelso conducted 10 and 6, respectively. © 2023 Simons Foundation

Keyword: Autism
Link ID: 28877 - Posted: 08.24.2023

Saima May Sidik A protein involved in wound healing can improve learning and memory in ageing mice1. Platelet factor 4 (PF4) has long been known for its role in promoting blood clotting and sealing broken blood vessels. Now, researchers are wondering whether this signalling molecule could be used to treat age-related cognitive disorders such as Alzheimer’s disease. “The therapeutic possibilities are very exciting,” says geneticist and anti-ageing scientist David Sinclair at Harvard University in Boston, Massachusetts, who was not involved in the research. The study was published on 16 August in Nature. Young blood, old brains About a decade ago, scientists discovered that blood from young mice could restore youthful properties, including learning abilities, in older mice2,3. The idea captivated Saul Villeda, a neuroscientist at the University of California, San Francisco, and a co-author of the new study. He and his colleagues have since been trying to identify the components of blood that cause this rejuvenation. Several lines of evidence suggested that PF4 might be one of these components, including the fact that young mice have higher levels of this molecule in their blood than do older mice. Villeda and his colleagues tried injecting PF4 into aged mice without including other blood components. The researchers found that the ratios of various types of immune cell shifted to become more similar to what is typically seen in younger mice. Some immune cells also reverted to a more youthful pattern of gene expression. Although PF4 was not able to cross the blood–brain barrier, its effects on the immune system also led to changes in the brain, probably through indirect mechanisms. Old mice that received doses of PF4 showed decreases in damaging inflammation in the hippocampus — a part of the brain that’s particularly vulnerable to the effects of ageing. They also showed increases in the levels of molecules that promote synaptic plasticity (the capacity to alter the strength of connections between nerve cells). © 2023 Springer Nature Limited

Keyword: Development of the Brain
Link ID: 28874 - Posted: 08.19.2023

By Alla Katsnelson Our understanding of animal minds is undergoing a remarkable transformation. Just three decades ago, the idea that a broad array of creatures have individual personalities was highly suspect in the eyes of serious animal scientists — as were such seemingly fanciful notions as fish feeling pain, bees appreciating playtime and cockatoos having culture. Today, though, scientists are rethinking the very definition of what it means to be sentient and seeing capacity for complex cognition and subjective experience in a great variety of creatures — even if their inner worlds differ greatly from our own. Such discoveries are thrilling, but they probably wouldn’t have surprised Charles Henry Turner, who died a century ago, in 1923. An American zoologist and comparative psychologist, he was one of the first scientists to systematically probe complex cognition in animals considered least likely to possess it. Turner primarily studied arthropods such as spiders and bees, closely observing them and setting up trailblazing experiments that hinted at cognitive abilities more complex than most scientists at the time suspected. Turner also explored differences in how individuals within a species behaved — a precursor of research today on what some scientists refer to as personality. Most of Turner’s contemporaries believed that “lowly” critters such as insects and spiders were tiny automatons, preprogrammed to perform well-defined functions. “Turner was one of the first, and you might say should be given the lion’s share of credit, for changing that perception,” says Charles Abramson, a comparative psychologist at Oklahoma State University in Stillwater who has done extensive biographical research on Turner and has been petitioning the US Postal Service for years to issue a stamp commemorating him. Turner also challenged the views that animals lacked the capacity for intelligent problem-solving and that they behaved based on instinct or, at best, learned associations, and that individual differences were just noisy data. But just as the scientific establishment of the time lacked the imagination to believe that animals other than human beings can have complex intelligence and subjectivity of experience, it also lacked the collective imagination to envision Turner, a Black scientist, as an equal among them. The hundredth anniversary of Turner’s death offers an opportunity to consider what we may have missed out on by their oversight. © 2023 Annual Reviews

Keyword: Learning & Memory; Evolution
Link ID: 28869 - Posted: 08.09.2023

By Yasemin Saplakoglu On warm summer nights, green lacewings flutter around bright lanterns in backyards and at campsites. The insects, with their veil-like wings, are easily distracted from their natural preoccupation with sipping on flower nectar, avoiding predatory bats and reproducing. Small clutches of the eggs they lay hang from long stalks on the underside of leaves and sway like fairy lights in the wind. The dangling ensembles of eggs are beautiful but also practical: They keep the hatching larvae from immediately eating their unhatched siblings. With sickle-like jaws that pierce their prey and suck them dry, lacewing larvae are “vicious,” said James Truman, a professor emeritus of development, cell and molecular biology at the University of Washington. “It’s like ‘Beauty and the Beast’ in one animal.” This Jekyll-and-Hyde dichotomy is made possible by metamorphosis, the phenomenon best known for transforming caterpillars into butterflies. In its most extreme version, complete metamorphosis, the juvenile and adult forms look and act like totally different species. Metamorphosis is not an exception in the animal kingdom; it’s almost a rule. More than 80% of the known animal species today, mainly insects, amphibians and marine invertebrates, undergo some form of metamorphosis or have complex, multistage life cycles. The process of metamorphosis presents many mysteries, but some of the most deeply puzzling ones center on the nervous system. At the center of this phenomenon is the brain, which must code for not one but multiple different identities. After all, the life of a flying, mate-seeking insect is very different from the life of a hungry caterpillar. For the past half-century, researchers have probed the question of how a network of neurons that encodes one identity — that of a hungry caterpillar or a murderous lacewing larva — shifts to encode an adult identity that encompasses a completely different set of behaviors and needs. Truman and his team have now learned how much metamorphosis reshuffles parts of the brain. In a recent study published in the journal eLife, they traced dozens of neurons in the brains of fruit flies going through metamorphosis. They found that, unlike the tormented protagonist of Franz Kafka’s short story “The Metamorphosis,” who awakes one day as a monstrous insect, adult insects likely can’t remember much of their larval life. Although many of the larval neurons in the study endured, the part of the insect brain that Truman’s group examined was dramatically rewired. That overhaul of neural connections mirrored a similarly dramatic shift in the behavior of the insects as they changed from crawling, hungry larvae to flying, mate-seeking adults. All Rights Reserved © 2023

Keyword: Learning & Memory
Link ID: 28860 - Posted: 07.27.2023

by Giorgia Guglielmi Mice with a mutation that boosts the activity of the autism-linked protein UBE3A show an array of behaviors reminiscent of the condition, a new study finds. The behaviors differ depending on whether the animals inherit the mutation from their mother or their father, the work also reveals. The results add to mounting evidence that hyperactive UBE3A leads to autism. Duplications of the chromosomal region that includes UBE3A have been associated with autism, whereas deletions and mutations that destroy the gene’s function are known to cause Angelman syndrome, which is characterized by developmental delay, seizures, lack of speech, a cheerful demeanor and, often, autism. “UBE3A is on a lot of clinicians’ radar because it is well known to be causative for Angelman syndrome when mutated or deleted,” says lead investigator Mark Zylka, professor of cell biology and physiology at the University of North Carolina at Chapel Hill. “What our study shows is that just because you have a mutation in UBE3A, it doesn’t mean that it’s going to be Angelman syndrome.” In the cell, UBE3A is involved in the degradation of proteins, and “gain-of-function” mutations — which send the UBE3A protein into overdrive — result in enhanced degradation of its targets, including UBE3A itself. Studying the effects of these mutations could provide insight into how they affect brain development and suggest targets for therapies, says study investigator Jason Yi, assistant professor of neuroscience at Washington University in St. Louis, Missouri. Gain-of-function mutations in UBE3A can disrupt early brain development and may contribute to neurodevelopmental conditions that are distinct from Angelman syndrome, Yi and Zylka have shown in previous studies. One of the mutations they analyzed had been found in an autistic child, so the team used CRISPR to create mice with this mutation. © 2023 Simons Foundation

Keyword: Autism
Link ID: 28857 - Posted: 07.27.2023

Lilly Tozer A study that followed thousands of people over 25 years has identified proteins linked to the development of dementia if their levels are unbalanced during middle age. The findings, published in Science Translational Medicine on 19 July1, could contribute to the development of new diagnostic tests, or even treatments, for dementia-causing diseases. Most of the proteins have functions unrelated to the brain. “We’re seeing so much involvement of the peripheral biology decades before the typical onset of dementia,” says study author Keenan Walker, a neuroscientist at the US National Institute on Aging in Bethesda, Maryland. Equipped with blood samples from more than 10,000 participants, Walker and his colleagues questioned whether they could find predictors of dementia years before its onset by looking at a person’s proteome — the collection of all the proteins expressed throughout the body. They searched for any signs of dysregulation — when proteins are at levels much higher or lower than normal. The samples were collected as part of an ongoing study that began in 1987. Participants returned for examination six times over three decades, and during this time, around 1 in 5 of them developed dementia. The researchers found 32 proteins that, if dysregulated in people aged 45 to 60, were strongly associated with an elevated chance of developing dementia in later life. It is unclear how exactly these proteins might be involved in the disease, but the link is “highly unlikely to be due to just chance alone”, says Walker © 2023 Springer Nature Limited

Keyword: Alzheimers
Link ID: 28856 - Posted: 07.22.2023

Geneva Abdul The so-called “brain fog” symptom associated with long Covid is comparable to ageing 10 years, researchers have suggested. In a study by King’s College London, researchers investigated the impact of Covid-19 on memory and found cognitive impairment highest in individuals who had tested positive and had more than three months of symptoms. The study, published on Friday in a clinical journal published by The Lancet, also found the symptoms in affected individuals stretched to almost two years since initial infection. “The fact remains that two years on from their first infection, some people don’t feel fully recovered and their lives continue to be impacted by the long-term effects of the coronavirus,” said Claire Steves, a professor of ageing and health at King’s College. “We need more work to understand why this is the case and what can be done to help.” An estimated two million people living in the UK were experiencing self-reported long Covid – symptoms continuing for more than four weeks since infection – as of January 2023, according to the 2023 government census. Commonly reported symptoms included fatigue, difficulty concentrating, shortness of breath and muscle aches. The study included more than 5,100 participants from the Covid Symptom Study Biobank, recruited through a smartphone app. Through 12 cognitive tests measuring speed and accuracy, researchers examined working memory, attention, reasoning and motor controls between two periods of 2021 and 2022. © 2023 Guardian News & Media Limited or

Keyword: Learning & Memory; Attention
Link ID: 28854 - Posted: 07.22.2023

By Pam Belluck Treating Alzheimer’s patients as early as possible — when symptoms and brain pathology are mildest — provides a better chance of slowing cognitive decline, a large study of an experimental Alzheimer’s drug presented Monday suggests. The study of 1,736 patients reported that the drug, donanemab, made by Eli Lilly, can modestly slow the progression of memory and thinking problems in early stages of Alzheimer’s, and that the slowing was greatest for early-stage patients when they had less of a protein that creates tangles in the brain. For people at that earlier stage, donanemab appeared to slow decline in memory and thinking by about four and a half to seven and a half months over an 18-month period compared with those taking a placebo, according to the study, published in the journal JAMA. Among people with less of the protein, called tau, slowing was most pronounced in those younger than 75 and those who did not yet have Alzheimer’s but had a pre-Alzheimer’s condition called mild cognitive impairment, according to data presented Monday at the Alzheimer’s Association International Conference in Amsterdam. “The earlier you can get in there, the more you can impact it before they’ve already declined and they’re on this fast slope,” Dr. Daniel Skovronsky, Eli Lilly’s chief medical and scientific officer, said in an interview. “No matter how you cut the data — earlier, younger, milder, less pathology — every time, it just looks like early diagnosis and early intervention are the key to managing this disease,” he added. The findings and the recent approval of another drug that modestly slows decline in the early stages of Alzheimer’s, Leqembi, signal a potentially promising turn in the long, rocky path toward finding effective medications for Alzheimer’s, a brutal disease that plagues more than six million Americans. Donanemab is currently being considered for approval by the Food and Drug Administration. © 2023 The New York Times Company

Keyword: Alzheimers
Link ID: 28852 - Posted: 07.19.2023

Nicola Davis Science correspondent Taking part in activities such as chess, writing a journal, or educational classes in older age may help to reduce the risk of dementia, a study has suggested. According to the World Health Organization, more than 55 million people have the disease worldwide, most of them older people. However experts have long emphasised that dementia is not an inevitable part of ageing, with being active, eating well and avoiding smoking among the lifestyle choices that can reduce risk. Now researchers have revealed fresh evidence that challenging the brain could also be beneficial. Writing in the journal Jama Network Open, researchers in the US and Australia report how they used data from the Australian Aspree Longitudinal Study of Older Persons covering the period from 1 March 2010 to 30 November 2020. Participants in the study were over the age of 70, did not have a major cognitive impairment or cardiovascular disease when recruited between 2010 and 2014, and were assessed for dementia through regular study visits. In the first year, participants were asked about their social networks. They were also questioned on whether they undertook certain leisure activities or trips out to venues such as galleries or restaurants, and how frequently: never, rarely, sometimes, often or always. The team analysed data from 10,318 participants, taking into account factors such as age, sex, smoking status, education, socioeconomic status, and whether participants had other diseases such as diabetes. The results reveal that for activities such as writing letters or journals, taking educational classes or using a computer, increasing the frequency of participation by one category, for example from “sometimes” to “often”, was associated with an 11% drop in the risk of developing dementia over a 10-year period. Similarly, increased frequency of activities such as card games, chess or puzzle-solving was associated with a 9% reduction in dementia risk. © 2023 Guardian News & Media Limited

Keyword: Alzheimers; Learning & Memory
Link ID: 28851 - Posted: 07.19.2023