Chapter 13. Memory and Learning

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 121 - 140 of 1919

By Kate Laskowski In the age-old debate about nature versus nurture — whether our characteristics are forged by our genes or our upbringing — I have an answer for you. It is both. And it is neither. I’m a behavioral ecologist who seeks to answer this question by studying a particular kind of fish. The Amazon molly (Poecilia formosa) is an experimental goldmine for these types of questions. She naturally clones herself by giving birth to offspring with identical genomes to her own and to each other’s. A second quirk of this little fish is that her offspring are born live and are completely independent from birth. This means I can control their experiences from the earliest possible age. Essentially, this fish gives me and my colleagues the opportunity to perform “twin studies” to understand how and why individuality develops. And what we’ve found may surprise you. As humans, we know the critical importance of our personalities. These persistent differences among us shape how we navigate our worlds and respond to major life events; whether we are bold or shy; whether we ask someone on a second date or not. Given the obvious importance of personality, it’s perhaps a bit surprising that scientists generally overlooked these kinds of differences in other species for a long time. Up until about 30 years ago, these differences (what I prefer to call “individuality,” as it avoids the human connotation of “personality”) were typically viewed as cute anecdotes with little evolutionary importance. Instead, researchers focused on the typical behavior of a given population. With guppies, for example — a classic workhorse of behavioral ecology research — researchers found that fish will, on average, swim more tightly together if they live among lots of predatory fish, whereas fish from areas with fewer predators spend less time schooling and more time fighting one another, as they don’t have to worry so much about being eaten. © 2023 Annual Reviews

Keyword: Development of the Brain; Genes & Behavior
Link ID: 28815 - Posted: 06.07.2023

Sara Reardon Vaccination against shingles might also prevent dementia, such as that caused by Alzheimer’s disease, according to a study of health records from around 300,000 people in Wales. The analysis found that getting the vaccine lowers the risk of dementia by 20%. But some puzzling aspects of the analysis have stirred debate about the work’s robustness. The study was published on the medRxiv preprint server on 25 May and has not yet been peer reviewed. “If it is true, it’s huge,” says Alberto Ascherio, an epidemiologist at Harvard University in Cambridge, Massachusetts, who was not involved in the study. “Even a modest reduction in risk is a tremendous impact.” Dementia–infection link The idea that viral infection can play a part in at least some dementia cases dates back to the 1990s, when biophysicist Ruth Itzhaki at the University of Manchester, UK, and her colleagues found herpesviruses in the brains of deceased people with dementia2. The theory has been controversial among Alzheimer’s researchers. But recent work has suggested that people infected with viruses that affect the brain have higher rates of neurodegenerative diseases3. Research has also suggested that those vaccinated against certain viral diseases are less likely to develop dementia4. But all these epidemiological studies have shared a key problem: people who get any type of vaccination tend to have healthier lifestyles than those who don’t5, meaning that other factors could account for their lowered risk of diseases such as Alzheimer’s. With that in mind, epidemiologist Pascal Geldsetzer at Stanford University in California and his colleagues turned to a natural experiment: a shingles vaccination programme in Wales, which began on 1 September 2013. Shingles is caused by the reawakening of inactive varicella zoster virus (VZV), the herpesvirus that causes chickenpox and which is present in most people. Shingles is most common in older adults and can cause severe pain and rashes. © 2023 Springer Nature Limited

Keyword: Alzheimers; Neuroimmunology
Link ID: 28814 - Posted: 06.07.2023

Davide Castelvecchi The wrinkles that give the human brain its familiar walnut-like appearance have a large effect on brain activity, in much the same way that the shape of a bell determines the quality of its sound, a study suggests1. The findings run counter to a commonly held theory about which aspect of brain anatomy drives function. The study’s authors compared the influence of two components of the brain’s physical structure: the outer folds of the cerebral cortex — the area where most higher-level brain activity occurs — and the connectome, the web of nerves that links distinct regions of the cerebral cortex. The team found that the shape of the outer surface was a better predictor of brainwave data than was the connectome, contrary to the paradigm that the connectome has the dominant role in driving brain activity. “We use concepts from physics and engineering to study how anatomy determines function,” says study co-author James Pang, a physicist at Monash University in Melbourne, Australia. The results were published in Nature on 31 May1. ‘Exciting’ a neuron makes it fire, which sends a message zipping to other neurons. Excited neurons in the cerebral cortex can communicate their state of excitation to their immediate neighbours on the surface. But each neuron also has a long filament called an axon that connects it to a faraway region within or beyond the cortex, allowing neurons to send excitatory messages to distant brain cells. In the past two decades, neuroscientists have painstakingly mapped this web of connections — the connectome — in a raft of organisms, including humans. The authors wanted to understand how brain activity is affected by each of the ways in which neuronal excitation can spread: across the brain’s surface or through distant interconnections. To do so, the researchers — who have backgrounds in physics and neuroscience — tapped into the mathematical theory of waves.

Keyword: Brain imaging; Development of the Brain
Link ID: 28811 - Posted: 06.03.2023

by Adam Kirsch Giraffes will eat courgettes if they have to, but they really prefer carrots. A team of researchers from Spain and Germany recently took advantage of this preference to investigate whether the animals are capable of statistical reasoning. In the experiment, a giraffe was shown two transparent containers holding a mixture of carrot and courgette slices. One container held mostly carrots, the other mostly courgettes. A researcher then took one slice from each container and offered them to the giraffe with closed hands, so it couldn’t see which vegetable had been selected. In repeated trials, the four test giraffes reliably chose the hand that had reached into the container with more carrots, showing they understood that the more carrots were in the container, the more likely it was that a carrot had been picked. Monkeys have passed similar tests, and human babies can do it at 12 months old. But giraffes’ brains are much smaller than primates’ relative to body size, so it was notable to see how well they grasped the concept. Such discoveries are becoming less surprising every year, however, as a flood of new research overturns longstanding assumptions about what animal minds are and aren’t capable of. A recent wave of popular books on animal cognition argue that skills long assumed to be humanity’s prerogative, from planning for the future to a sense of fairness, actually exist throughout the animal kingdom – and not just in primates or other mammals, but in birds, octopuses and beyond. In 2018, for instance, a team at the University of Buenos Aires found evidence that zebra finches, whose brains weigh half a gram, have dreams. Monitors attached to the birds’ throats found that when they were asleep, their muscles sometimes moved in exactly the same pattern as when they were singing out loud; in other words, they seemed to be dreaming about singing. © 2023 Guardian News & Media Limited

Keyword: Evolution; Learning & Memory
Link ID: 28808 - Posted: 05.31.2023

Emily Waltz Researchers have been exploring whether zapping a person’s brain with electrical current through electrodes on their scalp can improve cognition.Credit: J.M. Eddin/Military Collection/Alamy After years of debate over whether non-invasively zapping the brain with electrical current can improve a person’s mental functioning, a massive analysis of past studies offers an answer: probably. But some question that conclusion, saying that the analysis spans experiments that are too disparate to offer a solid answer. In the past six years, the number of studies testing the therapeutic effects of a class of techniques called transcranial electrical stimulation has skyrocketed. These therapies deliver a painless, weak electrical current to the brain through electrodes placed externally on the scalp. The goal is to excite, disrupt or synchronize signals in the brain to improve function. Researchers have tested transcranial alternating current stimulation (tACS) and its sister technology, tDCS (transcranial direct current stimulation), on both healthy volunteers and those with neuropsychiatric conditions, such as depression, Parkinson’s disease or addiction. But study results have been conflicting or couldn’t be replicated, leading researchers to question the efficacy of the tools. The authors of the new analysis, led by Robert Reinhart, director of the cognitive and clinical neuroscience laboratory at Boston University in Massachusetts, say they compiled the report to quantify whether tACS shows promise, by comparing more than 100 studies of the technique, which applies an oscillating current to the brain. “We have to address whether or not this technique is actually working, because in the literature, you have a lot of conflicting findings,” says Shrey Grover, a cognitive neuroscientist at Boston University and an author on the paper. © 2023 Springer Nature Limited

Keyword: Learning & Memory
Link ID: 28807 - Posted: 05.31.2023

By Robert Martone Neurological conditions can release a torrent of new creativity in a few people as if opening some mysterious floodgate. Auras of migraine and epilepsy may have influenced a long list of artists, including Pablo Picasso, Vincent van Gogh, Edvard Munch, Giorgio de Chirico, Claude Monet and Georges Seurat. Traumatic brain injury (TBI) can result in original thinking and newfound artistic drive. Emergent creativity is also a rare feature of Parkinson’s disease. But this burst of creative ability is especially true of frontotemporal dementia (FTD). Although a few rare cases of FTD are linked to improvements in verbal creativity, such as greater poetic gifts and increased wordplay and punning, enhanced creativity in the visual arts is an especially notable feature of the condition. Fascinatingly, this burst of creativity indicates that the potential to create may rest dormant in some of us, only to be unleashed by a disease that also causes a loss of verbal abilities. The emergence of a vibrant creative spark in the face of devastating neurological disease speaks to the human brain’s remarkable potential and resilience. A new study published in JAMA Neurology examines the roots of this phenomenon and provides insight into a possible cause. As specific brain areas diminish in FTD, the researchers find, they release their inhibition, or control, of other regions that support artistic expression. Frontotemporal dementia is relatively rare—affecting about 60,000 people in the U. S.—and distinct from the far more common Alzheimer’s disease, a form of dementia in which memory deficits predominate. FTD is named for the two brain regions that can degenerate in this disease, specifically the frontal and temporal lobes.

Keyword: Alzheimers; Attention
Link ID: 28797 - Posted: 05.27.2023

By Jennie Erin Smith José Echeverría spends restless days in a metal chair reinforced with boards and padded with a piece of foam that his mother, Nohora Vásquez, adjusts constantly for his comfort. The chair is coming loose and will soon fall apart. Huntington’s disease, which causes José to move his head and limbs uncontrollably, has already left one bed frame destroyed. At 42, he is still strong. José’s sister Nohora Esther Echeverría, 37, lives with her mother and brother. Just two years into her illness, her symptoms are milder than his, but she is afraid to walk around her town’s steep streets, knowing she could fall. A sign on the front door advertises rum for sale that does not exist. The family’s scarce resources now go to food — José and Nohora Esther must eat frequently or they will rapidly lose weight — and medical supplies, like a costly cream for Jose’s skin. Huntington’s is a hereditary neurodegenerative disease caused by excess repetitions of three building blocks of DNA — cytosine, adenine, and guanine — on a gene called huntingtin. The mutation results in a toxic version of a key brain protein, and a person’s age at the onset of symptoms relates, roughly, to the number of repetitions the person carries. Early symptoms can include mood disturbances — Ms. Vásquez remembers how her late husband had chased the children out of their beds, forcing her to sleep with them in the woods — and subtle involuntary movements, like the rotations of Nohora Esther’s delicate wrists. The disease is relatively rare, but in the late 1980s a Colombian neurologist, Jorge Daza, began observing a striking number of cases in the region where Ms. Vásquez lives, a cluster of seaside and mountain towns near Barranquilla. Around the same time, American scientists led by Nancy Wexler were working with an even larger family with Huntington’s in neighboring Venezuela, gathering and studying thousands of tissue samples from them to identify the genetic mutation responsible. © 2023 The New York Times Company

Keyword: Huntingtons; Genes & Behavior
Link ID: 28796 - Posted: 05.23.2023

By Yasemin Saplakoglu Memories are shadows of the past but also flashlights for the future. Our recollections guide us through the world, tune our attention and shape what we learn later in life. Human and animal studies have shown that memories can alter our perceptions of future events and the attention we give them. “We know that past experience changes stuff,” said Loren Frank, a neuroscientist at the University of California, San Francisco. “How exactly that happens isn’t always clear.” A new study published in the journal Science Advances now offers part of the answer. Working with snails, researchers examined how established memories made the animals more likely to form new long-term memories of related future events that they might otherwise have ignored. The simple mechanism that they discovered did this by altering a snail’s perception of those events. The researchers took the phenomenon of how past learning influences future learning “down to a single cell,” said David Glanzman, a cell biologist at the University of California, Los Angeles who was not involved in the study. He called it an attractive example “of using a simple organism to try to get understanding of behavioral phenomena that are fairly complex.” Although snails are fairly simple creatures, the new insight brings scientists a step closer to understanding the neural basis of long-term memory in higher-order animals like humans. Though we often aren’t aware of the challenge, long-term memory formation is “an incredibly energetic process,” said Michael Crossley, a senior research fellow at the University of Sussex and the lead author of the new study. Such memories depend on our forging more durable synaptic connections between neurons, and brain cells need to recruit a lot of molecules to do that. To conserve resources, a brain must therefore be able to distinguish when it’s worth the cost to form a memory and when it’s not. That’s true whether it’s the brain of a human or the brain of a “little snail on a tight energetic budget,” he said. All Rights Reserved © 2023

Keyword: Learning & Memory; Attention
Link ID: 28787 - Posted: 05.18.2023

Sara Reardon Researchers have identified a man with a rare genetic mutation that protected him from developing dementia at an early age. The finding, published on 15 May in Nature Medicine1, could help researchers to better understand the causes of Alzheimer’s disease and potentially lead to new treatments. For nearly 40 years, neurologist Francisco Lopera at the University of Antioquia in Medellín, Colombia, has been following an extended family whose members develop Alzheimer’s in their forties or earlier. Many of the approximately 6,000 family members carry a genetic variant called the paisa mutation that inevitably leads to early-onset dementia. But now, Lopera and his collaborators have identified a family member with a second genetic mutation — one that protected him from dementia until age 67. “Reading that paper made the hair on my arms stand up,” says neuroscientist Catherine Kaczorowski at the University of Michigan in Ann Arbor. “It’s just such an important new avenue to pursue new therapies for Alzheimer’s disease.” Lopera and his colleagues analysed the genomes and medical histories of 1,200 Colombians with the paisa mutation, which causes dementia around ages 45—50. They identified the man with the second mutation when he was 67 and had only mild cognitive impairment. When the researchers scanned his brain, they found high levels of the sticky protein complexes known as amyloid plaques, which are thought to kill neurons and cause dementia, as well as a protein called tau that accumulates as the disease progresses. The brain looked like that of a person with severe dementia, says study co-author Joseph Arboleda, an ophthalmologist at Harvard Medical School in Boston, Massachusetts. But a small brain area called the entorhinal cortex, which coordinates skills such as memory and navigation, had low levels of tau. © 2023 Springer Nature Limited

Keyword: Alzheimers; Genes & Behavior
Link ID: 28786 - Posted: 05.18.2023

By Cordula Hölig, Brigitte Röder, Ramesh Kekunnaya Growing up in poverty or experiencing any adversity, such as abuse or neglect, during early childhood can put a person at risk for poor health, including mental disorders, later in life. Although the underlying mechanisms are poorly understood, some studies have shown that adverse early childhood experience leaves persisting (and possibly irreversible) traces in brain structure. As neuroscientists who are investigating sensitive periods of human brain development, we agree: safe and nurturing environments are a prerequisite for healthy brain development and lifelong well-being. Thus, preventing early childhood adversity undoubtedly leads to healthier lives. Poverty and adversity can cause changes in brain development. Harms can come from exposure to violence or toxins or a lack of nutrition, caregiving, perceptual and cognitive stimulation or language interaction. Neuroscientists have demonstrated that these factors crucially influence human brain development. Advertisement We don’t know whether these changes are reversed by more favorable circumstances later in life, however. Investigating this question in humans is extremely difficult. For one, multiple biological and psychological factors through which poverty and adversity affect brain development are hard to disentangle. That’s because they often occur together: a neglected child often experiences a lack of caregiving simultaneously with malnutrition and exposure to physical violence. Secondly, a clear beginning and end of an adverse experience is hard to define. Finally, it is almost impossible to fully reverse harsh environments in natural settings because most of the time it is impossible to move children out of their families or communities.. © 2023 Scientific American

Keyword: Development of the Brain; Learning & Memory
Link ID: 28783 - Posted: 05.13.2023

Heidi Ledford When Naomi Rance first started studying menopause and the brain, she pretty much had the field to herself. And what she was discovering surprised her. In studies of post-mortem brains, she had found neurons in a region called the hypothalamus that roughly doubled in size in women after menopause1. “This was changing so much in postmenopausal women,” says Rance, a neuropathologist at the University of Arizona in Tucson. “It had to be important.” This was the 1990s, and few other researchers were interested. Rance forged ahead on her own, painstakingly unravelling what the neurons were doing and finessing a way to study menopause symptoms in rats by tracking tiny temperature changes in their tails as a measure of hot flushes, a common symptom of menopause that is thought to be triggered in the hypothalamus. Thirty years later, a drug called fezolinetant, based on Rance’s discoveries, is being evaluated by the US Food and Drug Administration, with an approval decision expected in the first half of this year. If approved, fezolinetant could be a landmark: the first non-hormonal therapy to treat the source of hot flushes, a symptom that has become nearly synonymous with menopause and one that is experienced by about 80% of women going through the transition. (This article uses ‘women’ to describe people who experience menopause, while recognizing that not all people who identify as women go through menopause, and not all people who go through menopause identify as women.) Rance and others in the field, fezolinetant’s progress to this point is a sign that research into the causes and effects of menopausal symptoms is finally being taken seriously. In the next few years, the global number of postmenopausal women is expected to surpass one billion. But many women still struggle to access care related to menopause, and research into how best to manage such symptoms has lagged behind. That is slowly changing. Armed with improved animal models and a growing literature on the effects of existing treatments, more researchers are coming into the field to fill that gap. © 2023 Springer Nature Limited

Keyword: Hormones & Behavior; Learning & Memory
Link ID: 28778 - Posted: 05.10.2023

John Katsaras Charles Patrick Collier Dima Bolmatov Your brain is responsible for controlling most of your body’s activities. Its information processing capabilities are what allow you to learn, and it is the central repository of your memories. But how is memory formed, and where is it located in the brain? Although neuroscientists have identified different regions of the brain where memories are stored, such as the hippocampus in the middle of the brain, the neocortex in the top layer of the brain and the cerebellum at the base of the skull, they have yet to identify the specific molecular structures within those areas involved in memory and learning. Research from our team of biophysicists, physical chemists and materials scientists suggests that memory might be located in the membranes of neurons. Neurons are the fundamental working units of the brain. They are designed to transmit information to other cells, enabling the body to function. The junction between two neurons, called a synapse, and the chemistry that takes place between synapses, in the space called the synaptic cleft, are responsible for learning and memory. At a more fundamental level, the synapse is made of two membranes: one associated with the presynaptic neuron that transmits information, and one associated with the postsynaptic neuron that receives information. Each membrane is made up of a lipid bilayer containing proteins and other biomolecules. The changes taking place between these two membranes, commonly known as synaptic plasticity, are the primary mechanism for learning and memory. These include changes to the amounts of different proteins in the membranes, as well as the structure of the membranes themselves.

Keyword: Learning & Memory
Link ID: 28777 - Posted: 05.10.2023

Scientists at the National Institutes of Health have identified new genetic risk factors for two types of non-Alzheimer’s dementia. These findings were published in Cell Genomics and detail how researchers identified large-scale DNA changes, known as structural variants, by analyzing thousands of DNA samples. The team discovered several structural variants that could be risk factors Lewy body dementia (LBD) and frontotemporal dementia (FTD). The project was a collaborative effort between scientists at the National Institute of Neurological Disorders and Stroke (NINDS) and the National Institute on Aging (NIA) at NIH. Structural variants have been implicated in a variety of neurological disorders. Unlike more commonly studied mutations, which often affect one or a few DNA building blocks called nucleotides, structural variants represent at least 50 but often hundreds, or even thousands, of nucleotides at once, making them more challenging to study. “If you imagine that our entire genetic code is a book, a structural variant would be a paragraph, page, or even an entire chapter that has been removed, duplicated, or inserted in the wrong place,” said Sonja W. Scholz, M.D., Ph.D., investigator in the neurogenetics branch of NINDS and senior author of this study. By combining cutting-edge computer algorithms capable of mapping structural variations across the whole genome with machine learning, the research team analyzed whole-genome data from thousands of patient samples and several thousand unaffected controls. A previously unknown variant in the gene TCPN1 was found in samples from patients with LBD, a disease, that like Parkinson’s disease, is associated with abnormal deposits of the protein alpha-synuclein in the brain. This variant, in which more than 300 nucleotides are deleted from the gene, is associated with a higher risk for developing LBD. While this finding is new for LBD, TCPN1 is a known risk factor for Alzheimer’s disease, which could mean that this structural variant plays a role in the broader dementia population.

Keyword: Alzheimers; Genes & Behavior
Link ID: 28775 - Posted: 05.10.2023

Sara Reardon For the second time, an experimental drug has been shown to reduce the cognitive decline associated with Alzheimer’s disease. On 3 May, pharmaceutical company Eli Lilly announced in a press release that its monoclonal antibody donanemab slowed mental decline by 35% for some participants in a 1,736-person trial — a rate comparable to that for competitor drug lecanemab. But researchers warn that until the full results are published, questions remain as to the drug’s clinical usefulness, as well as whether the modest benefit outweighs the risk of harmful side effects. Like lecanemab, donanemab targets amyloid protein, which is thought to cause dementia by accumulating in the brain and damaging neurons. The trial results provide strong evidence that amyloid is a key driver of Alzheimer’s, says Jeffrey Cummings, a neuroscientist at the University of Nevada, Las Vegas. “These are transformative in an enormously important way from a scientific point of view,” he adds. “They’re terrific.” But Marsel Mesulam, a neurologist at Northwestern University in Chicago, is more cautious. “The results that are described are extremely significant and impressive, but clinically their significance is doubtful,” he says, adding that the modest effect suggests that factors other than amyloid contribute to Alzheimer’s disease progression. “We’re heading to a new era — there’s room to cheer, but it’s an era that should make us all very sober, realizing that there will be no single magic bullet.” In the press release, Eli Lilly said that people with mild Alzheimer’s who received donanemab showed 35% less clinical decline over 18 months than did those who received a placebo, and 40% less decline in their ability to perform daily tasks. The company, based in Indianapolis, Indiana, says that it will present the full results at a conference in July and publish them in a peer-reviewed journal. It plans to apply for approval by the US Food and Drug Administration (FDA) in the next two months. Promising treatments © 2023 Springer Nature Limited

Keyword: Alzheimers
Link ID: 28773 - Posted: 05.06.2023

By Sofia Quaglia With a large blade resembling a bread knife—but without the jagged edges—Stephanie Forkel slices through the human brain lying in front of her on the dissection table. A first-year university student, Forkel is clad in an apron and protective gear. It’s her first day working in the morgue at a university hospital in Munich, Germany, where the brains of people who’ve donated their bodies to science are examined for research. Her contact lenses feel dry because of the dense formaldehyde hanging in the air. But that’s not the only reason she squints a little harder. When she looks down at the annotated brain diagram in the textbook she’s supposed to use for reference, the real human brain in front of her looks nothing like the illustrated one. That was Forkel’s first eureka moment: The standard reference shape of the brain and real brains were actually vastly divergent. As she continued her studies, she confirmed that, indeed, “every individual brain looked very different,” she recounts decades later. A growing body of research now confirms there are plenty of physical dissimilarities between individual brains, particularly when it comes to white matter—the material nestled beneath the much-prized gray matter. And it’s not just anatomical. White matter hosts connections between the brain’s sections, like a city’s streets and avenues. So behavioral patterns can arise from even small physical differences in white matter, according to a late 2022 Science paper penned by Forkel and a colleague.1 Forkel is now one of a host of researchers probing subtle differences in white matter to better understand the extent of its role in making us who we are—including how much white matter dictates variations between people’s everyday behavior, and whether it’s implicated in how some patients recover better than others from life-threatening brain injuries. © 2023 NautilusNext Inc.,

Keyword: Development of the Brain
Link ID: 28762 - Posted: 05.03.2023

By Jaya Padmanabhan Speaking two languages provides the enviable ability to make friends in unusual places. A new study suggests that bilingualism may also come with another benefit: improved memory in later life. Studying hundreds of older patients, researchers in Germany found that those who reported using two languages daily from a young age scored higher on tests of learning, memory, language and self-control than patients who spoke only one language. The findings, published in the April issue of the journal Neurobiology of Aging, add to two decades of work suggesting that bilingualism protects against dementia and cognitive decline in older people. “It’s promising that they report that early and middle-life bilingualism has a beneficial effect on cognitive health in later life,” said Miguel Arce Rentería, a neuropsychologist at Columbia University who was not involved in the study. “This would line up with the existing literature.” In recent years, scientists have gained a greater understanding of bilingualism and the aging brain, though not all their findings have aligned. Some have found that if people who have fluency in two languages develop dementia, they’ll develop it at a later age than people who speak one language. But other research has shown no clear benefit from bilingualism. Neuroscientists hypothesize that because bilingual people switch fluidly between two languages, they may be able to deploy similar strategies in other skills — such as multitasking, managing emotions and self-control — that help delay dementia later on. The new study tested 746 people age 59 to 76. Roughly 40 percent of the volunteers had no memory problems, while the others were patients at memory clinics and had experienced confusion or memory loss. © 2023 The New York Times Company

Keyword: Alzheimers; Language
Link ID: 28761 - Posted: 04.29.2023

Nicola Davis Science correspondent From loud snores to twitching paws, dogs often appear to have a penchant for a good snooze. But researchers have said elderly canines with dementia appear to spend less time slumbering than those with healthy brains – mirroring patterns seen in humans. It has long been known that people with dementia can experience sleep problems, including finding it harder to get to sleep. Researchers have also found changes in the brainwaves of people with dementia during sleep – including decreased slow brain waves that occur during non-rapid eye movement deep sleep. These are important in memory consolidation and appear to be linked to the activity of the brain’s system for clearing away waste. Now it seems sleep impairment may occur in dogs experiencing a condition similar to dementia in humans. “Changes in sleep habits should be expected in older dogs, and could be a harbinger of decline in cognition,” said Prof Natasha Olby, senior author of a study at North Carolina State University. Writing in the journal Frontiers in Veterinary Science, Olby and colleagues reported on their study of 28 dogs aged between 10 and 16 years old. The canines’ brainwaves were recorded by electroencephalogram (EEG) while the dogs took a two-hour afternoon nap. The researchers also assessed owners’ answers to a questionnaire and each dog’s performance on a range of problem-solving, memory and attention tasks, to provide a score indicating whether the dog had, or was at risk of, canine dementia. Twenty of the dogs were deemed to have cognitive impairment, with this judged to be severe in eight of them. Combining their data, the team found dogs with higher dementia scores took longer to fall asleep and spent less time sleeping. © 2023 Guardian News & Media Limited

Keyword: Alzheimers; Sleep
Link ID: 28760 - Posted: 04.29.2023

By Kate Golembiewski On the one hand, this headgear looks like something a cyberfish would wear. On the other, it’s not far from a fashion statement someone at the Kentucky Derby might make. But scientists didn’t just affix this device for laughs: They are curious about the underlying brain mechanisms that allow fish to navigate their world, and how such mechanisms relate to the evolutionary roots of navigation for all creatures with brain circuitry. “Navigation is an extremely important aspect of behavior because we navigate to find food, to find shelter, to escape predators,” said Ronen Segev, a neuroscientist at Ben-Gurion University of the Negev in Israel who was part of a team that fitted 15 fish with cybernetic headgear for a study published on Tuesday in the journal PLOS Biology. Putting a computer on a goldfish to study how the neurons fire in its brain while navigating wasn’t easy. It takes a careful hand because a goldfish’s brain, which looks a bit like a small cluster of lentils, is only half an inch long. “Under a microscope, we exposed the brain and put the electrodes inside,” said Lear Cohen, a neuroscientist and doctoral candidate at Ben-Gurion who performed the surgeries to attach the devices. Each of those electrodes was the diameter of a strand of human hair. It was also tricky to find a way to perform the procedure on dry land without harming the test subject. “The fish needs water and you need him not to move,” he said. He and his colleagues solved both problems by pumping water and anesthetics into the fish’s mouth. Once the electrodes were in the brain, they were connected to a small recording device, which could monitor neuronal activity and which was sealed in a waterproof case, mounted on the fish’s forehead. To keep the computer from weighing the fish down and impeding its ability to swim, the researchers attached buoyant plastic foam to the device. © 2023 The New York Times Company

Keyword: Learning & Memory
Link ID: 28756 - Posted: 04.26.2023

By Laurie McGinley — When Rebecca Chopp was diagnosed with early-stage Alzheimer’s disease, she and her husband did the only thing that seemed to make sense: They went to their favorite Mexican restaurant, held each other in a back booth and drank margaritas. And cried. After a while, they helped each other back across the street to their home. Chopp, at 67, was chancellor of the University of Denver, at the pinnacle of a career powered by a daunting intellect and relentless work. She was also an ordained minister, prolific author and former president of Swarthmore College and Colgate University. Sometimes, Chopp thought of herself as a brain with a body attached. The changes were subtle: Chopp was sleeping more. She got lost on the way to the doctor. Then came the diagnosis. (Joanna Kulesza/For The Washington Post ) Now, she was crushed, facing the loss of that beautiful mind. She worried she would soon be an empty shell, drooling and unkempt, a burden to the people she loved. “There is a sense that when you are diagnosed, you are immediately going to descend into madness,” Chopp said. When she relinquished the job she loved, Chopp fell into deep despair, confounded by the prescription given to her by an empathetic doctor: “Live with joy!” She had nightmares about going insane. But, eventually, she began to push back against the darkness. Chopp has mild cognitive impairment, a condition that involves subtle changes in thinking and memory and that, in most cases, leads to Alzheimer’s dementia, a fatal neurodegenerative disease that affects more than 6.7 million Americans. Using diet, exercise and joy to slow Alzheimer’s For years, there was little doctors could do for people with Alzheimer’s, even at a very early stage. Now, changes are coming in how the disease is diagnosed and treated, and patients with mild cognitive impairment are at the center of the efforts. Lacking a cure, scientists are trying desperately to delay the worst phase of the illness.

Keyword: Alzheimers
Link ID: 28752 - Posted: 04.26.2023

By Emily Underwood The ability to set a goal and pursue it without getting derailed by temptations or distractions is essential to nearly everything we do in life, from finishing homework to driving safely in traffic. It also places complex demands on the brain, requiring skills like working memory — the ability to keep small amounts of information in mind to perform a task — as well as impulse control and being able to rapidly adapt when rules or circumstances change. Taken together, these elements add up to something researchers call executive function. We all struggle with executive function sometimes, for example when we’re stressed or don’t get enough sleep. But in teenagers, these powers are still a work in progress, contributing to some of the contradictory behaviors and lapses in judgment — “My honor roll student did what on TikTok?” — that baffle many parents. This erratic control can be dangerous, especially when teens make impulsive choices. But that doesn’t mean the teen brain is broken, says Beatriz Luna, a developmental cognitive neuroscientist at the University of Pittsburgh and coauthor of a review on the maturation of one aspect of executive function, called cognitive control, in the 2015 Annual Review of Neuroscience. Adolescents have all the basic neural circuitry needed for executive function and cognitive control, Luna says. In fact, they have more than they need — what’s lacking is experience, which over time will strengthen some neural pathways and weaken or eliminate others. This winnowing serves an important purpose: It tailors the brain to help teens handle the demands of their unique, ever-changing environments and to navigate situations their parents may never have encountered. Luna’s research suggests that teens’ inconsistent cognitive control is key to becoming independent, because it encourages them to seek out and learn from experiences that go beyond what they’ve been actively taught. © 2023 Annual Reviews

Keyword: Development of the Brain; Attention
Link ID: 28751 - Posted: 04.26.2023