Chapter 13. Memory and Learning

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 1668

by Laura Dattaro Some genomic areas that help determine cerebellar size are associated with autism, schizophrenia and bipolar disorder, according to a new study. But heritable genetic variants across the genome that also influence cerebellar size are not. The cerebellum sits at the base of the skull, below and behind the much larger cerebrum. It coordinates movement and may also play roles in social cognition and autism, according to previous research. The new work analyzed genetic information and structural brain scans from more than 33,000 people in the UK Biobank, a biomedical and genetic database of adults aged 40 to 69 living in the United Kingdom. A total of 33 genetic sequence variants, known as single nucleotide polymorphisms (SNPs), were associated with differences in cerebellar volume. Only one SNP overlapped with those linked to autism, but the association should be explored further in other cohorts, says lead investigator Richard Anney, senior lecturer in bioinformatics at Cardiff University in Wales. “There’s lots of caveats to say why it might be worth following up on,” Anney says. “But from this data alone, it’s not telling us there’s a major link between [autism] and cerebellar volume.” So far, cognitive neuroscientists have largely ignored the cerebellum, says Jesse Gomez, assistant professor of neuroscience at Princeton University, who was not involved in the work. The new study represents a first step in better understanding genetic influences on the brain region and its role in neurodevelopmental conditions, he says. “It’s a fun paper,” Gomez says. “It’s the beginning of what’s an exciting revolution in the field.” Of the 33 inherited variants Anney’s team found, 5 had not previously been significantly associated with cerebellar volume. They estimated that the 33 variants account for about 50 percent of the differences in cerebellar volume seen across participants. © 2022 Simons Foundation

Keyword: Autism; Genes & Behavior
Link ID: 28215 - Posted: 02.23.2022

by Angie Voyles Askham Mice chemically coaxed to produce high levels of an autism-linked gut molecule have anxiety-like behavior and unusual patterns of brain connectivity, according to a study published today in Nature. The findings present a direct mechanism by which the gut could send signals to the brain and alter development, the researchers say. “It’s a true mechanistic paper, [like] the field has been asking for,” says Jane Foster, professor of psychiatry and behavioral neurosciences at McMaster University in Hamilton, Canada, who was not involved in the study. Although it’s not clear that this exact signaling pathway is happening in people, she says, “this is the sort of work that’s going to get us that answer.” The molecule, 4-ethylphenol (4EP), is produced by gut microbes in mice and people. An enzyme in the colon and liver converts 4EP to 4-ethylphenyl sulfate (4EPS), which then circulates in the blood. Mice exposed to a maternal immune response in the womb have atypically high blood levels of 4EPS, as do some autistic people, previous research shows. And injecting mice with the molecule increases behaviors indicative of anxiety. But it wasn’t clear how the molecule could contribute to those traits. In the new work, researchers show that 4EPS can enter the brain and that its presence is associated with altered brain connectivity and a decrease in myelin — the insulation around axons that helps conduct electrical signals. Boosting the function of myelin-producing cells, the team found, eases the animals’ anxiety. “This is one of the first — maybe, arguably, the first — demonstrations of a specific microbe molecule that has such a profound impact on a complex behavior,” says lead researcher Sarkis Mazmanian, professor of microbiology at the California Institute of Technology in Pasadena. “How it’s doing it, we still need to understand.” without the engineered enzymes, they showed increased anxiety-like behaviors, © 2022 Simons Foundation

Keyword: Development of the Brain; Neuroimmunology
Link ID: 28205 - Posted: 02.16.2022

By Linda Searing Health-care workers and others who are exposed on the job to formaldehyde, even in low amounts, face a 17 percent increased likelihood of developing memory and thinking problems later on, according to research published in the journal Neurology. The finding adds cognitive impairment to already established health risks associated with formaldehyde. As the level of exposure increases, those risks range from eye, nose and throat irritation to skin rashes and breathing problems. At high levels of exposure, the chemical is considered a carcinogen, linked to leukemia and some types of nose and throat cancer. A strong-smelling gas, formaldehyde is used in making building materials and plastics and often as a component of disinfectants and preservatives. Materials containing formaldehyde can release it into the air as a vapor that can be inhaled, which is the main way people are exposed to it. The study, which included data from more than 75,000 people, found that the majority of those exposed were workers in the health-care sector — nurses, caregivers, medical technicians and those working in labs and funeral homes. Other study participants who had been exposed to formaldehyde included workers in textile, chemistry and metal industries; carpenters; and cleaners. At highest risk were those whose work had exposed them to formaldehyde for 22 years or more, giving them a 21 percent higher risk for cognitive problems than those who had not been exposed. Using a battery of standardized tests, the researchers found that formaldehyde exposure created higher risk for every type of cognitive function that was tested, including memory, attention, reasoning, word recall and other thinking skills.

Keyword: Learning & Memory; Neurotoxins
Link ID: 28203 - Posted: 02.16.2022

Jordana Cepelewicz We often think of memory as a rerun of the past — a mental duplication of events and sensations that we’ve experienced. In the brain, that would be akin to the same patterns of neural activity getting expressed again: Remembering a person’s face, for instance, might activate the same neural patterns as the ones for seeing their face. And indeed, in some memory processes, something like this does occur. But in recent years, researchers have repeatedly found subtle yet significant differences between visual and memory representations, with the latter showing up consistently in slightly different locations in the brain. Scientists weren’t sure what to make of this transformation: What function did it serve, and what did it mean for the nature of memory itself? Now, they may have found an answer — in research focused on language rather than memory. A team of neuroscientists created a semantic map of the brain that showed in remarkable detail which areas of the cortex respond to linguistic information about a wide range of concepts, from faces and places to social relationships and weather phenomena. When they compared that map to one they made showing where the brain represents categories of visual information, they observed meaningful differences between the patterns. And those differences looked exactly like the ones reported in the studies on vision and memory. The finding, published last October in Nature Neuroscience, suggests that in many cases, a memory isn’t a facsimile of past perceptions that gets replayed. Instead, it is more like a reconstruction of the original experience, based on its semantic content. All Rights Reserved © 2022

Keyword: Learning & Memory; Language
Link ID: 28202 - Posted: 02.12.2022

Ian Sample Science editor People who develop Alzheimer’s disease can experience sleep disturbances years before the condition takes hold, but whether one causes the other, or something more complex is afoot, has always proved hard for scientists to determine. Now, researchers in the US have shed light on the mystery, in work that raises hopes for new therapies, and how “good sleep hygiene” could help to tackle the disease and its symptoms. The findings show that humans’ 24-hour circadian clock controls the brain’s ability to mop up wayward proteins linked to Alzheimer’s disease. If the scientists are right, the work would explain, at least in part, how disruption to circadian rhythms and sleep disturbances might feed into the onset and progression of Alzheimer’s disease, and how preventing such disruption might stave off the condition. “Circadian disruption is correlated with Alzheimer’s diagnosis and it has been suggested that sleep disruptions could be an early warning sign of Alzheimer’s disease,” said Dr Jennifer Hurley, who led the research at Rensselaer Polytechnic Institute, in New York. Alzheimer’s takes hold when connections are lost between nerve cells in the brain. The disease is progressive and linked to abnormal plaques and tangles of proteins that steadily build up in the brain. The disease is the most common cause of dementia and affects more than half a million people in the UK, a figure that is set to rise. To keep the brain healthy, immune cells called microglia seek out and destroy troublesome proteins that threaten to accumulate in the brain. One type of protein targeted by the cells is called amyloid beta, a hallmark of Alzheimer’s. © 2022 Guardian News & Media Limited

Keyword: Alzheimers; Sleep
Link ID: 28197 - Posted: 02.12.2022

By Elizabeth Landau My grandmother was in the advanced stages of Alzheimer’s disease when she died in 2007, not long after I graduated from journalism school. As a budding health reporter, I tried to learn everything I could about Alzheimer’s and wrote about new research on preventions and treatments that everyone wanted to believe had potential. It is demoralizing and infuriating to think about how, nearly 15 years later, no breakthrough cure or proven prevention strategy has panned out. But neurologist Sara Manning Peskin argues in “A Molecule Away from Madness: Tales of the Hijacked Brain” that we could be on the brink of a revolution in confronting diseases like this because scientists have a better handle on how molecules work in the brain. Molecular research has transformed our understanding and treatment of cancer in recent years, and now it is beginning to do the same for brain diseases. In fact, it has already been key to solving several mysteries of why seemingly healthy people appear to suddenly fall into a mental inferno. While the shadow of Alzheimer’s looms over the book, representing an intractable condition that Peskin routinely confronts in her clinical practice, “A Molecule Away from Madness” is a fascinating tour of different kinds of ways that the brain can lead to the breakdown of mental life. The book is organized according to how different molecules interact with our brains to wreak havoc — Peskin calls them “mutants, rebels, invaders, and evaders.” Some have helped scientists solve longstanding puzzles, while others, like the molecules associated with Alzheimer’s, continue to leave millions of people waiting for a cure.

Keyword: Alzheimers
Link ID: 28196 - Posted: 02.12.2022

By Pallab Ghosh A paralysed man with a severed spinal cord has been able to walk again, thanks to an implant developed by a team of Swiss researchers. It is the first time someone who has had a complete cut to their spinal cord has been able to walk freely. The same technology has improved the health of another paralysed patient to the extent that he has been able to become a father. The research has been published in the journal Nature Medicine. Michel Roccati was paralysed after a motorbike accident five years ago. His spinal cord was completely severed - and he has no feeling at all in his legs. But he can now walk - because of an electrical implant that has been surgically attached to his spine. Someone this injured has never been able to walk like this before. The researchers stress that it isn't a cure for spinal injury and that the technology is still too complicated to be used in everyday life, but hail it nonetheless as a major step to improving quality of life. I met Michel at the lab where the implant was created. He told me that the technology "is a gift to me". "I stand up, walk where I want to, I can walk the stairs - it's almost a normal life." It was not the technology alone that drove Michel's recovery. The young Italian has a steely resolve. He told me that from the moment of his accident, he was determined to make as much progress as he could. "I used to box, run and do fitness training in the gym. But after the accident, I could not do the things that I loved to do, but I did not let my mood go down. I never stopped my rehabilitation. I wanted to solve this problem." The speed of Michel's recovery amazed the neurosurgeon who inserted the implant and expertly attached electrodes to individual nerve fibres, Prof Jocelyne Bloch at Lausanne University Hospital "I was extremely surprised," she told me. "Michel is absolutely incredible. He should be able to use this technology to progress and be better and better." © 2022 BBC.

Keyword: Robotics; Regeneration
Link ID: 28194 - Posted: 02.09.2022

By Laura Sanders A tussle with COVID-19 can leave people’s brains fuzzy. SARS-CoV-2, the virus behind COVID-19, doesn’t usually make it into the brain directly. But the immune system’s response to even mild cases can affect the brain, new preliminary studies suggest. These reverberating effects may lead to fatigue, trouble thinking, difficulty remembering and even pain, months after the infection is gone. It’s not a new idea. Immune systems gone awry have been implicated in cognitive problems that come with other viral infections such as HIV and influenza, with disorders such as myalgic encephalomyelitis/chronic fatigue syndrome, or ME/CFS, and even from the damaging effects of chemotherapy. What’s different with COVID-19 is the scope of the problem. Millions of people have been infected, says neurologist Avindra Nath of the National Institutes of Health in Bethesda, Md. “We are now faced with a public health crisis,” he says. Sign up for e-mail updates on the latest coronavirus news and research To figure out ways to treat people for the fuzzy thinking, headaches and fatigue that hang around after a bout with COVID-19, scientists are racing to figure out what’s causing these symptoms (SN: 4/27/21). Cognitive neurologist Joanna Hellmuth at the University of California, San Francisco had a head start. As someone who had studied the effects of HIV on the brain, she quickly noted similarities in the neurological symptoms of HIV and COVID-19. The infections paint “the same exact clinical picture,” she says. HIV-related cognitive symptoms have been linked to immune activation in the body, including the brain. “Maybe the same thing is happening in COVID,” Hellmuth says. © Society for Science & the Public 2000–2022.

Keyword: Neuroimmunology; Learning & Memory
Link ID: 28189 - Posted: 02.05.2022

Anastasia Brodovskaya Jaideep Kapur Epilepsy is a disease marked by recurrent seizures, or sudden periods of abnormal, excessive or synchronous neuronal activity in the brain. One in 26 people in the U.S. will develop epilepsy at some point in their life. While people with mild seizures might experience a brief loss of awareness and muscle twitches, more severe seizures could last for several minutes and lead to injury from falling down and losing control of their limbs. Many people with epilepsy also experience memory problems. Patients often experience retrograde amnesia, where they cannot remember what happened immediately before their seizure. Electroconvulsive therapy, a form of treatment for major depression that intentionally triggers small seizures, can also cause retrograde amnesia. So why do seizures often cause memory loss? We are neurology researchers who study the mechanisms behind how seizures affect the brain. Our brain-mapping study found that seizures affect the same circuits of the brain responsible for memory formation. Understand new developments in science, health and technology, each week One of the earliest descriptions of seizures was written on a Babylonian tablet over 3,000 years ago. Seizures can be caused by a number of factors, ranging from abnormalities in brain structure and genetic mutations to infections and autoimmune conditions. Often, the root cause of a seizure isn’t known. The most common type of epilepsy involves seizures that originate in the brain region located behind the ears, the temporal lobe. Some patients with temporal lobe epilepsy experience retrograde amnesia and are unable to recall events immediately before their seizure. © 2010–2022, The Conversation US, Inc.

Keyword: Epilepsy; Learning & Memory
Link ID: 28187 - Posted: 02.05.2022

by Holly Barker New software uses machine-learning to automatically detect and quantify gait and posture from videos of mice moving around their cage. The tool could accelerate research on how autism-linked mutations or drug treatments affect motor skills, says lead researcher Vivek Kumar, associate professor of mammalian genetics at The Jackson Laboratory in Bar Harbor, Maine. Most efforts to analyze motor behavior involve placing a mouse on a treadmill or training it to walk through a maze. These assays are a simple way of testing speed, but they restrict the animals’ movement and force mice to walk in an unnatural way. The algorithm processes footage from an overhead camera and tracks 12 key points on a mouse’s body as it freely explores its surroundings. As the animal wanders, the software detects the position of its limbs and other body parts, automatically generating data on its gait and posture. The researchers described their method in January in Cell Reports. Kumar’s group trained the software by feeding it about 8,000 video frames that had been manually annotated to tag key points on the animal’s body, such as the nose, ears and tip of the tail. They repeated the process with a variety of different strains to teach the algorithm to recognize mice of all shapes and sizes. The trained software learned to read the rodent’s pose, which was further analyzed to extract more detailed information, such as the speed and length of each stride and the width of the mouse’s stance. © 2022 Simons Foundation

Keyword: Autism; Movement Disorders
Link ID: 28186 - Posted: 02.05.2022

ByRodrigo Pérez Ortega A good workout doesn’t just boost your mood—it also boosts the brain’s ability to create new neurons. But exactly how this happens has puzzled researchers for years. “It’s been a bit of a black box,” says Tara Walker, a neuroscientist at the University of Queensland’s Brain Institute. Now, Walker and her colleagues think they have found a key: the chemical element selenium. During exercise, mice produce a protein containing selenium that helps their brains grow new neurons, the team reports today. Scientists may also be able to harness the element to help reverse cognitive decline due to old age and brain injury, the authors say. It’s a “fantastic” study, says Bárbara Cardoso, a nutritional biochemist at Monash University’s Victorian Heart Institute. Her own research has shown selenium—which is found in Brazil nuts, grains, and some legumes—improves verbal fluency and the ability to copy drawings correctly in older adults. “We could start thinking about selenium as a strategy” to treat or prevent cognitive decline in those who cannot exercise or are more vulnerable to selenium deficiency, she says, such as older adults, and stroke and Alzheimer’s disease patients. In 1999, researchers reported that running stimulates the brain to make new neurons in the hippocampus, a region involved in learning and memory. But which molecules were released into the bloodstream to spark this “neurogenesis” remained unclear. So 7 years ago, Walker and her colleagues screened the blood plasma of mice that had exercised on a running wheel in their cages for 4 days, versus mice that had no wheel. The team identified 38 proteins whose levels increased after the workout. © 2022 American Association for the Advancement of Science.

Keyword: Learning & Memory; Obesity
Link ID: 28185 - Posted: 02.05.2022

Bret Stetka It all started with genetic data. A gene here, a gene there. Eventually the story became clearer: If scientists are to one day find a cure for Alzheimer's disease, they should look to the immune system. Over the past couple decades, researchers have identified numerous genes involved in various immune system functions that may also contribute to Alzheimer's. Some of the prime suspects are genes that control humble little immune cells called microglia, now the focus of intense research in developing new Alzheimer's drugs. Microglia are amoeba-like cells that scour the brain for injuries and invaders. They help clear dead or impaired brain cells and literally gobble up invading microbes. Without them, we'd be in trouble. In a normal brain, a protein called beta-amyloid is cleared away through our lymphatic system by microglia as molecular junk. But sometimes it builds up. Certain gene mutations are one culprit in this toxic accumulation. Traumatic brain injury is another, and, perhaps, impaired microglial function. One thing everyone agrees on is that in people with Alzheimer's, too much amyloid accumulates between their brain cells and in the vessels that supply the brain with blood. Once amyloid begins to clog networks of neurons, it triggers the accumulation of another protein, called tau, inside of these brain cells. The presence of tau sends microglia and other immune mechanisms into overdrive, resulting in the inflammatory immune response that many experts believe ultimately saps brain vitality in Alzheimer's. To date, nearly a dozen genes involved in immune and microglial function have been tied to Alzheimer's. The first was CD33, identified in 2008. © 2022 npr

Keyword: Alzheimers; Neuroimmunology
Link ID: 28184 - Posted: 02.02.2022

Dan Robitzski As the coronavirus pandemic continues, scientists are racing to understand the underlying causes and implications of long COVID, the umbrella term for symptoms that persist for at least 12 weeks but often last even longer and affect roughly 30 percent of individuals who contract COVID-19. Evidence for specific risk factors such as diabetes and the presence of autoantibodies is starting to emerge, but throughout the pandemic, one assumption has been that an important indicator of whether a COVID-19 survivor is likely to develop long COVID is the severity of their acute illness. However, a preprint shared online on January 10 suggests that even mild SARS-CoV-2 infections may lead to long-term neurological symptoms associated with long COVID such as cognitive impairment and difficulties with attention and memory, a suite of symptoms often lumped together as “brain fog.” In the study, which has not yet been peer-reviewed, scientists led by Stanford University neurologist Michelle Monje identified a pathway in COVID-19–infected mice and humans that almost perfectly matches the inflammation thought to cause chemotherapy-related cognitive impairment (CRCI), also known as “chemo fog,” following cancer treatments. On top of that, the preprint shows that the neuroinflammation pathway can be triggered even without the coronavirus infecting a single brain cell. As far back as March 2020, Monje feared that cytokine storms caused by the immune response to SARS-CoV-2 would cause the same neuroinflammation and symptoms associated with CRCI, she tells The Scientist. But because her lab doesn’t study viral infections, she had no way to test her hypothesis until other researchers created the appropriate models. In the study, Monje and her colleagues used a mouse model for mild SARS-CoV-2 infections developed at the lab of Yale School of Medicine biologist and study coauthor Akiko Iwasaki as well as brain tissue samples taken from people who had COVID-19 when they died to demonstrate that mild infections can trigger inflammation in the brain. © 1986–2022 The Scientist.

Keyword: Learning & Memory
Link ID: 28182 - Posted: 02.02.2022

by Laura Dattaro Early in her first postdoctoral position, Hollis Cline first showed her hallmark flair for creative problem-solving. Cline, who goes by Holly, and her adviser, neuroscientist Martha Constantine-Paton, wanted to study the brain’s ‘topographical maps’ — internal representations of sensory input from the external world. These maps are thought to shape a person’s ability to process sensory information — filtering that can go awry in autism and other neurodevelopmental conditions. No one knew just how these maps formed or what could potentially disrupt them. Cline and Constantine-Paton, who was then at Yale University and is now emerita professor of brain and cognitive sciences at the Massachusetts Institute of Technology, weren’t sure how to find out. But as a first step, the pair decided to take the plunge with an unusual animal model: the frog — specifically, a spotted greenish-brown species called Rana pipiens, or the northern leopard frog. The amphibians spend two to three months as tadpoles, a span during which their brains change rapidly and visibly — unlike in mammals, which undergo similar stages of development inside of the mother’s body. These traits made it possible for Cline and Constantine-Paton to introduce changes and repeatedly watch their effects in real time. “That’s an extended period when you can actually have access to the developing brain,” Cline says. The unorthodox approach paid off. Cline, 66, now professor of neuroscience at the Scripps Research Institute in La Jolla, California, worked out that a receptor for the neurotransmitter glutamate, which had been shown to be important for learning and memory, also mediated how visual experiences influence the developing topographical map. She later created a novel live imaging technique to visualize frog neurons’ development over time and, sticking with frogs over the ensuing decades, went on to make fundamental discoveries about how sensory experiences shape brain development and sensory processing. © 2022 Simons Foundation

Keyword: Development of the Brain; Autism
Link ID: 28179 - Posted: 02.02.2022

By Meeri Kim Kellie Carr and her 13-year-old son, Daniel, sat in the waiting room of a pediatric neurology clinic for yet another doctor’s appointment in 2012. For years, she struggled to find out what was causing his weakened right side. It wasn’t an obvious deficit, by any means, and anyone not paying close attention would see a normal, healthy teenage boy. At that point, no one had any idea that Daniel had suffered a massive stroke as a newborn and lost large parts of his brain as a result. “It was the largest stroke I’d ever seen in a child who hadn’t died or suffered extreme physical and mental disability,” said Nico Dosenbach, the pediatric neurologist at Washington University School of Medicine in St. Louis who finally diagnosed him using a magnetic resonance imaging (MRI) scan. "If I saw the MRI first, I would have assumed this kid's probably in a wheelchair, has a feeding tube and might be on a ventilator," Dosenbach said. "Because normally, when a child is missing that much brain, it's bad." But Daniel — as an active, athletic young man who did fine in school — defied all logic. Before the discovery of the stroke, his mother had noticed some odd mannerisms, such as zipping up his coat or eating a burger using only his left hand. When engaged, his right hand often served as club-like support instead of a dexterous appendage with fingers. Daniel excelled as a left-handed pitcher in competitive baseball, but his coach found it unusual that he would always switch the glove to his left hand when catching the ball. Medical professionals tried to help — first his pediatrician, followed by an orthopedic doctor who sent him to physical therapy — but no one could figure out the root cause. They tried constraint-induced movement therapy, which forces patients to use the weaker arm by immobilizing the other in a cast, but Daniel soon rebelled and broke himself free. © 1996-2022 The Washington Post

Keyword: Development of the Brain; Stroke
Link ID: 28174 - Posted: 01.26.2022

By Jason DeParle WASHINGTON — A study that provided poor mothers with cash stipends for the first year of their children’s lives appears to have changed the babies’ brain activity in ways associated with stronger cognitive development, a finding with potential implications for safety net policy. The differences were modest — researchers likened them in statistical magnitude to moving to the 75th position in a line of 100 from the 81st — and it remains to be seen if changes in brain patterns will translate to higher skills, as other research offers reason to expect. Still, evidence that a single year of subsidies could alter something as profound as brain functioning highlights the role that money may play in child development and comes as President Biden is pushing for a much larger program of subsidies for families with children. “This is a big scientific finding,” said Martha J. Farah, a neuroscientist at the University of Pennsylvania, who conducted a review of the study for the Proceedings of the National Academy of Sciences, where it was published on Monday. “It’s proof that just giving the families more money, even a modest amount of more money, leads to better brain development.” Another researcher, Charles A. Nelson III of Harvard, reacted more cautiously, noting the full effect of the payments — $333 a month — would not be clear until the children took cognitive tests. While the brain patterns documented in the study are often associated with higher cognitive skills, he said, that is not always the case. © 2022 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 28172 - Posted: 01.26.2022

by Lauren Schenkman Autism is thought to arise during prenatal development, when the brain is spinning its web of excitatory and inhibitory neurons, the main signal-generating cell types in the cerebral cortex. Though this wiring process remains mysterious, one thing seemed certain after two decades of studies in mice: Although both neuron types arise from radial glia, excitatory neurons crop up in the developing cortex, whereas inhibitory neurons, also known as interneurons, originate outside of the cortex and then later migrate into it. Not so in the human brain, according to a study published in December in Nature. A team of researchers led by Tomasz Nowakowski, assistant professor of anatomy at the University of California, San Francisco, used a new viral barcoding method to trace the descendants of radial glial cells from the developing human cortex and found that these progenitor cells can give rise to both excitatory neurons and interneurons. “This is really a paradigm-shifting finding,” Nowakowski says. “It sets up a new framework for studying, understanding and interpreting experimental models of autism mutations.” Nowakowski spoke with Spectrum about the discovery’s implications for studying the origins of autism in the developing brain. Spectrum: Why did you investigate this topic? Tomasz Nowakowski: My lab and I are interested in understanding the early neurodevelopmental events that give rise to the incredible complexity of the human cerebral cortex. We know especially little about the early stages of human development, primarily because a lot of our knowledge comes from mouse models. As we’ve begun to realize over the past decade, the processes that underlie development of the brain in humans and mice can be quite different. © 2022 Simons Foundation

Keyword: Autism; Development of the Brain
Link ID: 28165 - Posted: 01.22.2022

Veronique Greenwood In the moment between reading a phone number and punching it into your phone, you may find that the digits have mysteriously gone astray — even if you’ve seared the first ones into your memory, the last ones may still blur unaccountably. Was the 6 before the 8 or after it? Are you sure? Maintaining such scraps of information long enough to act on them draws on an ability called visual working memory. For years, scientists have debated whether working memory has space for only a few items at a time, or if it just has limited room for detail: Perhaps our mind’s capacity is spread across either a few crystal-clear recollections or a multitude of more dubious fragments. The uncertainty in working memory may be linked to a surprising way that the brain monitors and uses ambiguity, according to a recent paper in Neuron from neuroscience researchers at New York University. Using machine learning to analyze brain scans of people engaged in a memory task, they found that signals encoded an estimate of what people thought they saw — and the statistical distribution of the noise in the signals encoded the uncertainty of the memory. The uncertainty of your perceptions may be part of what your brain is representing in its recollections. And this sense of the uncertainties may help the brain make better decisions about how to use its memories. The findings suggests that “the brain is using that noise,” said Clayton Curtis, a professor of psychology and neuroscience at NYU and an author of the new paper. All Rights Reserved © 2022

Keyword: Learning & Memory
Link ID: 28163 - Posted: 01.19.2022

Nicola Davis It’s a cold winter’s day, and I’m standing in a room watching my dog stare fixedly at two flower pots. I’m about to get an answer to a burning question: is my puppy a clever girl? Dogs have been our companions for millennia, domesticated sometime between 15,000 and 30,000 years ago. And the bond endures: according to the latest figures from the Pet Food Manufacturers Association 33% of households in the UK have a dog. But as well as fulfilling roles from Covid detection to lovable family rogue, scientists investigating how dogs think, express themselves and communicate with humans say dogs can also teach us about ourselves. And so I am here at the dog cognition centre at the University of Portsmouth with Calisto, the flat-coated retriever, and a pocket full of frankfurter sausage to find out how. We begin with a task superficially reminiscent of the cup and ballgame favoured by small-time conmen. Amy West, a PhD student at the centre, places two flower pots a few metres in front of Calisto, and appears to pop something under each. However, only one actually contains a tasty morsel. West points at the pot under which the sausage lurks, and I drop Calisto’s lead. The puppy makes a beeline for the correct pot. But according to Dr Juliane Kaminski, reader in comparative psychology at the University of Portsmouth, this was not unexpected. “A chimpanzee is our closest living relative – they ignore gestures like these coming from humans entirely,” she says. “But dogs don’t.” © 2022 Guardian News & Media Limited

Keyword: Learning & Memory; Evolution
Link ID: 28162 - Posted: 01.19.2022

By Jane E. Brody Many people aren’t overly concerned when an octogenarian occasionally forgets the best route to a favorite store, can’t remember a friend’s name or dents the car while trying to parallel park on a crowded city street. Even healthy brains work less efficiently with age, and memory, sensory perceptions and physical abilities become less reliable. But what if the person is not in their 80s but in their 30s, 40s or 50s and forgets the way home from their own street corner? That’s far more concerning. While most of the 5.3 million Americans who are living with Alzheimer’s disease or other forms of dementia are over 65, some 200,000 are younger than 65 and develop serious memory and thinking problems far earlier in life than expected. “Young-onset dementia is a particularly disheartening diagnosis because it affects individuals in the prime years,” Dr. David S. Knopman, a neurologist at the Mayo Clinic in Rochester, Minn., wrote in a July 2021 editorial in JAMA Neurology. Many of the afflicted are in their 40s and 50s, midcareer, hardly ready to retire and perhaps still raising a family. Dementia in a younger adult is especially traumatic and challenging for families to acknowledge, and many practicing physicians fail to recognize it or even suspect it may be an underlying cause of symptoms. “Complaints about brain fog in young patients are very common and are mostly benign,” Dr. Knopman told me. “It’s hard to know when they’re not attributable to stress, depression or anxiety or the result of normal aging. Even neurologists infrequently see patients with young-onset dementia.” Yet recent studies indicate that the problem is far more common than most doctors realize. Worldwide, as many as 3.9 million people younger than 65 may be affected, a Dutch analysis of 74 studies indicated. The study, published in JAMA Neurology in September, found that for every 100,000 people aged 30 to 64, 119 had early dementia. © 2022 The New York Times Company

Keyword: Alzheimers; Genes & Behavior
Link ID: 28161 - Posted: 01.19.2022