Chapter 13. Memory and Learning

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1705

Nicola Davis Science correspondent If the taste of kale makes you screw up your face, you are not alone: researchers have observed foetuses pull a crying expression when exposed to the greens in the womb. While previous studies have suggested our food preferences may begin before birth and can be influenced by the mother’s diet, the team says the new research is the first to look directly at the response of unborn babies to different flavours. “[Previously researchers] just looked at what happens after birth in terms of what do [offspring] prefer, but actually seeing facial expressions of the foetus when they are getting hit by the bitter or by the non-bitter taste, that is something which is completely new,” said Prof Nadja Reissland, from Durham University, co-author of the research. Writing in the journal Psychological Science, the team noted that aromas from the mother’s diet were present in the amniotic fluid. Taste buds can detect taste-related chemicals from 14 weeks’ gestation, and odour molecules can be sensed from 24 weeks’ gestation. To delve into whether foetuses differentiate specific flavours, the team looked at ultrasound scans from almost 70 pregnant women, aged 18 to 40 from the north-east of England, who were split into two groups. One group was asked to take a capsule of powdered kale 20 minutes before an ultrasound scan, and the other was asked to take a capsule of powdered carrot. Vegetable consumption by the mothers did not differ between the kale and carrot group. The team also examined scans from 30 women, taken from an archive, who were not given any capsules. All the women were asked to refrain from eating anything else in the hour before their scans. The team then carried out a frame-by-frame analysis of the frequency of a host of different facial movements of the foetuses, including combinations that resembled laughing or crying. Overall, the researchers examined 180 scans from 99 foetuses, scanned at either 32 weeks, 36 weeks, or at both time points. © 2022 Guardian News & Media Limited

Keyword: Development of the Brain; Chemical Senses (Smell & Taste)
Link ID: 28493 - Posted: 09.28.2022

Terriline Porelle is puzzling over two mysteries. The first is: what’s plaguing her? For the past two years, the formerly healthy, active, 34-year-old resident of Cocagne, N.B. has been experiencing many strange and alarming symptoms, including muscle twitches and blurred vision, auditory hallucinations, brain fog and loss of balance and co-ordination. The second mystery is why health authorities no longer seem interested in finding out why she’s ill. “It’s like nobody’s really looking to see what’s going on and it doesn’t make any sense,” she said. Ms. Porelle is one of 48 people who were initially identified between late 2020 and May, 2021, as being part of a cluster of patients in New Brunswick who all had a mysterious brain illness, which the province referred to as a “potential neurological syndrome of unknown cause.” Doctors and researchers puzzled over the cases for months. Then, in a February report, the province announced that there was no mystery illness, and that its investigation into the matter had concluded. An independent oversight committee had found that the 48 patients were likely suffering from various previously known diseases that had simply been misdiagnosed, the report said. But some of the patients and their families say their suffering remains very real – and that it’s made worse by the fact that they’re no closer to getting answers about what’s causing it. The province’s report said neurologists on the oversight committee had provided potential alternative diagnoses for 41 of the 48 patients, including Alzheimer’s disease and other types of dementia, post-concussion syndrome, chronic severe anxiety disorder and cancer. It recommended that patients contact their primary caregivers for referrals to further treatment, or that they seek help from a specialized clinic in Moncton called the Moncton Interdisciplinary Neurodegenerative Diseases (MIND) Clinic.

Keyword: Alzheimers
Link ID: 28491 - Posted: 09.28.2022

by Angie Voyles Askham / Brain connectivity patterns in people with autism and other neuropsychiatric conditions are more closely related to genetics than to phenotypic traits, according to two new studies. The findings highlight why a single brain biomarker for autism has remained elusive, the researchers say. The condition’s genetic heterogeneity has hampered the search for a shared brain signature: More than 100 genes have been identified as strongly linked to autism, and multiple copy number variations (CNVs) — deleted or duplicated stretches of genetic code — can increase a person’s likelihood of the condition. Autism also often overlaps with other conditions, such as schizophrenia and attention-deficit/hyperactivity disorder (ADHD), making autism-specific markers difficult to disentangle. Common variants tied to autism overlap strongly with those linked to schizophrenia and high IQ, for example, whereas rare autism-linked variants track with low IQ. According to the new papers, however, autism’s genetic heterogeneity corresponds to similarly disparate maps of ‘functional connectivity’ — a measure of which brain areas activate in sync while the brain is at rest. “What we’re seeing is that these groups of variants have specific functional connectivity signatures,” says lead investigator Sébastien Jacquemont, associate professor of pediatrics at the University of Montreal in Canada. The findings need to be replicated, says Aaron Alexander-Bloch, assistant professor of psychiatry at the University of Pennsylvania and the Children’s Hospital of Philadelphia, who was not involved in the work, but they point to the importance of subgrouping study participants based on their underlying genetics. © 2022 Simons Foundation

Keyword: Autism; Brain imaging
Link ID: 28490 - Posted: 09.28.2022

by Charles Q. Choi Infection during pregnancy may be associated with having an autistic child simply because mothers of autistic children are prone to infections, a new study finds. The results suggest that “common infections during pregnancy do not seem increase their children’s risk of autism,” says study investigator Martin Brynge, a psychiatrist and doctoral student of global public health at the Karolinska Institutet in Stockholm, Sweden. “Prevention of maternal infections would likely not affect the prevalence of autism in the population.” A great deal of previous research has linked maternal infection during pregnancy with autism and intellectual disability in children. Whether the former causes the latter, however, has remained uncertain. For instance, both autism and intellectual disability are linked with gene variants that may influence the immune system, so mothers of children with either condition may also just be more vulnerable to serious infections. The new study analyzed data from 549,967 children, including 267,995 girls, living in Stockholm County who were born between 1987 and 2010; about 34,000 of the children had been exposed to a maternal infection requiring specialized health care, according to data from Sweden’s National Patient Register and National Medical Birth Register. Of the exposed children, 3.3 percent have autism, compared with 2.5 percent of unexposed children — a 16 percent increase in the chance of autism. But maternal infection in the year before pregnancy was also linked with a 25 percent greater chance of autism. “Mothers who had an infection during pregnancy may not be comparable to those mothers without infections,” Brynge says. “There may be systematic differences at the group level.” © 2022 Simons Foundation

Keyword: Autism; Neuroimmunology
Link ID: 28488 - Posted: 09.24.2022

By Ed Yong On March 25, 2020, Hannah Davis was texting with two friends when she realized that she couldn’t understand one of their messages. In hindsight, that was the first sign that she had COVID-19. It was also her first experience with the phenomenon known as “brain fog,” and the moment when her old life contracted into her current one. She once worked in artificial intelligence and analyzed complex systems without hesitation, but now “runs into a mental wall” when faced with tasks as simple as filling out forms. Her memory, once vivid, feels frayed and fleeting. Former mundanities—buying food, making meals, cleaning up—can be agonizingly difficult. Her inner world—what she calls “the extras of thinking, like daydreaming, making plans, imagining”—is gone. The fog “is so encompassing,” she told me, “it affects every area of my life.” For more than 900 days, while other long-COVID symptoms have waxed and waned, her brain fog has never really lifted. Of long COVID’s many possible symptoms, brain fog “is by far one of the most disabling and destructive,” Emma Ladds, a primary-care specialist from the University of Oxford, told me. It’s also among the most misunderstood. It wasn’t even included in the list of possible COVID symptoms when the coronavirus pandemic first began. But 20 to 30 percent of patients report brain fog three months after their initial infection, as do 65 to 85 percent of the long-haulers who stay sick for much longer. It can afflict people who were never ill enough to need a ventilator—or any hospital care. And it can affect young people in the prime of their mental lives. Long-haulers with brain fog say that it’s like none of the things that people—including many medical professionals—jeeringly compare it to. It is more profound than the clouded thinking that accompanies hangovers, stress, or fatigue. For Davis, it has been distinct from and worse than her experience with ADHD. It is not psychosomatic, and involves real changes to the structure and chemistry of the brain. It is not a mood disorder: “If anyone is saying that this is due to depression and anxiety, they have no basis for that, and data suggest it might be the other direction,” Joanna Hellmuth, a neurologist at UC San Francisco, told me. (c) 2022 by The Atlantic Monthly Group. All Rights Reserved.

Keyword: Attention; Learning & Memory
Link ID: 28487 - Posted: 09.21.2022

By Mark Johnson A study using the electronic health records of more than 6 million Americans over age 65 found those who had covid-19 ran a greater risk of receiving a new diagnosis of Alzheimer’s disease within a year. The study, led by researchers at Case Western Reserve University School of Medicine and published in the Journal of Alzheimer’s Disease, does not show that covid-19 causes Alzheimer’s, but adds to a growing body of work suggesting links between the two. The results suggest researchers should be tracking older patients who recover from covid to see if they go on to show signs of memory loss, declining brain function or Alzheimer’s disease. The study found that for every 1,000 seniors with covid-19, seven will be diagnosed with Alzheimer’s within a year, slightly above the five-in-a-thousand diagnosis rate for seniors who did not have covid. “We know that covid can affect the brain, but I don’t think anyone had looked at new diagnoses of Alzheimer’s,” said Pamela Davis, one of the study’s co-authors and a research professor at Case Western Reserve University School of Medicine. Colleague Rong Xu said she had expected to see some increase among seniors sickened by covid, but was surprised “by the extent of the increase and how rapidly it occurred.” The study, though “important and useful” was “limited,” said Gabriel de Erausquin, director of the Laboratory of Brain Development, Modulation and Repair at University of Texas Health San Antonio, who was not involved in the research. He cautioned that a diagnosis of Alzheimer’s disease is not necessarily confirmation of the disease. Doctors sometimes diagnose Alzheimer’s based on changes in behavior, or responses to a memory test. These are considered less accurate than imaging or spinal fluid tests that measure two types of proteins, beta-amyloid and phosphorylated tau, which accumulate abnormally in the brains of people with Alzheimer’s. Brain scans that look for structural changes, such as the shrinking of certain regions, are another more accurate indicator. © 1996-2022 The Washington Post

Keyword: Alzheimers
Link ID: 28479 - Posted: 09.17.2022

by Nora Bradford A well-studied brain response to sound, called the M100, appears earlier in life in autistic children than in their non-autistic peers, according to a new longitudinal study. The finding suggests that the auditory cortex in children with autism matures unusually quickly, a growth pattern seen previously in other brain regions. “It’s a demonstration that when we look for autism markers in the brain, they can be very age-specific,” says lead investigator J. Christopher Edgar, associate professor of radiology at the Children’s Hospital of Philadelphia in Pennsylvania. For that reason, longitudinal studies such as this one — in which Edgar and his colleagues assessed children at up to three different ages — are essential, he adds. “If the two populations being studied have different rates of brain maturation, then the pattern of findings changes across time.” At the time of the first magnetoencephalography (MEG) scan, when the children were 6 to 9 years old, those with autism were more likely to have an M100 response to a barely audible tone in the right hemisphere than non-autistic children were. But this difference disappeared in the next two visits, presumably because the M100 response typically appears during early adolescence. By contrast, the M50 response, which occurs throughout life, beginning in utero, showed no significant difference between the two groups at any visit. The team also evaluated ‘phase locking,’ a measure of how similar a participant’s neural activity is from scan to scan within a certain frequency band. Autistic participants demonstrated more mature phase-locking patterns at the first visit, which then diminished at the later two visits. © 2022 Simons Foundation

Keyword: Autism; Hearing
Link ID: 28478 - Posted: 09.17.2022

Sara Reardon More than 500,000 years ago, the ancestors of Neanderthals and modern humans were migrating around the world when a pivotal genetic mutation caused some of their brains to improve suddenly. This mutation, researchers report in Science1, drastically increased the number of brain cells in the hominins that preceded modern humans, probably giving them a cognitive advantage over their Neanderthal cousins. “This is a surprisingly important gene,” says Arnold Kriegstein, a neurologist at the University of California, San Francisco. However, he expects that it will turn out to be one of many genetic tweaks that gave humans an evolutionary advantage over other hominins. “I think it sheds a whole new light on human evolution.” When researchers first reported the sequence of a complete Neanderthal genome in 20142, they identified 96 amino acids — the building blocks that make up proteins — that differ between Neanderthals and modern humans, as well as some other genetic tweaks. Scientists have been studying this list to learn which of these changes helped modern humans to outcompete Neanderthals and other hominins. Cognitive advantage To neuroscientists Anneline Pinson and Wieland Huttner at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, one gene stood out. TKTL1 encodes a protein that is made when a fetus’s brain is first developing. A mutation in the human version changed one amino acid, resulting in a protein that is different from those found in hominin ancestors, Neanderthals and non-human primates. The researchers suspected that this protein could increase the proliferation of neural progenitor cells, which become neurons, as the brain develops, specifically in an area called the neocortex — a region involved in cognitive function. This, they reasoned, could contribute to modern humans’ cognitive advantage. © 2022 Springer Nature Limited

Keyword: Evolution; Genes & Behavior
Link ID: 28477 - Posted: 09.14.2022

By Erin Garcia de Jesús Human trash can be a cockatoo’s treasure. In Sydney, the birds have learned how to open garbage bins and toss trash around in the streets as they hunt for food scraps. People are now fighting back. Bricks, pool noodles, spikes, shoes and sticks are just some of the tools Sydney residents use to keep sulphur-crested cockatoos (Cacatua galerita) from opening trash bins, researchers report September 12 in Current Biology. The goal is to stop the birds from lifting the lid while the container is upright but still allowing the lid to flop open when a trash bin is tilted to empty its contents. This interspecies battle could be a case of what’s called an innovation arms race, says Barbara Klump, a behavioral ecologist at the Max Planck Institute of Animal Behavior in Radolfzell, Germany. When cockatoos learn how to flip trash can lids, people change their behavior, using things like bricks to weigh down lids, to protect their trash from being flung about (SN Explores: 10/26/21). “That’s usually a low-level protection and then the cockatoos figure out how to defeat that,” Klump says. That’s when people beef up their efforts, and the cycle continues. Researchers are closely watching this escalation to see what the birds — and humans — do next. With the right method, the cockatoos might fly by and keep hunting for a different target. Or they might learn how to get around it. In the study, Klump and colleagues inspected more than 3,000 bins across four Sydney suburbs where cockatoos invade trash to note whether and how people were protecting their garbage. Observations coupled with an online survey showed that people living on the same street are more likely to use similar deterrents, and those efforts escalate over time. © Society for Science & the Public 2000–2022.

Keyword: Learning & Memory; Evolution
Link ID: 28476 - Posted: 09.14.2022

James Brunton Badenoch Monkeypox’s effect on the skin – the disfiguring rashes – and the flu-like symptoms have been well described, but few have investigated the neurological and psychiatric problems the virus might cause. There are historic reports of neurological complications in people infected with the related smallpox virus and in people vaccinated against smallpox, which contains the related vaccinia virus. So my colleagues and I wanted to know whether monkeypox causes similar problems. We looked at all the evidence from before the current monkeypox pandemic of neurological or psychiatric problems in people with a monkeypox infection. The results are published in the journal eClinicalMedicine. A small but noticeable proportion of people (2% to 3%) with monkeypox became very unwell and developed serious neurological problems, including seizure and encephalitis (inflammation of the brain that can cause long-term disability). We also found that confusion occurred in a similar number of people. It’s important to note, though, that these figures are based on a few studies with few participants. Besides the severe and rare brain problems, we found evidence of a broader group of people with monkeypox who had more common neurological symptoms including headache, muscle ache and fatigue. From looking at the studies, it was unclear how severe these symptoms were and how long they lasted. It was also unclear how many people with monkeypox had psychiatric problems - such as anxiety and depression - as few studies looked into it. Of those that did, low mood was frequently reported.. © 2010–2022, The Conversation US, Inc.

Keyword: Epilepsy; Learning & Memory
Link ID: 28475 - Posted: 09.14.2022

Jon Hamilton In some families, Alzheimer's disease seems inevitable. "Your grandmother has it, your mom has it, your uncle has it, your aunts have it, your cousin has it. I always assumed that I would have it," says Karen Douthitt, 57. "It was always in our peripheral vision," says Karen's sister June Ward, 61. "Our own mother started having symptoms at age 62, so it has been a part of our life." Nearly a decade ago, Karen, June, and an older sister, Susie Gilliam, 64, set out to learn why Alzheimer's was affecting so many family members. Since then, each sister has found out whether she carries a rare gene mutation that makes Alzheimer's inescapable. And all three have found ways to help scientists trying to develop treatments for the disease. I met Karen and June in 2015, at the first-ever conference for families with a particular type of genetic mutation in which Alzheimer's often appears in middle age. The annual conference is sponsored by the Alzheimer's Association and the Dominantly Inherited Alzheimer's Network Trials Unit, a research program run by Washington University School of Medicine in St. Louis. Karen and June had come to Washington, D.C., for the family conference because of something they had just learned about a cousin on their mother's side. The cousin had developed Alzheimer's in her 50s. And genetic tests showed that she carried a rare, inherited gene mutation called presenilin 1. It's one of three mutations that typically cause Alzheimer's to appear in middle age. The three gene mutations responsible for early Alzheimer's are unlike a better known gene called APOE4, which merely increases the likelihood somewhat that a person will develop Alzheimer's – and usually at age 65 or older. In contrast, the early-onset mutations, including presenilin 1, make it almost certain an individual will develop the disease, and usually before age 60. Each child of a parent who has the presenilin 1 mutation has a 50% chance of inheriting it. © 2022 npr

Keyword: Alzheimers; Genes & Behavior
Link ID: 28474 - Posted: 09.14.2022

ByRodrigo Pérez Ortega We humans are proud of our big brains, which are responsible for our ability to plan ahead, communicate, and create. Inside our skulls, we pack, on average, 86 billion neurons—up to three times more than those of our primate cousins. For years, researchers have tried to figure out how we manage to develop so many brain cells. Now, they’ve come a step closer: A new study shows a single amino acid change in a metabolic gene helps our brains develop more neurons than other mammals—and more than our extinct cousins, the Neanderthals. The finding “is really a breakthrough,” says Brigitte Malgrange, a developmental neurobiologist at the University of Liège who was not involved in the study. “A single amino acid change is really, really important and gives rise to incredible consequences regarding the brain.” What makes our brain human has been the interest of neurobiologist Wieland Huttner at the Max Planck Institute of Molecular Cell Biology and Genetics for years. In 2016, his team found that a mutation in the ARHGAP11B gene, found in humans, Neanderthals, and Denisovans but not other primates, caused more production of cells that develop into neurons. Although our brains are roughly the same size as those of Neanderthals, our brain shapes differ and we created complex technologies they never developed. So, Huttner and his team set out to find genetic differences between Neanderthals and modern humans, especially in cells that give rise to neurons of the neocortex. This region behind the forehead is the largest and most recently evolved part of our brain, where major cognitive processes happen. The team focused on TKTL1, a gene that in modern humans has a single amino acid change—from lysine to arginine—from the version in Neanderthals and other mammals. By analyzing previously published data, researchers found that TKTL1 was mainly expressed in progenitor cells called basal radial glia, which give rise to most of the cortical neurons during development. © 2022 American Association for the Advancement of Science.

Keyword: Development of the Brain; Evolution
Link ID: 28472 - Posted: 09.10.2022

Yasemin Saplakoglu You’re on the vacation of a lifetime in Kenya, traversing the savanna on safari, with the tour guide pointing out elephants to your right and lions to your left. Years later, you walk into a florist’s shop in your hometown and smell something like the flowers on the jackalberry trees that dotted the landscape. When you close your eyes, the store disappears and you’re back in the Land Rover. Inhaling deeply, you smile at the happy memory. Now let’s rewind. You’re on the vacation of a lifetime in Kenya, traversing the savanna on safari, with the tour guide pointing out elephants to your right and lions to your left. From the corner of your eye, you notice a rhino trailing the vehicle. Suddenly, it sprints toward you, and the tour guide is yelling to the driver to hit the gas. With your adrenaline spiking, you think, “This is how I am going to die.” Years later, when you walk into a florist’s shop, the sweet floral scent makes you shudder. “Your brain is essentially associating the smell with positive or negative” feelings, said Hao Li, a postdoctoral researcher at the Salk Institute for Biological Studies in California. Those feelings aren’t just linked to the memory; they are part of it: The brain assigns an emotional “valence” to information as it encodes it, locking in experiences as good or bad memories. And now we know how the brain does it. As Li and his team reported recently in Nature, the difference between memories that conjure up a smile and those that elicit a shudder is established by a small peptide molecule known as neurotensin. They found that as the brain judges new experiences in the moment, neurons adjust their release of neurotensin, and that shift sends the incoming information down different neural pathways to be encoded as either positive or negative memories. To be able to question whether to approach or to avoid a stimulus or an object, you have to know whether the thing is good or bad. All Rights Reserved © 2022

Keyword: Learning & Memory; Emotions
Link ID: 28471 - Posted: 09.10.2022

By Helen Santoro I barreled into the world — a precipitous birth, the doctors called it — at a New York City hospital in the dead of night. In my first few hours of life, after six bouts of halted breathing, the doctors rushed me to the neonatal intensive care unit. A medical intern stuck his pinky into my mouth to test the newborn reflex to suck. I didn’t suck hard enough. So they rolled my pink, 7-pound-11-ounce body into a brain scanner. Lo and behold, there was a huge hole on the left side, just above my ear. I was missing the left temporal lobe, a region of the brain involved in a wide variety of behaviors, from memory to the recognition of emotions, and considered especially crucial for language. My mother, exhausted from the labor, remembers waking up after sunrise to a neurologist, pediatrician and midwife standing at the foot of her bed. They explained that my brain had bled in her uterus, a condition called a perinatal stroke. They told her I would never speak and would need to be institutionalized. The neurologist brought her arms up to her chest and contorted her wrists to illustrate the physical disability I would be likely to develop. In those early days of my life, my parents wrung their hands wondering what my life, and theirs, would look like. Eager to find answers, they enrolled me in a research project at New York University tracking the developmental effects of perinatal strokes. But month after month, I surprised the experts, meeting all of the typical milestones of children my age. I enrolled in regular schools, excelled in sports and academics. The language skills the doctors were most worried about at my birth — speaking, reading and writing — turned out to be my professional passions. My case is highly unusual but not unique. Scientists estimate that thousands of people are, like me, living normal lives despite missing large chunks of our brains. Our myriad networks of neurons have managed to rewire themselves over time. But how? © 2022 The New York Times Company

Keyword: Development of the Brain; Language
Link ID: 28466 - Posted: 09.07.2022

Nicola Davis Regular doses of a hormone may help to boost cognitive skills in people with Down’s syndrome, a pilot study has suggested. Researchers fitted seven men who have Down’s syndrome with a pump that provided a dose of GnRH, a gonadotropin-releasing hormone, every two hours for six months. Six out of the seven men showed moderate cognitive improvements after the treatment, including in attention and being able to understand instructions, compared with a control group who were not given the hormone. However, experts raised concerns about the methods used in the study, urging caution over the findings. The team behind the work said brain scans of the participants, who were aged between 20 and 37, given the hormone suggest they underwent changes in neural connectivity in areas involved in cognition. “[People] with Down’s syndrome have cognitive decline which starts in the 30s,” said Prof Nelly Pitteloud, co-author of the study from the University of Lausanne. “I think if we can delay that, this would be great, if the therapy is well tolerated [and] without side effects.” Writing in the journal Science, Pitteloud and colleagues said they previously found mice with an extra copy of chromosome 16 experienced an age-related decline in cognition and sense of smell, similar to that seen in people with Down’s syndrome – who have an extra copy of chromosome 21. In a series of experiments, the team found regular doses of gonadotropin-releasing hormone boosted both the sense of smell and cognitive performance of these mice. Pitteloud said no side effects were seen in the participants and that the hormone is already used to induce puberty in patients with certain disorders. “I think these data are of course very exciting, but we have to remain cautious,” said Pitteloud. She said larger, randomised control studies are now needed to confirm that the improvements were not driven by patients becoming less stressed during assessments and thus performing better. Prof Michael Thomas of Birkbeck, University of London, who studies cognitive development across the lifespan in Down’s syndrome, said the results were exciting. “For parents, this is good news: interventions can still yield benefits across the lifespan,” he said, although he noted it is not clear how applicable the hormone therapy would be for children. © 2022 Guardian News & Media Limited

Keyword: Hormones & Behavior; Development of the Brain
Link ID: 28462 - Posted: 09.03.2022

By Rebecca Sohn Distinctive bursts of sleeping-brain activity, known as sleep spindles, have long been generally associated with strengthening recently formed memories. But new research has managed to link such surges to specific acts of learning while awake. These electrical flurries, which can be observed as sharp spikes on an electroencephalogram (EEG), tend to happen in early sleep stages when brain activity is otherwise low. A study published in Current Biology shows that sleep spindles appear prominently in particular brain areas that had been active in study participants earlier, while they were awake and learning an assigned task. Stronger spindles in these areas correlated with better recall after sleep. “We were able to link, within [each] participant, exactly the brain areas used for learning to spindle activity during sleep,” says University of Oxford cognitive neuroscientist Bernhard Staresina, senior author on the study. Staresina, Marit Petzka of the University of Birmingham in England and their colleagues devised a set of tasks they called the “memory arena,” which required each participant to memorize a sequence of images appearing inside a circle. While the subjects did so, researchers measured their brain activity with an EEG, which uses electrodes placed on the head. Participants then took a two-hour nap, after which they memorized a new image set—but then had to re-create the original image sequence learned before sleeping. During naps, the researchers recorded stronger sleep spindles in the specific brain areas that had been active during the pre-sleep-memorization task, and these areas differed for each participant. This suggested that the spindle pattern was not “hardwired” in default parts of the human brain; rather it was tied to an individual's thought patterns. The researchers also observed that participants who experienced stronger sleep spindles in brain areas used during memorization did a better job re-creating the images' positions after the nap. © 2022 Scientific American

Keyword: Sleep; Learning & Memory
Link ID: 28460 - Posted: 09.03.2022

By Elizabeth Landau Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circle’s circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street. In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculation—credited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujan—doesn’t involve anything more complicated than adding, dividing, and squaring numbers. “How could you not say that’s amazing?” Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom. As a fellow pi enthusiast—I am well known among friends for hosting Pi Day pie parties—I had to agree with him that it’s a dazzling formula. But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe. In the 1950s, some educators began to observe a phenomenon they called mathemaphobia in students,1 though this was just one of a long list of academic phobias they saw in students. Today, nearly 1 in 5 U.S. adults suffers from high levels of math anxiety, according to some estimates,2 and a 2016 study found that 11 percent of university students experienced “high enough levels of mathematics anxiety to be in need of counseling.”3 Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.4 While many questions remain about the underlying reasons, high school math scores in the U.S. tend to rank significantly lower than those in many other countries. In 2018, for example, American students ranked 30th in the world in their math scores on the PISA exam, an international assessment given every three years. © 2022 NautilusThink Inc,

Keyword: Attention; Learning & Memory
Link ID: 28459 - Posted: 09.03.2022

By Kurt Kleiner The human brain is an amazing computing machine. Weighing only three pounds or so, it can process information a thousand times faster than the fastest supercomputer, store a thousand times more information than a powerful laptop, and do it all using no more energy than a 20-watt lightbulb. Researchers are trying to replicate this success using soft, flexible organic materials that can operate like biological neurons and someday might even be able to interconnect with them. Eventually, soft “neuromorphic” computer chips could be implanted directly into the brain, allowing people to control an artificial arm or a computer monitor simply by thinking about it. Like real neurons — but unlike conventional computer chips — these new devices can send and receive both chemical and electrical signals. “Your brain works with chemicals, with neurotransmitters like dopamine and serotonin. Our materials are able to interact electrochemically with them,” says Alberto Salleo, a materials scientist at Stanford University who wrote about the potential for organic neuromorphic devices in the 2021 Annual Review of Materials Research. Salleo and other researchers have created electronic devices using these soft organic materials that can act like transistors (which amplify and switch electrical signals) and memory cells (which store information) and other basic electronic components. The work grows out of an increasing interest in neuromorphic computer circuits that mimic how human neural connections, or synapses, work. These circuits, whether made of silicon, metal or organic materials, work less like those in digital computers and more like the networks of neurons in the human brain. © 2022 Annual Reviews

Keyword: Robotics; Learning & Memory
Link ID: 28449 - Posted: 08.27.2022

Diana Kwon People’s ability to remember fades with age — but one day, researchers might be able to use a simple, drug-free method to buck this trend. In a study published on 22 August in Nature Neuroscience1, Robert Reinhart, a cognitive neuroscientist at Boston University in Massachusetts, and his colleagues demonstrate that zapping the brains of adults aged over 65 with weak electrical currents repeatedly over several days led to memory improvements that persisted for up to a month. Previous studies have suggested that long-term memory and ‘working’ memory, which allows the brain to store information temporarily, are controlled by distinct mechanisms and parts of the brain. Drawing on this research, the team showed that stimulating the dorsolateral prefrontal cortex — a region near the front of the brain — with high-frequency electrical currents improved long-term memory, whereas stimulating the inferior parietal lobe, which is further back in the brain, with low-frequency electrical currents boosted working memory. “Their results look very promising,” says Ines Violante, a neuroscientist at the University of Surrey in Guildford, UK. “They really took advantage of the cumulative knowledge within the field.” Using a non-invasive method of stimulating the brain known as transcranial alternating current stimulation (tACS), which delivers electrical currents through electrodes on the surface of the scalp, Reinhart’s team conducted a series of experiments on 150 people aged between 65 and 88. Participants carried out a memory task in which they were asked to recall lists of 20 words that were read aloud by an experimenter. The participants underwent tACS for the entire duration of the task, which took 20 minutes. © 2022 Springer Nature Limited

Keyword: Learning & Memory
Link ID: 28445 - Posted: 08.24.2022

By Diana Kwon During an embryo's development, a piece of the still-growing brain branches off to form the retina, a sliver of tissue in the back of the eye. This makes the retina, which is composed of several layers of neurons, a piece of the central nervous system. As evidence builds that changes in the brain can manifest in this region, scientists are turning to retinas as a potential screening target for early signs of Alzheimer's, an incurable neurodegenerative disease that affects an estimated six million people in the U.S. alone. Initially clinicians could diagnose Alzheimer's only through brain autopsies after patients died. Since the early 2000s, however, research advances have made it possible to pinpoint signs of the disease—and to begin to investigate treatment—years before symptoms first appear. Today positron emission tomography (PET) brain imaging and tests of cerebrospinal fluid (CSF), the clear liquid surrounding the brain and spinal cord, aid Alzheimer's diagnosis at its early stages. “There have been tremendous improvements in our ability to detect early disease,” says Peter J. Snyder, a neuropsychologist and neuroscientist at the University of Rhode Island. But these diagnostic methods are not always readily available, and they can be expensive and invasive. PET imaging requires injecting a radioactive tracer molecule into the bloodstream, and spinal fluid must be extracted with a needle inserted between vertebrae in the back. “We need ways of funneling the right high-risk individuals into the diagnostic process with low-cost screening tools that are noninvasive and simple to administer,” Snyder says. The retina is a particularly attractive target, he adds, because it is closely related to brain tissue and can be examined noninvasively through the pupil, including with methods routinely used to check for eye diseases. © 2022 Scientific American,

Keyword: Alzheimers; Vision
Link ID: 28442 - Posted: 08.24.2022