Links for Keyword: Learning & Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1135

In a study of healthy volunteers, National Institutes of Health researchers found that our brains may solidify the memories of new skills we just practiced a few seconds earlier by taking a short rest. The results highlight the critically important role rest may play in learning. “Everyone thinks you need to ‘practice, practice, practice’ when learning something new. Instead, we found that resting, early and often, may be just as critical to learning as practice,” said Leonardo G. Cohen, M.D., Ph.D., senior investigator at NIH’s National Institute of Neurological Disorders and Stroke and a senior author of the paper published in the journal Current Biology. “Our ultimate hope is that the results of our experiments will help patients recover from the paralyzing effects caused by strokes and other neurological injuries by informing the strategies they use to ‘relearn’ lost skills.” The study was led by Marlene Bönstrup, M.D., a postdoctoral fellow in Dr. Cohen’s lab. Like many scientists, she held the general belief that our brains needed long periods of rest, such as a good night’s sleep, to strengthen the memories formed while practicing a newly learned skill. But after looking at brain waves recorded from healthy volunteers in learning and memory experiments at the NIH Clinical Center, she started to question the idea. The waves were recorded from right-handed volunteers with a highly sensitive scanning technique called magnetoencephalography. The subjects sat in a chair facing a computer screen and under a long cone-shaped brain scanning cap. The experiment began when they were shown a series of numbers on a screen and asked to type the numbers as many times as possible with their left hands for 10 seconds; take a 10 second break; and then repeat this trial cycle of alternating practice and rest 35 more times. This strategy is typically used to reduce any complications that could arise from fatigue or other factors.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 26137 - Posted: 04.13.2019

By Benedict Carey Anyone above a certain age who has drawn a blank on the name of a favorite uncle, a friend’s phone number or the location of a house key understands how fragile memory is. Its speed and accuracy begin to slip in one’s 20s and keep slipping. This is particularly true for working memory, the mental sketch pad that holds numbers, names and other facts temporarily in mind, allowing decisions to be made throughout the day. On Monday, scientists reported that brief sessions of specialized brain stimulation could reverse this steady decline in working memory, at least temporarily. The stimulation targeted key regions in the brain and synchronized neural circuits in those areas, effectively tuning them to one another, as an orchestra conductor might tune the wind section to the strings. The findings, reported in the journal Nature Neuroscience, provide the strongest support yet for a method called transcranial alternating current stimulation, or tACS, as a potential therapy for memory deficits, whether from age-related decline, brain injury or, perhaps, creeping dementia. In recent years, neuroscientists have shown that memory calls on a widely distributed network in the brain, and it coordinates those interactions through slow-frequency, thrumming rhythms called theta waves, akin to the pulsing songs shared among humpback whales. The tACS technology is thought to enable clearer communication by tuning distant circuits to one another. The tACS approach is appealing for several reasons, perhaps most of all because it is noninvasive; unlike other forms of memory support, it involves no implant, which requires brain surgery. The stimulation passes through the skull with little sensation. Still, a widely available therapy is likely years away, as the risks and benefits are not fully understood, experts said. © 2019 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 26123 - Posted: 04.09.2019

Laura Sanders Brains have long been star subjects for neuroscientists. But the typical “brain in a jar” experiments that focus on one subject in isolation may be missing a huge part of what makes us human — our social ties. “There’s this assumption that we can understand how the mind works by just looking at individual minds, and not looking at them in interactions,” says social neuroscientist Thalia Wheatley of Dartmouth College. “I think that’s wrong.” To answer some of the thorniest questions about the human brain, scientists will have to study the mind as it actually exists: steeped in social connections that involve rich interplay among family, friends and strangers, Wheatley argues. To illustrate her point, she asked the audience at a symposium in San Francisco on March 26, during the annual meeting of the Cognitive Neuroscience Society, how many had talked to another person that morning. Nearly everybody in the crowd of about 100 raised a hand. Everyday social interactions may seem inconsequential. But recent work on those who have been isolated, such as elderly people and prisoners in solitary confinement, suggests otherwise: Brains deprived of social interaction stop working well (SN: 12/8/18, p. 11). “That’s a hint that it’s not just that we like interaction,” Wheatley says. “It’s important to keep us healthy and sane.” |© Society for Science & the Public 2000 - 2019

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 26122 - Posted: 04.09.2019

By Carl Zimmer In 2011, Dr. Dena Dubal was hired by the University of California, San Francisco, as an assistant professor of neurology. She set up a new lab with one chief goal: to understand a mysterious hormone called Klotho. Dr. Dubal wondered if it might be the key to finding effective treatments for dementia and other disorders of the aging brain. At the time, scientists only knew enough about Klotho to be fascinated by it. Mice bred to make extra Klotho lived 30 percent longer, for instance. But scientists also had found Klotho in the brain, and so Dr. Dubal launched experiments to see whether it had any effect on how mice learn and remember. The results were startling. In one study, she and her colleagues found that extra Klotho protects mice with symptoms of Alzheimer’s disease from cognitive decline. “Their thinking, in every way that we could measure them, was preserved,” said Dr. Dubal. She and her colleagues also bred healthy mice to make extra Klotho. They did better than their fellow rodents on learning mazes and other cognitive tests. Klotho didn’t just protect their brains, the researchers concluded — it enhanced them. Experiments on more mice turned up similar results. “I just couldn’t believe it — was it true, or was it just a false positive?” Dr. Dubal recalled. “But here it is. It enhances of cognition even in a young mouse. It makes them smarter.” Five years have passed since Dr. Dubal and her colleagues began publishing these extraordinary results. Other researchers have discovered tantalizing findings of their own, suggesting that Klotho may protect against other neurological disorders, including multiple sclerosis and Parkinson’s disease. © 2019 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 26105 - Posted: 04.02.2019

Emma Yasinski In the 1970s, scientists discovered that certain neurons in the hippocampus—an area of the brain involved in learned and memory—would fire in response to particular locations. They were called “place cells,” explains Charlotte Boccara, a researcher at the University of Oslo. “They were deemed important for spatial representation . . . a bit like the ‘You Are Here’ signal’ on a map.” But it wasn’t until 2005 that researchers discovered the brain’s grid cells, which they believed function as that map. These cells, found adjacent to the hippocampus in the medial entorhinal cortex (MEC), self-organize into a pattern of hexagons that serve as coordinates to help animals make sense of their surroundings and the signals from our place cells. A pair of studies published today (March 28) in Science suggests that this map may not be as rigid as once thought. The experiments demonstrated that, in rats at least, the cellular activity within these grids changes as the animals learn and remember where they can find food rewards. “These are wonderful studies,” says György Buzsáki, a neuroscientist at New York University who was not involved in either of them. “When ideas converge from multiple, different directions, and they converge and come to the same conclusion, the result is always stronger.” In the first study, Boccara, then a researcher at the Institute of Science and Technology Austria, and her team placed rats one by one in a cheeseboard maze, a flat board drilled full of holes. They hid three food rewards in different holes then scattered food dust over the entire surface so the rats would not be able to sniff their ways to the reward. The rats explored the maze until they found the prizes and repeated the task until they learned to go straight to the food instead of foraging. The next day, the researchers conducted the same experiment but changed the locations of the rewards. © 1986 - 2019 The Scientist.

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 26094 - Posted: 03.30.2019

By Benedict Carey Whatever its other properties, memory is a reliable troublemaker, especially when navigating its stockpile of embarrassments and moral stumbles. Ten minutes into an important job interview and here come screenshots from a past disaster: the spilled latte, the painful attempt at humor. Two dates into a warming relationship and up come flashbacks of an earlier, abusive partner. The bad timing is one thing. But why can’t those events be somehow submerged amid the brain’s many other dimming bad memories? Emotions play a role. Scenes, sounds and sensations leave a deeper neural trace if they stir a strong emotional response; this helps you avoid those same experiences in the future. Memory is protective, holding on to red flags so they can be waved at you later, to guide your future behavior. But forgetting is protective too. Most people find a way to bury, or at least reshape, the vast majority of their worst moments. Could that process be harnessed or somehow optimized? Perhaps. In the past decade or so, brain scientists have begun to piece together how memory degrades and forgetting happens. A new study, published this month in the Journal of Neuroscience, suggests that some things can be intentionally relegated to oblivion, although the method for doing so is slightly counterintuitive. For the longest time, forgetting was seen as a passive process of decay and the enemy of learning. But as it turns out, forgetting is a dynamic ability, crucial to memory retrieval, mental stability and maintaining one’s sense of identity. © 2019 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 26070 - Posted: 03.23.2019

Liam Drew A mouse scurries down a hallway, past walls lined with shifting monochrome stripes and checks. But the hallway isn’t real. It’s part of a simulation that the mouse is driving as it runs on a foam wheel, mounted inside a domed projection screen. While the mouse explores its virtual world, neuroscientist Aman Saleem watches its brain cells at work. Light striking the mouse’s retinas triggers electrical pulses that travel to neurons in its primary visual cortex, where Saleem has implanted electrodes. Textbooks say that these neurons each respond to a specific stimulus, such as a horizontal or vertical line, so that identical patterns of inputs should induce an identical response. But that’s not what happens. When the mouse encounters a repeat of an earlier scene, its neurons fire in a different pattern. “Five years ago, if you’d told me that, I’d have been like, ‘No, that’s not true. That’s not possible’,” says Saleem, in whose laboratory at University College London we are standing. His results, published last September1, show that cells in the hippocampus that track where the mouse has run along the hallway are somehow changing how cells in the visual cortex fire. In other words, the mouse’s neural representation of two identical scenes differs, depending on where it perceives itself to be. It’s no surprise that an animal’s experiences change how it sees the world: all brains learn from experience and combine multiple streams of information to construct perceptions of reality. But researchers once thought that at least some areas in the brain — those that are the first to process inputs from the sense organs — create relatively faithful representations of the outside world. According to this model, these representations then travel to ‘association’ areas, where they combine with memories and expectations to produce perceptions.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 7: Vision: From Eye to Brain
Link ID: 26039 - Posted: 03.15.2019

Laura Sanders Fast waves of activity ripple in the brain a half second before a person calls up a memory. The finding, published in the March 1 Science, hint that these brain waves might be a key part of a person’s ability to remember. The results come from a study of 14 people with epilepsy who had electrodes placed on their brains as part of their treatment. Those electrodes also allowed scientists to monitor neural activity while the people learned pairs of words. One to three minutes after learning the pairs, people were given one word and asked to name its partner. As participants remembered the missing word, neuroscientist and neurosurgeon Kareem Zaghloul and his colleagues caught glimpses of fast brain waves rippling across parts of the brain at a rate of around 100 per second. These ripples appeared nearly simultaneously in two brain regions — the medial temporal lobe, which is known to be important for memory, and the temporal association cortex, which has a role in language. When a person got the answer wrong, or didn’t answer at all, these coordinated ripples were less likely to be present, the researchers found. “We see this happening, and then we see people remember,” says Zaghloul, of the National Institutes of Health in Bethesda, Md. While recalling a memory, “you mentally jump back in time and re-experience it,” Zaghloul says. Just after the ripples, the researchers saw telltale signs of that mental time travel — an echo of brain activity similar to the brain activity when the memory of the word pair was first formed. |© Society for Science & the Public 2000 - 201

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 26003 - Posted: 03.05.2019

Laura Sanders People often fret about television time for children. A new study examines the habit at the other end of life. The more television older people watched, the worse they recalled a list of words, researchers report online February 28 in Scientific Reports. But the study describes only a correlation; it can’t say that lots of TV time actually causes the memory slips. Researchers examined data on 3,590 people collected as part of the English Longitudinal Study of Aging, a long-running study of English people aged 50 and older. In 2008 and 2009, participants reported how many hours a day, on average, they spent watching television. In addition to the surveys, participants listened to a recording of 10 common words, one word every two seconds. Then, people tried to remember as many words as they could, both immediately after hearing the words and after a short delay. Six years later, people took the same tests. People who watched more than 3.5 hours of TV daily back in 2008 or 2009 were more likely to have worse verbal memory scores six years later, the researchers found. Television “dose” seemed to matter: Beyond that 3.5-hour threshold, the more TV people watched, the bigger their later verbal memory scores declined. It’s not known whether television time actually causes verbal memory problems. The reverse could be true: People who have worse memories might be more likely to watch more television. Still, the researchers suggest that TV might cause a certain kind of mental stress that might contribute to memory trouble. |© Society for Science & the Public 2000 - 2019

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 26000 - Posted: 03.02.2019

By Agata Boxe Police officers investigating a crime may hesitate to interview drunk witnesses. But waiting until they sober up may not be the best strategy; people remember more while they are still inebriated than they do a week later, a new study finds. Malin Hildebrand Karlén, a senior psychology lecturer at Sweden’s University of Gothenburg, and her colleagues recruited 136 people and gave half of them vodka mixed with orange juice. The others drank only juice. In 15 minutes women in the alcohol group consumed 0.75 gram of alcohol per kilogram of body weight, and men drank 0.8 gram (that is equivalent to 3.75 glasses of wine for a 70-kilogram woman or four glasses for a man of the same weight, Hildebrand Karlén says). All participants then watched a short film depicting a verbal and physical altercation between a man and a woman. The researchers next asked half the people in each group to freely recall what they remembered from the film. The remaining participants were sent home and interviewed a week later. The investigators found that both the inebriated and sober people who were interviewed immediately demonstrated better recollection of the film events than their drunk or sober counterparts who were questioned later. The effect held even for people with blood alcohol concentrations of 0.08 or higher—the legal limit for driving in most of the U.S. (Intoxication levels varied because different people metabolize alcohol at different speeds.) The results suggest that intoxicated witnesses should be interviewed sooner rather than later, according to the study, which was published online last October in Psychology, Crime & Law. © 2019 Scientific American

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Link ID: 25947 - Posted: 02.11.2019

By Rachel Hartigan Shea When Steve Ramirez was in college, he was fascinated by all kinds of subjects—from Shakespeare to piano, astronauts to medicine. That made choosing a major difficult, so he decided to “cheat,” as he puts it. He would study “the thing that achieved everything that’s ever been achieved”: the brain. After he joined a lab researching the neuroscience of memory, he learned that every experience leaves physical traces throughout the brain. Those are memories, and they can be examined or even altered. “That idea enchanted me,” he says. Now Ramirez leads his own lab at Boston University, and he’s figured out how to suppress bad memories by activating good ones. He and his team genetically engineer brain cells associated with memory in mice to respond to light. Then they create a bad memory—a mild electric shock—and watch the activated cells light up. Deactivating those cells would make the bad memory inaccessible or allow it to be overwritten by a good memory, such as social time with other mice. Ramirez does not propose using this sort of “genetic trickery” to manipulate memories in humans. Instead, his discoveries about memory could inform how patients with post-traumatic stress disorder, anxiety, or depression are treated. “We want to understand how the brain works; we want to understand how memory works,” he says. “It’s like, the more we know how a car works, the better equipped we are to figure out what happens when it breaks down.”

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25944 - Posted: 02.09.2019

John Bergeron During the first weeks of the new year, resolutions are often accompanied by attempts to learn new behaviours that improve health. We hope that old bad habits will disappear and new healthy habits will become automatic. But how can our brain be reprogrammed to assure that a new health habit can be learned and retained? In 1949, Canadian psychologist Donald Hebb proposed the theory of Hebbian learning to explain how a learning task is transformed into a long-term memory. In this way, healthy habits become automatically retained after their continual repetition. Synapses transmit electrical signals. Svitlana Pavliuk Learning and memory are a consequence of how our brain cells (neurons) communicate with each other. When we learn, neurons communicate through molecular transmissions which hop across synapses producing a memory circuit. Known as long-term potentiation (LTP), the more often a learning task is repeated, the more often transmission continues and the stronger a memory circuit becomes. It is this unique ability of neurons to create and strengthen synaptic connections by repeated activation that leads to Hebbian learning. Understanding the brain requires investigation through different approaches and from a variety of specialities. The field of cognitive neuroscience initially developed through a small number of pioneers. Their experimental designs and observations led to the foundation for how we understand learning and memory today. © 2010–2019, The Conversation US, Inc.

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25890 - Posted: 01.22.2019

By: Brenna Hassinger-Das, Ph.D., and Kathryn Hirsh-Pasek, Ph.D. In 1954, Walt Disney was the first to envision a new form of entertainment that melded traditional fun and education—a form that he dubbed “edutainment.” By the latter part of the 20th century, this form had morphed into educational toys and games, a multi-billion-dollar industry that is projected to capture a full 36 percent of the global toy market share by 2022. Nowhere is this trend more apparent than in the explosion of digital apps: of the 2.2 million apps available in the Apple Store, roughly 176,000—8.5 percent—are loosely designated as “ educational. ” Their growth continues, with annual increases of 10 percent expected through 2021. Whether called edutainment, educational toys, or the digital learning revolution, this trend shares the implicit philosophy that mixing fun and learning will offer a kind of “brain training” that will enhance children’s thinking and amplify their learning potential. But there are many questions before us. What do manufacturers and marketers mean when they designate a product “ educational? ” What relevant research in the science of learning has been done? Is there a standard definition of educational value that guides the field? Indeed, a framework we use highlights when toys might sculpt mental muscle and when products are likely to be total imposters. This framework helps us elucidate which educational and digital toys are likely to confer benefits for children.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 25877 - Posted: 01.18.2019

By Bryan Clark You slide the key into the door and hear a clunk as the tumblers engage. You rotate the key, twist the doorknob and walk inside. The house is familiar, but the contents foreign. At your left, there’s a map of Minnesota, dangling precariously from the wall. You’re certain it wasn’t there this morning. Below it, you find a plush M&M candy. To the right, a dog, a shiba inu you’ve never seen before. In its mouth, a pair of your expensive socks. And then it comes to you, 323-3607, a phone number. If none of this makes sense, stick with us; by the end of this piece you’ll be using the same techniques to memorize just about anything you’ve ever wanted to remember. The “memory athlete” Munkhshur Narmandakh once employed a similar combination of mnemonics to commit more than 6,000 binary digits to memory in just 30 minutes. Alex Mullen, a three-time World Memory Champion, used them to memorize the order of a deck of cards in just 15 seconds, a record at the time. It was later broken by Shijir-Erdene Bat-Enkh, who did it in 12. We’re going to aim lower, applying these strategies to real-world scenarios, like remembering the things we often forget at dinner parties or work-related mixers. At the start of this piece, we employed two mnemonic strategies to remember the seven digits of a phone number. The first, called the “Major System,” was developed in 1648 by historian Johann Winkelmann. In his book “Moonwalking With Einstein,” the author Joshua Foer described this system as a simple cipher that transforms numbers to letters or phonetic sounds. From there we can craft words and, ultimately, images. Some will, no doubt, be crude or enigmatic. Others may contain misspellings and factual errors. It doesn’t matter. This system is designed to create rich imagery, not accurate representations. © 2019 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25865 - Posted: 01.15.2019

By Sandra E. Garcia Imagine being held up at gunpoint. Do you trust you could remember the perpetrator’s face? The gun? Or would you have a better recollection of how loud the birds were chirping at that moment? “The memory does not operate like a videotape machine faithfully recording every single detail,” said Richard J. McNally, a professor of psychology at Harvard University and the author of “Remembering Trauma.” “The thing that is happening is that you’re focusing on the most dangerous thing,” he said. “That is the function of fear: to alert you to imminent threats.” Stress can play a role in eyewitness cases of mistaken identity, experts said, and it could be a reason there were such conflicting accounts of the suspects in the shooting death of Jazmine Barnes, the 7-year-old Texas girl who was fired upon in a car with her mother and three sisters on Dec. 30. A gunman pulled up alongside them and opened fire. Jazmine’s mother, LaPorsha Washington, 30, was injured. Ms. Washington and her daughters met with investigators to help them create a composite sketch of the gunman, who attacked them before sunrise. The man was described as white, thin and in his 30s or 40s and driving a red pickup truck. On Sunday, the authorities announced they had charged a 20-year-old black man with capital murder in connection with the shooting. In a CNN interview, Ms. Washington said her teenage daughter told her that the man was white and that his hoodie was black. “That’s all she could see at the time because the sun hadn’t really even came out yet,” Ms. Washington said in the interview. © 2019 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25844 - Posted: 01.07.2019

By Ryan Dalton You might be forgiven for having never heard of the NotPetya cyberattack. It didn’t clear out your bank account, or share your social media passwords, or influence an election. But it was one of the most costly and damaging cyberattacks in history, for what it did target: shipping through ports. By the time the engineers at Maersk realized that their computers were infected with a virus, it was too late: worldwide shipping would grind to a halt for days. Imagine a similar situation, in which the target was another port: the synapse, the specialized port of communication between neurons. Much of our ability to learn and remember comes down to the behavior of synapses. What would happen then, if one neuron infected another with malware? Ports and synapses both run on rules, meant to ensure that their cargo can be exchanged not only quickly and reliably, but also adaptably, so that they can quickly adjust to current conditions and demands. This ‘synaptic plasticity’, is fundamental to the ability of animals to learn, and without it we would no more be able to tie our shoes than to remember our own names. Just as shipping rules are determined by treaties and laws, the rules of synaptic plasticity are written into a multitude of genes in our DNA. For example, one gene might be involved in turning up the volume on one side of the synapse, while another gene might ask the other side of the synapse to turn up the gain. Studying the function of these genes has been one of the core approaches to understanding what it is, at the microscopic level, to learn and to remember. © 2018 Scientific American

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25782 - Posted: 12.12.2018

In his enthralling 2009 collection of parables, Sum: Forty Tales from the Afterlives, the neuroscientist David Eagleman describes a world in which a person only truly dies when they are forgotten. After their bodies have crumbled and they leave Earth, all deceased must wait in a lobby and are allowed to pass on only after someone says their name for the last time. “The whole place looks like an infinite airport waiting area,” Eagleman writes. “But the company is terrific.” Most people leave just as their loved ones arrive — for it was only the loved ones who were still remembering. But the truly famous have to hang around for centuries; some, keen to be off, are with an “aching heart waiting for statues to fall”. Eagleman’s tale is an interpretation of what psychologists and social scientists call collective memory. Continued and shared attention to people and events is important because it can help to shape identity — how individuals see themselves as part of a group — and because the choice of what to commemorate, and so remember, influences the structures and priorities of society. This week in Nature Human Behaviour, researchers report a surprising discovery about collective memory: the pattern of its decay follows a mathematical law (C. Candia et al. Nature Hum. Behav. http://doi.org/cxq2; 2018). The attention we pay to academic papers, films, pop songs and tennis players decays in two distinct stages. In theory, the findings could help those who compete for society’s continued attention — from politicians and companies to environmental campaigners — to find ways to stay in the public eye, or at least in the public’s head. © 2018 Springer Nature Publishing AG

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25780 - Posted: 12.12.2018

By Dan Falk When the ghost of King Hamlet commands his son to “remember me,” the prince takes the message to heart, vowing to “wipe away” all that is trivial in his accumulated memory, so that “thy commandment alone shall live / Within the book and volume of my brain.” Of course, it’s not quite that simple, and we often find ourselves doing battle with our memories — struggling to recall something that we’ve forgotten, or wishing to forget something that nonetheless intrudes into consciousness. Humans are masters at leaping through time, vividly imagining the past while making richly detailed plans for the future. A long-forgotten memory can surface at any time. In Marcel Proust’s “In Search of Lost Time,” the narrator bites into a French pastry known as a madeleine and is instantly transported back in time. Suddenly a childhood memory “revealed itself” — it was the recollection of the snack his aunt used to share with him in her bedroom on Sunday mornings before mass. Poets and novelists got a head start, but for some 140 years now scientists, too, have been wrestling with memory. It’s this struggle that two Norwegian sisters, the novelist Hilde Østby and the neuropsychologist Ylva Østby, tackle in their engrossing book, “Adventures in Memory: The Science and Secrets of Remembering and Forgetting.” Copyright 2018 Undark

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25762 - Posted: 12.08.2018

Laura Sanders The uterus is best known for its baby-growing job. But the female organ may also have an unexpected role in memory, a study in rats suggests. The results, published online December 6 in Endocrinology, counter the idea that the nonpregnant uterus is an extraneous organ. That may have implications for the estimated 20 million women in the United States who have had hysterectomies. In the study, female rats either underwent removal of the uterus, ovaries, both organs or neither. Six weeks after surgery, researchers led by behavioral neuroscientist Heather Bimonte-Nelson of Arizona State University in Tempe began testing the rats on water mazes with platforms that were hidden just below the surface. Compared with the other groups, rats that lacked only a uterus were worse at remembering where to find the platforms as the tests turned progressively harder. The results suggest that signals that go from the uterus to the brain are somehow involved in remembering multiple bits of information at the same time. Rats lacking just a uterus had differences in their hormone levels, too, even though these rats kept their hormone-producing ovaries. Researchers have known for decades that hormones released by the ovaries can influence the brain. But finding that the uterus on its own can influence memory is a surprise, says neuroendocrinologist Victoria Luine of Hunter College of the City University of New York. Because many women have their uteruses removed but keep their ovaries, “this revelation brings up some interesting questions to explore.” |© Society for Science & the Public 2000 - 2018

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 8: Hormones and Sex
Link ID: 25757 - Posted: 12.07.2018

By Neuroskeptic The science story of the past week was the claim from Chinese scientist He Jiankui that he has created gene-edited human babies. Prof. He reports that two twin girls have been born carrying modifications of the gene CCR5, which is intended to protect them against future HIV risk. It’s far from clear yet whether the gene-editing that He described has actually taken place – no data has yet been presented. The very prospect of genetically-modifying human beings has, however, led to widespread concern, with He’s claims being described as “monstrous“, “crazy” and “unethical”. All of which got me wondering: could there ever be a neuroscience experiment which attracted the same level of condemnation? What I’m asking here is whether there are neuroscience advances that would be considered inherently unethical. It would, of course, be possible to carry out any neuroscience experiment in an unethical way, by forcing or tricking people into participation. But are there experiments which would be unethical even if all the participants gave full, informed consent at every stage? Here are a couple of possibilities: Intelligence enhancement: Suppose it were possible to substantially boost human intelligence through some kind of technological means, perhaps a drug, or through brain stimulation. I suspect that many people would see this prospect as an ethical problem, because it would give users a definite advantage over non-users and thus, in effect, force people to use the technology in order to keep up. It would be a similar situation to the problem of doping in sports: if doping were widespread, it would be very difficult for non-dopers to compete.

Related chapters from BN8e: Chapter 1: Biological Psychology: Scope and Outlook; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 1: An Introduction to Brain and Behavior; Chapter 13: Memory, Learning, and Development
Link ID: 25738 - Posted: 12.01.2018