Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1681

Researchers at the National Institutes of Health have discovered in mice what they believe is the first known genetic mutation to improve cognitive flexibility—the ability to adapt to changing situations. The gene, KCND2, codes for a protein that regulates potassium channels, which control electrical signals that travel along neurons. The electrical signals stimulate chemical messengers that jump from neuron to neuron. The researchers were led by Dax Hoffman, Ph.D., chief of the Section on Neurophysiology at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). It appears in Nature Communications. The KCND2 protein, when modified by an enzyme, slows the generation of electrical impulses in neurons. The researchers found that altering a single base pair in the KCND2 gene enhanced the ability of the protein to dampen nerve impulses. Mice with this mutation performed better than mice without the mutation in a cognitive task. The task involved finding and swimming to a slightly submerged platform that had been moved to a new location. Mice with the mutation found the relocated platform much faster than their counterparts without the mutation. The researchers plan to investigate whether the mutation will affect neural networks in the animals’ brains. They added that studying the gene and its protein may ultimately lead to insights on the nature of cognitive flexibility in people. It also may help improve understanding of epilepsy, schizophrenia, Fragile X syndrome, and autism spectrum disorder, which all have been associated with other mutations in KCND2.

Keyword: Learning & Memory; Genes & Behavior
Link ID: 27148 - Posted: 03.30.2020

May-Britt Moser & Edvard Moser There was something of the Viking about Per Andersen. The intrepid and steadfast Norwegian was renowned for his attacks on the deepest puzzle of the brain: how its wiring and electrical activity give rise to behaviour and experience. When he was a student in the 1950s, most neuroscientists studied accessible parts of the mammalian nervous system — the junctions between nerves and muscles, say. Andersen worked on the cerebral cortex, which processes higher-level functions: perception, voluntary movement, planning and abstract thinking. His pioneering recordings of electrical activity in the hippocampus — a part of the cortex involved in memory — launched a new era in physiological understanding of the brain and laid the foundations of modern systems neuroscience. He died on 17 February, aged 90. In 1949, it was predicted that learning might depend on repeated activity strengthening the connections — synapses — in networks of neurons. Andersen saw that this was the case in the hippocampus. As the effect was too fleeting to account directly for memory storage, he encouraged his student Terje Lømo to investigate. In 1973, in one of the greatest discoveries of twentieth-century neuroscience, Lømo and British visiting scholar Tim Bliss reported from Andersen’s laboratory that many bursts of electrical stimulation at certain frequencies enhanced connectivity for hours or days. This phenomenon — long-term potentiation (LTP) — remains the main explanation for how we form and store memories (T. V. P. Bliss and T. Lømo J. Physiol. 232, 331–356; 1973). We met Andersen as students, in the late 1980s. Our work with him on LTP and animal learning found differences in function between regions of the hippocampus and demonstrated changes in connectivity related to behaviour. His hunch that we should record activity from single cells led to our discovery of specialized neurons in the cortex that support the sense of where the body is in space. The work was a direct result of his insight. © 2020 Springer Nature Limited

Keyword: Learning & Memory
Link ID: 27130 - Posted: 03.21.2020

By Adrienne Raphel Let me tell you a tale of two grandfathers, Irv and Murray. For decades, Irv, an introverted, quiet, retired bartender and former military engineer, had the same morning routine: coffee and cream; a roll; and the puzzle page of the Press of Atlantic City. He methodically and religiously worked his way through each one, from the crossword to the jumble to the cryptoquip, a substitution cipher that asks solvers to decode clues and figure out the pun. Extroverted and spontaneous Murray, a successful businessman and local politician, also had his morning routine: coffee with lots of sugar; oatmeal; and tinkering on one of his many writing projects, such as a loosely autobiographical musical about a traveling salesman. Murray swam a few times a week, devoured books and loved to travel. But he never did crosswords. Irv died at age 94, and he barely experienced any cognitive loss before the last six months of his life, when he exhibited rapid mental decline. Murray lived to be 91, but the last several years of his life were marked with severe dementia. When I was researching my book Thinking Inside the Box: Adventures with Crosswords and the Puzzling People Who Can’t Live Without Them, I was fascinated by my family’s case study. The evidence, it seemed, couldn’t be clearer: doing crosswords late in life prevents dementia. And at first, all the studies I found seemed to bear this hypothesis out. “Regular crosswords and number puzzles linked to sharper brain in later life,” a May 2019 Science Daily headline proclaims. According to a University of Exeter study, older adults who regularly did word and number puzzles had increased mental acuity. A 2011 experiment with members of the Bronx Aging Study found that a regular regimen of crosswords might delay the onset of cognitive decline. Belief in puzzle power has fueled multimillion-dollar industry of brain-training games like Lumosity or Dakim. © 2020 Scientific American,

Keyword: Alzheimers; Learning & Memory
Link ID: 27124 - Posted: 03.17.2020

By Judson A. Brewer, M.D. Anxiety is a strange beast. As a psychiatrist, I have learned that anxiety and its close cousin, panic, are both born from fear. As a behavioral neuroscientist, I know that fear’s main evolutionary function is helping us survive. In fact, fear is the oldest survival mechanism we have. Fear helps us learn to avoid dangerous situations in the future through a process called negative reinforcement. For example, if we step out into a busy street, turn our head and see a car coming right at us, we instinctively jump back onto the safety of the sidewalk. Evolution made this really simple for us. So simple that we only need three elements in situations like this to learn: an environmental cue, a behavior and a result. In this case, walking up to a busy street cues us to look both ways before crossing. The result of not getting killed helps us remember to repeat the action again in the future. Sometime in the last million years, humans evolved a new layer on top of our more primitive survival brain, called the prefrontal cortex. Involved in creativity and planning, the prefrontal cortex helps us think and plan for the future. It predicts what will happen in the future based on past experience. If information is lacking, our prefrontal cortex lays out different scenarios about what might happen, and guesses which will be most likely. It does this by running simulations based on previous events that are most similar. Defined as “a feeling of worry, nervousness or unease, typically about an imminent event or something with an uncertain outcome,” anxiety comes up when our prefrontal cortexes don’t have enough information to accurately predict the future. We see this right now with coronavirus. Without accurate information, it is easy for our brains to spin stories of fear and dread. © 2020 The New York Times Company

Keyword: Emotions; Stress
Link ID: 27117 - Posted: 03.14.2020

By R. Douglas Fields Our concepts of how the two and a half pounds of flabby flesh between our ears accomplish learning date to Ivan Pavlov’s classic experiments, where he found that dogs could learn to salivate at the sound of a bell. In 1949 psychologist Donald Hebb adapted Pavlov’s “associative learning rule” to explain how brain cells might acquire knowledge. Hebb proposed that when two neurons fire together, sending off impulses simultaneously, the connections between them—the synapses—grow stronger. When this happens, learning has taken place. In the dogs’ case, it would mean the brain now knows that the sound of a bell is followed immediately by the presence of food. This idea gave rise to an oft-quoted axiom: “Synapses that fire together wire together.” The theory proved sound, and the molecular details of how synapses change during learning have been described in detail. But not everything we remember results from reward or punishment, and in fact, most experiences are forgotten. Even when synapses do fire together, they sometimes do not wire together. What we retain depends on our emotional response to an experience, how novel it is, where and when the event occurred, our level of attention and motivation during the event, and we process these thoughts and feelings while asleep. A narrow focus on the synapse has given us a mere stick-figure conception of how learning and the memories it engenders work. It turns out that strengthening a synapse cannot produce a memory on its own, except for the most elementary reflexes in simple circuits. Vast changes throughout the expanse of the brain are necessary to create a coherent memory. Whether you are recalling last night’s conversation with dinner guests or using an acquired skill such as riding a bike, the activity of millions of neurons in many different regions of your brain must become linked to produce a coherent memory that interweaves emotions, sights, sounds, smells, event sequences and other stored experiences. Because learning encompasses so many elements of our experiences, it must incorporate different cellular mechanisms beyond the changes that occur in synapses. This recognition has led to a search for new ways to understand how information is transmitted, processed and stored in the brain to bring about learning. In the past 10 years neuroscientists have come to realize that the iconic “gray matter” that makes up the brain’s outer surface—familiar from graphic illustrations found everywhere, from textbooks to children’s cartoons—is not the only part of the organ involved in the inscription of a permanent record of facts and events for later recall and replay. It turns out that areas below the deeply folded, gray-colored surface also play a pivotal role in learning. © 2020 Scientific American

Keyword: Learning & Memory; Glia
Link ID: 27114 - Posted: 03.12.2020

Dori Grijseels In 2016, three neuroscientists wrote a commentary article arguing that, to truly understand the brain, neuroscience needed to change. From that paper, the International Brain Laboratory (IBL) was born. The IBL, now a collaboration between 22 labs across the world, is unique in biology. The IBL is modeled on physics collaborations, like the ATLAS experiment at CERN, where thousands of scientists work together on a common problem, sharing data and resources during the process. This was in response to the main criticism that the paper’s authors, Zachary Mainen, Michael Häusser and Alexandre Pouget, had about existing neuroscience collaborations: labs came together to discuss generalities, but all the experiments were done separately. They wanted to create a collaboration in which scientists worked together throughout the process, even though their labs may be distributed all over the globe. The IBL decided to focus on one brain function only: decision-making. Decision-making engages the whole brain, since it requires using both input from the senses and information about previous experiences. If someone is thinking about bringing a sweater when they go out, they will use their senses to determine whether it looks and feels cold outside, but they might also remember that, yesterday, they were cold without a sweater. For its first published (in pre-print form) experiment, seven labs of the 22 collaborating in the IBL tested 101 mice on their decision-making ability. The mice saw a black and white grating either to their right or to their left. They then had to twist a little Lego wheel to move the grating to the middle. By rewarding them with sugary water whenever they did the task correctly, the mice gradually learned. It is easy for them to decide which way to twist the wheel if the grating has a high contrast, because it stands out compared to the background of their visual field. However, the mice were also presented with a more ambiguously-patterned grating not easily distinguishable from the background, so the decision of which way to turn the wheel was more difficult. In some cases, the grating was even indistinguishable from the background. Between all seven labs –which were spread across three countries – the mice completed this task three million times. © 2017 – 2019 Massive Science Inc.

Keyword: Attention; Learning & Memory
Link ID: 27102 - Posted: 03.07.2020

In a study of epilepsy patients, researchers at the National Institutes of Health monitored the electrical activity of thousands of individual brain cells, called neurons, as patients took memory tests. They found that the firing patterns of the cells that occurred when patients learned a word pair were replayed fractions of a second before they successfully remembered the pair. The study was part of an NIH Clinical Center trial for patients with drug-resistant epilepsy whose seizures cannot be controlled with drugs. “Memory plays a crucial role in our lives. Just as musical notes are recorded as grooves on a record, it appears that our brains store memories in neural firing patterns that can be replayed over and over again,” said Kareem Zaghloul, M.D., Ph.D., a neurosurgeon-researcher at the NIH’s National Institute of Neurological Disorders and Stroke (NINDS) and senior author of the study published in Science. Dr. Zaghloul’s team has been recording electrical currents of drug-resistant epilepsy patients temporarily living with surgically implanted electrodes designed to monitor brain activity in the hopes of identifying the source of a patient’s seizures. This period also provides an opportunity to study neural activity during memory. In this study, his team examined the activity used to store memories of our past experiences, which scientists call episodic memories. In 1957, the case of an epilepsy patient H.M. provided a breakthrough in memory research. H.M could not remember new experiences after part of his brain was surgically removed to stop his seizures. Since then, research has pointed to the idea that episodic memories are stored, or encoded, as neural activity patterns that our brains replay when triggered by such things as the whiff of a familiar scent or the riff of a catchy tune. But exactly how this happens was unknown.

Keyword: Learning & Memory
Link ID: 27098 - Posted: 03.06.2020

By Cindi May Music makes life better in so many ways. It elevates mood, reduces stress and eases pain. Music is heart-healthy, because it can lower blood pressure, reduce heart rate and decrease stress hormones in the blood. It also connects us with others and enhances social bonds. Music can even improve workout endurance and increase our enjoyment of challenging activities. The fact that music can make a difficult task more tolerable may be why students often choose to listen to it while doing their homework or studying for exams. But is listening to music the smart choice for students who want to optimize their learning? A new study by Manuel Gonzalez of Baruch College and John Aiello of Rutgers University suggests that for some students, listening to music is indeed a wise strategy, but for others, it is not. The effect of music on cognitive functioning appears not to be “one-size-fits-all” but to instead depend, in part, on your personality—specifically, on your need for external stimulation. People with a high requirement for such stimulation tend to get bored easily and to seek out external input. Those individuals often do worse, paradoxically, when listening to music while engaging in a mental task. People with a low need for external stimulation, on the other hand, tend to improve their mental performance with music. But other factors play a role as well. Gonzalez and Aiello took a fairly sophisticated approach to understanding the influence of music on intellectual performance, assessing not only listener personality but also manipulating the difficulty of the task and the complexity of the music. Whether students experience a perk or a penalty from music depends on the interplay of the personality of the learner, the mental task, and the music. © 2020 Scientific American

Keyword: Learning & Memory; Attention
Link ID: 27093 - Posted: 03.05.2020

Ian Sample Science editor It’s the sort a sneaky trick only a gull would learn: by watching how people handle their food, the birds can work out when there are snacks to be had. Researchers found that herring gulls were more likely to peck at items left on the ground if humans had pretended to eat them first. The study suggests that gulls take cues from human behaviour to help them home in on tasty scraps in the rubbish people leave behind. “People don’t tend to think of wild animals as using cues from humans like this,” said Madeleine Goumas, a researcher at the University of Exeter. “It’s the kind of behaviour that’s more often associated with domesticated animals or those kept in captivity.” Goumas, who has become one of the more prominent gull researchers in Britain, reported last year that maintaining eye contact can deter seagulls from snatching food. In tests with bags of chips in seaside towns, she found that staring the birds out put them off their daring raids. To follow up that work, Goumas wanted to see whether gulls pick up on subtle human cues to help them find their next meal. And so she set off to the Cornish towns of Falmouth, St Ives, Newquay and Penzance, and Plymouth in Devon, armed with shop-bought flapjacks in shiny blue wrappers, a supply of blue sponges, and a pair of dark glasses. For the first experiment, Goumas donned the sunglasses and walked towards her chosen bird, carrying a bucket with a flapjack in each hand. When she was about eight metres from the gull, she sat down, flipped the buckets over so they concealed the snacks, and pushed them out to her sides. She then lifted off the buckets, picked up one of the flapjacks, stood up and pretended to eat it. After 20 seconds, she put the flapjack back and retreated a safe distance. © 2020 Guardian News & Media Limited

Keyword: Learning & Memory
Link ID: 27078 - Posted: 02.27.2020

By James Gorman There’s something about a really smart dog that makes it seem as if there might be hope for the world. China is in the midst of a frightening disease outbreak and nobody knows how far it will spread. The warming of the planet shows no signs of stopping; it reached a record 70 degrees in Antarctica last week. Not to mention international tensions and domestic politics. But there’s a dog in Norway that knows not only the names of her toys, but also the names of different categories of toys, and she learned all this just by hanging out with her owners and playing her favorite game. So who knows what other good things could be possible? Right? This dog’s name is Whisky. She is a Border collie that lives with her owners and almost 100 toys, so it seems like things are going pretty well for her. Even though I don’t have that many toys myself, I’m happy for her. You can’t be jealous of a dog. Or at least you shouldn’t be. Whisky’s toys have names. Most are dog-appropriate like “the colorful rope” or “the small Frisbee.” However, her owner, Helge O. Svela said on Thursday that since the research was done, her toys have grown in number from 59 to 91, and he has had to give some toys “people” names, like Daisy or Wenger. “That’s for the plushy toys that resemble animals like ducks or elephants (because the names Duck and Elephant were already taken),” he said. During the research, Whisky proved in tests that she knew the names for at least 54 of her 59 toys. That’s not just the claim of a proud owner, and Mr. Svela is quite proud of Whisky, but the finding of Claudia Fugazza, an animal behavior researcher from Eötvös Loránd University in Budapest, who tested her. That alone makes Whisky part of a very select group, although not a champion. You may recall Chaser, another Border collie that knew the names of more than 1,000 objects and also knew words for categories of objects. And there are a few other dogs with shockingly large vocabularies, Dr. Fugazza said, including mixed breeds, and a Yorkie. These canine verbal prodigies are, however, few and far between. “It is really, really unusual, and it is really difficult to teach object names to dogs,” Dr. Fugazza said. © 2020 The New York Times Company

Keyword: Language; Learning & Memory
Link ID: 27063 - Posted: 02.21.2020

Amy Schleunes New Zealand’s North Island robins (Petroica longipes), known as toutouwai in Maori, are capable of remembering a foraging task taught to them by researchers for up to 22 months in the wild, according to a study published on February 12 in Biology Letters. These results echo the findings of a number of laboratory studies of long-term memory in animals, but offer a rare example of a wild animal retaining a learned behavior with no additional training. The study also has implications for conservation and wildlife management: given the birds’ memory skills, researchers might be able to teach them about novel threats and resources in their constantly changing habitat. “This is the first study to show [memory] longevity in the wild,” says Vladimir Pravosudov, an animal behavior researcher at the University of Nevada, Reno, who was not involved in the study. Rachael Shaw, a coauthor and behavioral ecologist at Victoria University in New Zealand, says she was surprised that the birds remembered the new skill she had taught them. “Wild birds have so much that they have to contend with in their daily lives,” she says. “You don’t really expect that it’s worth their while to retain this learned task they hardly had the opportunity to do, and they can’t predict that they will have an opportunity to do again.” Shaw is generally interested in the cognitive abilities of animals and the evolution of intelligence, and the toutouwai, trainable food caching birds that can live up to roughly 10 years, make perfect subjects for her behavioral investigations. “They’ve got this kind of boldness and curiosity that a lot of island bird species share,” says Shaw. These qualities make them vulnerable to predation by invasive cats, rats, and ermines (also known as stoats), but also inquisitive and relatively unafraid of humans, an ideal disposition for testing memory retention in the field. © 1986–2020 The Scientist

Keyword: Learning & Memory; Evolution
Link ID: 27053 - Posted: 02.20.2020

Ian Sample Science editor Consuming a western diet for as little as one week can subtly impair brain function and encourage slim and otherwise healthy young people to overeat, scientists claim. Researchers found that after seven days on a high fat, high added sugar diet, volunteers in their 20s scored worse on memory tests and found junk food more desirable immediately after they had finished a meal. The finding suggests that a western diet makes it harder for people to regulate their appetite, and points to disruption in a brain region called the hippocampus as the possible cause. “After a week on a western-style diet, palatable food such as snacks and chocolate becomes more desirable when you are full,” said Richard Stevenson, a professor of psychology at Macquarie University in Sydney. “This will make it harder to resist, leading you to eat more, which in turn generates more damage to the hippocampus and a vicious cycle of overeating.” Previous work in animals has shown that junk food impairs the hippocampus, a brain region involved in memory and appetite control. It is unclear why, but one idea is that the hippocampus normally blocks or weakens memories about food when we are full, so looking at a cake does not flood the mind with memories of how nice cake can be. “When the hippocampus functions less efficiently, you do get this flood of memories, and so food is more appealing,” Stevenson said. To investigate how the western diet affects humans, the scientists recruited 110 lean and healthy students, aged 20 to 23, who generally ate a good diet. Half were randomly assigned to a control group who ate their normal diet for a week. The other half were put on a high energy western-style diet, which featured a generous intake of Belgian waffles and fast food. © 2020 Guardian News & Media Limited

Keyword: Learning & Memory; Obesity
Link ID: 27050 - Posted: 02.19.2020

Blake Richards Despite billions of dollars spent and decades of research, computation in the human brain remains largely a mystery. Meanwhile, we have made great strides in the development of artificial neural networks, which are designed to loosely mimic how brains compute. We have learned a lot about the nature of neural computation from these artificial brains and it’s time to take what we’ve learned and apply it back to the biological ones. Neurological diseases are on the rise worldwide, making a better understanding of computation in the brain a pressing problem. Given the ability of modern artificial neural networks to solve complex problems, a framework for neuroscience guided by machine learning insights may unlock valuable secrets about our own brains and how they can malfunction. Our thoughts and behaviours are generated by computations that take place in our brains. To effectively treat neurological disorders that alter our thoughts and behaviours, like schizophrenia or depression, we likely have to understand how the computations in the brain go wrong. However, understanding neural computation has proven to be an immensely difficult challenge. When neuroscientists record activity in the brain, it is often indecipherable. In a paper published in Nature Neuroscience, my co-authors and I argue that the lessons we have learned from artificial neural networks can guide us down the right path of understanding the brain as a computational system rather than as a collection of indecipherable cells. © 2010–2020, The Conversation US, Inc.

Keyword: Brain imaging; Robotics
Link ID: 27042 - Posted: 02.14.2020

By Laura Sanders Immune cells in the brain chew up memories, a new study in mice shows. The finding, published in the Feb. 7 Science, points to a completely new way that the brain forgets, says neuroscientist Paul Frankland of the Hospital for Sick Children Research Institute in Toronto, who wasn’t involved in the study. That may sound like a bad thing, but forgetting is just as important as remembering. “The world constantly changes,” Frankland says, and getting rid of unimportant memories — such as a breakfast menu from two months ago — allows the brain to collect newer, more useful information. Exactly how the brain stores memories is still debated, but many scientists suspect that connections between large groups of nerve cells are important (SN: 1/24/18). Forgetting likely involves destroying or changing these large webs of precise connections, called synapses, other lines of research have suggested. The new result shows that microglia, immune cells that can clear debris from the brain, “do exactly that,” Frankland says. Microglia are master brain gardeners that trim extra synapses away early in life, says Yan Gu, a neuroscientist at Zhejiang University School of Medicine in Hangzhou, China. Because synapses have a big role in memory storage, “we started to wonder whether microglia may induce forgetting by eliminating synapses,” Gu says. Gu’s team first gave mice an unpleasant memory: mild foot shocks, delivered in a particular cage. Five days after the shocks, the mice would still freeze in fear when they were placed in the cage. But 35 days later, they had begun to forget and froze less often in the room. © Society for Science & the Public 2000–2020

Keyword: Learning & Memory; Neuroimmunology
Link ID: 27026 - Posted: 02.07.2020

By Charles Zanor We all know people who say they have “no sense of direction,” and our tendency is almost always to minimize such claims rather than take them at full force. Yet for some people that description is literally true, and true in all circumstances: If they take a single wrong turn on an established route they often become totally lost. This happens even when they are just a few miles from where they live. Ellen Rose had been a patient of mine for years before I realized that she had this life-long learning disability. Advertisement I was made aware of it not long after I moved my psychology office from Agawam, Massachusetts to Suffield, Connecticut, just five miles away. I gave Ellen a fresh set of directions from the Springfield, Massachusetts area that took her south on Interstate 91 to Exit 47W, then across the Connecticut River to Rte 159 in Suffield. I thought it would pose no problem at all for her. A few minutes past her scheduled appointment time she called to say that she was lost. She had come south on Route 91 and had taken the correct exit, but she got confused and almost immediately hooked a right onto a road going directly north, bringing her back over the Massachusetts line to the town of Longmeadow. She knew this was wrong but did not know how to correct it, so I repeated the directions to get on 91 South and so on. Minutes passed, and then more minutes passed, and she called again to say that somehow she had driven by the exit she was supposed to take and was in Windsor, Connecticut. I kept her on the phone and guided her turn by turn to my office. Advertisement When I asked her why she hadn’t taken Exit 47W, she said that she saw it but it came up sooner than she expected so she kept going. This condition—developmental topographic disorientation—didn’t even have a formal name until 2009, when Giuseppe Iaria reported his first case in the journal Neuropsychologia. To understand DTD it is best to begin by saying that there are two main ways that successful travelers use to navigate their environment. © 2020 Scientific American,

Keyword: Learning & Memory; Development of the Brain
Link ID: 27021 - Posted: 02.05.2020

Scott Grafton When people ask me about the “mind-body connection,” I typically suggest walking on an icy sidewalk. Skip the yoga, mindfulness, or meditation, and head to the corner on a cold, windy, snowy day. Every winter, much of North America becomes exceedingly slippery with ice. Emergency rooms across the continent see a sharp uptick in fractured limbs and hips as people confidently trudge outside in such conditions, unveiling a profound disconnection between what people believe and what they can actually do with their bodies. One might think that a person could call on experience from years past to adjust their movement or provide a little insight or caution. But the truth is that the body forgets what it takes to stay upright in these perilous conditions. Why is there so much forgetting and relearning on an annual basis? We remember how to ride a bike. Why can’t we remember how to walk on ice? I attempt to answer this and other questions concerning the connection (or lack thereof) between motion in the mind and motion by the body in my new book, Physical Intelligence: The Science of How the Body and the Mind Guide Each Other Through Life. Pantheon, January 2020 Falling on ice reveals a delicate tradeoff that the brain must reconcile as it pilots the body. On the one hand, it needs to build refined motor programs to execute skills such as walking, running, and throwing. On the other hand, those programs can’t be too specific. There is a constant need to tweak motor plans to account for dynamic conditions. When I throw a backpack on, my legs don’t walk in the same way as they do without the pack: my stance widens, my stride shortens. Often, the tweaking needs to happen in moments. As I pick the pack up, I need to lean in or I could tip myself over. Just as importantly, as soon as I put it down, I need to forget I ever held it in the first place. © 1986–2020 The Scientist

Keyword: Learning & Memory
Link ID: 27001 - Posted: 01.28.2020

By Betsy Mason Despite weighing less than half an ounce, mountain chickadees are able to survive harsh winters complete with subzero temperatures, howling winds and heavy snowfall. How do they do it? By spending the fall hiding as many as 80,000 individual seeds, which they then retrieve — by memory — during the winter. Their astounding ability to keep track of that many locations puts their memory among the most impressive in the animal kingdom. It also makes chickadees an intriguing subject for animal behavior researchers. Cognitive ecologist Vladimir Pravosudov of the University of Nevada, Reno, has dedicated his career to studying this tough little bird’s amazing memory. Writing in 2013 on the cognitive ecology of food caching in the Annual Review of Ecology, Evolution, and Systematics, he and coauthor Timothy Roth argued that answers to big questions about the evolution of cognition may lie in the brains of these little birds. In July, at a meeting of the Animal Behavior Society in Chicago, Pravosudov presented his group’s latest research on the wild chickadees that live in the Sierra Nevada mountains. He and his graduate students were able to show for the first time that an individual bird’s spatial memory has a direct impact on its survival. The team did this by building an experimental contraption that uses radio-frequency identification (RFID) technology and electronic leg bands to test individual birds’ memory in the wild and then track their longevity. The researchers found that the birds with the best memory were most likely to survive the winter. What are some of the big ideas driving your work on chickadees? If some species are smart, or not smart, the question is: Why? Cognitive ecologists like me are specifically trying to figure out which ecological factors may have shaped the evolution of these differences in cognition. In other words, the idea is to understand the ecological and evolutionary reasons for variation in cognition. © 2020 Annual Reviews, Inc

Keyword: Learning & Memory
Link ID: 26968 - Posted: 01.17.2020

By Daniel J. Levitin I’m 62 years old as I write this. Like many of my friends, I forget names that I used to be able to conjure up effortlessly. When packing my suitcase for a trip, I walk to the hall closet and by the time I get there, I don’t remember what I came for. And yet my long-term memories are fully intact. I remember the names of my third-grade classmates, the first record album I bought, my wedding day. This is widely understood to be a classic problem of aging. But as a neuroscientist, I know that the problem is not necessarily age-related. Short-term memory contains the contents of your thoughts right now, including what you intend to do in the next few seconds. It’s doing some mental arithmetic, thinking about what you’ll say next in a conversation or walking to the hall closet with the intention of getting a pair of gloves. Short-term memory is easily disturbed or disrupted. It depends on your actively paying attention to the items that are in the “next thing to do” file in your mind. You do this by thinking about them, perhaps repeating them over and over again (“I’m going to the closet to get gloves”). But any distraction — a new thought, someone asking you a question, the telephone ringing — can disrupt short-term memory. Our ability to automatically restore the contents of the short-term memory declines slightly with every decade after 30. But age is not the major factor so commonly assumed. I’ve been teaching undergraduates for my entire career and I can attest that even 20-year-olds make short-term memory errors — loads of them. They walk into the wrong classroom; they show up to exams without the requisite No. 2 pencil; they forget something I just said two minutes before. These are similar to the kinds of things 70-year-olds do. © 2020 The New York Times Company

Keyword: Learning & Memory; Alzheimers
Link ID: 26952 - Posted: 01.13.2020

By Matthew Hutson When you are stuck on a problem, sometimes it is best to stop thinking about it—consciously, anyway. Research has shown that taking a break or a nap can help the brain create pathways to a solution. Now a new study expands on the effect of this so-called incubation by using sound cues to focus the sleeping mind on a targeted problem. When humans sleep, parts of the brain replay certain memories, strengthening and transforming them. About a decade ago researchers developed a technique, called targeted memory reactivation (TMR), aimed at further reinforcing selected memories: when a sound becomes associated with a memory and is later played during sleep, that memory gets reactivated. In a study published last November in Psychological Science, scientists tested whether revisiting the memory of a puzzle during sleep might also improve problem-solving. About 60 participants visited the laboratory before and after a night of sleep. In an evening session, they attempted spatial, verbal and conceptual puzzles, with a distinct music clip repeating in the background for each, until they had worked on six puzzles they could not solve. Overnight they wore electrodes to detect slow-wave sleep—slumber's deepest phase, which may be important for memory consolidation—and a device played the sounds assigned to three of the six unsolved puzzles. The next day, back at the lab, the participants attempted the six puzzles again. (Each repeated the experiment with a different set of puzzles the following night.) All told, the subjects solved 32 percent of the sound-prompted puzzles versus 21 percent of the untargeted puzzles—a boost of more than 50 percent. © 2020 Scientific American

Keyword: Sleep; Learning & Memory
Link ID: 26938 - Posted: 01.07.2020

Natalie C Tronson Ph.D. We all have a strong intuitive sense of what memory is: it’s the conscious recollection of events, people, and places from our past. And it’s something we often wish we were better at so we didn’t continuously lose our keys, forget where our car was parked, and we could remember more facts for exams, remember people’s birthdays, or what I came all the way upstairs to grab. But memory is so much more. Memory is also how I can find my way around the town I live in now—and how I can still find my way around the town I grew up in, despite the many changes over the 25 years since I left. It’s how I know how to drive the car, and how I can sing four verses of Mary Had a Little Lamb to my child sitting in the back seat demanding that I sing. It’s why I know to stop at the red light, go at the green, and avoid the stretch of road that has been under construction for the past six months. It’s also one reason why I feel anxious when pedestrians run across the street randomly, and why our cats come running home when they hear the front door of our house open. That’s a lot of different types of memory just for a quick drive home: memory for spatial learning, verbal memory for songs, motor learning for driving, and episodic memory, among others, are in there too. Not only are there a lot of different types of memory, but there is also a lot of real estate and energy in our brains (and in the brains of many other species) taken up for learning and memory processes. © 2020 Sussex Publishers, LLC

Keyword: Learning & Memory
Link ID: 26937 - Posted: 01.07.2020