Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1505

By Victoria Davis Some people can trace their traditions back decades; the swamp sparrow has passed its songs down for more than 1500 years. The findings, published today in Nature Communications, suggest humans are not alone in keeping practices alive for long periods of time. To conduct the study, researchers recorded a collection of songs from 615 adult male swamp sparrows from six densely populated areas across the northeastern United States. They dissected each bird’s song repertoire, identifying only 160 different syllable types within all the recorded sample. Most swamp swallows sang the same tunes, using the same common syllables, but there were a few rare types in each population, just as there are variations in human oral histories over time. Using a statistical method of calculation called approximate Bayesian computation and models that measure the diversity of syllable types present in each population, the scientists were able to calculate how the songs of each male would have changed over time. They also found that all but two of the most common syllables used during their sampling in 2009 were also the most common during an earlier study of the species when recordings were made in the 1970s. Overall, the analysis indicated that the average age of the oldest tune dated back about 1537 years. © 2018 American Association for the Advancement of Science

Keyword: Animal Communication; Language
Link ID: 25112 - Posted: 06.21.2018

Maria Temming Getting robots to do what we want would be a lot easier if they could read our minds. That sci-fi dream might not be so far off. With a new robot control system, a human can stop a bot from making a mistake and get the machine back on track using brain waves and simple hand gestures. People who oversee robots in factories, homes or hospitals could use this setup, to be presented at the Robotics: Science and Systems conference on June 28, to ensure bots operate safely and efficiently. Electrodes worn on the head and forearm allow a person to control the robot. The head-worn electrodes detect electrical signals called error-related potentials — which people’s brains unconsciously generate when they see someone goof up — and send an alert to the robot. When the robot receives an error signal, it stops what it is doing. The person can then make hand gestures — detected by arm-worn electrodes that monitor electrical muscle signals — to show the bot what it should do instead. MIT roboticist Daniela Rus and colleagues tested the system with seven volunteers. Each user supervised a robot that moved a drill toward one of three possible targets, each marked by an LED bulb, on a mock airplane fuselage. Whenever the robot zeroed in on the wrong target, the user’s mental error-alert halted the bot. And when the user flicked his or her wrist left or right to redirect the robot, the machine moved toward the proper target. In more than 1,000 trials, the robot initially aimed for the correct target about 70 percent of the time, and with human intervention chose the right target more than 97 percent of the time. The team plans to build a system version that recognizes a wider variety of user movements. That way, “you can gesture how the robot should move, and your motion can be more fluidly interpreted,” says study coauthor Joseph DelPreto, also a roboticist at MIT. |© Society for Science & the Public 2000 - 2018

Keyword: Brain imaging; Robotics
Link ID: 25111 - Posted: 06.20.2018

Siobhan Roberts In May 2013, the mathematician Carina Curto attended a workshop in Arlington, Virginia, on “Physical and Mathematical Principles of Brain Structure and Function” — a brainstorming session about the brain, essentially. The month before, President Obama had issued one of his “Grand Challenges” to the scientific community in announcing the BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies), aimed at spurring a long-overdue revolution in understanding our three-pound organ upstairs. In advance of the workshop, the hundred or so attendees each contributed to a white paper addressing the question of what they felt was the most significant obstacle to progress in brain science. Answers ran the gamut — some probed more generally, citing the brain’s “utter complexity,” while others delved into details about the experimental technology. Curto, an associate professor at Pennsylvania State University, took a different approach in her entry, offering an overview of the mathematical and theoretical technology: A major obstacle impeding progress in brain science is the lack of beautiful models. Let me explain. … Many will agree that the existing (and impending) deluge of data in neuroscience needs to be accompanied by advances in computational and theoretical approaches — for how else are we to “make sense” of these data? What such advances should look like, however, is very much up to debate. … How much detail should we be including in our models? … How well can we defend the biological realism of our theories? All Rights Reserved © 2018

Keyword: Robotics; Learning & Memory
Link ID: 25108 - Posted: 06.20.2018

Ed Yong Peter, aged 3, was scared of rabbits. So Mary Cover Jones kept bringing him rabbits. At first, she’d take a caged rabbit up to Peter, while he ate some candy and played with other children. At first, Peter was terrified by the mere presence of a rabbit in the same room. But soon, he allowed the animal to get closer—12 feet, then four, then three. Eventually, Peter was happy for rabbits to nibble his fingers. “The case of Peter illustrates how a fear may be removed under laboratory conditions,” Cover Jones wrote in 1924. Cover Jones is now recognized as the "mother of behavioral therapy." Her observations laid the groundwork for what would become known as exposure therapy—the practice of getting people to overcome their fears by facing them in controlled settings. A century later, neuroscientists can watch how the act of facing one’s fears actually plays out inside the brain. Using gene-engineering tools, they can label the exact neurons in a mouse’s brain that store a specific fearful memory. Then, they can watch what happens when the rodent recalls those experiences. By doing this, Ossama Khalaf from the EPFL in Lausanne showed that the extinction of fear depends on reactivating the neurons that encode it. A mouse has to re-experience a deep-rooted fear if it is to lose it. When someone encounters a new experience—say, a terrifying rabbit—groups of neurons in their brain fire together, the connections between them become stronger, and molecules accumulate at the places where neurons meet. Many scientists believe that these preserved patterns of strengthened connections are the literal stuff of memories—the physical representations of the things we remember. These connected neuron groups are called engrams.

Keyword: Emotions; Learning & Memory
Link ID: 25101 - Posted: 06.18.2018

by Sarah DiGiulio Why is it that you can perfectly recite the words to *NSYNC’s “Bye Bye Bye,” but can’t remember the title of the new TV show you started watching on Netflix and wanted to tell your coworker about? We remember things because they either stand out, they relate to and can easily be integrated in our existing knowledge base, or it’s something we retrieve, recount or use repeatedly over time, explains Sean Kang, PhD, assistant professor in the Department of Education at Dartmouth College, whose research focuses on the cognitive psychology of learning and memory. “The average layperson trying to learn nuclear physics for the first time, for example, will probably find it very difficult to retain that information." That's because he or she likely doesn’t have existing knowledge in their brain to connect that new information to. And on a molecular level neuroscientists suspect that there’s actually a physical process that needs to be completed to form a memory — and us not remembering something is a result of that not happening, explains Blake Richards, DPhil, assistant professor in the Department of Biological Sciences and Fellow at the Canadian Institute for Advanced Research. In the same way that when you store a grocery list on a piece of paper, you are making a physical change to that paper by writing words down, or when you store a file on a computer, you’re making a physical change somewhere in the magnetization of some part of your hard drive — a physical change happens in your brain when you store a memory or new information. © 2018 NBC UNIVERSAL

Keyword: Learning & Memory
Link ID: 25085 - Posted: 06.14.2018

By Douglas Woods It often starts with a simple, subtle behavior like a rapid eye blink. Sometimes it’s a nose-scrunch or a sniff that is confused with a lingering cold or an allergy. Often, these habits go away on their own, but in about 1 percent of children (boys more so than girls), these blinks, twitches, and coughs become the persistent tic disorder known as Tourette syndrome (TS), a misunderstood and stigmatizing neurological condition. Media portrayals of TS often overemphasize the rare (fewer than 15 percent of cases) symptoms, in which people with TS shout obscene words—a symptom known as coprolalia—but most patients have a wide range of movements and sounds, ranging from simple tics to more complex ones that often look intentional but are not. Hidden beneath the tics, people with TS often experience "premonitory urges”—unpleasant sensations that build until the tic occurs. Ticcing brings a brief sense of relief, but the urges soon return. We know that TS is a genetically-based neurological disorder that is strongly influenced by a person’s surroundings. The disorder stems from a problem within the basal ganglia, a series of structures in the brain that are responsible for selecting and inhibiting our movements. When neurons fire, signaling us to move, the basal ganglia serves as a filter, allowing some of these signals to pass through and become movements. Other movement signals that are not needed in a particular situation are held back. © 2018 Scientific American

Keyword: Tourettes; Learning & Memory
Link ID: 25062 - Posted: 06.06.2018

By Ruth Williams The sun’s ultraviolet (UV) radiation is a major cause of skin cancer, but it offers some health benefits too, such as boosting production of essential vitamin D and improving mood. Today (May 17), a report in Cell adds enhanced learning and memory to UV’s unexpected benefits. Researchers have discovered that, in mice, exposure to UV light activates a molecular pathway that increases production of the brain chemical glutamate, heightening the animals’ ability to learn and remember. “The subject is of strong interest, because it provides additional support for the recently proposed theory of ultraviolet light’s regulation of the brain and central neuroendocrine system,” dermatologist Andrzej Slominski of the University of Alabama who was not involved in the research writes in an email to The Scientist. “It’s an interesting and timely paper investigating the skin-brain connection,” notes skin scientist Martin Steinhoff of University College Dublin’s Center for Biomedical Engineering who also did not participate in the research. “The authors make an interesting observation linking moderate UV exposure to . . . [production of] the molecule urocanic acid. They hypothesize that this molecule enters the brain, activates glutaminergic neurons through glutamate release, and that memory and learning are increased.” © 1986-2018 The Scientist

Keyword: Learning & Memory; Biological Rhythms
Link ID: 25052 - Posted: 06.02.2018

By Matthew Hutson It's a Saturday morning in February, and Chloe, a curious 3-year-old in a striped shirt and leggings, is exploring the possibilities of a new toy. Her father, Gary Marcus, a developmental cognitive scientist at New York University (NYU) in New York City, has brought home some strips of tape designed to adhere Lego bricks to surfaces. Chloe, well-versed in Lego, is intrigued. But she has always built upward. Could she use the tape to build sideways or upside down? Marcus suggests building out from the side of a table. Ten minutes later, Chloe starts sticking the tape to the wall. "We better do it before Mama comes back," Marcus says in a singsong voice. "She won't be happy." (Spoiler: The wall paint suffers.) Implicit in Marcus's endeavor is an experiment. Could Chloe apply what she had learned about an activity to a new context? Within minutes, she has a Lego sculpture sticking out from the wall. "Papa, I did it!" she exclaims. In her adaptability, Chloe is demonstrating common sense, a kind of intelligence that, so far, computer scientists have struggled to reproduce. Marcus believes the field of artificial intelligence (AI) would do well to learn lessons from young thinkers like her. Researchers in machine learning argue that computers trained on mountains of data can learn just about anything—including common sense—with few, if any, programmed rules. These experts "have a blind spot, in my opinion," Marcus says. "It's a sociological thing, a form of physics envy, where people think that simpler is better." He says computer scientists are ignoring decades of work in the cognitive sciences and developmental psychology showing that humans have innate abilities—programmed instincts that appear at birth or in early childhood—that help us think abstractly and flexibly, like Chloe. He believes AI researchers ought to include such instincts in their programs. © 2018 American Association for the Advancement of Science.

Keyword: Learning & Memory; Development of the Brain
Link ID: 25026 - Posted: 05.26.2018

By Ruth Williams | The sun’s ultraviolet (UV) radiation is a major cause of skin cancer, but it offers some health benefits too, such as boosting production of essential vitamin D and improving mood. Today (May 17), a report in Cell adds enhanced learning and memory to UV’s unexpected benefits. Researchers have discovered that, in mice, exposure to UV light activates a molecular pathway that increases production of the brain chemical glutamate, heightening the animals’ ability to learn and remember. “The subject is of strong interest, because it provides additional support for the recently proposed theory of ultraviolet light’s regulation of the brain and central neuroendocrine system,” dermatologist Andrzej Slominski of the University of Alabama who was not involved in the research writes in an email to The Scientist. “It’s an interesting and timely paper investigating the skin-brain connection,” notes skin scientist Martin Steinhoff of University College Dublin’s Center for Biomedical Engineering who also did not participate in the research. “The authors make an interesting observation linking moderate UV exposure to . . . [production of] the molecule urocanic acid. They hypothesize that this molecule enters the brain, activates glutaminergic neurons through glutamate release, and that memory and learning are increased.” While the work is “fascinating, very meticulous, and extremely detailed,” says dermatologist David Fisher of Massachusetts General Hospital and Harvard Medical School, “it does not imply that UV is actually good for you. . . . Across the board, for humanity, UV really is dangerous.” © 1986-2018 The Scientist

Keyword: Intelligence; Vision
Link ID: 24993 - Posted: 05.18.2018

By Shawn Hayward Brenda Milner has collected her share of awards, prizes, honourary degrees and other recognitions throughout her amazing career, but there is something special about being recognized with top honours from the city and province she has called home since 1944, all within one week. On May 8, the Speaker of the National Assembly of Quebec, Jacques Chagnon presented Milner with its Medal of Honour, along with seven other Quebecers including McGill alumna Dr. Joanne Liu. The Medal of Honour is awarded to public figures from all walks of life who, through their career, their work or their social commitment, have earned the recognition of the Members of the National Assembly and the people of Quebec. Milner added to that recognition the title of Commander of the Order of Montreal, given to her by Mayor Valérie Plante during a ceremony at City Hall on May 14. The Order of Montreal was created on the city’s 375th anniversary to recognize women and men who have contributed in a remarkable way to the city’s development and reputation. There are three ranks in the Order, Commander being the highest. A celebrated researcher at the Montreal Neurological Institute and Hospital (The Neuro), Milner turns 100 years old on July 15. She is the Dorothy J. Killam Professor at The Neuro, and a professor in the Department of Neurology and Neurosurgery at McGill University. © 2018 McGill Reporter ·

Keyword: Learning & Memory
Link ID: 24984 - Posted: 05.17.2018

By Usha Lee McFarling, STAT UCLA neuroscientists reported Monday that they have transferred a memory from one animal to another via injections of RNA, a startling result that challenges the widely held view of where and how memories are stored in the brain. The finding from the lab of David Glanzman hints at the potential for new RNA-based treatments to one day restore lost memories and, if correct, could shake up the field of memory and learning. “It’s pretty shocking,” said Dr. Todd Sacktor, a neurologist and memory researcher at SUNY Downstate Medical Center in Brooklyn, N.Y. “The big picture is we’re working out the basic alphabet of how memories are stored for the first time.” He was not involved in the research, which was published in eNeuro, the online journal of the Society for Neuroscience. Advertisement Many scientists are expected to view the research more cautiously. The work is in snails, animals that have proven a powerful model organism for neuroscience but whose simple brains work far differently than those of humans. The experiments will need to be replicated, including in animals with more complex brains. And the results fly in the face of a massive amount of evidence supporting the deeply entrenched idea that memories are stored through changes in the strength of connections, or synapses, between neurons. © 2018 Scientific American

Keyword: Learning & Memory
Link ID: 24979 - Posted: 05.15.2018

Laurel Hamers Sluggish memories might be captured via RNA. The molecule, when taken from one sea slug and injected into another, appeared to transfer a rudimentary memory between the two, a new study suggests. Most neuroscientists believe long-term memories are stored by strengthening connections between nerve cells in the brain (SN: 2/3/18, p. 22). But these results, reported May 14 in eNeuro, buoy a competing argument: that some types of RNA molecules, and not linkages between nerve cells, are key to long-term memory storage. “It’s a very controversial idea,” admits study coauthor David Glanzman, a neuroscientist at UCLA. When poked or prodded, some sea slugs (Aplysia californica) will reflexively pull their siphon, a water-filtering appendage, into their bodies. Using electric shocks, Glanzman and his colleagues sensitized sea slugs to have a longer-lasting siphon-withdrawal response — a very basic form of memory. The team extracted RNA from those slugs and injected it into slugs that hadn’t been sensitized. These critters then showed the same long-lasting response to touch as their shocked companions. RNA molecules come in a variety of flavors that carry out specialized jobs, so it’s not yet clear what kind of RNA may be responsible for the effect, Glanzman says. But he suspects that it’s one of the handful of RNA varieties that don’t carry instructions to make proteins, the typical job of most RNA. (Called noncoding RNAs, these molecules are often involved in manipulating genes’ activity.) |© Society for Science & the Public 2000 - 2018.

Keyword: Learning & Memory
Link ID: 24978 - Posted: 05.15.2018

By Gretchen Reynolds Call them tip-of-the-tongue moments: those times we can’t quite call up the name or word that we know we know. These frustrating lapses are thought to be caused by a brief disruption in the brain’s ability to access a word’s sounds. We haven’t forgotten the word, and we know its meaning, but its formulation dances teasingly just beyond our grasp. Though these mental glitches are common throughout life, they become more frequent with age. Whether this is an inevitable part of growing older or somehow lifestyle-dependent is unknown. But because evidence already shows that physically fit older people have reduced risks for a variety of cognitive deficits, researchers recently looked into the relationship between aerobic fitness and word recall. For the study, whose results appeared last month in Scientific Reports, researchers at the University of Birmingham tested the lungs and tongues, figuratively speaking, of 28 older men and women at the school’s human-performance lab. Volunteers were between 60 and 80 and healthy, with no clinical signs of cognitive problems. Their aerobic capacities were measured by having them ride a specialized stationary bicycle to exhaustion; fitness levels among the subjects varied greatly. This group and a second set of volunteers in their 20s then sat at computers as word definitions flashed on the screens, prompting them to indicate whether they knew and could say the implied word. The vocabulary tended to be obscure — “decanter,” for example — because words rarely used are the hardest to summon quickly. To no one’s surprise, the young subjects experienced far fewer tip-of-the-tongue failures than the seniors, even though they had smaller vocabularies over all, according to other tests. Within the older group, the inability to identify and say the right words was strongly linked to fitness. The more fit someone was, the less likely he or she was to go through a “what’s that word again?” moment of mental choking. © 2018 The New York Times Company

Keyword: Learning & Memory
Link ID: 24974 - Posted: 05.15.2018

By Alexandra Sacks, M.D. A new mother finally gets her fussy baby to sleep and steps into a relaxing hot shower — with her glasses on. At a family barbecue she can’t recall the name of a relative she rarely sees. It’s easy to laugh off such lapses as “mommy brain,” but there remains a cultural belief that pregnancy and child care impact a woman’s cognition and mental life, long after a baby is born. Women have often chalked up these changes to hormones, fatigue and the intoxicating love for a new baby. Hormones do affect cognition, and, as anyone who has ever done shift work or had jet lag knows, sleep deprivation saps our mental abilities. And the current evidence in scientific literature suggests that pregnancy changes the brain on a physical, cellular level in ways that we are only beginning to understand. However, there is no convincing scientific evidence that pregnancy causes an overall decline in cognitive performance or memory. Instead, most experts believe that pregnant women’s brain changes are an example of neuroplasticity, the process in which the brain changes throughout life by reorganizing connections in response to the stimulation of new experiences, and neurogenesis, the process of growth that allows for new learning. A 2016 study in Nature Neuroscience found that even two years after pregnancy, women had gray matter brain changes in regions involved in social cognition or the ability to empathically understand what is going on in the mind of another person, to put yourself in their shoes. It may be that some subtle aspects of memory are sacrificed to enhance other areas of cognition. A 2010 study in Psychoneuroendocrinology showed that pregnant women experienced some impairment in the ability to remember words, but did not show changes in other memory functions such as recognition or working memory. This means that these women might forget the name of a character in their favorite TV show, for example, but would have no trouble in the type of memory that involves learning, reasoning and comprehension. © 2018 The New York Times Company

Keyword: Sexual Behavior; Learning & Memory
Link ID: 24969 - Posted: 05.12.2018

By Neuroskeptic A new paper in ACS Chemical Neuroscience pulls no punches in claiming that most of what we know about the neuroscience of learning is wrong: Dendritic Learning as a Paradigm Shift in Brain Learning According to authors Shira Sardi and colleagues, the prevailing view which is that learning takes place in the synapses is mistaken. Instead, they say, ‘dendritic learning’ is how brain cells really store information. If a neuron is a tree, the dendrites are the branches, while the synapses are the leaves on the ends of those branches. Here’s how Sardi et al. explain their new theory: On the left we see the idea of synaptic learning, which proposes that each synapse can independently adjust its strength. On the right, we have dendritic learning, the idea that each neuron only has a small number of adjustable units, corresponding to the main dendritic branches. The evidence for dendritic learning, Sardi et al. say, comes from experiments using cultured neurons in which they found that a) some neurons are more likely to fire when stimulated in certain places, suggesting that dendrites can vary in their excitability and b) that these (presumed) dendritic excitability levels are plastic (they can ‘learn’.)

Keyword: Learning & Memory
Link ID: 24968 - Posted: 05.12.2018

Maria Temming An artificial intelligence that navigates its environment much like mammals do could help solve a mystery about our own internal GPS. Equipped with virtual versions of specialized brain nerve cells called grid cells, the AI could easily solve and plan new routes through virtual mazes. That performance, described online May 9 in Nature, suggests the grid cells in animal brains play a critical role in path planning. “This is a big step forward” in understanding our own navigational neural circuitry, says Ingmar Kanitscheider, a computational neuroscientist at the University of Texas at Austin not involved in the work. The discovery that rats track their location with the help of grid cells, which project an imaginary hexagonal lattice onto an animal’s surroundings, earned a Norwegian research team the 2014 Nobel Prize in physiology or medicine (SN Online: 10/6/14). Neuroscientists suspected these cells, which have also been found in humans, might help not only give mammals an internal coordinate system, but also plan direct paths between points (SN Online: 8/5/13). To test that idea, neuroscientist Caswell Barry at University College London, along with colleagues at Google DeepMind, created an AI that contained virtual nerve cells, or neurons, whose activity resembled that of real grid cells. The researchers trained this AI to navigate virtual mazes by giving the system reward signals when it reached its destination. |© Society for Science & the Public 2000 - 2018

Keyword: Learning & Memory; Robotics
Link ID: 24958 - Posted: 05.10.2018

Alison Abbott Scientists have used artificial intelligence (AI) to recreate the complex neural codes that the brain uses to navigate through space. The feat demonstrates how powerful AI algorithms can assist conventional neuroscience research to test theories about the brain’s workings — but the approach is not going to put neuroscientists out of work just yet, say the researchers. “It really was a very striking and remarkable convergence of form and function.” The computer program, details of which were published in Nature on 9 May1, was developed by neuroscientists at University College London (UCL) and AI researchers at the London-based Google company DeepMind. It used a technique called deep learning — a type of AI inspired by the structures in the brain — to train a computer-simulated rat to track its position in a virtual environment. The program surprised the scientists by spontaneously generating hexagonal-shaped patterns of activity akin to those generated by navigational cells in the mammalian brain called grid cells. Grid cells have been shown in experiments with real rats to be fundamental to how an animal tracks its own position in space. What’s more, the simulated rat was able to use the grid-cell-like coding to navigate a virtual maze so well that it even learnt to take shortcuts. © 2018 Macmillan Publishers Limited,

Keyword: Learning & Memory; Robotics
Link ID: 24957 - Posted: 05.10.2018

By Gretchen Reynolds Exercise changes the brains and sperm of male animals in ways that later affect the brains and thinking skills of their offspring, according to a fascinating new study involving mice. The findings indicate that some of the brain benefits of physical activity may be passed along to children, even if a father does not begin to exercise until adulthood. We already have plenty of scientific evidence showing that exercise is good for our brains, whether we are mice or people. Among other effects, physical activity can strengthen the connections between neurons in the hippocampus, a crucial part of the brain involved in memory and learning. Stronger neuronal connections there generally mean sharper thinking. Studies also indicate that exercise, like other aspects of lifestyle, can alter how genes work — whether and when they get turned on or off, for instance — and those changes can get passed on to children. This process is known as epigenetics. But it had not been clear whether structural changes in the brain caused by exercise might also have epigenetic effects that would result in meaningful changes in the brains of the next generation. In other words, would exercise by a parent help to produce smarter babies? And, in particular, would this process occur in males, who contribute sperm but not a womb and its multitude of hormones, cells and tissues to their children? To find out, researchers at the German Center for Neurodegenerative Diseases in Göttingen, Germany, and other institutions gathered a large group of genetically identical male mice. Because the animals were genetically the same at the start, any differences in their bodies and behavior that cropped up later should be a result of lifestyle. © 2018 The New York Times Company

Keyword: Epigenetics; Learning & Memory
Link ID: 24951 - Posted: 05.09.2018

By Diana Kwon Scientists have long attempted to understand where, and how, the brain stores memories. At the beginning of the 20th century, German scientist Richard Semon coined the term “engram” to describe the hypothetical physical representations of memories in the brain. Then, in the 1940s, Canadian psychologist Donald Hebb proposed that, when neurons encoded memories, connections, called synapses, between coactivated memory, or engram, cells were strengthened—a theory that was famously paraphrased as neurons that “fire together, wire together.” These two ideas have become the cornerstone of memory research—and in the decades since they first emerged, scientists have amassed evidence supporting them. “Donald Hebb suggested that it’s not engram cells that are the critical part of storing the memory, it’s the synapse between engram cells,” says Bong-Kiun Kaang, a neuroscientist at Seoul National University in South Korea. However, he adds, while there has been much indirect evidence that such a process underlies memory formation, such as studies showing long-term potentiation (the process through which two simultaneously activated neurons show enhanced connectivity), direct evidence has been lacking. One of the key issues has been the lack of tools to directly observe this process, Kaang says. To overcome this limitation, he and his colleagues injected a virus containing recombinant DNA—coding for different colors of fluorescent proteins for engram and non-engram cells—into the brains of mice. Using this technique, the team was able to pinpoint which type of cell had connected with a postsynaptic neuron. The endeavor wasn’t easy. Developing the method and getting it to work experimentally was a painstaking process that took almost a decade, Kaang tells The Scientist. So when his team finally managed to get promising results around two years ago, “we were very excited,” he says. © 1986-2018 The Scientist

Keyword: Learning & Memory
Link ID: 24916 - Posted: 04.28.2018

by Lauren Neergaard It’s pretty extraordinary for people in their 80s and 90s to keep the same sharp memory as someone several decades younger, so scientists are peeking into the brains of“superagers” who do to uncover their secret. The work is the flip side of the disappointing hunt for new drugs to fight or prevent Alzheimer’s disease. Instead of tackling that problem, “why don’t we figure out what it is we might need to do to maximize our memory?” said neuro­scientist Emily Rogalski, who leads the SuperAging study at Northwestern University in Chicago. Parts of the brain shrink with age, one of the reasons that most people experience a gradual slowing of at least some types of memory late in life. But it turns out that superagers’ brains aren’t shrinking nearly as fast as their peers’. And autopsies of the first superagers to die during the study show they harbor a lot more of a special kind of nerve cell in a deep-brain region that’s important for attention, Rogalski told a recent meeting of the American Association for the Advancement of Science. These elite elders are “more than just an oddity or a rarity,” said neuroscientist Molly Wagster of the National Institute on Aging, which helps fund the research. “There’s the potential for learning an enormous amount and applying it to the rest of us, and even to those who may be on a trajectory for some type of neurodegenerative disease.” © 1996-2018 The Washington Post

Keyword: Alzheimers; Learning & Memory
Link ID: 24874 - Posted: 04.17.2018