Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1595

By Perri Klass, M.D. A major international study provides new reassurance around the question of whether young children who have anesthesia are more likely to develop learning disabilities The issue has troubled pediatric anesthesiologists and parents for well over a decade, after research on animals suggested that there was a connection. Do the drugs that make it possible to perform vital surgical procedures without pain cause lasting damage to the developing human brain? Several large studies have found ways to tease out the effects of actual surgeries and anesthetic exposures on children. The new study, in the British journal The Lancet, is a randomized controlled trial involving more than 700 infants who needed hernia repairs. The babies, at 28 hospitals in seven countries, were randomly assigned to receive either general anesthesia or regional (spinal) anesthesia for these short operations — the mean duration of general anesthesia was 54 minutes. The study, called the GAS study — for general anesthesia compared to spinal — compared neurodevelopmental outcomes at 5 years of age, and found no significant difference in the children’s performance in the two groups. Dr. Andrew Davidson, a professor in the department of anesthesia at the Royal Children’s Hospital of Melbourne and one of the two lead investigators on the trial, said that this prospective, randomized design allows researchers to avoid many confounding factors that have complicated previous studies, and answer a very specific question. Preliminary data from testing the children at age 2 had shown no significant differences between the groups, and the children were then evaluated at the age of school entry. “If you have an hour of anesthesia as a child, then you are at no greater risk of deficits of cognition at the age of 5,” Dr. Davidson said. “It doesn’t increase the risk of poor neurodevelopmental outcome.” © 2019 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 25972 - Posted: 02.18.2019

Bruce Bower WASHINGTON — Beliefs among some university professors that intelligence is fixed, rather than capable of growth, contribute to a racial achievement gap in STEM courses, a new study suggests. Those professors may subtly communicate stereotypes about blacks, Hispanics and Native Americans allegedly being less intelligent than Asians and whites, say psychologist Elizabeth Canning of Indiana University in Bloomington and her colleagues. In turn, black, Hispanic and Native American undergraduates may respond by becoming less academically motivated and more anxious about their studies, leading to lower grades. Even small dips in STEM grades — especially for students near pass/fail cutoffs — can accumulate across the 15 or more science, technology, engineering and math classes needed to become a physician or an engineer, Canning says. That could jeopardize access to financial aid and acceptance to graduate programs. “Our work suggests that academic benefits could accrue over time if all students, and particularly underrepresented minority students, took STEM classes with faculty who endorse a growth mind-set,” Canning says. Underrepresented minority students’ reactions to professors with fixed or flexible beliefs about intelligence have yet to be studied. But over a two-year period, the disparity in grade point averages separating Asian and white STEM students from black, Hispanic and Native American peers was nearly twice as large in courses taught by professors who regarded intelligence as set in stone, versus malleable, Canning’s team reports online February 15 in Science Advances. |© Society for Science & the Public 2000 - 2019.

Keyword: Attention; Learning & Memory
Link ID: 25970 - Posted: 02.18.2019

By Pallab Ghosh Science correspondent, BBC News, Washington DC New results suggest ageing brains can potentially be rejuvenated, at least in mice, according to researchers. Very early-stage experiments indicate that drugs can be developed to stop or even reverse mental decline. The results were presented at the 2019 meeting of the American Association for the Advancement of Science. The US and Canadian researchers took two new approaches to trying to prevent the loss of memory and cognitive decline that can come with old age. One team, from the University of California, Berkeley, showed MRI scans which indicated that mental decline may be caused by molecules leaking into the brain. Blood vessels in the brain are different from those in other parts of the body. They protect the organ by allowing only nutrients, oxygen and some drugs to flow through into the brain, but block larger, potentially damaging molecules. This is known as the blood-brain barrier. The scans revealed that this barrier becomes increasingly leaky as we get older. For example, 30-40% of people in their 40s have some disruption to their blood-brain barrier, compared with 60% of 60-year-olds. The scans also showed that the brain was inflamed in the leaky areas. Prof Daniela Kaufer, who leads the Berkeley group, said that young mice altered to have leaky blood-brain barriers showed many signs of aging. She discovered a chemical that stops the damage to the barrier from causing inflammation to the brain. Prof Kaufer told BBC News that not only did the chemical stop the genetically altered young mice from showing signs of aging, it reversed the signs of aging in older mice. © 2019 BBC

Keyword: Alzheimers; Learning & Memory
Link ID: 25966 - Posted: 02.15.2019

Ian Sample Science editor An experimental drug that bolsters ailing brain cells has raised hopes of a treatment for memory loss, poor decision making and other mental impairments that often strike in old age. The drug could be taken as a daily pill by over-55s if clinical trials, which are expected to start within two years, show that the medicine is safe and effective at preventing memory lapses. Tests in the lab showed that old animals had far better memory skills half an hour after receiving the drug. After two months on the treatment, brain cells which had shrunk in the animals had grown back, scientists found. Etienne Sibille, at the Centre for Addiction and Mental Health in Toronto, said the treatment was aimed not only at the “normal” cognitive decline that leads to senior moments, but at memory loss and mental impairments that commonly afflict people with depression, schizophrenia and Alzheimer’s disease. If the drug did well in human trials, Sibille said it was possible that “anybody over the age of 55-60 who may be at risk of cognitive problems later on could benefit from this treatment”. “Our findings have direct implications for poor cognition in normal ageing,” he said, with the drug potentially improving learning, memory, decision making and essential life planning. “But we see this deficiency across disorders from depression to schizophrenia and Alzheimer’s.” © 2019 Guardian News & Media Limited

Keyword: Alzheimers; Learning & Memory
Link ID: 25964 - Posted: 02.14.2019

Shawna Williams Watch a bacterium chase down the source of an enticing molecular trail using chemo-taxis, and it’s clear that its sensory and navigation abilities are tightly linked. But could the same be true for humans? In 2014, Louisa Dahmani, then a graduate student at McGill University in Montreal, set out to answer that question. After having reviewed the literature on studies of spatial memory and olfaction in people, “I realized that the two functions seemed to rely on similar brain regions,” she explains. “But no one had actually looked at it directly and tested the same sample of participants on an olfaction and on a spatial memory task.” Dahmani, her advisor Véronique Bohbot, and their colleagues set out to rectify that. The group recruited 60 volunteers and tested their ability to identify 40 odors, from menthol to cucumber to lavender. The researchers also had the subjects do a computer-based task in which they moved through a virtual town. After their exploration, the subjects navigated through the virtual town from one of its eight landmarks to a different destination via the shortest route possible. “People who are better at finding their way are also better at identifying smells,” Dahmani says, summing up the study’s biggest takeaway. The scientists also imaged participants’ brains using MRI and found that a larger medial orbitofrontal cortex—a brain region known to be associated, along with the hippocampus, with spatial navigation—correlated with both better smell identification and fewer errors on the navigation task (Nat Comm, 9:4162, 2018). © 1986 - 2019 The Scientist.

Keyword: Chemical Senses (Smell & Taste); Learning & Memory
Link ID: 25963 - Posted: 02.14.2019

By Nicholas Bakalar Chronic inflammation in middle age may lead to memory and thinking problems later in life. Unlike acute inflammation, which arises in response to injury, chronic inflammation persists over months or years. Autoimmune disease, lingering infection, exposure to polluted air, psychological stress and other conditions can all promote chronic inflammation. Researchers did blood tests on 12,336 men and women, average age 57, assigning them an “inflammation composite score” based on white blood cell count, clotting factors and other tests. They also assessed their cognition with standardized tests of memory, processing speed and verbal fluency. The study is in Neurology. After controlling for age, education, blood pressure, cholesterol, heart disease and many other factors, they found that the greater the number of inflammatory factors, the steeper the cognitive decline over 20 years of follow-up. Inflammation was most strongly associated with declines in memory. “We know that dementia starts earlier than the appearance of symptoms,” said the lead author, Keenan A. Walker, a postdoctoral researcher at Johns Hopkins, “and we’ve shown that levels of inflammation matter for dementia risk. Reducing chronic inflammation involves the same health behaviors that we already know are important for other reasons — regular exercise, healthy diet, avoiding excessive weight gain and so on.” © 2019 The New York Times Company

Keyword: Alzheimers; Neuroimmunology
Link ID: 25962 - Posted: 02.14.2019

By Agata Boxe Police officers investigating a crime may hesitate to interview drunk witnesses. But waiting until they sober up may not be the best strategy; people remember more while they are still inebriated than they do a week later, a new study finds. Malin Hildebrand Karlén, a senior psychology lecturer at Sweden’s University of Gothenburg, and her colleagues recruited 136 people and gave half of them vodka mixed with orange juice. The others drank only juice. In 15 minutes women in the alcohol group consumed 0.75 gram of alcohol per kilogram of body weight, and men drank 0.8 gram (that is equivalent to 3.75 glasses of wine for a 70-kilogram woman or four glasses for a man of the same weight, Hildebrand Karlén says). All participants then watched a short film depicting a verbal and physical altercation between a man and a woman. The researchers next asked half the people in each group to freely recall what they remembered from the film. The remaining participants were sent home and interviewed a week later. The investigators found that both the inebriated and sober people who were interviewed immediately demonstrated better recollection of the film events than their drunk or sober counterparts who were questioned later. The effect held even for people with blood alcohol concentrations of 0.08 or higher—the legal limit for driving in most of the U.S. (Intoxication levels varied because different people metabolize alcohol at different speeds.) The results suggest that intoxicated witnesses should be interviewed sooner rather than later, according to the study, which was published online last October in Psychology, Crime & Law. © 2019 Scientific American

Keyword: Learning & Memory; Drug Abuse
Link ID: 25947 - Posted: 02.11.2019

By Rachel Hartigan Shea When Steve Ramirez was in college, he was fascinated by all kinds of subjects—from Shakespeare to piano, astronauts to medicine. That made choosing a major difficult, so he decided to “cheat,” as he puts it. He would study “the thing that achieved everything that’s ever been achieved”: the brain. After he joined a lab researching the neuroscience of memory, he learned that every experience leaves physical traces throughout the brain. Those are memories, and they can be examined or even altered. “That idea enchanted me,” he says. Now Ramirez leads his own lab at Boston University, and he’s figured out how to suppress bad memories by activating good ones. He and his team genetically engineer brain cells associated with memory in mice to respond to light. Then they create a bad memory—a mild electric shock—and watch the activated cells light up. Deactivating those cells would make the bad memory inaccessible or allow it to be overwritten by a good memory, such as social time with other mice. Ramirez does not propose using this sort of “genetic trickery” to manipulate memories in humans. Instead, his discoveries about memory could inform how patients with post-traumatic stress disorder, anxiety, or depression are treated. “We want to understand how the brain works; we want to understand how memory works,” he says. “It’s like, the more we know how a car works, the better equipped we are to figure out what happens when it breaks down.”

Keyword: Learning & Memory
Link ID: 25944 - Posted: 02.09.2019

/ By Jacob Appel One of the most upsetting illnesses any psychiatrist encounters is Wernicke-Korsakoff Syndrome (WKS). Caused by a deficiency of thiamine (vitamin B1), it results in devastating impairment of muscle control and memory. Able-bodied men and women develop a severe and irreversible amnesia that wipes clean their pasts and prevents them from forming new memories. Those who survive — and many patients don’t — are often relegated to nursing homes. Yet the neurological damage WKS causes is only part of what makes it so upsetting to emergency room psychiatrists; after all, many neurological and psychiatric illnesses inflict irreversible cognitive harm. The tragedy of WKS is that, with appropriate public health measures, it could be easily prevented. Historically, thiamine deficiency afflicted the indigent, prisoners of war, and societies with rice-based diets. Its most serious chronic manifestation, beriberi, can present in a “wet” form that results in cardiac overload and massive edema, or in a “dry” variant — of which WKS is a subset — that affects the peripheral nervous system, the brain, or both. In addition to amnesia, victims of WKS often display striking degrees of spontaneous confabulation, in which they volunteer personal stories that they believe to be true but are not. Prevalence rates for WKS at autopsy have been found to run as high as 2.8 percent in Australia, and between 0.1 to 2.2 percent in the U.S. Copyright 2019 Undark

Keyword: Drug Abuse; Learning & Memory
Link ID: 25923 - Posted: 02.01.2019

John Bergeron During the first weeks of the new year, resolutions are often accompanied by attempts to learn new behaviours that improve health. We hope that old bad habits will disappear and new healthy habits will become automatic. But how can our brain be reprogrammed to assure that a new health habit can be learned and retained? In 1949, Canadian psychologist Donald Hebb proposed the theory of Hebbian learning to explain how a learning task is transformed into a long-term memory. In this way, healthy habits become automatically retained after their continual repetition. Synapses transmit electrical signals. Svitlana Pavliuk Learning and memory are a consequence of how our brain cells (neurons) communicate with each other. When we learn, neurons communicate through molecular transmissions which hop across synapses producing a memory circuit. Known as long-term potentiation (LTP), the more often a learning task is repeated, the more often transmission continues and the stronger a memory circuit becomes. It is this unique ability of neurons to create and strengthen synaptic connections by repeated activation that leads to Hebbian learning. Understanding the brain requires investigation through different approaches and from a variety of specialities. The field of cognitive neuroscience initially developed through a small number of pioneers. Their experimental designs and observations led to the foundation for how we understand learning and memory today. © 2010–2019, The Conversation US, Inc.

Keyword: Learning & Memory
Link ID: 25890 - Posted: 01.22.2019

By: Brenna Hassinger-Das, Ph.D., and Kathryn Hirsh-Pasek, Ph.D. In 1954, Walt Disney was the first to envision a new form of entertainment that melded traditional fun and education—a form that he dubbed “edutainment.” By the latter part of the 20th century, this form had morphed into educational toys and games, a multi-billion-dollar industry that is projected to capture a full 36 percent of the global toy market share by 2022. Nowhere is this trend more apparent than in the explosion of digital apps: of the 2.2 million apps available in the Apple Store, roughly 176,000—8.5 percent—are loosely designated as “ educational. ” Their growth continues, with annual increases of 10 percent expected through 2021. Whether called edutainment, educational toys, or the digital learning revolution, this trend shares the implicit philosophy that mixing fun and learning will offer a kind of “brain training” that will enhance children’s thinking and amplify their learning potential. But there are many questions before us. What do manufacturers and marketers mean when they designate a product “ educational? ” What relevant research in the science of learning has been done? Is there a standard definition of educational value that guides the field? Indeed, a framework we use highlights when toys might sculpt mental muscle and when products are likely to be total imposters. This framework helps us elucidate which educational and digital toys are likely to confer benefits for children.

Keyword: Learning & Memory; Development of the Brain
Link ID: 25877 - Posted: 01.18.2019

By Gretchen Reynolds A hormone that is released during exercise may improve brain health and lessen the damage and memory loss that occur during dementia, a new study finds. The study, which was published this month in Nature Medicine, involved mice, but its findings could help to explain how, at a molecular level, exercise protects our brains and possibly preserves memory and thinking skills, even in people whose pasts are fading. Considerable scientific evidence already demonstrates that exercise remodels brains and affects thinking. Researchers have shown in rats and mice that running ramps up the creation of new brain cells in the hippocampus, a portion of the brain devoted to memory formation and storage. Exercise also can improve the health and function of the synapses between neurons there, allowing brain cells to better communicate. In people, epidemiological research indicates that being physically active reduces the risk for Alzheimer’s disease and other dementias and may also slow disease progression. But many questions remain about just how exercise alters the inner workings of the brain and whether the effects are a result of changes elsewhere in the body that also happen to be good for the brain or whether the changes actually occur within the brain itself. Those issues attracted the attention of an international consortium of scientists, some neuroscientists, others cell biologists, all of whom were focused on preventing, treating and understanding Alzheimer’s disease. Those concerns had brought a hormone called irisin into their sphere of interest. Irisin, first identified in 2012 and named for Iris, the gods’ messenger in Greek mythology, is produced by muscles during exercise. The hormone jump-starts multiple biochemical reactions throughout the body, most of them related to energy metabolism. © 2019 The New York Times Company

Keyword: Alzheimers; Hormones & Behavior
Link ID: 25868 - Posted: 01.16.2019

Marise Parent Of course you know that eating is vital to your survival, but have you ever thought about how your brain controls how much you eat, when you eat and what you eat? This is not a trivial question, because two-thirds of Americans are either overweight or obese and overeating is a major cause of this epidemic. To date, the scientific effort to understand how the brain controls eating has focused primarily on brain areas involved in hunger, fullness and pleasure. To be better armed in the fight against obesity, neuroscientists, including me, are starting to expand our investigation to other parts of the brain associated with different functions. My lab’s recent research focuses on one that’s been relatively overlooked: memory. For many people, decisions about whether to eat now, what to eat and how much to eat are often influenced by memories of what they ate recently. For instance, in addition to my scale and tight clothes, my memory of overeating pizza yesterday played a pivotal role in my decision to eat salad for lunch today. Memories of recently eaten foods can serve as a powerful mechanism for controlling eating behavior because they provide you with a record of your recent intake that likely outlasts most of the hormonal and brain signals generated by your meal. But surprisingly, the brain regions that allow memory to control future eating behavior are largely unknown. Studies done in people support the idea that meal-related memory can control future eating behavior. © 2010–2019, The Conversation US, Inc.

Keyword: Obesity; Learning & Memory
Link ID: 25866 - Posted: 01.15.2019

By Bryan Clark You slide the key into the door and hear a clunk as the tumblers engage. You rotate the key, twist the doorknob and walk inside. The house is familiar, but the contents foreign. At your left, there’s a map of Minnesota, dangling precariously from the wall. You’re certain it wasn’t there this morning. Below it, you find a plush M&M candy. To the right, a dog, a shiba inu you’ve never seen before. In its mouth, a pair of your expensive socks. And then it comes to you, 323-3607, a phone number. If none of this makes sense, stick with us; by the end of this piece you’ll be using the same techniques to memorize just about anything you’ve ever wanted to remember. The “memory athlete” Munkhshur Narmandakh once employed a similar combination of mnemonics to commit more than 6,000 binary digits to memory in just 30 minutes. Alex Mullen, a three-time World Memory Champion, used them to memorize the order of a deck of cards in just 15 seconds, a record at the time. It was later broken by Shijir-Erdene Bat-Enkh, who did it in 12. We’re going to aim lower, applying these strategies to real-world scenarios, like remembering the things we often forget at dinner parties or work-related mixers. At the start of this piece, we employed two mnemonic strategies to remember the seven digits of a phone number. The first, called the “Major System,” was developed in 1648 by historian Johann Winkelmann. In his book “Moonwalking With Einstein,” the author Joshua Foer described this system as a simple cipher that transforms numbers to letters or phonetic sounds. From there we can craft words and, ultimately, images. Some will, no doubt, be crude or enigmatic. Others may contain misspellings and factual errors. It doesn’t matter. This system is designed to create rich imagery, not accurate representations. © 2019 The New York Times Company

Keyword: Learning & Memory
Link ID: 25865 - Posted: 01.15.2019

By Sandra E. Garcia Imagine being held up at gunpoint. Do you trust you could remember the perpetrator’s face? The gun? Or would you have a better recollection of how loud the birds were chirping at that moment? “The memory does not operate like a videotape machine faithfully recording every single detail,” said Richard J. McNally, a professor of psychology at Harvard University and the author of “Remembering Trauma.” “The thing that is happening is that you’re focusing on the most dangerous thing,” he said. “That is the function of fear: to alert you to imminent threats.” Stress can play a role in eyewitness cases of mistaken identity, experts said, and it could be a reason there were such conflicting accounts of the suspects in the shooting death of Jazmine Barnes, the 7-year-old Texas girl who was fired upon in a car with her mother and three sisters on Dec. 30. A gunman pulled up alongside them and opened fire. Jazmine’s mother, LaPorsha Washington, 30, was injured. Ms. Washington and her daughters met with investigators to help them create a composite sketch of the gunman, who attacked them before sunrise. The man was described as white, thin and in his 30s or 40s and driving a red pickup truck. On Sunday, the authorities announced they had charged a 20-year-old black man with capital murder in connection with the shooting. In a CNN interview, Ms. Washington said her teenage daughter told her that the man was white and that his hoodie was black. “That’s all she could see at the time because the sun hadn’t really even came out yet,” Ms. Washington said in the interview. © 2019 The New York Times Company

Keyword: Learning & Memory
Link ID: 25844 - Posted: 01.07.2019

By Paula Span He was a retired factory worker, living with his wife outside a small town in Wales, in the United Kingdom. Once outgoing and sociable, engaged in local activities including a community choir, he’d been jolted by a diagnosis of early dementia. A few months later, at 70, he wouldn’t leave the house alone, fearful that if he needed help, he couldn’t manage to use a cellphone to call his wife. He avoided household chores he’d previously undertaken, such as doing laundry. When his frustrated wife tried to show him how to use the washer, he couldn’t remember her instructions. “He’d lost a lot of confidence,” said Linda Clare, a clinical psychologist at the University of Exeter. “He was actually capable, but he was frightened of making a mistake, getting it wrong.” Dr. Clare directed a recent trial of cognitive rehabilitation in England and Wales in which the patient was enrolled. Cognitive rehabilitation, which Dr. Clare has been researching for 20 years, evolved from methods used to help people with brain injuries. The practice brings occupational and other therapists into the homes of dementia patients to learn which everyday activities they’re struggling with and which abilities they want to preserve or improve upon. Organizing a visit with a friend, perhaps. Keeping track of the day’s appointments and plans. Heating a prepared lunch without burning it. In weekly sessions over several months, the therapists devise individual strategies that can help, at least in the early and moderate stages of the disease. The therapists show patients how to compensate for memory problems and to practice new techniques. Cognitive rehab has its limitations. “We never suggest this can reverse the effects of dementia,” Dr. Clare said. It will not raise participants’ scores on tests of mental ability. © 2019 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 25840 - Posted: 01.05.2019

By Amy Harmon It has been more than a decade since James D. Watson, a founder of modern genetics, landed in a kind of professional exile by suggesting that black people are intrinsically less intelligent than whites. In 2007, Dr. Watson, who shared a 1962 Nobel Prize for describing the double-helix structure of DNA, told a British journalist that he was “inherently gloomy about the prospect of Africa” because “all our social policies are based on the fact that their intelligence is the same as ours, whereas all the testing says, not really.” Moreover, he added, although he wished everyone were equal, “people who have to deal with black employees find this not true.” Dr. Watson’s comments reverberated around the world, and he was forced to retire from his job as chancellor of the Cold Spring Harbor Laboratory on Long Island, although he retains an office there. He apologized publicly and “unreservedly,’’ and in later interviews he sometimes suggested that he had been playing the provocateur — his trademark role — or had not understood that his comments would be made public. Ever since, Dr. Watson, 90, has been largely absent from the public eye. His speaking invitations evaporated. In 2014, he became the first living Nobelist to sell his medal, citing a depleted income from having been designated a “nonperson.’’ But his remarks have lingered. They have been invoked to support white supremacist views, and scientists routinely excoriate Dr. Watson when his name surfaces on social media. © 2019 The New York Times Company

Keyword: Genes & Behavior; Intelligence
Link ID: 25831 - Posted: 01.01.2019

By Scott Barry Kaufman In his classic 1923 essay, "Intelligence as the Tests Test It", Edwin Boring wrote "Intelligence is what the tests test." Almost a century of research later, we know that this definition is far too narrow. As long as a test is sufficiently cognitively complex and taps into enough diverse content, you can get a rough snapshot of a person's general cognitive ability-- and general cognitive ability predicts a wide range of important outcomes in life, including academic achievement, occupational performance, health, and longevity. But what about happiness? Prior studies have been mixed about this, with some studies showing no relationship between individual IQ and happiness, and other studies showing that those in the lowest IQ range report the lowest levels of happiness compared to those in the highest IQ group. In one study, however, the unhappiness of the lowest IQ range was reduced by 50% once income and mental health issues were taken into account. The authors concluded that "interventions that target modifiable variables such as income (e.g., through enhancing education and employment opportunities) and neurotic symptoms (e.g., through better detection of mental health problems) may improve levels of happiness in the lower IQ groups." One major limitations of these prior studies, however, is that they all rely on a single measure of happiness, notably life satisfaction. Modern day researchers now have measures to assess a much wider array of indicators of well-being, including autonomy, personal growth, positive relationships, self-acceptance, mastery, and purpose and meaning in life.

Keyword: Intelligence; Learning & Memory
Link ID: 25819 - Posted: 12.23.2018

By Ryan Dalton You might be forgiven for having never heard of the NotPetya cyberattack. It didn’t clear out your bank account, or share your social media passwords, or influence an election. But it was one of the most costly and damaging cyberattacks in history, for what it did target: shipping through ports. By the time the engineers at Maersk realized that their computers were infected with a virus, it was too late: worldwide shipping would grind to a halt for days. Imagine a similar situation, in which the target was another port: the synapse, the specialized port of communication between neurons. Much of our ability to learn and remember comes down to the behavior of synapses. What would happen then, if one neuron infected another with malware? Ports and synapses both run on rules, meant to ensure that their cargo can be exchanged not only quickly and reliably, but also adaptably, so that they can quickly adjust to current conditions and demands. This ‘synaptic plasticity’, is fundamental to the ability of animals to learn, and without it we would no more be able to tie our shoes than to remember our own names. Just as shipping rules are determined by treaties and laws, the rules of synaptic plasticity are written into a multitude of genes in our DNA. For example, one gene might be involved in turning up the volume on one side of the synapse, while another gene might ask the other side of the synapse to turn up the gain. Studying the function of these genes has been one of the core approaches to understanding what it is, at the microscopic level, to learn and to remember. © 2018 Scientific American

Keyword: Learning & Memory; Intelligence
Link ID: 25782 - Posted: 12.12.2018

In his enthralling 2009 collection of parables, Sum: Forty Tales from the Afterlives, the neuroscientist David Eagleman describes a world in which a person only truly dies when they are forgotten. After their bodies have crumbled and they leave Earth, all deceased must wait in a lobby and are allowed to pass on only after someone says their name for the last time. “The whole place looks like an infinite airport waiting area,” Eagleman writes. “But the company is terrific.” Most people leave just as their loved ones arrive — for it was only the loved ones who were still remembering. But the truly famous have to hang around for centuries; some, keen to be off, are with an “aching heart waiting for statues to fall”. Eagleman’s tale is an interpretation of what psychologists and social scientists call collective memory. Continued and shared attention to people and events is important because it can help to shape identity — how individuals see themselves as part of a group — and because the choice of what to commemorate, and so remember, influences the structures and priorities of society. This week in Nature Human Behaviour, researchers report a surprising discovery about collective memory: the pattern of its decay follows a mathematical law (C. Candia et al. Nature Hum. Behav.; 2018). The attention we pay to academic papers, films, pop songs and tennis players decays in two distinct stages. In theory, the findings could help those who compete for society’s continued attention — from politicians and companies to environmental campaigners — to find ways to stay in the public eye, or at least in the public’s head. © 2018 Springer Nature Publishing AG

Keyword: Learning & Memory
Link ID: 25780 - Posted: 12.12.2018