Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Gareth Cook talks to Douwe Draaisma Much has been written on the wonders of human memory: its astounding feats of recall, the way memories shape our identities and are shaped by them, memory as a literary theme and a historical one. But what of forgetting? This is the topic of a new book by Douwe Draaisma, author of The Nostalgia Factory: Memory, Time and Ageing (Yale University Press, 2013; 176 pages) and a professor of the history of psychology at the University of Groningen in the Netherlands. In Forgetting: Myths, Perils and Compensations (Yale University Press, 2015; 288 pages), Draaisma considers dreaming, amnesia, dementia and all the ways in which our minds—and lives—are shaped by memory’s opposite. He answered questions from contributing editor Gareth Cook. What is your earliest memory, and why, do you suppose, have you not forgotten it? Quite a few early memories in the Netherlands involve bicycles; mine is no exception. I was two and a half years old when my aunts walked my mother to the train station. They had taken a bike to transport her bags. I was sitting on the back of the bike. Suddenly the whole procession came to a halt when my foot got caught between the spokes of a wheel. I am pretty sure this memory is accurate because I had to see a doctor, and there is a dated medical record. It is a brief, snapshotlike memory, black-and-white. I do not remember any pain, but I do remember the consternation among my mom and her sisters. Looking back on this memory from a professional perspective, I would say that it has the flashlike character typical for first memories from before age three; “later” first memories are usually a bit longer and more elaborate. © 2015 Scientific American
Keyword: Learning & Memory
Link ID: 21474 - Posted: 10.05.2015
By Lisa Sanders, M.d. On Thursday we challenged Well readers to solve the case of a 27-year-old woman who had vomiting, weakness and confusion months after having weight loss surgery. More than 200 readers offered their perspective on the case. Most of you recognized it as a nutritional deficiency, and nearly half of you totally nailed it. The diagnosis is: Wernicke’s encephalopathy due to thiamine (vitamin B1) deficiency. The very first reader to post a comment, Dr. Adrian Budhram, figured it out. His answer landed on our doorstep just five minutes after the case went up. Dr. Budhram is a second year neurology resident at Western University in London, Ontario. He says that Wernicke’s is on the list of diseases he thinks about every time someone is brought to the hospital because they are confused. Thiamine, or vitamin B1, is a nutrient essential for the body to break down and use sugars and proteins. It is found in many foods, including beans, brown rice, pork and cereals. Although the body only stores enough of the vitamin to last three to four weeks, deficiencies are rare when a full and varied diet is available. Diseases caused by a thiamine deficiency were described in Chinese medicine as early as 2600 B.C. – well before the vitamin was identified chemically. Western medicine came to know the disease as beriberi – a Sinhalese term meaning weak (apparently from the phrase “I can’t, I can’t”) characterized by either numbness and weakness in the legs (dry beriberi) or a weakened heart leading to hugely swollen legs (wet beriberi). © 2015 The New York Times Company
Keyword: Learning & Memory
Link ID: 21469 - Posted: 10.03.2015
James Hamblin Mental exercises to build (or rebuild) attention span have shown promise recently as adjuncts or alternatives to amphetamines in addressing symptoms common to Attention Deficit Hyperactivity Disorder (ADHD). Building cognitive control, to be better able to focus on just one thing, or single-task, might involve regular practice with a specialized video game that reinforces "top-down" cognitive modulation, as was the case in a popular paper in Nature last year. Cool but still notional. More insipid but also more clearly critical to addressing what's being called the ADHD epidemic is plain old physical activity. This morning the medical journal Pediatrics published research that found kids who took part in a regular physical activity program showed important enhancement of cognitive performance and brain function. The findings, according to University of Illinois professor Charles Hillman and colleagues, "demonstrate a causal effect of a physical program on executive control, and provide support for physical activity for improving childhood cognition and brain health." If it seems odd that this is something that still needs support, that's because it is odd, yes. Physical activity is clearly a high, high-yield investment for all kids, but especially those attentive or hyperactive. This brand of research is still published and written about as though it were a novel finding, in part because exercise programs for kids remain underfunded and underprioritized in many school curricula, even though exercise is clearly integral to maximizing the utility of time spent in class.
Erin Wayman Priya Rajasethupathy’s research has been called groundbreaking, compelling and beautifully executed. It’s also memorable. Rajasethupathy, a neuroscientist at Stanford University, investigates how the brain remembers. Her work probes the molecular machinery that governs memories. Her most startling — and controversial — finding: Enduring memories may leave lasting marks on DNA. Being a scientist wasn’t her first career choice. Although Rajasethupathy inherited a love of computation from her computer scientist dad, she enrolled in Cornell University as a pre-med student. After graduating in three years, she took a year off to volunteer in India, helping people with mental illness. During that year she also did neuroscience research at the National Centre for Biological Sciences in Bangalore. While there, she began to wonder whether microRNAs, tiny molecules that put protein production on pause, could play a role in regulating memory. She pursued that question as an M.D. and Ph.D. student at Columbia University (while intending, at least initially, to become a physician). She found some answers in the California sea slug (Aplysia californica). In 2009, she and colleagues discovered a microRNA in the slug’s nerve cells that helps orchestrate the formation of memories that linger for at least 24 hours. © Society for Science & the Public 2000 - 2015.
Keyword: Learning & Memory
Link ID: 21434 - Posted: 09.23.2015
Rachel Ehrenberg If not for a broken piece of lab equipment and a college crush, Steve Ramirez might never have gone into neuroscience. As an undergraduate at Boston University his interests were all over the place: He was taking a humanities course and classes in philosophy and biochemistry while working several hours a week in a biology lab. When the lab’s centrifuge, a device that spins liquids, broke, Ramirez had to use one in another lab. “I was trying to make small talk with this girl who was using the centrifuge, ‘What’s your major?’ kind of thing,” Ramirez recalls. Hearing of his myriad interests, the student suggested that Ramirez talk with neuroscientist Paul Lipton. That led to a conversation with Howard Eichenbaum, a leading memory researcher. Eichenbaum told him that everything Ramirez was interested in was about the brain. “Everything from the pyramids to putting a man on the moon, it’s all the product of the human brain, which is kind of crazy when you think about it,” Ramirez says. Studying “the most interdisciplinary organ in existence,” as Ramirez calls it, was a natural fit. While working in Eichenbaum’s lab, Ramirez got turned on to how the brain forms memories. Those explorations led to a Ph.D. program at MIT in the lab of Nobel laureate Susumu Tonegawa, where Ramirez focused on the individual brain cells that hold specific memories. © Society for Science & the Public 2000 - 2015.
By Michael Balter Are some animals smarter than others? It’s hard to say, because you can’t sit a chimpanzee or a mouse down at a table for an IQ test. But a new study, in which scientists tested wild robins on a variety of skills, concludes that they do differ in the kind of “general intelligence” that IQ tests are supposed to measure. General intelligence is usually defined as the ability to do well on multiple cognitive tasks, from math skills to problem solving. For years, researchers have questioned whether measurable differences exist in humans and nonhumans alike. In humans, factors like education and socioeconomic status can affect performance. When it comes to animals, the problem is compounded for two main reasons: First, it is very difficult to design and administer tests that pick up on overall smarts instead of specific skills, such as the keen memories of food-hoarding birds or the fine motor skills of chimpanzees that make tools for finding insects in trees. Second, differences in animal test scores can depend on how motivated they are to perform. Because most experiments award would-be test-takers with food, an empty (or a full) stomach might be all it takes to skew the results. Thus, even studies that suggest variations in intelligence among mice, birds, and apes all carry the caveat that alternative explanations could be at play. To get around some of these limitations, a team led by Rachael Shaw, an animal behavior researcher at Victoria University of Wellington, turned to a population of New Zealand North Island robins for a new round of experiments. The robins live at the Zealandia wildlife sanctuary, a 225-hectare nature paradise in Wellington where more than 700 of the birds live wild and protected from predators in the middle of the city. © 2015 American Association for the Advancement of Science.
We all have our favourite movie moments, ones we love to watch again from time to time. Now it seems chimpanzees and bonobos, too, have the nous to recall thrilling scenes in movies they have previously seen and anticipate when they are about to come up. The results suggest apes can readily recall and anticipate significant recent events, just by watching those events once. Rather than use hidden food as a memory test, Japanese researchers made short movies and showed them to apes on two consecutive days. “We showed a movie instead, and asked whether they remember it when they only watch an event once, and an event totally new to them,” says Fumihiro Kano of Kyoto University in Japan. “Their anticipatory glances told us that they did.” Plot moment Kano and his colleague Satoshi Hirata made and starred in two short films. Another of the characters was a human dressed up as an ape in a King Kong costume who carried out attacks on people, providing the key plot moment in the first movie (see video). Both films were designed to contain memorable dramatic events, and the researchers deployed laser eye-tracking technology to see if the animals preferentially noticed and remembered these moments. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 21421 - Posted: 09.20.2015
By AMY HARMON Some neuroscientists believe it may be possible, within a century or so, for our minds to continue to function after death — in a computer or some other kind of simulation. Others say it’s theoretically impossible, or impossibly far off in the future. A lot of pieces have to fall into place before we can even begin to start thinking about testing the idea. But new high-tech efforts to understand the brain are also generating methods that make those pieces seem, if not exactly imminent, then at least a bit more plausible. Here’s a look at how close, and far, we are to some requirements for this version of “mind uploading.” The hope of mind uploading rests on the premise that much of the key information about who we are is stored in the unique pattern of connections between our neurons, the cells that carry electrical and chemical signals through living brains. You wouldn't know it from the outside, but there are more of those connections — individually called synapses, collectively known as the connectome — in a cubic centimeter of the human brain than there are stars in the Milky Way galaxy. The basic blueprint is dictated by our genes, but everything we do and experience alters it, creating a physical record of all the things that make us US — our habits, tastes, memories, and so on. It is exceedingly tricky to transition that pattern of connections, known as the connectome, into a state where it is both safe from decay and can be verified as intact. But in recent months, two sets of scientists said they had devised separate ways to do that for the brains of smaller mammals. If either is scaled up to work for human brains — still a big if — then theoretically your brain could sit on a shelf or in a freezer for centuries while scientists work on the rest of these steps. © 2015 The New York Times Company
Mo Costandi In an infamous set of experiments performed in the 1960s, psychologist Walter Mischel sat pre-school kids at a table, one by one, and placed a sweet treat – a small marshmallow, a biscuit, or a pretzel – in front of them. Each of the young participants was told that they would be left alone in the room, and that if they could resist the temptation to eat the sweet on the table in front of them, they would be rewarded with more sweets when the experimenter returned. The so-called Marshmallow Test was designed to test self-control and delayed gratification. Mischel and his colleagues tracked some of the children as they grew up, and then claimed that those who managed to hold out for longer in the original experiment performed better at school, and went on to become more successful in life, than those who couldn’t resist the temptation to eat the treat before the researcher returned to the room. The ability to exercise willpower and inhibit impulsive behaviours is considered to be a core feature of the brain’s executive functions, a set of neural processes - including attention, reasoning, and working memory - which regulate our behaviour and thoughts, and enable us to adapt them according to the changing demands of the task at hand. Executive function is a rather vague term, and we still don’t know much about its underlying bran mechanisms, or about how different components of this control system are related to one another. New research shows that self-control and memory share, and compete with each other for, the same brain mechanisms, such that exercising willpower saps these common resources and impairs our ability to encode memories. © 2015 Guardian News and Media Limited
Aftab Ali People who were born prematurely are less intelligent later on in life and earn less money as a result, according to a new study by the University of Warwick. Researchers at the Coventry-based institution said they found a link which connects pre-term birth with low reading and, in particular, maths skills which affect the amount of wealth accumulated as adults. Funded by the Nuffield Foundation, the researchers examined data from two other large studies, following children born more than a decade apart, with one group from 1958 and the other from 1970. In total, more than 15,000 individuals were surveyed – which recruited all children born in a single week in England, Scotland, and Wales. Data were examined for all individuals who were born at between 28 and 42 weeks gestational age, and who had available wealth information at the age of 42. Those participants who were born pre-term – at less than 37 weeks – were compared with those who were born full-term to find both groups’ mathematical ability in childhood had a direct effect on how much they earned as an adult, regardless of later educational qualifications. In order to measure adult wealth, the researchers looked at factors including: family income and social class, housing and employment status, and their own perceptions of their financial situation. In regards to academic abilities, they examined: validated measures for mathematics, reading, and intelligence, along with ratings from teachers and parents. © independent.co.uk
By Gretchen Reynolds At the age of 93, Olga Kotelko — one of the most successful and acclaimed nonagenarian track-and-field athletes in history — traveled to the University of Illinois to let scientists study her brain. Ms. Kotelko held a number of world records and had won hundreds of gold medals in masters events. But she was of particular interest to the scientific community because she hadn’t begun serious athletic training until age 77. So scanning her brain could potentially show scientists what late-life exercise might do for brains. Ms. Kotelko died last year at the age of 95, but the results of that summer brain scan were published last month in Neurocase. And indeed, Ms. Kotelko’s brain looked quite different from those of other volunteers aged 90-plus who participated in the study, the scans showed. The white matter of her brain — the cells that connect neurons and help to transmit messages from one part of the brain to another — showed fewer abnormalities than the brains of other people her age. And her hippocampus, a portion of the brain involved in memory, was larger than that of similarly aged volunteers (although it was somewhat shrunken in comparison to the brains of volunteers decades younger than her). Over all, her brain seemed younger than her age. But because the scientists didn’t have a scan showing Ms. Kotelko’s brain before she began training, it’s impossible to know whether becoming an athlete late in life improved her brain’s health or whether her naturally healthy brain allowed her to become a stellar masters athlete. © 2015 The New York Times Company
By SINDYA N. BHANOO The human eye has a blind spot, though few of us realize it. Now, a new study suggests that it is possible to reduce the spot with training. The optic nerve, which carries visual signals to the brain, passes through the retina, a light-sensitive layer of tissue. There are no so-called photoreceptors at the point where the optic nerve intersects the retina. The right eye generally compensates for the left eye’s blind spot and vice versa, so the spot is hardly noticed. Researchers trained 10 people using a computer monitor and an eye patch. The participants were shown a waveform in the visual field of their blind spot day after day. After 20 days of this repeated stimulation, the blind spot shrunk by about 10 percent. The researchers believe that neurons at the periphery of the blind spot became more responsive, effectively reducing the extent of functional blindness. The findings add to a growing body of research suggesting that the human eye can be trained, said Paul Miller, a psychologist at the University of Queensland in Australia and an author of the study, which appeared in the journal Current Biology. This kind of training may help researchers develop better treatments for visual impairments like macular degeneration. “This is the leading cause of blindness in the western world,” Mr. Miller said. © 2015 The New York Times Company
We all have days when we feel like our brain is going at a snail’s pace, when our neurons forgot to get out of bed. And psychologists have shown that IQ can fluctuate day to day. So if we’re in good health and don’t have a sleep deficit from last night’s shenanigans to blame, what’s the explanation? Sophie von Stumm, a psychologist at Goldsmiths University, London, set about finding out. In particular, she wanted to know whether mood might explain the brain’s dimmer switch. Although it seems intuitively obvious that feeling low could compromise intellectual performance, von Stumm says research to date has been inconclusive, with some studies finding an effect and others not. “On bad mood days, we tend to feel that our brains are lame and work or study is particularly challenging. But scientists still don’t really know if our brains work better when we are happy compared to when we are sad.” To see if she could pin down mood’s effect on IQ more convincingly, von Stumm recruited 98 participants. Over five consecutive days they completed questionnaires to assess their mood, as well as tests to measure cognitive functions, such as short-term memory, working memory and processing speed. Surprisingly, being in a bad mood didn’t translate into worse cognitive performance. However, when people reported feeling positive, von Stumm saw a modest boost in their processing speed. © Copyright Reed Business Information Ltd.
By Laura Sanders By tweaking a single gene, scientists have turned average mice into supersmart daredevils. The findings are preliminary but hint at therapies that may one day ease the symptoms of such disorders as Alzheimer’s disease and schizophrenia, scientists report August 14 in Neuropsychopharmacology. The altered gene provides instructions for a protein called phosphodiesterase-4B, or PDE4B, which has been implicated in schizophrenia. It’s too early to say whether PDE4B will turn out to be a useful target for drugs that treat these disorders, cautions pharmacologist Ernesto Fedele of the University of Genoa in Italy. Nonetheless, the protein certainly deserves further investigation, he says. The genetic change interfered with PDE4B’s ability to do its job breaking down a molecular messenger called cAMP. Mice designed to have this disabled form of PDE4B showed a suite of curious behaviors, including signs of smarts, says study coauthor Alexander McGirr of the University of British Columbia. Compared with normal mice, these mice more quickly learned which objects in a cage had been moved to a new location, for instance, and could better recognize a familiar mouse after 24 hours. “The system is primed and ready to learn, and it doesn’t require the same kind of input as a normal mouse,” McGirr says. These mice also spent more time than usual exploring brightly lit spaces, spots that normal mice avoid. But this devil-may-care attitude sometimes made the “smart” mice blind to risky situations. The mice were happy to spend time poking around an area that had been sprinkled with bobcat urine. “Not being afraid of cat urine is not a good thing for a mouse,” McGirr says. © Society for Science & the Public 2000 - 2015
By Zoe Kleinman Technology reporter, BBC News More than 200 academics have signed an open letter criticising controversial new research suggesting a link between violent video games and aggression. The findings were released by the American Psychological Association. It set up a taskforce that reviewed hundreds of studies and papers published between 2005 and 2013. The American Psychological Association concluded while there was "no single risk factor" to blame for aggression, violent video games did contribute. "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression," said the report. "It is the accumulation of risk factors that tends to lead to aggressive or violent behaviour. The research reviewed here demonstrates that violent video game use is one such risk factor." However, a large group of academics said they felt the methodology of the research was deeply flawed as a significant part of material included in the study had not been subjected to peer review. "I fully acknowledge that exposure to repeated violence may have short-term effects - you would be a fool to deny that - but the long-term consequences of crime and actual violent behaviour, there is just no evidence linking violent video games with that," Dr Mark Coulson, associate professor of psychology at Middlesex University and one of the signatories of the letter told the BBC. "If you play three hours of Call of Duty you might feel a little bit pumped, but you are not going to go out and mug someone." © 2015 BBC
Link ID: 21310 - Posted: 08.19.2015
By Kate Kelland LONDON (Reuters) - Scientists have genetically modified mice to be super-intelligent and found they are also less anxious, a discovery that may help the search for treatments for disorders such as Alzheimer's, schizophrenia and post traumatic stress disorder (PTSD). Researchers from Britain and Canada found that altering a single gene to block the phosphodiesterase-4B (PDE4B) enzyme, which is found in many organs including the brain, made mice cleverer and at the same time less fearful. "Our work using mice has identified phosphodiesterase-4B as a promising target for potential new treatments," said Steve Clapcote, a lecturer in pharmacology at Britain's Leeds University, who led the study. He said his team is now working on developing drugs that will specifically inhibit PDE4B. The drugs will be tested first in animals to see whether any of them might be suitable to go forward into clinical trials in humans. In the experiments, published on Friday in the journal Neuropsychopharmacology, the scientists ran a series of behavioral tests on the PDE4B-inhibited mice and found they tended to learn faster, remember events longer and solve complex problems better than normal mice. The "brainy" mice were better at recognizing a mouse they had seen the previous day, the researchers said, and were also quicker at learning the location of a hidden escape platform.
Alison Abbott The octopus genome offers clues to how cephalopods evolved intelligence to rival the craftiest vertebrates. With its eight prehensile arms lined with suckers, camera-like eyes, elaborate repertoire of camouflage tricks and spooky intelligence, the octopus is like no other creature on Earth. Added to those distinctions is an unusually large genome, described in Nature1 on 12 August, that helps to explain how a mere mollusc evolved into an otherworldly being. “It’s the first sequenced genome from something like an alien,” jokes neurobiologist Clifton Ragsdale of the University of Chicago in Illinois, who co-led the genetic analysis of the California two-spot octopus (Octopus bimaculoides). The work was carried out by researchers from the University of Chicago, the University of California, Berkeley, the University of Heidelberg in Germany and the Okinawa Institute of Science and Technology in Japan. The scientists also investigated gene expression in twelve different types of octopus tissue. “It’s important for us to know the genome, because it gives us insights into how the sophisticated cognitive skills of octopuses evolved,” says neurobiologist Benny Hochner at the Hebrew University of Jerusalem in Israel, who has studied octopus neurophysiology for 20 years. Researchers want to understand how the cephalopods, a class of free-floating molluscs, produced a creature that is clever enough to navigate highly complex mazes and open jars filled with tasty crabs. © 2015 Nature Publishing Group
Ashley Yeager A mouse scurries across a round table rimmed with Dixie cup–sized holes. Without much hesitation, the rodent heads straight for the hole that drops it into a box lined with cage litter. Any other hole would have led to a quick fall to the floor. But this mouse was more than lucky. It had an advantage — human glial cells were growing in its brain. Glia are thought of as the support staff for the brain’s nerve cells, or neurons, which transmit and receive the brain’s electrical and chemical signals. Named for the Greek term for “glue,” glia have been known for nearly 170 years as the cells that hold the brain’s bits together. Some glial cells help feed neurons. Other glia insulate nerve cell branches with myelin. Still others attack brain invaders responsible for infection or injury. Glial cells perform many of the brain’s most important maintenance jobs. But recent studies suggest they do a lot more. Glia can shape the conversation between neurons, speeding or slowing the electrical signals and strengthening neuron-to-neuron connections. When scientists coaxed human glia to grow in the brains of baby mice, the mice grew up to be supersmart, navigating tabletops full of holes and mastering other tasks much faster than normal mice. This experiment and others suggest that glia may actually orchestrate learning and memory, says neuroscientist R. Douglas Fields. “Glia aren’t doing vibrato. That’s for the neurons,” says Fields, of the National Institute of Child Health and Human Development in Bethesda, Md. “Glia are the conductors.” © Society for Science & the Public 2000 - 2015
Could taking iodine pills in pregnancy help to raise children’s IQ? Some researchers suggest women in the UK should take such supplements, but others say the evidence is unclear, and that it could even harm development. Iodine is found in dairy foods and fish, and is used in the body to make thyroid hormone, which is vital for brain development in the womb. In some parts of the world, such as inland areas where little fish is consumed or the soil is low in iodine, severe deficiencies can markedly lower intelligence in some people. In most affected areas, iodine is now added to salt. The UK was not thought to need this step, but in 2013 a large study of urine samples from pregnant women found that about two-thirds had mild iodine deficiency, and that the children of those with the lowest levels had the lowest IQs. Now another team has combined data from this study with other data to calculate that if all women in the UK were given iodine supplements from three months before pregnancy until they finished breastfeeding, average IQ would increase by 1.2 points per child. And the children of mothers who were most iodine deficient would probably benefit more, says Kate Jolly of the University of Birmingham, who was involved in the study. “We are talking about very small differences but on a population basis it could mean quite a lot,” she says. The team calculated that providing these iodine supplements would be worth the cost to the UK’s National Health Service because it would boost the country’s productivity. © Copyright Reed Business Information Ltd.
April Dembosky Developers of a new video game for your brain say theirs is more than just another get-smarter-quick scheme. Akili, a Northern California startup, insists on taking the game through a full battery of clinical trials so it can get approval from the Food and Drug Administration — a process that will take lots of money and several years. So why would a game designer go to all that trouble when there's already a robust market of consumers ready to buy games that claim to make you smarter and improve your memory? Think about all the ads you've heard for brain games. Maybe you've even passed a store selling them. There's one at the mall in downtown San Francisco — just past the cream puff stand and across from Jamba Juice — staffed on my visit by a guy named Dominic Firpo. "I'm a brain coach here at Marbles: The Brain Store," he says. Brain coach? "Sounds better than sales person," Firpo explains. "We have to learn all 200 games in here and become great sales people so we can help enrich peoples' minds." He heads to the "Word and Memory" section of the store and points to one product that says it will improve your focus and reduce stress in just three minutes a day. "We sold out of it within the first month of when we got it," Firpo says. The market for these "brain fitness" games is worth about $1 billion and is expected to grow to $6 billion in the next five years. Game makers appeal to both the young and the older with the common claim that if you exercise your memory, you'll be able to think faster and be less forgetful. Maybe bump up your IQ a few points. "That's absurd," says psychology professor Randall Engle from the Georgia Institute of Technology. © 2015 NPR