Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Drugs to treat Alzheimer's disease don't help patients with mild cognitive impairment and are linked to greater risk of harm, a Canadian review concludes. People with mild cognitive impairment show symptoms of memory problems that are not severe enough to be considered dementia or to interfere with day-to-day functioning. Each year, three to 17 per cent of people with mild cognitive impairment deteriorate to dementia, research suggests. It was hoped that "cognitive enhancers" used to treat dementia might delay progression to dementia. Dr. Sharon Straus of the department of geriatric medicine at the University of Toronto and her team reviewed clinical trials and reports on the effects of four cognitive enhancers. "Cognitive enhancers did not improve cognition or function among patients with mild cognitive impairment and were associated with a greater risk of gastrointestinal harms," the reviewers concluded in Monday's issue of the Canadian Medical Association Journal. "Our findings do not support the use of cognitive enhancers for mild cognitive impairment." The medications act on different neurotransmitters in the brain, such as acetylcholine. © CBC 2013
By Melanie Tannenbaum I can remember exactly where I was twelve years ago when I learned why the sky was starting to fill with smoke about 30 miles to the west. Though I live in Illinois now, I’m originally from Long Island. In September 2001, I was just beginning the 9th grade at Friends Academy, my new high school in Locust Valley. I had just started getting to know the people who would become my closest friends over the next four years. I was on my way to Computer Programming when I ran into Molly, a girl on my bus. “Hey, did you hear?” Molly asked, somewhat casually. “No, what’s up? Oh, is Maggie taking the bus today?!” I asked excitedly. Maggie was Molly’s adorable baby sister, whose expeditions onto our bus were rare (but exciting) events. “No…apparently something really big just happened in the city. They’re canceling class right now and calling an all-school assembly in the Dolan Center. You didn’t hear?” “Oh, no, but thank God. I didn’t finish my math homework last night and I didn’t have time to do it on the bus, this is awesome,” I said with a smile. “Do you have any idea why they’re canceling class, though?!” I had no idea at the time how much I would cringe for the rest of my life whenever I looked back and thought about my first reaction to hearing that “something big” was going on in the city. © 2013 Scientific American,
Keyword: Learning & Memory
Link ID: 18636 - Posted: 09.12.2013
By Dwayne Godwin and Jorge Cham Dwayne Godwin is a neuroscientist at the Wake Forest University School of Medicine. Jorge Cham draws the comic strip Piled Higher and Deeper at www.phdcomics.com. © 2013 Scientific American
By RONI JACOBSON We have seven deadly sins, seven days of the week, seven seas, seven dwarfs. The recurrence of the number seven so impressed the cognitive psychologist George A. Miller that, in an oft-cited paper in 1956, he wrote, “My problem is that I have been persecuted by an integer.” Miller went on to describe several experiments where seven pieces of information — plus or minus two — appeared to be the limit of what our minds could retain in the short term. Since then, Miller’s theory — that our short-term memory can hold about seven items before we start to forget them — has been refined. It is now understood that the capacity of short-term memory depends on several factors, including age, attention and the type of information presented. For instance, long words like “onomatopoeia” and “reciprocate” take up more memory span than short words like “cat” and “ball.” Grouping smaller bits of information into a meaningful unit, like a word of many syllables or an abstract concept, is called “chunking,” and our ability to retain information decreases as the chunk becomes more complex. Psychologists now believe that we can recall about four chunks of information at a time, which works out to approximately six letters, five one-syllable words and seven digits. As for the ubiquity of the number seven, Miller came to suspect that that is just a coincidence. © 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18624 - Posted: 09.10.2013
by Jon White Ever tried beetroot custard? Probably not, but your brain can imagine how it might taste by reactivating old memories in a new pattern. Helen Barron and her colleagues at University College London and Oxford University wondered if our brains combine existing memories to help us decide whether to try something new. So the team used an fMRI scanner to look at the brains of 19 volunteers who were asked to remember specific foods they had tried. Each volunteer was then given a menu of 13 unusual food combinations – including beetroot custard, tea jelly, and coffee yoghurt – and asked to imagine how good or bad they would taste, and whether or not they would eat them. "Tea jelly was popular," says Barron. "Beetroot custard not so much." When each volunteer imagined a new combination, they showed brain activity associated with each of the known ingredients at the same time. It is the first evidence to suggest that we use memory combination to make decisions, says Barron. Journal reference: Nature Neuroscience, doi: 10.1038/nn.3515 © Copyright Reed Business Information Ltd.
Kelly Servick If keeping the brain spry were as simple as pumping iron, everyone would want to own the ultimate piece of cognitive exercise equipment. But designing activities to reverse the mental effects of aging is tricky. A new video game created by neuroscientists shows promise in reversing some signs of decline. Now, the researchers behind it aim to prove that video game training can be more than the latest workout craze. Games designed to keep the brain healthy as it ages have found an eager audience. “Many, many people have gotten into the business,” says neuropsychologist Glenn Smith of the Mayo Clinic in Rochester, Minnesota. The brain does appear to be capable of changing its structure and developing new skills over the course of a lifetime. But not all the products on the market are designed using scientific knowledge of the aging brain, and their ability to make meaningful, lasting changes hasn’t been proven, says Smith, who studies games as treatment for early signs of dementia. “There’s an awful lot of skepticism out there,” he says. The heart of the issue is whether practicing a video game can strengthen skills that are useful away from a computer. Early research showed that people could improve on computerized memory and speed tasks in the lab, Smith says. But it’s not clear whether these gains translate to everyday life. A recent trend puts more value in games that target the underlying problem—the decline in ability to remember and react as people age. © 2012 American Association for the Advancement of Science.
by Jennifer Viegas Goldfish not only listen to music, but they also can distinguish one composer from another, a new study finds. The paper adds to the growing body of evidence that many different animals understand music. For the study, published in the journal Behavioural Processes, Shinozuka and colleagues Haruka Ono and Shigeru Watanabe played two pieces of classical music near goldfish in a tank. The pieces were Toccata and Fugue in D minor by Johann Sebastian Bach and The Rite of Spring by Igor Stravinsky. The scientists trained the fish to gnaw on a little bead hanging on a filament in the water. Half of the fish were trained with food to gnaw whenever Bach played and the other half were taught to gnaw whenever Stravinsky music was on. The goldfish aced the test, easily distinguishing the two composers and getting a belly full of food in the process. The fish were more interested in the vittles than the music, but earlier studies on pigeons and songbirds suggest that Bach is the preferred choice, at least for birds. “These pieces can be classified as classical (Bach) and modern (Stravinsky) music,” Shinozuka explained. “Previously we demonstrated that Java sparrows preferred classical over modern music. Also, we demonstrated Java sparrows could discriminate between consonance and dissonance.” © 2013 Discovery Communications, LLC.
By TOM FIELDS-MEYER I was looking in my closet, choosing a shirt, when I lost my mind. Four hours later, I’m in the E.R., and I don’t know how I got here. My wife, Shawn, stands at my bedside, her expression alternating between reassuring and dismayed. Next to her, a doctor in his mid-50s calmly tells me he’s going to name three objects. “I want you to hold these in your mind,” he says. “Apple, table, penny.” I nod, noticing a semicircle of young interns behind him, listening intently. Then the doctor asks me to multiply 17 times 3. “I’m not very good at math,” I say. He waits. “Let’s see. Twenty times 3 is 60, minus 6.” I pause, correcting myself. “No, minus 9. Fifty-one?” “Good.” He smiles. “Now, what were those three objects I named?” I can’t recall the objects. I barely remember that he listed them. Flustered, I purse my lips and slowly shake my head, looking at Shawn. She fills in the blanks for me: I woke up, took a shower, and when I stepped out, I seemed disoriented. I sat down on the bed. “Wait, remind me, what are we doing today?” I asked her. “Do I need to remind you again? We’re having lunch at the Swerdlows’.” I didn’t remember that. I put a hand on my forehead, then lay on my back. “What day is it?” I asked her. Concerned by my blank stare, Shawn shot me questions: Do you know who came over last night? (I didn’t.) Do you remember what we argued about yesterday morning? (I couldn’t.) © 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18596 - Posted: 09.02.2013
Alison Abbott Like humans, Drosophila fruitflies become forgetful with age. But at least their memory deficits can be reversed by eating a diet rich in polyamines, according to a study published online today1 in Nature Neuroscience. “There’s a great need for cognitive enhancers to keep us healthy into old age — now polyamines are offering a new approach,” says learning and memory specialist Ronald Davis at the Scripps Research Institute Florida in Jupiter, who was not involved in the study. “There are reasons for optimism that this fly work will translate into human.” Polyamines — which include the graphically named putrescine, cadaverine and spermidine — are small molecules that are essential for cells to survive and grow. But their cellular levels decline with age. Some foods that are popularly considered to have health benefits — such as wheatgerm and fermented soya beans — contain high levels of polyamines. Japanese scientists have shown that natto, a fermented soya-bean product, raises the level of polyamines in the blood in humans2. But there is a long way to go before anyone can say that polyamines can help to stave off memory decline in ageing people, cautions Stephan Sigrist of the Free University of Berlin, one of the study's principal investigators. “Still, the polyamine system does offer a new target for those interested in developing therapies.” © 2013 Nature Publishing Group
by Bob Holmes It's the cruel cycle of poverty. The many challenges that come with being poor can sap people's ability to think clearly, according to a new study. The findings suggest that governments should think twice before tying up social-assistance programmes in confusing red tape. Sociologists have long known that poor people are less likely to take medications, keep appointments, or be attentive parents. "Poor people make poorer decisions. They do. The question is why," says Timothy Smeeding, director of the Institute for Research on Poverty at the University of Wisconsin-Madison. But does bad decision-making help cause poverty, or does poverty interfere with decision-making? To explore this question, psychologist Eldar Shafir at Princeton University and his colleagues took advantage of a natural experiment. Small-scale sugar-cane farmers in Tamil Nadu in southern India receive most of their year's income all at once, shortly after the annual harvest. As a result, the same farmer can be poor before harvest and relatively rich after. And indeed, Shafir's team found that farmers had more loans, pawned more belongings, and reported more difficulty paying bills before the harvest than after. The researchers visited 464 farmers in 54 villages both before and after harvest. At each visit, they gave the farmers two tests of their cognitive ability: a multiple-choice pattern-matching test, and one in which they had to declare the number of digits shown rather then their value: seeing "5 5 5" but saying "three", for example. © Copyright Reed Business Information Ltd.
By Susan Milius Here’s a lesson on road trips from whooping cranes: For efficient migration, what matters is the age of the oldest crane in the group. These more experienced fliers nudge youngsters away from going off course on long flights. “The older birds get, the closer they stick to the straight line,” says ecologist Thomas Mueller of the University of Maryland in College Park, who crunched data from 73 Grus americana migrating between Wisconsin and Florida. One-year-olds traveling with other birds of the same age, the analysis says, tend to deviate about 76 kilometers from a direct route. But if they fly in a group with an 8-year-old crane, they stray 38 percent less, or about 47 kilometers, Mueller and his colleagues report in the August 30 Science. Eight years of data on these endangered cranes summering in Wisconsin’s Necedah National Wildlife Refuge offered a rare chance to parse how birds find their way. Conservationists have been rebuilding this eastern migratory population of the once widespread birds. Researchers release captive-bred cranes in Wisconsin and lead each class of newbies, just once, with an ultralight aircraft to Florida’s Chassahowitzka National Wildlife Refuge for the winter. Cranes navigate back to Wisconsin on their own. © Society for Science & the Public 2000 - 2013
Amanda Mascarelli It’s an inconvenient truth of aging: In our 30s and up, it gets increasingly harder for most of us to recall names, faces, and details from the past. Scientists have long debated whether this gradual decline is an early form of Alzheimer’s disease—a neurodegenerative condition that leads to severe dementia—or a distinct neurological process. Now, researchers have found a protein that distinguishes typical forgetfulness from Alzheimer’s and could lead to potential treatments for age-related memory loss. Previous studies have shown that Alzheimer’s disease and age-related memory loss involve different neural circuits in the hippocampus, a seahorse-shaped structure in the brain where memories are formed and organized. The hallmark signs of Alzheimer’s disease are well established—tangled proteins and plaques accumulate over time, and brain tissue atrophies. But little is known about what occurs when memory declines during normal aging, except that brain cells begin to malfunction, says Scott Small, a neurologist at Columbia University and senior author to the study. “At the molecular level, there’s been a lot of uncertainty about what is actually going wrong, and that’s what this paper isolates.” To tease apart the biological processes involved in memory loss in normal aging, Scott and other researchers from Columbia University in New York examined postmortem brain tissue from eight healthy people ranging in age from 33 to 86. They looked for differences in gene expression—the proteins or other products that a gene makes—between younger and older people. They also looked for age-related changes in the brains of mice. © 2012 American Association for the Advancement of Science
By Clayton Aldern We’ve been here before. Two or three times a year, a team of neuroscientists comes along and tightropes over the chasm that is dystopian research. Across the valley lies some pinnacle of human achievement; below flows the dirty, coursing river of mind control and government-sponsored brainwashing and all things Nineteen Eighty-Four. Cliffside, maybe clutching our tinfoil caps, we bite our nails and try to keep our faith in the scientists. This time is no different. On July 26, a research team took its first step onto the tightrope. Working under Nobel laureate Susumu Tonegawa, the MIT group reported that they had created a false memory in the brain of a mouse. “Our data,” wrote the authors in Science, “demonstrate that it is possible to generate an internally represented and behaviorally expressed fear memory via artificial means.” While the sterility reserved for scientific research abstracts tends to diffuse the élan of the work, the gravity here is apparent. Which brings us to the cliff and the chasm. That devil-klaxon of a sound effect from Inception always seems appropriate for heralding reports with sci-fi undertones. In the case of the closest thing we have to an actual inception, it seems particularly apt. But the group’s work is not Inception per se, and it’s certainly not Total Recall. That’s not to say it isn’t unnerving. It’s also not to say the study isn’t remarkable. More than anything, the Science paper’s publication is a reminder that neuroscience is inching over some dangerous ethical waters, and from here, it is important to tread carefully. © 2013 Scientific American
Keyword: Learning & Memory
Link ID: 18567 - Posted: 08.27.2013
By VASILIS K. POZIOS, PRAVEEN R. KAMBAM and H. ERIC BENDER EARLIER this summer the actor Jim Carrey, a star of the new superhero movie “Kick-Ass 2,” tweeted that he was distancing himself from the film because, in the wake of the Sandy Hook massacre, “in all good conscience I cannot support” the movie’s extensive and graphically violent scenes. Mark Millar, a creator of the “Kick-Ass” comic book series and one of the movie’s executive producers, responded that he has “never quite bought the notion that violence in fiction leads to violence in real life any more than Harry Potter casting a spell creates more boy wizards in real life.” While Mr. Carrey’s point of view has its adherents, most people reflexively agree with Mr. Millar. After all, the logic goes, millions of Americans see violent imagery in films and on TV every day, but vanishingly few become killers. But a growing body of research indicates that this reasoning may be off base. Exposure to violent imagery does not preordain violence, but it is a risk factor. We would never say: “I’ve smoked cigarettes for a long time, and I don’t have lung cancer. Therefore there’s no link between smoking cigarettes and lung cancer.” So why use such flawed reasoning when it comes to media violence? There is now consensus that exposure to media violence is linked to actual violent behavior — a link found by many scholars to be on par with the correlation of exposure to secondhand smoke and the risk of lung cancer. In a meta-analysis of 217 studies published between 1957 and 1990, the psychologists George Comstock and Haejung Paik found that the short-term effect of exposure to media violence on actual physical violence against a person was moderate to large in strength. Mr. Comstock and Ms. Paik also conducted a meta-analysis of studies that looked at the correlation between habitual viewing of violent media and aggressive behavior at a point in time. They found 200 studies showing a moderate, positive relationship between watching television violence and physical aggression against another person. © 2013 The New York Times Company
By Geoffrey Mohan If you can’t quite get that nine-note treble opening to "Fur Elise," just sleep on it. The brain will rehearse, reorganize and nail the sequential motor tasks that help you play piano or type on a keyboard. How that consolidation of memory happens has remained largely a mystery, despite telling evidence that the brain’s motor cortex appears to be quite busy during sleep. Now, a team led by Brown University neuroscientists believes it has found the source of the sleeping piano lesson, and it’s not where many expected it to be. Neuroscience has been fixated since its founding on why the brain “needs” that peculiar mix of dormancy and random activity known as sleep. And it equally wondered why we emerge from it better able to do things. Slowly, evidence accrued that we were “learning” during sleep -- consolidating memory in ways that would make waking tasks more successful. It seemed deepest sleep, not the familiar rapid-eye-movement type, had the most effect on our brain’s abilty to reorganize and prepare to perform better in waking hours. “It has been very difficult to measure brain activation during sleep,” said Brown University neuroscientist Masako Tamaki, lead author of the study published online Tuesday in the Journal of Neuroscience. “So it was unclear what brain region was involved.”
Linda Carroll TODAY contributor Whether it’s “One Flew Over the Cuckoo’s Nest,” “Girl Interrupted,” or “Homeland,” Hollywood’s portrayals of electroconvulsive therapy have never been pretty. And the images from those movies and TV shows have only added to a stigma that keeps many desperate patients from opting for a therapy that might turn their lives around, experts say. “We can’t get past the stigma of all the visuals we’ve seen from movies and the fact that it seems so antiquated when you consider modern medicine,” NBC chief medical editor Dr. Nancy Snyderman told TODAY’s Matt Lauer. “But time and time and time again if you look at patients who have severe depression who don’t respond to medications, they will tell you that ECT works.” That’s certainly true in Denise Stewart’s case. Stewart, a mother of two, suffers from schizoaffective disorder. Her hallucinations were pushing her closer and closer to suicide each day. “There would be voices in my head that would sit there and say, ‘Denise, see the knife in the kitchen? Cut your wrists. Denise, see those pills over there? Take all those pills,’” she told TODAY. After antidepressants made Stewart’s condition worse, her doctors suggested ECT. And the change was dramatic. “If it hadn’t been for the electroconvulsive therapy, I wouldn’t be alive right now,” Stewart said. These days an estimated 100,000 Americans undergo ECT each year – and the process is a lot different from what you see in the media, experts say.
Moheb Costandi In the early hours of 9 September, 1984, a stranger entered Mrs M's California home through an open living-room window. Finding Mrs M asleep, he tried to rape her, but fled when other people in the house awoke. Mrs M described her assailant to the police: he was black, weighing about 170 pounds and 5'7” to 5'9” tall, with small braids and a blue baseball cap. Officers cruising her neighbourhood spotted someone roughly matching that description standing beside his car a block away from the house. The man, Joseph Pacely, said that his car had broken down and he was looking for someone to jump-start it. But Mrs M identified him as her attacker and he was charged. At Pacely's trial a few months later, memory researcher Elizabeth Loftus testified on his behalf. She told the jury how memory is fallible; how stress and fear may have impaired Mrs M's ability to identify her assailant, and how people can find it difficult to identify someone of a race other than their own. Pacely was acquitted. “It's cases like this that mean the most to me,” says Loftus, “the ones in which I play a role in bringing justice to an innocent person.” In a career spanning four decades, Loftus, a psychologist at the University of California, Irvine, has done more than any other researcher to document the unreliability of memory in experimental settings. And she has used what she has learned to testify as an expert witness in hundreds of criminal cases — Pacely's was her 101st — informing juries that memories are pliable and that eyewitness accounts are far from perfect recordings of actual events. © 2013 Nature Publishing Group
Keyword: Learning & Memory
Link ID: 18514 - Posted: 08.15.2013
Helen Shen The false mouse memories made the ethicists uneasy. By stimulating certain neurons in the hippocampus, Susumu Tonegawa and his colleagues caused mice to recall receiving foot shocks in a setting in which none had occurred1. Tonegawa, a neuroscientist at the Massachusetts Institute of Technology in Cambridge, says that he has no plans to ever implant false memories into humans — the study, published last month, was designed just to offer insight into memory formation. But the experiment has nonetheless alarmed some neuroethicists. “That was a bell-ringer, the idea that you can manipulate the brain to control the mind,” says James Giordano, chief of neuroethics studies at Georgetown University in Washington DC. He says that the study is one of many raising ethical concerns, and more are sure to come as an ambitious, multi-year US effort to parse the human brain gets under way. The BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative will develop technologies to understand how the brain’s billions of neurons work together to produce thought, emotion, movement and memory. But, along with the discoveries, it could force scientists and society to grapple with a laundry list of ethical issues: the responsible use of cognitive-enhancement devices, the protection of personal neural data, the prediction of untreatable neurodegenerative diseases and the assessment of criminal responsibility through brain scanning. On 20 August, US President Barack Obama’s commission on bioethics will hold a meeting in Philadelphia, Pennsylvania, to begin to craft a set of ethics standards to guide the BRAIN project. There is already one major mechanism for ethical oversight in US research: institutional review boards, which must approve any studies involving human subjects. But many ethicists say that as neuroscience discoveries creep beyond laboratory walls into the marketplace and the courtroom, more comprehensive oversight is needed. © 2013 Nature Publishing Group,
Keyword: Learning & Memory
Link ID: 18513 - Posted: 08.15.2013
by Douglas Heaven It's a cognitive leap forward. IBM can now program an experimental chip they unveiled two years ago. The chips, designed to mimic how our brains work, are set to power computers that handle many streams of input data at once – much like the sensory input we deal with all the time. IBM's TrueNorth computer chips contain memory, processors and communication channels wired up like the synapses, neurons and axons of a brain. A key idea is that the chips can be hooked up into vast grids with many thousands working together in parallel. For certain types of task, such as quickly responding to large amounts of input data from sensors, they are much faster and less power-hungry than standard chips. They could one day replace human reflexes in self-driving cars or power the sensory systems of a robot, for example. But because the chips rewrite the rulebook for how computers are normally put together, they are not easy to program. Dharmendra Modha and his colleagues at IBM Research in San Jose, California, learned this the hard way. The team's first attempts were full of errors: "The programs were very unintuitive and extremely difficult to debug," says Modha. "Things looked hopeless." So they designed a new way of programming. This involves telling the computer how to yoke together the many individual chips in play at once. The IBM team came up with a way to package the functionality of each chip inside blocks of code they call "corelets". © Copyright Reed Business Information Ltd.
Link ID: 18490 - Posted: 08.12.2013
“OUR primary goal is for our users to see us as a gym, where they can work out and keep mentally fit,” says Michael Scanlon, the co-founder and chief scientist of Lumos Labs. For $14.95 a month, subscribers to the firm’s Lumosity website get to play a selection of online games designed to improve their cognitive performance. There are around 40 exercises available, including “speed match”, in which players click if an image matches a previous one; “memory matrix”, which requires remembering which squares on a matrix were shaded; and “raindrops”, which involves solving arithmetic problems before the raindrops containing them hit the ground. The puzzles are varied, according to how well users perform, to ensure they are given a suitably challenging brain-training session each day. The popularity of Lumosity since its launch in 2007 has been, well, mind-blowing. Its smartphone app has been the top education app in the iTunes store at some point in 38 countries. On August 1st it launched an iPad version, which it expects to boost its existing 45m registered users in 180-plus countries. Lumos Labs has already raised almost $70m in venture capital, and is one of two firms vying to become the first public company serving the new “digital brain health” market, says Alvaro Fernandez of SharpBrains, a research firm. (The firm hoping to beat it to the punch is NeuroSky, which makes “brainwave sensors”—including some shaped like cats’ ears that will apparently wiggle if you are enjoying yourself and droop if you are relaxed.) The metaphor of workouts for the mind will set alarm bells ringing for anyone familiar with Brain Gym, a series of physical exercises for children, adopted unquestioningly by many British schools, whose supposed cognitive benefits were debunked in “Bad Science”, a 2008 book by Ben Goldacre. However, Mr Scanlon, who quit his neuroscience PhD at Stanford University to co-found Lumos Labs, says he was inspired to do so by the mounting academic evidence of the plasticity of the brain and of the ability to improve cognitive function through simple exercises. © The Economist Newspaper Limited 2013
Keyword: Learning & Memory
Link ID: 18480 - Posted: 08.10.2013