Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Alison Abbott The fact that Edvard and May-Britt Moser have collaborated for 30 years — and been married for 28 — has done nothing to dull their passion for the brain. They talk about it at breakfast. They discuss its finer points at their morning lab meeting. And at a local restaurant on a recent summer evening, they are still deep into a back-and-forth about how their own brains know where they are and will guide them home. “Just to walk there, we have to understand where we are now, where we want to go, when to turn and when to stop,” says May-Britt. “It's incredible that we are not permanently lost.” If anyone knows how we navigate home, it is the Mosers. They shot to fame in 2005 with their discovery of grid cells deep in the brains of rats. These intriguing cells, which are also present in humans, work much like the Global Positioning System, allowing animals to understand their location. The Mosers have since carved out a niche studying how grid cells interact with other specialized neurons to form what may be a complete navigation system that tells animals where they are going and where they have been. Studies of grid cells could help to explain how memories are formed, and why recalling events so often involves re-envisioning a place, such as a room, street or landscape. While pursuing their studies, the two scientists have become a phenomenon. Tall and good-looking, they operate like a single brain in two athletic bodies in their generously funded lab in Trondheim, Norway — a remote corner of northern Europe just 350 kilometres south of the Arctic Circle. They publish together and receive prizes as a single unit — most recently, the Nobel Prize in Physiology or Medicine, which they won this week with their former supervisor, neuroscientist John O’Keefe at University College London. In 2007, while still only in their mid-40s, they won a competition by the Kavli Foundation of Oxnard, California, to build and direct one of only 17 Kavli Institutes around the world. The Mosers are now minor celebrities in their home country, and their institute has become a magnet for other big thinkers in neuroscience. “It is definitely intellectually stimulating to be around them,” says neurobiologist Nachum Ulanovsky from the Weizmann Institute of Science in Rehovot, Israel, who visited the Trondheim institute for the first time in September. © 2014 Nature Publishing Grou
Keyword: Learning & Memory
Link ID: 20162 - Posted: 10.06.2014
By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company
|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American
By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 20157 - Posted: 10.04.2014
Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.
Wild marmosets in the Brazilian forest can learn quite successfully from video demonstrations featuring other marmosets, Austrian scientists have reported, showing not only that marmosets are even better learners than previously known, but that video can be used successfully in experiments in the wild. Tina Gunhold, a cognitive biologist at the University of Vienna, had worked with a population of marmoset monkeys in a bit of Brazilian forest before this particular experiment. The forest is not wilderness. It lies near some apartment complexes, and the marmosets are somewhat used to human beings. But the monkeys are wild, and each extended family group has its own foraging territory. Dr. Gunhold and her colleagues reported in the journal Biology Letters this month that they had tested 12 family groups, setting up a series of video monitors, each with a kind of complicated box that they called an “artificial fruit.” All the boxes contained food. Six of the monitors showed just an unchanging image of a marmoset near a similar box. Three of them showed a marmoset opening the box by pulling a drawer, and three others a marmoset lifting a lid to get at the food. Marmosets are very territorial and would not tolerate a strange individual on their turf, but the image of a strange marmoset on video didn’t seem to bother them. Individual marmosets “differed in their reactions to the video,” Dr. Gunhold said. “Some were more shy, some more bold. The younger ones were more attracted to the video, perhaps because of greater curiosity.” © 2014 The New York Times Company
By David Z. Hambrick, Fernanda Ferreira, and John M. Henderson A decade ago, Magnus Carlsen, who at the time was only 13 years old, created a sensation in the chess world when he defeated former world champion Anatoly Karpov at a chess tournament in Reykjavik, Iceland, and the next day played then-top-rated Garry Kasparov—who is widely regarded as the best chess player of all time—to a draw. Carlsen’s subsequent rise to chess stardom was meteoric: grandmaster status later in 2004; a share of first place in the Norwegian Chess Championship in 2006; youngest player ever to reach World No. 1 in 2010; and highest-rated player in history in 2012. What explains this sort of spectacular success? What makes someone rise to the top in music, games, sports, business, or science? This question is the subject of one of psychology’s oldest debates. In the late 1800s, Francis Galton—founder of the scientific study of intelligence and a cousin of Charles Darwin—analyzed the genealogical records of hundreds of scholars, artists, musicians, and other professionals and found that greatness tends to run in families. For example, he counted more than 20 eminent musicians in the Bach family. (Johann Sebastian was just the most famous.) Galton concluded that experts are “born.” Nearly half a century later, the behaviorist John Watson countered that experts are “made” when he famously guaranteed that he could take any infant at random and “train him to become any type of specialist [he] might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents.” One player needed 22 times more deliberate practice than another player to become a master. © 2014 The Slate Group LLC.
Keyword: Learning & Memory
Link ID: 20136 - Posted: 09.30.2014
by Elijah Wolfson @elijahwolfson The class was the most difficult of the fall 2013 semester, and J.D. Leadam had missed all but one lecture. His grandfather’s health had worsened, and he left San Jose State, where he was studying for a degree in business, to return home to Los Angeles to help out. Before he knew it, midterm exams had almost arrived. At this point, Leadam had, for a while, been playing around with transcranial direct-current stimulation, or tDCS, an experimental treatment for all sorts of health issues that, at its most basic, involves running a very weak electric current through the brain. When he first came across tDCS, Leadam was immediately intrigued but thought, “There’s no way I’m gonna put electrodes on my head. It’s just not going to happen.” After extensive research, though, he changed his mind. He looked into buying a device online, but there wasn’t much available — just one extremely expensive machine and then a bare-bones $40 device that didn’t even have a switch. So he dug around online and figured he could build one himself. He bought all the pieces he needed and put it together. He tried it a few times, but didn’t notice much, so he put it aside. But now, with the test looming, he picked it back up. The professor had written a book, and Leadam knew all the information he’d be tested on was written in its pages. “But I’m an auditory learner,” he said, “so I knew it wouldn’t work to just read it.” He strapped on the device, turned it on and read the chapters. “Nothing,” he thought. But when he got to the classroom and put pen to paper, he had a revelation. “I could remember concepts down to the exact paragraphs in the textbook,” Leadam said. “I actually ended up getting an A on the test. I couldn’t believe it.”
Keyword: Learning & Memory
Link ID: 20130 - Posted: 09.29.2014
By Dick Miller, CBC News Dan Campbell felt the bullets whiz past his head. The tracer rounds zipped between his legs. It was his first firefight as a Canadian soldier in Afghanistan. "I was completely frightened and scared like I’d never been before in my life,” he says. As the attack continued, the sights, sounds and smells started to form memories inside his brain. The fear he felt released the hormone norepinephrine, and in the complex chemistry of the brain, the memories of the battle became associated with the fear. 'I think one day, hopefully in the not-too-distant future, we will be able to delete a memory.'- Dr. Sheena Josselyn, senior scientist, Hospital For Sick Children Research Institute Six years later, a sight or sound such as a firecracker or car backfiring can remind him of that night in 2008. The fear comes back and he relives rather than remembers the moments. "It can be hard. Physically, you know, there’s the tapping foot, my heart beating,” he says. Like so many soldiers and victims of assault or people who have experienced horrific accidents, Campbell was diagnosed with post traumatic stress disorder. Now a newspaper reporter in Yellowknife, Campbell thinks one day he may get therapy. But for now he is working on his own to control the fear and anger the memories bring. © CBC 2014
By Melissa Dahl Recently, I was visiting my family in Seattle, and we were doing that thing families do: retelling old stories. As we talked, a common theme emerged. My brother hardly remembered anything from our childhood, even the stories in which he was the star player. (That time he fell down the basement steps and needed stitches in the ER? Nope. That panicky afternoon when we all thought he’d disappeared, only to discover he’d been hiding in his room, and then fell asleep? Nothing.) “Boys never remember anything,” my mom huffed. She’s right. Researchers are finding some preliminary evidence that women are indeed better at recalling memories, especially autobiographical ones. Girls and women tend to recall these memories faster and with more specific details, and some studies have demonstrated that these memories tend to be more accurate, too, when compared to those of boys and men. And there’s an explanation for this: It could come down to the way parents talk to their daughters, as compared to their sons, when the children are developing memory skills. To understand this apparent gender divide in recalling memories, it helps to start with early childhood—specifically, ages 2 to 6. Whether you knew it or not, during these years, you learned how to form memories, and researchers believe this happens mostly through conversations with others, primarily our parents. These conversations teach us how to tell our own stories, essentially; when a mother asks her child for more details about something that happened that day in school, for example, she is implicitly communicating that these extra details are essential parts to the story. © 2014 The Slate Group LLC
By Maria Konnikova At the turn of the twentieth century, Ivan Pavlov conducted the experiments that turned his last name into an adjective. By playing a sound just before he presented dogs with a snack, he taught them to salivate upon hearing the tone alone, even when no food was offered. That type of learning is now called classical—or Pavlovian—conditioning. Less well known is an experiment that Pavlov was conducting at around the same time: when some unfortunate canines heard the same sound, they were given acid. Just as their luckier counterparts had learned to salivate at the noise, these animals would respond by doing everything in their power to get the imagined acid out of their mouths, each “shaking its head violently, opening its mouth and making movements with its tongue.” For many years, Pavlov’s classical conditioning findings overshadowed the darker version of the same discovery, but, in the nineteen-eighties, the New York University neuroscientist Joseph LeDoux revived the technique to study the fear reflex in rats. LeDoux first taught the rats to associate a certain tone with an electric shock so that they froze upon hearing the tone alone. In essence, the rat had formed a new memory—that the tone signifies pain. He then blunted that memory by playing the tone repeatedly without following it with a shock. After multiple shock-less tones, the animals ceased to be afraid. Now a new generation of researchers is trying to figure out the next logical step: re-creating the same effects within the brain, without deploying a single tone or shock. Is memory formation now understood well enough that memories can be implanted and then removed absent the environmental stimulus?
By Filipa Ioannou Per the Associated Press, the Food and Drug Administration is considering a ban on electric-shock devices that are used to punish unwanted behavior by patients with autism and other developmental disabilities. If it comes as a surprise to you that any involuntary electric shocks are administered to autism patients in the United States, that's because the devices are only used at one facility in the country—the Judge Rotenberg Educational Center in Canton, Mass. The school has been a target of media attention in the past; in 2012, video leaked of 18-year-old patient Andre McCollins being restrained face-down and shocked 31 times. McCollins’ mother sued the center, and the lawsuit was settled outside of court. Rotenberg must get a court’s approval to begin administering skin shocks to a student. The center uses a graduated electronic decelerator, or GED, that is attached to the arms or legs. If the student acts aggressively – head-banging, throwing furniture, attacking someone – then a center worker can press a button to activate the electrode, delivering a two-second shock to the skin. The amount of pain generated by the device is a contentious subject. The Rotenberg Center's Glenda Crookes compared the sensation to “a bee sting” in comments to CBS News, and some Rotenberg parents are strong proponents of the device. But a U.N. official in 2010 said the shocks constituted “torture." An FDA report also addresses the widely held belief that autistic individuals have a high pain threshold, pointing out the possibility that “not all children with ASD express their pain in the same way as a ‘neurotypical child’ would (e.g., cry, moan, seek comfort, etc.), which may lead to misinterpretation by caregivers and medical professionals that patients are insensitive or to an incorrect belief that the child is not in pain.” © 2014 The Slate Group LLC.
By Elizabeth Pennisi "What's for dinner?" The words roll off the tongue without even thinking about it—for adults, at least. But how do humans learn to speak as children? Now, a new study in mice shows how a gene, called FOXP2, implicated in a language disorder may have changed between humans and chimps to make learning to speak possible—or at least a little easier. As a uniquely human trait, language has long baffled evolutionary biologists. Not until FOXP2 was linked to a genetic disorder that caused problems in forming words could they even begin to study language’s roots in our genes. Soon after that discovery, a team at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, discovered that just two bases, the letters that make up DNA, distinguished the human and chimp versions of FOXP2. To try to determine how those changes influenced the gene's function, that group put the human version of the gene in mice. In 2009, they observed that these "humanized" mice produced more frequent and complex alarm calls, suggesting the human mutations may have been involved in the evolution of more complex speech. Another study showed that humanized mice have different activity in the part of the brain called the striatum, which is involved in learning, among other tasks. But the details of how the human FOXP2 mutations might affect real-world learning remained murky. To solve the mystery, the Max Planck researchers sent graduate student Christiane Schreiweis to work with Ann Graybiel, a neuroscientist at the Massachusetts Institute of Technology in Cambridge. She's an expert in testing mouse smarts by seeing how quickly they can learn to find rewards in mazes. © 2014 American Association for the Advancement of Science
by Michael Slezak It's one of the biggest mysteries of Alzheimer's. The disease is associated with the formation of protein plaques in the brain, but why is it that some people with plaques seem not to have the disease? Research suggests that some people's brains are able to reorganise during the early stages of Alzheimer's, delaying the appearance of initial symptoms. The plaques in question are small mounds of a protein called beta-amyloid, and are found in the brains of people with Alzheimer's disease. Whether these plaques are a cause of the disease has been hotly debated. One reason for doubt is the appearance of plaques in many older people who have no symptoms Movie Cameraof dementia at all. Using fMRI to measure changes in blood flow around the brain, William Jagust from the University of California in Berkley and colleagues compared brain function in three groups of people without symptoms of dementia: 22 young people, 16 older people with beta-amyloid plaques and 33 older people without the plaques. He asked each of them to memorise a photographed scene while inside the machine. Jagust found that older people with plaques had increased blood flow – which means stronger activation of that brain area – in the regions of the brain that are usually activated during memory formation, compared with the older people who did not have plaques. The team then analysed whether this extra brain activation might be helping to compensate for the plaques. © Copyright Reed Business Information Ltd.
by Simon Makin Talking in your sleep might be annoying, but listening may yet prove useful. Researchers have shown that sleeping brains not only recognise words, but can also categorise them and respond in a previously defined way. This could one day help us learn more efficiently. Sleep appears to render most of us dead to the world, our senses temporarily suspended, but sleep researchers know this is a misleading impression. For instance, a study published in 2012 showed that sleeping people can learn to associate specific sounds and smells. Other work has demonstrated that presenting sounds or smells during sleep boosts performance on memory tasks – providing the sensory cues were also present during the initial learning. Now it seems the capabilities of sleeping brains stretch even further. A team led by Sid Kouider from the Ecole Normale Supérieur in Paris trained 18 volunteers to classify spoken words as either animal or object by pressing buttons with their right or left hand. Brain activity was recorded using EEG, allowing the researchers to measure the telltale spikes in activity that indicate the volunteers were preparing to move one of their hands. Since each hand is controlled by the motor cortex on the opposite side of the brain, these brainwaves can be matched to the intended hand just by looking at which side of the motor cortex is active. © Copyright Reed Business Information Ltd.
By Helen Briggs Health editor, BBC News website There may be a link between a rare blood type and memory loss in later life, American research suggests. People with AB blood, found in 4% of the population, appear more likely to develop thinking and memory problems than those with other blood groups. The study, published in Neurology, builds on previous research showing blood type may influence heart risk. A charity said the best way to keep the brain healthy was a balanced diet, regular exercise and not smoking. A US team led by Dr Mary Cushman, of the University of Vermont College of Medicine, Burlington, analysed data from about 30,000 US citizens aged 45 and above. It identified 495 participants who had developed thinking and memory problems, or cognitive impairment, during the three-year study. They were compared to 587 people with no cognitive problems. People with AB blood type made up 6% of the group who developed cognitive impairment, which is higher than the 4% found in the general population. They were 82% more likely to have difficulties with day-to-day memory, language and attention, which can signal the onset of dementia. However, the study did not look at the risk of dementia. The study supported the idea that having a certain blood group, such as O, may give a lower risk for cardiovascular disease, which in turn protected the brain, the researchers said. "Our study looks at blood type and risk of cognitive impairment, but several studies have shown that factors such as high blood pressure, high cholesterol and diabetes increase the risk of cognitive impairment and dementia," said Dr Cushman. BBC © 2014
By BENEDICT CAREY Imagine that on Day 1 of a difficult course, before you studied a single thing, you got hold of the final exam. The motherlode itself, full text, right there in your email inbox — attached mistakenly by the teacher, perhaps, or poached by a campus hacker. No answer key, no notes or guidelines. Just the questions. Would that help you study more effectively? Of course it would. You would read the questions carefully. You would know exactly what to focus on in your notes. Your ears would perk up anytime the teacher mentioned something relevant to a specific question. You would search the textbook for its discussion of each question. If you were thorough, you would have memorized the answer to every item before the course ended. On the day of that final, you would be the first to finish, sauntering out with an A+ in your pocket. And you would be cheating. But what if, instead, you took a test on Day 1 that was just as comprehensive as the final but not a replica? You would bomb the thing, for sure. You might not understand a single question. And yet as disorienting as that experience might feel, it would alter how you subsequently tuned into the course itself — and could sharply improve your overall performance. This is the idea behind pretesting, one of the most exciting developments in learning-science. Across a variety of experiments, psychologists have found that, in some circumstances, wrong answers on a pretest aren’t merely useless guesses. Rather, the attempts themselves change how we think about and store the information contained in the questions. On some kinds of tests, particularly multiple-choice, we benefit from answering incorrectly by, in effect, priming our brain for what’s coming later. That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20043 - Posted: 09.08.2014
by Sandrine Ceurstemont Screening an instructional monkey movie in a forest reveals that marmosets do not only learn from family members: they also copy on-screen strangers. It is the first time such a video has been used for investigations in the wild. Tina Gunhold at the University of Vienna, Austria, and her colleagues filmed a common marmoset retrieving a treat from a plastic device. They then took the device to the Atlantic Forest near Aldeia in Pernambuco, Brazil, and showed the movie to wild marmosets there. Although monkeys are known to learn from others in their social group, especially when they are youngMovie Camera, little is known about their ability to learn from monkeys that do not belong to the same group. Marmosets are territorial, so the presence of an outsider – even a virtual one on a screen – could provoke an attack. "We didn't know if wild marmosets would be frightened of the video box but actually they were all attracted to it," says Gunhold. Compared to monkeys shown a static image of the stranger, video-watching marmosets were more likely to manipulate the device, typically copying the technique shown (see video). Young monkeys spent more time near the video box than older family members, suggesting that they found the movie more engaging – although as soon as one monkey mastered the task, it was impossible to tell whether the others were learning from the video or from their relative. "We think it's a combination of both," says Gunhold. © Copyright Reed Business Information Ltd.
By Virginia Morell Figaro, a Goffin’s cockatoo (Cacatua goffini) housed at a research lab in Austria, stunned scientists a few years ago when he began spontaneously making stick tools from the wooden beams of his aviary. The Indonesian parrots are not known to use tools in the wild, yet Figaro confidently employed his sticks to rake in nuts outside his wire enclosure. Wondering if Figaro’s fellow cockatoos could learn by watching his methods, scientists set up experiments for a dozen of them. One group watched as Figaro used a stick to reach a nut placed inside an acrylic box with a wire-mesh front panel; others saw “ghost demonstrators”—magnets that were hidden beneath a table and that the researchers controlled—displace the treats. Each bird was then placed in front of the box, with a stick just like Figaro’s lying nearby. The group of three males and three females that had watched Figaro also picked up the sticks, and made some efforts reminiscent of his actions. But only those three males, such as the one in the photo above, became proficient with the tool and successfully retrieved the nuts, the scientists report online today in the Proceedings of the Royal Society B. None of the females did so; nor did any of the birds, male or female, in the ghost demonstrator group. Because the latter group failed entirely, the study shows that the birds need living teachers, the scientists say. Intriguingly, the clever observers developed a better technique than Figaro’s for getting the treat. Thus, the cockatoos weren’t copying his exact actions, but emulating them—a distinction that implies some degree of creativity. Two of the successful cockatoos were later given a chance to make a tool of their own. One did so immediately (as in the video above), and the other succeeded after watching Figaro. It may be that by learning to use a tool, the birds are stimulated to make tools of their own, the scientists say. © 2014 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 20027 - Posted: 09.03.2014
by Chris Higgins Neuroscientists have pinpointed where imagination hides in the brain and found it to be functionally distinct from related processes such as memory. The team from Brigham Young University (BYU), Utah-- including research proposer, undergraduate student Stefania Ashby -- used functional Magnetic Resonance Imaging (fMRI) to observe brain activity when subjects were remembering specific experiences and putting themselves in novel ones. "I was thinking a lot about planning for my own future and imagining myself in the future, and I started wondering how memory and imagination work together," Ashby said. "I wondered if they were separate or if imagination is just taking past memories and combining them in different ways to form something I've never experienced before." The two processes of remembering and imagining have been previously proposed to be the same cognitive task, and so thought to be carried out by the same areas of the brain. However, the experiments derived by Ashby and her mentor (and coauthor) BYU professor Brock Kirwan have refuted these ideas. The studies -- published in the journal Cognitive Neuroscience -- required participants to submit 60 photographs of previous life events and use them to create prompts for the "remember" sections. They then carried out a questionnaire before putting the subject into the MRI scanner to determine what scenarios were the most novel to them and force them into imagination. Then, under fMRI testing, the subjects were prompted with various scenarios and the areas of their brain that became active during each scenario was correlated with each scene's familiarity -- pure memory, or imagination. © Condé Nast UK 2014