Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
| by Isaac Saul Multi-step puzzles can be difficult for humans, but what if I told you there was a bird that could solve them on its own? In this BBC special, Dr. Alex Taylor has set up an eight-step puzzle to try and stump one of the smartest crows he's seen in captivity. They describe the puzzle as "one of the most complex tests of the animal mind ever." This isn't the first time crows' intelligence has been tested, either. Along with being problem solvers, these animals have an eerie tendency towards complex human-like memory skills. Through several different studies, we've learned that crows can recognize faces, communicate details of an event to each other and even avoid places they recognize as dangerous. This bird, dubbed "007" for its crafty mind, flies into the caged puzzle and spends only seconds analyzing the puzzle before getting down to business. Despite the puzzle's difficulty, the bird only seems to be stumped momentarily. At the end of the puzzle is a food reward, but how he gets there is what will really blow your mind. © 2014 TheHuffingtonPost.com, Inc
Karen Weintraub, Every time you pull up a memory – say of your first kiss – your mind reinterprets it for the present day, new research suggests. If you're in the middle of an ugly divorce, for example, you might recall it differently than if you're happily married and life is going well. This makes your memory quite unlike the video camera you may imagine it to be. But new research in the Journal of Neuroscience suggests it's very effective for helping us adapt to our environments, said co-author Joel Voss, a researcher at Northwestern University's Feinberg School of Medicine. Voss' findings build on others and may also explain why we can be thoroughly convinced that something happened when it didn't, and why eyewitness testimony is notoriously unreliable. The new research also suggests that memory problems like those seen in Alzheimer's could involve a "freezing" of these memories — an inability to adapt the memory to the present, Voss said. Our memories are thus less a snapshot of the past, than "a record of our current view on the past," said Donna Rose Addis, a researcher and associate professor at the University of Auckland in New Zealand, who was not involved in the research. Using brain scans of 17 healthy volunteers as they were taught new data and recalled previously learned information, Voss and his colleagues were able to show for the first time precisely when and where new information gets implanted into existing memories.
Keyword: Learning & Memory
Link ID: 19205 - Posted: 02.05.2014
by Susan Milius Male bee flies fooled into trying to copulate with a daisy may learn from the awkward incident. Certain orchids and several forms of South Africa’s Gorteria diffusa daisy lure pollinators by mimicking female insects. The most effective daisy seducers row a dark, somewhat fly-shaped bump on one of their otherwise yellow-to-orange petals. Males of small, dark Megapalpus capensis bee flies go wild. But tests show the daisy’s victims waste less time trying to mate with a second deceptive daisy than with the first. “Far from being slow and stupid, these males are actually quite keen observers and fairly perceptive for a fly,” says Marinus L. de Jager of Stellenbosch University in South Africa. Males’ success locating a female bee fly drops in the presence of deceitful daisies, de Jager and Stellenbosch University colleague Allan Ellis say January 29 in the Proceedings of the Royal Society B. That’s the first clear demonstration of sexual deceit’s cost to a pollinator, Ellis says. Such evolutionary costs might push the bee fly to learn from mating mistakes. How long bee flies stay daisy-wary remains unknown. In other studies, wasps tricked by an Australian orchid forgot their lesson after about 24 hours. © Society for Science & the Public 2000 - 2014
Alison Abbott By slicing up and reconstructing the brain of Henry Gustav Molaison, researchers have confirmed predictions about a patient that has already contributed more than most to neuroscience. No big scientific surprises emerge from the anatomical analysis, which was carried out by Jacopo Annese of the Brain Observatory at the University of California, San Diego, and his colleagues, and published today in Nature Communications1. But it has confirmed scientists’ deductions about the parts of the brain involved in learning and memory. “The confirmation is surely important,” says Richard Morris, who studies learning and memory at the University of Edinburgh, UK. “The patient is a classic case, and so the paper will be extensively cited.” Molaison, known in the scientific literature as patient H.M., lost his ability to store new memories in 1953 after surgeon William Scoville removed part of his brain — including a large swathe of the hippocampus — to treat his epilepsy. That provided the first conclusive evidence that the hippocampus is fundamental for memory. H.M. was studied extensively by cognitive neuroscientists during his life. After H.M. died in 2008, Annese set out to discover exactly what Scoville had excised. The surgeon had made sketches during the operation, and brain-imaging studies in the 1990s confirmed that the lesion corresponded to the sketches, although was slightly smaller. But whereas brain imaging is relatively low-resolution, Annese and his colleagues were able to carry out an analysis at the micrometre scale. © 2014 Nature Publishing Group
Henry Molaison, the famous amnesic patient better known as “H.M.,” was unable to form new long-term memories following brain surgery to treat his epilepsy. Scientists who studied his condition made groundbreaking discoveries that revealed how memory works, and before his 2008 death, H.M. and his guardian agreed that his brain would be donated to science. One year after his death, H.M.’s brain was sliced into 2,401 70-micron-thick sections for further study. MIT neuroscience professor emerita Suzanne Corkin studied H.M. during his life and is now part of a team that is analyzing his brain. She is an author of a paper appearing in Nature Communications today reporting preliminary results of the postmortem study. The research team was led by Jacopo Annese at the University of California at San Diego (UCSD). Q: What can we learn from studying H.M.’s brain after his death? And when did you begin laying the groundwork for these postmortem studies? A: It was important to get H.M.’s brain after he died, for three reasons: first of all, to document the exact locus and extent of his lesions, in order to identify the neural substrate for declarative memory. Second, to evaluate the status of the intact brain tissue, revealing the possible brain substrates for the many cognitive functions that H.M. performed normally, including nondeclarative learning without awareness. The third reason was to identify any new abnormalities that occurred as a result of his getting old and were unrelated to the operation. In 1992, I explained to H.M. and his conservator that it would be extremely valuable to have his brain after he died. I told them how important he was to the science of memory, and that he had already made amazing contributions. It would make those even more significant to actually have his brain and see exactly where the damage was. That year, they signed a brain donation form leaving his brain to Massachusetts General Hospital [MGH] and MIT.
Keyword: Learning & Memory
Link ID: 19182 - Posted: 01.29.2014
By BENEDICT CAREY People of a certain age (and we know who we are) don’t spend much leisure time reviewing the research into cognitive performance and aging. The story is grim, for one thing: Memory’s speed and accuracy begin to slip around age 25 and keep on slipping. The story is familiar, too, for anyone who is over 50 and, having finally learned to live fully in the moment, discovers it’s a senior moment. The finding that the brain slows with age is one of the strongest in all of psychology. Lisa Haney Over the years, some scientists have questioned this dotage curve. But these challenges have had an ornery-old-person slant: that the tests were biased toward the young, for example. Or that older people have learned not to care about clearly trivial things, like memory tests. Or that an older mind must organize information differently from one attached to some 22-year-old who records his every Ultimate Frisbee move on Instagram. Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases. Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared. “What shocked me, to be honest, is that for the first half of the time we were doing this project, I totally bought into the idea of age-related cognitive decline in healthy adults,” the lead author, Michael Ramscar, said by email. But the simulations, he added, “fit so well to human data that it slowly forced me to entertain this idea that I didn’t need to invoke decline at all.” © 2014 The New York Times Company
by Helen Thomson The brain that made the greatest contribution to neuroscience and to our understanding of memory has become a gift that keeps on giving. A 3D reconstruction of the brain of Henry Molaison, whose surgery to cure him of epilepsy left him with no short-term memory, will allow scientists to continue to garner insights into the brain for years to come. "Patient HM" became arguably the most famous person in neuroscience after he had several areas of his brain removed in 1953. His resulting amnesia and willingness to be tested have given us unprecedented insights into where memories are formed and stored in the brain. On his death in 2008, HM was revealed to the world as Henry Molaison. Now, a post-mortem examination of his brain, and a new kind of virtual 3D reconstruction, have been published. As a child, Molaison had major epileptic seizures. Anti-epileptic drugs failed, so he sought help from neurosurgeon William Scoville at Hartford Hospital in Connecticut. When Molaison was 27 years old, Scoville removed portions of his medial temporal lobes, which included an area called the hippocampus on both sides of his brain. As a result, Molaison's epilepsy became manageable, but he could not form any new memories, a condition known as anterograde amnesia. He also had difficulty recollecting his long-term past – partial retrograde amnesia.
Keyword: Learning & Memory
Link ID: 19172 - Posted: 01.27.2014
by Bethany Brookshire There are some scientific topics that are bound to generate excitement. A launch to the moon, a potential cure for cancer or any study involving chocolate will always make the news. And then of course there’s caffeine. More than half of Americans have a daily coffee habit, not to mention the boost offered by tea, soda, chocolate and energy drinks. We’d all love to believe that it has more benefit than just papering over a poor night’s sleep. This week, scientists reported that caffeine could give a jolt to memory consolidation, the step right after your brain acquires a memory. During memory consolidation, activity patterns laid down in your brain become more permanent. The study suggested that caffeine might perk up this stage of memory formation. But while it’s an interesting finding, the scientific brew may not be strong enough to justify your coffee habit. Caffeine is a great way to wake you up. It blocks the action of adenosine, a chemical messenger that promotes sleep. Caffeine also has indirect effects on other chemical messengers such as norepinephrine, the neurotransmitter that gives us our famous “fight or flight” response. The net result is increased attention, wakefulness and faster responses. But attention, focus and response time are not memory. And previous studies of memory, says neuroscientist Michael Yassa, the lead author on the new study, were “all over the place.” So Yassa, then at Johns Hopkins University (he’s now at the University of California, Irvine), and undergraduate student Daniel Borota decided to study the effects of caffeine on memory “in a rigorous way.” © Society for Science & the Public 2000 - 2014
A clean slate—that’s what people suffering from posttraumatic stress disorder (PTSD) crave most with their memories. Psychotherapy is more effective at muting more recent traumatic events than those from long ago, but a new study in mice shows that modifying the molecules that attach to our DNA may offer a route to quashing painful memories in both cases. One of the most effective treatments for PTSD is exposure psychotherapy. A behavioral psychologist asks a patient to recall and confront a traumatic event; each time the traumatic memory is revisited, it becomes susceptible to editing through a phenomenon known as memory reconsolidation. As the person relives, for example, a car crash, the details of the event—such as the color and make of the vehicle—gradually uncouple from the anxiety, reducing the likelihood of a panic attack the next time the patient sees, say, a red Mazda. Repeated therapy sessions can also lead to memory extinction, in which the fears tied to an event fade away as old memories are replaced with new ones. Yet this therapy works only for recent memories. If too much time passes before intervention, the haunting visions become stalwart, refusing to budge from the crevices of the mind. This persistence raises the question of how the brain tells the age of a memory in the first place. Researchers at the Massachusetts Institute of Technology, led by neurobiologist Li-Huei Tsai, have now uncovered a chemical modification of DNA that regulates gene activity and dictates whether a memory is too old for reconsolidation in mice. A drug that tweaks these “memory wrinkles” gives old memories a face-lift, allowing them to be edited by reconsolidation and resulting in fear extinction during behavior therapy. © 2014 American Association for the Advancement of Science.
By Ashutosh Jogalekar Popular wisdom holds that caffeine enhances learning, alertness and retention, leading millions to consume coffee or caffeinated drinks before a challenging learning task such as attending a business strategy meeting or a demanding scientific presentation. However a new study in the journal Nature Neuroscience conducted by researchers from Johns Hopkins hints that when it comes to long-term memory and caffeine, timing may be everything; caffeine may enhance consolidation of memories only if it is consumed after a learning or memory challenge. In the study the authors conducted a randomized, double-blind controlled experiment in which 160 healthy female subjects between the ages of 18 and 30 were asked to perform a series of learning tasks. The subjects were handed cards with pictures of various random indoor and outdoor objects (for instance leaves, ducks and handbags) on them and asked to classify the objects as indoor or outdoor. Immediately after the task the volunteers were handed pills, either containing 200 mg of caffeine or placebo. Saliva samples to test for caffeine and its metabolites were collected after 1, 3 and 24 hours. After 24 hours the researchers tested the participants’ recollection of the past day’s test. Along with the items in the test (‘old’) they were presented with new items (‘foils’) and similar looking items (‘lures’), neither of which were part of the task. They were then asked to again classify the items as old, new and similar. There was a statistically significant percentage of volunteers in the caffeinated group that was more likely to mark the ‘similar’ items as ‘similar’ rather than ‘old’. That is, caffeinated participants were clearly able to distinguish much better between the old and the other items, indicating that they were retaining the memory of the old items much better than the people in the placebo group. © 2014 Scientific American,
Training to improve cognitive abilities in older people lasted to some degree 10 years after the training program was completed, according to results of a randomized clinical trial supported by the National Institutes of Health. The findings showed training gains for aspects of cognition involved in the ability to think and learn, but researchers said memory training did not have an effect after 10 years. The report, from the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study, appears in the January 2014 issue of the Journal of the American Geriatrics Society. The project was funded by the National Institute on Aging (NIA) and the National Institute of Nursing Research (NINR), components of the NIH. “Previous data from this clinical trial demonstrated that the effects of the training lasted for five years,” said NIA Director Richard J. Hodes, M.D. “Now, these longer term results indicate that particular types of cognitive training can provide a lasting benefit a decade later. They suggest that we should continue to pursue cognitive training as an intervention that might help maintain the mental abilities of older people so that they may remain independent and in the community.” “ACTIVE is an important example of intervention research aimed at enabling older people to maintain their cognitive abilities as they age,” said NINR Director Patricia Grady, Ph.D. “The average age of the individuals who have been followed over the last 10 years is now 82. Given our nation’s aging population, this type of research is an increasingly high priority.”
Ian Sample, science correspondent A cup or two of coffee could boost the brain's ability to store long-term memories, researchers in the US claim. People who had a shot of caffeine after looking at a series of pictures were better at distinguishing them from similar images in tests the next day, the scientists found. The task gives a measure of how precisely information is stored in the brain, which helps with a process called pattern separation which can be crucial in everyday situations. If the effect is real, and some scientists are doubtful, then it would add memory enhancement to the growing list of benefits that moderate caffeine consumption seems to provide. Michael Yassa, a neuroscientist who led the study at Johns Hopkins University in Baltimore, said the ability to separate patterns was vital for discriminating between similar scenarios and experiences in life. "If you park in the same parking lot every day, the spot you choose can look the same as many others. But when you go and look for your car, you need to look for where you parked it today, not where you parked it yesterday," he said. Writing in the journal Nature Neuroscience, Yassa described how 44 volunteers who were not heavy caffeine consumers and had abstained for at least a day were shown a rapid sequence of pictures on a computer screen. The pictures included a huge range of items, such as a hammer, a chair, an apple, a seahorse, a rubber duck and a car. © 2014 Guardian News and Media Limited
by Helen Thomson A drug for perfect pitch is just the start: mastering new skills could become easy if we can restore the brain's youthful ability to create new circuits WANNABE maestros, listen up. A mood-stabilising drug can help you achieve perfect pitch – the ability to identify any note you hear without inferring it from a reference note. Since this is a skill that is usually acquired only early in life, the discovery is the first evidence that it may be possible to revert the human brain to a childlike state, enabling us to treat disorders and unlock skills that are difficult, if not impossible, to acquire beyond a certain age. From bilingualism to sporting prowess, many abilities rely on neural circuits that are laid down by our early experiences. Until the age of 7 or so, the brain goes through several "critical periods" during which it can be radically changed by the environment. During these times, the brain is said to have increased plasticity. In order to take advantage of these critical periods, the brain needs to be stimulated appropriately so it lays down the neuronal circuitry needed for a particular ability. For example, young children with poor sight in one eye may develop lazy eye, or amblyopia. It can be treated by covering the better eye, forcing the child to use the lazy eye – but this strategy only works during the critical period. These windows of opportunity are fleeting, but now researchers are beginning to understand what closes them and how they might be reopened. © Copyright Reed Business Information Ltd.
Oliver Burkeman What happens when you attach several electrodes to your forehead, connect them via wires to a nine-volt battery and resistor, ramp up the current and send an electrical charge directly into your brain? Most people would be content just to guess, but last summer a 33-year-old from Alabama named Anthony Lee decided to find out. "Here we go… oooahh, that stings a little!" he says, in one of the YouTube videos recording his exploits. "Whoa. That hurts… Ow!" The video cuts out. When Lee reappears, the electrodes are gone: "Something very strange happened," he says thoughtfully. "It felt like something popped." (In another video, he reports a sudden white flash in his visual field, which he describes, in a remarkably calm voice, as "cool".) You might conclude from this that Lee is a very foolish person, but the quest he's on is one that has occupied scientists, philosophers and fortune-hunters for centuries: to find some artificial way to improve upon the basic cognitive equipment we're born with, and thus become smarter and maintain mental sharpness into old age. "It started with Limitless," Lee told me – the 2011 film in which an author suffering from writer's block discovers a drug that can supercharge his faculties. "I figured, I'm a pretty average-intelligence guy, so I could use a little stimulation." The scientific establishment, it's fair to say, remains far from convinced that it's possible to enhance your brain's capacities in a lasting way – whether via electrical jolts, brain-training games, dietary supplements, drugs or anything else. But that hasn't impeded the growth of a huge industry – and thriving amateur subculture – of "neuro-enhancement", which, according to the American Psychological Association, is worth $1bn a year. "Brain fitness technology" has been projected to be worth up to $8bn in 2015 as baby boomers age. Anthony Lee belongs to the sub-subculture of DIY transcranial direct-current stimulation, or tDCS, whose members swap wiring diagrams and cautionary tales online, though if that makes you queasy, you can always pay £179 for Foc.us, a readymade tDCS headset that promises to "make your synapses fire faster" and "excite your prefrontal cortex", so that you can "get the edge in online gaming". © 2014 Guardian News and Media Limited
By GRETCHEN REYNOLDS African tribesmen walk through their landscape in a pattern that eerily echoes the movements of scavenging birds, flocking insects, gliding sharks and visitors to Disneyland, a new study finds, suggesting that aspects of how we choose to move around in our world are deeply hard-wired. For the new study, which appeared online recently in Proceedings of the National Academy of Sciences, researchers at the University of Arizona at Tucson, Yale University, the New York Consortium in Evolutionary Primatology and other institutions traveled to northern Tanzania to study the Hadza, who are among the last human hunter-gatherers on earth. The Hadza generally spend their days following game and foraging for side dishes and condiments such as desert tubers and honey, frequently walking and jogging for miles in the process. The ways in which creatures, including people, navigate their world is a topic of considerable scientific interest, but one that, until the advent of global positioning systems and similar tracking technology, was difficult to quantify. In the past decade, however, scientists have begun strapping GPS units to many varieties of animals and insects, from bumblebees to birds, and measuring how they move. What they have found is that when moving with a purpose such as foraging for food, many creatures follow a particular and shared pattern. They walk (or wing or lope) for a short time in one direction, scouring the ground for edibles, then turn and start moving in another direction for a short while, before turning and strolling or flying in another direction yet again. This is a useful strategy for finding tubers and such, but if maintained indefinitely brings creatures back to the same starting point over and over; they essentially move in circles. Copyright 2014 The New York Times Company
By JOHN MARKOFF PALO ALTO, Calif. — Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head. The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete. The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals. In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming. Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, though a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon. “We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits. © 2013 The New York Times Company
Tomas Jivanda Being pulled into the world of a gripping novel can trigger actual, measurable changes in the brain that linger for at least five days after reading, scientists have said. The new research, carried out at Emory University in the US, found that reading a good book may cause heightened connectivity in the brain and neurological changes that persist in a similar way to muscle memory. The changes were registered in the left temporal cortex, an area of the brain associated with receptivity for language, as well as the the primary sensory motor region of the brain. Neurons of this region have been associated with tricking the mind into thinking it is doing something it is not, a phenomenon known as grounded cognition - for example, just thinking about running, can activate the neurons associated with the physical act of running. “The neural changes that we found associated with physical sensation and movement systems suggest that reading a novel can transport you into the body of the protagonist,” said neuroscientist Professor Gregory Berns, lead author of the study. “We already knew that good stories can put you in someone else’s shoes in a figurative sense. Now we’re seeing that something may also be happening biologically.” 21 students took part in the study, with all participants reading the same book - Pompeii, a 2003 thriller by Robert Harris, which was chosen for its page turning plot. “The story follows a protagonist, who is outside the city of Pompeii and notices steam and strange things happening around the volcano,” said Prof Berns. “It depicts true events in a fictional and dramatic way. It was important to us that the book had a strong narrative line.” © independent.co.uk
Helen Shen The ability to erase memory may jump from the realm of film fantasy (such as Eternal Sunshine of the Spotless Mind, shown here) to reality. In the film Eternal Sunshine of the Spotless Mind, unhappy lovers undergo an experimental brain treatment to erase all memories of each other from their minds. No such fix exists for real-life couples, but researchers report today in Nature Neuroscience that a targeted medical intervention helps to reduce specific negative memories in patients who are depressed1. "This is one time I would say that science is better than art," says Karim Nader, a neuroscientist at McGill University in Montreal, Canada, who was not involved in the research. "It's a very clever study." The technique, called electroconvulsive (ECT) or electroshock therapy, induces seizures by passing current into the brain through electrode pads placed on the scalp. Despite its sometimes negative reputation, ECT is an effective last-resort treatment for severe depression, and is used today in combination with anaesthesia and muscle relaxants. Marijn Kroes, a neuroscientist at Radboud University Nijmegen in the Netherlands, and his colleagues found that by strategically timing ECT bursts, they could target and disrupt patients' memory of a disturbing episode. A matter of time The strategy relies on a theory called memory reconsolidation, which proposes that memories are taken out of 'mental storage' each time they are accessed and 're-written' over time back onto the brain's circuits. Results from animal studies and limited evidence in humans suggest that during reconsolidation, memories are vulnerable to alteration or even erasure2–4. © 2013 Nature Publishing Group
Don’t worry about watching all those cat videos on the Internet. You’re not wasting time when you are at your computer—you’re honing your fine-motor skills. A study of people’s ability to translate training that involves clicking and twiddling a computer mouse reveals that the brain can apply that expertise to other fine-motor tasks requiring the hands. We know that computers are altering the way that people think. For example, using the Internet changes the way that you remember information. But what about use of the computer itself? You probably got to this story by using a computer mouse, for example, and that is a bizarre task compared with the activities that we’ve encountered in our evolutionary history. You made tiny movements of your hand in a horizontal plane to cause tiny movements of a cursor in a completely disconnected vertical plane. But with daily practice—the average computer user makes more than 1000 mouse clicks per day—you have become such an expert that you don’t even think about this amazing feat of dexterity. Scientists would love to know if that practice affects other aspects of your brain’s control of your body. The problem is finding people with no computer experience. So Konrad Kording, a psychologist at Northwestern University’s Rehabilitation Institute of Chicago in Illinois, and his former postdoc Kunlin Wei, now at Peking University in Beijing, turned to migrant Chinese workers. The country’s vast population covers the whole socioeconomic spectrum, from elite computer hackers to agricultural laborers whose lifestyles have changed little over the past century. The country’s economic boom is bringing people in waves from the countryside to cities in search of employment. © 2013 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 19060 - Posted: 12.21.2013
By Felicity Muth This might seem perplexing to some, but I’ve just spent two days listening to talks and meeting with people who all work on social insects. And it was great. I was at Royal Holloway, University of London, where the IUSSI meeting was taking place. The IUSSI is the ‘International Union for the Study of Social Insects’, although they seem to let people in who work on social spiders too (a nice inclusive attitude if you ask me). This meeting was specifically for researchers who are in the UK and North-West Europe, of which there are a surprisingly large number. The talks were really good, sharing a lot of the recent research that’s happened using social insects, and I thought I’d share my highlight of first day’s events here. One of my favourite talks from the first day was from Elli Leadbeater who spoke about work carried out primarily by Erika Dawson. I’ve written before about ‘social learning’ in monkeys and whales, where one animal can learn something from observing another animal, normally of the same species. Dawson and her colleagues were looking specifically at whether there is actually anything ‘social’ about ‘social learning’, or whether it can be explained with the same mechanism as other types of learning. In the simplest form of learning, associative learning, an animal learns to associate a particular stimulus (for example a particular colour, smell or sound) with a reward (usually food). The classic example of this was Pavlov’s dogs, who learned to associate the sound of a metronome with food. When Pavlov then sounded the metronome, the dogs salivated even when there was no food present. © 2013 Scientific American
Keyword: Learning & Memory
Link ID: 19058 - Posted: 12.21.2013