Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Training to improve cognitive abilities in older people lasted to some degree 10 years after the training program was completed, according to results of a randomized clinical trial supported by the National Institutes of Health. The findings showed training gains for aspects of cognition involved in the ability to think and learn, but researchers said memory training did not have an effect after 10 years. The report, from the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study, appears in the January 2014 issue of the Journal of the American Geriatrics Society. The project was funded by the National Institute on Aging (NIA) and the National Institute of Nursing Research (NINR), components of the NIH. “Previous data from this clinical trial demonstrated that the effects of the training lasted for five years,” said NIA Director Richard J. Hodes, M.D. “Now, these longer term results indicate that particular types of cognitive training can provide a lasting benefit a decade later. They suggest that we should continue to pursue cognitive training as an intervention that might help maintain the mental abilities of older people so that they may remain independent and in the community.” “ACTIVE is an important example of intervention research aimed at enabling older people to maintain their cognitive abilities as they age,” said NINR Director Patricia Grady, Ph.D. “The average age of the individuals who have been followed over the last 10 years is now 82. Given our nation’s aging population, this type of research is an increasingly high priority.”
Ian Sample, science correspondent A cup or two of coffee could boost the brain's ability to store long-term memories, researchers in the US claim. People who had a shot of caffeine after looking at a series of pictures were better at distinguishing them from similar images in tests the next day, the scientists found. The task gives a measure of how precisely information is stored in the brain, which helps with a process called pattern separation which can be crucial in everyday situations. If the effect is real, and some scientists are doubtful, then it would add memory enhancement to the growing list of benefits that moderate caffeine consumption seems to provide. Michael Yassa, a neuroscientist who led the study at Johns Hopkins University in Baltimore, said the ability to separate patterns was vital for discriminating between similar scenarios and experiences in life. "If you park in the same parking lot every day, the spot you choose can look the same as many others. But when you go and look for your car, you need to look for where you parked it today, not where you parked it yesterday," he said. Writing in the journal Nature Neuroscience, Yassa described how 44 volunteers who were not heavy caffeine consumers and had abstained for at least a day were shown a rapid sequence of pictures on a computer screen. The pictures included a huge range of items, such as a hammer, a chair, an apple, a seahorse, a rubber duck and a car. © 2014 Guardian News and Media Limited
by Helen Thomson A drug for perfect pitch is just the start: mastering new skills could become easy if we can restore the brain's youthful ability to create new circuits WANNABE maestros, listen up. A mood-stabilising drug can help you achieve perfect pitch – the ability to identify any note you hear without inferring it from a reference note. Since this is a skill that is usually acquired only early in life, the discovery is the first evidence that it may be possible to revert the human brain to a childlike state, enabling us to treat disorders and unlock skills that are difficult, if not impossible, to acquire beyond a certain age. From bilingualism to sporting prowess, many abilities rely on neural circuits that are laid down by our early experiences. Until the age of 7 or so, the brain goes through several "critical periods" during which it can be radically changed by the environment. During these times, the brain is said to have increased plasticity. In order to take advantage of these critical periods, the brain needs to be stimulated appropriately so it lays down the neuronal circuitry needed for a particular ability. For example, young children with poor sight in one eye may develop lazy eye, or amblyopia. It can be treated by covering the better eye, forcing the child to use the lazy eye – but this strategy only works during the critical period. These windows of opportunity are fleeting, but now researchers are beginning to understand what closes them and how they might be reopened. © Copyright Reed Business Information Ltd.
Oliver Burkeman What happens when you attach several electrodes to your forehead, connect them via wires to a nine-volt battery and resistor, ramp up the current and send an electrical charge directly into your brain? Most people would be content just to guess, but last summer a 33-year-old from Alabama named Anthony Lee decided to find out. "Here we go… oooahh, that stings a little!" he says, in one of the YouTube videos recording his exploits. "Whoa. That hurts… Ow!" The video cuts out. When Lee reappears, the electrodes are gone: "Something very strange happened," he says thoughtfully. "It felt like something popped." (In another video, he reports a sudden white flash in his visual field, which he describes, in a remarkably calm voice, as "cool".) You might conclude from this that Lee is a very foolish person, but the quest he's on is one that has occupied scientists, philosophers and fortune-hunters for centuries: to find some artificial way to improve upon the basic cognitive equipment we're born with, and thus become smarter and maintain mental sharpness into old age. "It started with Limitless," Lee told me – the 2011 film in which an author suffering from writer's block discovers a drug that can supercharge his faculties. "I figured, I'm a pretty average-intelligence guy, so I could use a little stimulation." The scientific establishment, it's fair to say, remains far from convinced that it's possible to enhance your brain's capacities in a lasting way – whether via electrical jolts, brain-training games, dietary supplements, drugs or anything else. But that hasn't impeded the growth of a huge industry – and thriving amateur subculture – of "neuro-enhancement", which, according to the American Psychological Association, is worth $1bn a year. "Brain fitness technology" has been projected to be worth up to $8bn in 2015 as baby boomers age. Anthony Lee belongs to the sub-subculture of DIY transcranial direct-current stimulation, or tDCS, whose members swap wiring diagrams and cautionary tales online, though if that makes you queasy, you can always pay £179 for Foc.us, a readymade tDCS headset that promises to "make your synapses fire faster" and "excite your prefrontal cortex", so that you can "get the edge in online gaming". © 2014 Guardian News and Media Limited
By GRETCHEN REYNOLDS African tribesmen walk through their landscape in a pattern that eerily echoes the movements of scavenging birds, flocking insects, gliding sharks and visitors to Disneyland, a new study finds, suggesting that aspects of how we choose to move around in our world are deeply hard-wired. For the new study, which appeared online recently in Proceedings of the National Academy of Sciences, researchers at the University of Arizona at Tucson, Yale University, the New York Consortium in Evolutionary Primatology and other institutions traveled to northern Tanzania to study the Hadza, who are among the last human hunter-gatherers on earth. The Hadza generally spend their days following game and foraging for side dishes and condiments such as desert tubers and honey, frequently walking and jogging for miles in the process. The ways in which creatures, including people, navigate their world is a topic of considerable scientific interest, but one that, until the advent of global positioning systems and similar tracking technology, was difficult to quantify. In the past decade, however, scientists have begun strapping GPS units to many varieties of animals and insects, from bumblebees to birds, and measuring how they move. What they have found is that when moving with a purpose such as foraging for food, many creatures follow a particular and shared pattern. They walk (or wing or lope) for a short time in one direction, scouring the ground for edibles, then turn and start moving in another direction for a short while, before turning and strolling or flying in another direction yet again. This is a useful strategy for finding tubers and such, but if maintained indefinitely brings creatures back to the same starting point over and over; they essentially move in circles. Copyright 2014 The New York Times Company
By JOHN MARKOFF PALO ALTO, Calif. — Computers have entered the age when they are able to learn from their own mistakes, a development that is about to turn the digital world on its head. The first commercial version of the new kind of computer chip is scheduled to be released in 2014. Not only can it automate tasks that now require painstaking programming — for example, moving a robot’s arm smoothly and efficiently — but it can also sidestep and even tolerate errors, potentially making the term “computer crash” obsolete. The new computing approach, already in use by some large technology companies, is based on the biological nervous system, specifically on how neurons react to stimuli and connect with other neurons to interpret information. It allows computers to absorb new information while carrying out a task, and adjust what they do based on the changing signals. In coming years, the approach will make possible a new generation of artificial intelligence systems that will perform some functions that humans do with ease: see, speak, listen, navigate, manipulate and control. That can hold enormous consequences for tasks like facial and speech recognition, navigation and planning, which are still in elementary stages and rely heavily on human programming. Designers say the computing style can clear the way for robots that can safely walk and drive in the physical world, though a thinking or conscious computer, a staple of science fiction, is still far off on the digital horizon. “We’re moving from engineering computing systems to something that has many of the characteristics of biological computing,” said Larry Smarr, an astrophysicist who directs the California Institute for Telecommunications and Information Technology, one of many research centers devoted to developing these new kinds of computer circuits. © 2013 The New York Times Company
Tomas Jivanda Being pulled into the world of a gripping novel can trigger actual, measurable changes in the brain that linger for at least five days after reading, scientists have said. The new research, carried out at Emory University in the US, found that reading a good book may cause heightened connectivity in the brain and neurological changes that persist in a similar way to muscle memory. The changes were registered in the left temporal cortex, an area of the brain associated with receptivity for language, as well as the the primary sensory motor region of the brain. Neurons of this region have been associated with tricking the mind into thinking it is doing something it is not, a phenomenon known as grounded cognition - for example, just thinking about running, can activate the neurons associated with the physical act of running. “The neural changes that we found associated with physical sensation and movement systems suggest that reading a novel can transport you into the body of the protagonist,” said neuroscientist Professor Gregory Berns, lead author of the study. “We already knew that good stories can put you in someone else’s shoes in a figurative sense. Now we’re seeing that something may also be happening biologically.” 21 students took part in the study, with all participants reading the same book - Pompeii, a 2003 thriller by Robert Harris, which was chosen for its page turning plot. “The story follows a protagonist, who is outside the city of Pompeii and notices steam and strange things happening around the volcano,” said Prof Berns. “It depicts true events in a fictional and dramatic way. It was important to us that the book had a strong narrative line.” © independent.co.uk
Helen Shen The ability to erase memory may jump from the realm of film fantasy (such as Eternal Sunshine of the Spotless Mind, shown here) to reality. In the film Eternal Sunshine of the Spotless Mind, unhappy lovers undergo an experimental brain treatment to erase all memories of each other from their minds. No such fix exists for real-life couples, but researchers report today in Nature Neuroscience that a targeted medical intervention helps to reduce specific negative memories in patients who are depressed1. "This is one time I would say that science is better than art," says Karim Nader, a neuroscientist at McGill University in Montreal, Canada, who was not involved in the research. "It's a very clever study." The technique, called electroconvulsive (ECT) or electroshock therapy, induces seizures by passing current into the brain through electrode pads placed on the scalp. Despite its sometimes negative reputation, ECT is an effective last-resort treatment for severe depression, and is used today in combination with anaesthesia and muscle relaxants. Marijn Kroes, a neuroscientist at Radboud University Nijmegen in the Netherlands, and his colleagues found that by strategically timing ECT bursts, they could target and disrupt patients' memory of a disturbing episode. A matter of time The strategy relies on a theory called memory reconsolidation, which proposes that memories are taken out of 'mental storage' each time they are accessed and 're-written' over time back onto the brain's circuits. Results from animal studies and limited evidence in humans suggest that during reconsolidation, memories are vulnerable to alteration or even erasure2–4. © 2013 Nature Publishing Group
Don’t worry about watching all those cat videos on the Internet. You’re not wasting time when you are at your computer—you’re honing your fine-motor skills. A study of people’s ability to translate training that involves clicking and twiddling a computer mouse reveals that the brain can apply that expertise to other fine-motor tasks requiring the hands. We know that computers are altering the way that people think. For example, using the Internet changes the way that you remember information. But what about use of the computer itself? You probably got to this story by using a computer mouse, for example, and that is a bizarre task compared with the activities that we’ve encountered in our evolutionary history. You made tiny movements of your hand in a horizontal plane to cause tiny movements of a cursor in a completely disconnected vertical plane. But with daily practice—the average computer user makes more than 1000 mouse clicks per day—you have become such an expert that you don’t even think about this amazing feat of dexterity. Scientists would love to know if that practice affects other aspects of your brain’s control of your body. The problem is finding people with no computer experience. So Konrad Kording, a psychologist at Northwestern University’s Rehabilitation Institute of Chicago in Illinois, and his former postdoc Kunlin Wei, now at Peking University in Beijing, turned to migrant Chinese workers. The country’s vast population covers the whole socioeconomic spectrum, from elite computer hackers to agricultural laborers whose lifestyles have changed little over the past century. The country’s economic boom is bringing people in waves from the countryside to cities in search of employment. © 2013 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 19060 - Posted: 12.21.2013
By Felicity Muth This might seem perplexing to some, but I’ve just spent two days listening to talks and meeting with people who all work on social insects. And it was great. I was at Royal Holloway, University of London, where the IUSSI meeting was taking place. The IUSSI is the ‘International Union for the Study of Social Insects’, although they seem to let people in who work on social spiders too (a nice inclusive attitude if you ask me). This meeting was specifically for researchers who are in the UK and North-West Europe, of which there are a surprisingly large number. The talks were really good, sharing a lot of the recent research that’s happened using social insects, and I thought I’d share my highlight of first day’s events here. One of my favourite talks from the first day was from Elli Leadbeater who spoke about work carried out primarily by Erika Dawson. I’ve written before about ‘social learning’ in monkeys and whales, where one animal can learn something from observing another animal, normally of the same species. Dawson and her colleagues were looking specifically at whether there is actually anything ‘social’ about ‘social learning’, or whether it can be explained with the same mechanism as other types of learning. In the simplest form of learning, associative learning, an animal learns to associate a particular stimulus (for example a particular colour, smell or sound) with a reward (usually food). The classic example of this was Pavlov’s dogs, who learned to associate the sound of a metronome with food. When Pavlov then sounded the metronome, the dogs salivated even when there was no food present. © 2013 Scientific American
Keyword: Learning & Memory
Link ID: 19058 - Posted: 12.21.2013
Skepticism about repressed traumatic memories has increased over time, but new research shows that psychology researchers and practitioners still tend to hold different beliefs about whether such memories occur and whether they can be accurately retrieved. The findings are published in Psychological Science, a journal of the Association for Psychological Science. “Whether repressed memories are accurate or not, and whether they should be pursued by therapists, or not, is probably the single most practically important topic in clinical psychology since the days of Freud and the hypnotists who came before him,” says researcher Lawrence Patihis of the University of California, Irvine. According to Patihis, the new findings suggest that there remains a “serious split in the field of psychology in beliefs about how memory works.” Controversy surrounding repressed memory – sometimes referred to as the “memory wars” – came to a head in the 1990s. While some believed that traumatic memories could be repressed for years only to be recovered later in therapy, others questioned the concept, noting that lack of scientific evidence in support of repressed memory. Spurred by impressions that both researchers and clinicians believed the debate had been resolved, Patihis and colleagues wanted to investigate whether and how beliefs about memory may have changed since the 1990s. To find out, the researchers recruited practicing clinicians and psychotherapists, research psychologists, and alternative therapists to complete an online survey. © Association for Psychological Science
Taking some heartburn medications for more than two years is linked to a higher risk of vitamin B12 deficiency in adults, a U.S. study suggests. Left untreated, vitamin B12 deficiency can lead to dementia, neurological damage, anemia, and other complications. Knowing that stomach acid aids in vitamin B12 absorption, researchers set out to test whether suppressing the acids can lead to vitamin deficiency. The drugs in question are known as proton pump inhibitors and they include such well known brands as Losec, Nexium, Prabacid and Pariet. Doses of more than 1.5 pills per day were more strongly associated with vitamin D deficiency than doses of less than 0.75 pills per day, Dr. Douglas Corley, a gastroenterologist and research scientist with the Kaiser Permanente Division of Research in Broadway, Calif. and his co-authors said in Wednesday's issue of the Journal of the American Medical Association. "This research raises the question of whether people who are taking acid-depressing medications long term should be screened for vitamin B12 deficiency," Corley said in a release. "It's a relatively simple blood test, and vitamin supplements are an effective way of managing the vitamin deficiency, if it is found." For the study, researchers looked at electronic health records of 25,956 adults diagnosed with vitamin B12 deficiency in Northern California between January 1997 and June 2011, and compared them with 184,199 patients without B12 deficiency during the same time period. Among the 25,956 patients who had vitamin B12 deficiency, 12 per cent used proton pump inhibitors for at least two years, compared with 7.2 per cent of those in the control group. © CBC 2013
by Laura Sanders Last Sunday, the Giants battled the Redskins in our living room, and there was no bigger fan than 9-month-old Baby V. Unlike her father, she was not interested in RG3’s shortcomings. The tiny, colorful guys running around on a bright green field, the psychedelic special effects and the bursts of noise drew her in like a moth to a 42-inch high-definition flame. My friends with kids have noticed the same screen fascination in their little ones. Like adults, kids love colorful, shiny, moving screens. The problem, of course, is that watching TV probably isn’t the best way for little kids to spend their time. Long bouts in front of the tube have been linked to obesity, weaker attention spans and aggression in kids. Now, a new study of Japanese children has linked TV time with changes in the growing brains, effects that have been harder to spot. And the more television a kid watches, the more profound the brain differences, scientists report November 20 in Cerebral Cortex. Researchers studied kids between age 5 and 18 who watched between zero and four hours of television a day. On average, the kids watched TV for about two hours a day. Brain scans revealed that the more television a kid watched, the larger certain parts of the brain were. Gray matter volume was higher in regions toward the front and side of the head in kids who watched a lot of TV. Say that again? Watching television boosts brain volume? Before you rejoice and fire up Season 1 of Breaking Bad, keep in mind: Bigger isn’t always better. In this case, higher brain volume in these kids was associated with a lower verbal IQ. Study coauthor Hikaru Takeuchi Tohoku University in Japan says that these brain areas need to be pruned during childhood to operate efficiently. © Society for Science & the Public 2000 - 2013.
by Bob Holmes Dying cells may play only a small role in the brain decline that accompanies ageing. That is the suggestion from the first computer simulation of brain function that can solve intelligence tests almost as well as university undergraduates. The model promises to reveal how our brains and behaviour are affected by age, and might even offer a way of testing drugs that compensate for cognitive decline. Psychologists have known for many years that our ability to think through novel problems – our "fluid intelligence" – gradually declines with age. However, the reasons for this decline aren't clear, because many features of the brain change as we age: neurons die; connections become sparser between regions of the brain and between individual brain cells; and our mental representation of different concepts becomes less distinct, among other changes. So far, psychologists have been unable to tease apart these possible explanations for cognitive decline. Enter Chris Eliasmith, a theoretical neuroscientist at the University of Waterloo in Ontario, Canada, and his student Daniel Rasmussen. The pair used a computer to simulate the behaviour of about 35,000 individual brain cells wired together in a biologically realistic way. Just like a real brain, their model encoded information as a pattern of electrical activity in particular sets of cells. The researchers set up the system to solve a widely used intelligence test known as Raven's Progressive Matrices, which involves predicting what abstract symbol comes next in a sequence. © Copyright Reed Business Information Ltd.
Ewen Callaway Certain fears can be inherited through the generations, a provocative study of mice reports1. The authors suggest that a similar phenomenon could influence anxiety and addiction in humans. But some researchers are sceptical of the findings because a biological mechanism that explains the phenomenon has not been identified. According to convention, the genetic sequences contained in DNA are the only way to transmit biological information across generations. Random DNA mutations, when beneficial, enable organisms to adapt to changing conditions, but this process typically occurs slowly over many generations. Yet some studies have hinted that environmental factors can influence biology more rapidly through 'epigenetic' modifications, which alter the expression of genes, but not their actual nucleotide sequence. For instance, children who were conceived during a harsh wartime famine in the Netherlands in the 1940s are at increased risk of diabetes, heart disease and other conditions — possibly because of epigenetic alterations to genes involved in these diseases2. Yet although epigenetic modifications are known to be important for processes such as development and the inactivation of one copy of the X-chromsome in females, their role in the inheritance of behaviour is still controversial. Kerry Ressler, a neurobiologist and psychiatrist at Emory University in Atlanta, Georgia, and a co-author of the latest study, became interested in epigenetic inheritance after working with poor people living in inner cities, where cycles of drug addiction, neuropsychiatric illness and other problems often seem to recur in parents and their children. “There are a lot of anecdotes to suggest that there’s intergenerational transfer of risk, and that it’s hard to break that cycle,” he says. © 2013 Nature Publishing Group
By Tanya Lewis 20 hours ago To understand the human brain, scientists must start small, and what better place than the mind of a worm? The roundworm Caenorhabditis elegans is one of biology's most widely studied organisms, and it's the first to have the complete wiring diagram, or connectome, of its nervous system mapped out. Knowing the structure of the animal's connectome will help explain its behavior, and could lead to insights about the brains of other organisms, scientists say. "You can't understand the brain without understanding the connectome," Scott Emmons, a molecular geneticist at Albert Einstein College of Medicine of Yeshiva University in New York, said in a talk earlier this month at the annual meeting of the Society for Neuroscience in San Diego. In 1963, South African biologist Sydney Brenner of the University of Cambridge decided to use C. elegans as a model organism for developmental biology. He chose the roundworm because it has a simple nervous system, it's easy to grow in a lab and its genetics are relatively straightforward. C. elegans was the first multicellular organism to have its genome sequenced, in 1998. Brenner knew that to understand how genes affect behavior, "you would have to know the structure of the nervous system," Emmons told LiveScience.
by Bethany Brookshire Most people take it as a given that distraction is bad for — oh, hey, a squirrel! Where was I? … Right. Most people take it as a given that distraction is bad for memory. And most of the time, it is. But under certain conditions, the right kind of distraction might actually help you remember. Nathan Cashdollar of University College London and colleagues were looking at the effects of distraction on memory in memory-impaired patients. They were specifically looking at distractions that were totally off-topic from a particular task, and how those distractions affected memory performance. Their results were published November 27 in the Journal of Neuroscience. The researchers worked with a small group of people with severe epilepsy who had lesions in the hippocampus, and therefore had memory problems. They compared them to groups of people with epilepsy without lesions, young healthy people, and older healthy people that were matched to the epilepsy group. Each of the participants went through a memory task called “delayed match-to-sample.” For this task, participants are given a set of samples or pictures, usually things like nature scenes. Then there’s a delay, from one second at the beginning of the test on up to nearly a minute. Then participants are shown another nature scene. Is it one they have seen before? Yes or no? The task starts out simply, with only one nature scene to match, but soon becomes harder, with up to five pictures to remember, and a five-second delay. People with memory impairments did a lot worse when they had more items to remember (called high cognitive load), falling off very steeply in their performance. Normal controls did better, still remaining fairly accurate, but making mistakes once in a while. © Society for Science & the Public 2000 - 2013.
Medical marijuana can alleviate pain and nausea, but it can also cause decreased attention span and memory loss. A new study in mice finds that taking an over-the-counter pain medication like ibuprofen may help curb these side effects. "This is what we call a seminal paper," says Giovanni Marsicano, a neuroscientist at the University of Bordeaux in France who was not involved in the work. If the results hold true in humans, they "could broaden the medical use of marijuana," he says. "Many people in clinical trials are dropping out from treatments, because they say, ‘I cannot work anymore. I am stoned all the time.’ ” People have used marijuana for hundreds of years to treat conditions such as chronic pain, multiple sclerosis, and epilepsy. Studies in mice have shown that it can reduce some of the neural damage seen in Alzheimer's disease. The main psychoactive ingredient, tetrahydrocannabinol (THC), is approved by the Food and Drug Administration to treat anorexia in AIDS patients and the nausea triggered by chemotherapy. Although recreational drug users usually smoke marijuana, patients prescribed THC take it as capsules. Many people find the side effects hard to bear, however. The exact cause of these side effects is unclear. In the brain, THC binds to receptors called CB1 and CB2, which are involved in neural development as well as pain perception and appetite. The receptors are normally activated by similar compounds, called endocannabinoids, that are produced by the human body. When one of these compounds binds to CB1, it suppresses the activity of an enzyme called cyclooxygenase-2 (COX-2). The enzyme has many functions. For instance, painkillers such as ibuprofen and aspirin work by blocking COX-2. Researchers have hypothesized that the suppression of COX-2 could be the cause of THC's side effects, such as memory problems. © 2013 American Association for the Advancement of Science
By BENEDICT CAREY Grading college students on quizzes given at the beginning of every class, rather than on midterms or a final exam, increases both attendance and overall performance, scientists reported Wednesday. The findings — from an experiment in which 901 students in a popular introduction to psychology course at the University of Texas took their laptops to class and were quizzed online — demonstrate that the computers can act as an aid to teaching, not just a distraction. Moreover, the study is the latest to show how tests can be used to enhance learning as well as measure it. The report, appearing in the journal PLoS One, found that this “testing effect” was particularly strong in students from lower-income households. Psychologists have known for almost a century that altering the timing of tests can affect performance. In the past decade, they have shown that taking a test — say, writing down all you can remember from a studied prose passage — can deepen the memory of that passage better than further study. The new findings stand as a large-scale prototype for how such testing effects can be exploited in the digital era, experts said, though they cautioned that it was not yet clear how widely they could be applied. “This study is important because it introduces a new method to implement frequent quizzing with feedback in large classrooms, which can be difficult to do,” said Jeffrey D. Karpicke, a professor of psychology at Purdue, who was not involved in the study. He added, “This is the first large study to show that classroom quizzing can help reduce achievement gaps” due to socioeconomic background. © 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18960 - Posted: 11.23.2013
By EMILY ANTHES Humans have no exclusive claim on intelligence. Across the animal kingdom, all sorts of creatures have performed impressive intellectual feats. A bonobo named Kanzi uses an array of symbols to communicate with humans. Chaser the border collie knows the English words for more than 1,000 objects. Crows make sophisticated tools, elephants recognize themselves in the mirror, and dolphins have a rudimentary number sense. Anolis evermanni lizards normally attack their prey from above. The lizards were challenged to find a way to access insects that were kept inside a small hole covered with a tightfitting blue cap. And reptiles? Well, at least they have their looks. In the plethora of research over the past few decades on the cognitive capabilities of various species, lizards, turtles and snakes have been left in the back of the class. Few scientists bothered to peer into the reptile mind, and those who did were largely unimpressed. “Reptiles don’t really have great press,” said Gordon M. Burghardt, a comparative psychologist at the University of Tennessee at Knoxville. “Certainly in the past, people didn’t really think too much of their intelligence. They were thought of as instinct machines.” But now that is beginning to change, thanks to a growing interest in “coldblooded cognition” and recent studies revealing that reptile brains are not as primitive as we imagined. The research could not only redeem reptiles but also shed new light on cognitive evolution. Because reptiles, birds and mammals diverged so long ago, with a common ancestor that lived 280 million years ago, the emerging data suggest that certain sophisticated mental skills may be more ancient than had been assumed — or so adaptive that they evolved multiple times. © 2013 The New York Times Company