Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Fears over surveillance seem to figure large in the bird world, too. Ravens hide their food more quickly if they think they are being watched, even when no other bird is in sight. It’s the strongest evidence yet that ravens have a “theory of mind” – that they can attribute mental states such as knowledge to others. Many studies have shown that certain primates and birds behave differently in the presence of peers who might want to steal their food. While some researchers think this shows a theory of mind, others say they might just be reacting to visual cues, rather than having a mental representation of what others can see and know. Through the peephole Thomas Bugnyar and colleagues at the University of Vienna, Austria, devised an experiment to rule out the possibility that birds are responding to another’s cues. The setup involved two rooms separated by a wooden wall, with windows and peepholes that could be covered. First, a raven was given food with another raven in the next room, with the window open or covered, to see how quickly it caches its prize. With the window open, the birds hid their food more quickly and avoided going back to conceal it further. Then individual ravens were then trained to use the peephole to see where humans were putting food in the other room. The idea here was to allow the bird to realise it could be seen through the peephole. © Copyright Reed Business Information Ltd.
Nell Greenfieldboyce The state of New Jersey has been trying to help jurors better assess the reliability of eyewitness testimony, but a recent study suggests that the effort may be having unintended consequences. That's because a new set of instructions read to jurors by a judge seems to make them skeptical of all eyewitness testimony — even testimony that should be considered reasonably reliable. Back in 2012, New Jersey's Supreme Court did something groundbreaking. It said that in cases that involve eyewitness testimony, judges must give jurors a special set of instructions. The instructions are basically a tutorial on what scientific research has learned about eyewitness testimony and the factors that can make it more dependable or less so. "The hope with this was that jurors would then be able to tell what eyewitness testimony was trustworthy, what sort wasn't, and at the end of the day it would lead to better decisions, better court outcomes, better justice," says psychologist David Yokum. Yokum was a graduate student at the University of Arizona, doing research on decision-making, when he and two colleagues, Athan Papailiou and Christopher Robertson, decided to test the effect of these new jury instructions, using videos of a mock trial that they showed to volunteers. © 2016 npr
Keyword: Learning & Memory
Link ID: 21828 - Posted: 01.27.2016
James Gorman Spotted hyenas are the animals that got Sarah Benson-Amram thinking about how smart carnivores are and in what ways. Dr. Benson-Amram, a researcher at the University of Wyoming in Laramie, did research for her dissertation on hyenas in the wild under Kay E. Holekamp of Michigan State University. Hyenas have very complicated social structures and they require intelligence to function in their clans, or groups. But the researchers also tested the animals on a kind of intelligence very different from figuring out who ranks the highest: They put out metal boxes that the animals had to open by sliding a bolt in order to get at meat inside. Only 15 percent of the hyenas solved the problem in the wild, but in captivity, the animals showed a success rate of 80 percent. Dr. Benson-Amram and Dr. Holekamp decided to test other carnivores, comparing species and families. They and other researchers presented animals in several different zoos with a metal puzzle box with a treat inside and recorded the animals’ efforts. They tested 140 animals in 39 species that were part of nine families. They reported their findings on Monday in the Proceedings of the National Academy of Sciences. They compared the success rates of different families with absolute brain size, relative brain size, and the size of the social groups that the species form in the wild. Just having a bigger brain did not make difference, but the relative size of the brain, compared with the size of the body, was the best indication of which animals were able to solve the problem of opening the box. © 2016 The New York Times Company
by Graham McDougall, Jr., behavioral scientist at U. of Alabama Chemo brain is a mental cloudiness reported by about 30 percent of cancer patients who receive chemotherapy. Symptoms typically include impairments in attention, concentration, executive function, memory and visuospatial skills. Since the 1990s researchers have tried to understand this phenomenon, particularly in breast cancer patients. But the exact cause of chemo brain remains unclear. Some studies indicate that chemotherapy may trigger a variety of related neurological symptoms. One study, which examined the effects of chemotherapy in 42 breast cancer patients who underwent a neuropsychological evaluation before and after treatment, found that almost three times more patients displayed signs of cognitive dysfunction after treatment as compared with before (21 versus 61 percent). A 2012 review of 17 studies considering 807 breast cancer patients found that cognitive changes after chemotherapy were pervasive. Other research indicates that the degree of mental fogginess that a patient experiences may be directly related to how much chemotherapy that person receives: higher doses lead to greater dysfunction. There are several possible mechanisms to explain the cognitive changes associated with chemotherapy treatments. The drugs may have direct neurotoxic effects on the brain or may indirectly trigger immunological responses that may cause an inflammatory reaction in the brain. Chemotherapy, however, is not the only possible culprit. Research also shows that cancer itself may cause changes to the brain. In addition, it is possible that the observed cognitive decline may simply be part of the natural aging process, especially considering that many cancer patients are older than 50 years. © 2016 Scientific American,
By Emily Underwood Roughly half of Americans use marijuana at some point in their lives, and many start as teenagers. Although some studies suggest the drug could harm the maturing adolescent brain, the true risk is controversial. Now, in the first study of its kind, scientists have analyzed long-term marijuana use in teens, comparing IQ changes in twin siblings who either used or abstained from marijuana for 10 years. After taking environmental factors into account, the scientists found no measurable link between marijuana use and lower IQ. “This is a very well-conducted study … and a welcome addition to the literature,” says Valerie Curran, a psychopharmacologist at the University College London. She and her colleagues reached “broadly the same conclusions” in a separate, nontwin study of more than2000 British teenagers, published earlier this month in the Journal of Psychopharmacology, she says. But, warning that the study has important limitations, George Patton, a psychiatric epidemiologist at the University of Melbourne in Australia, adds that it in no way proves that marijuana—particularly heavy, or chronic use —is safe for teenagers. Most studies that linked marijuana to cognitive deficits, such as memory loss and low IQ, looked at a single “snapshot” in time, says statistician Nicholas Jackson of the University of Southern California in Los Angeles, lead author of the new work. That makes it impossible to tell which came first: drug use or poor cognitive performance. “It's a classic chicken-egg scenario,” he says. © 2016 American Association for the Advancement of Science.
Laura Sanders Pain can sear memories into the brain, a new study finds. A full year after viewing a picture of a random, neutral object, people could remember it better if they had been feeling painful heat when they first saw it. “The results are fun, they are interesting and they are provocative,” says neuroscientist A. Vania Apkarian of Northwestern University in Chicago. The findings “speak to the idea that pain really engages memory.” Neuroscientists G. Elliott Wimmer and Christian Büchel of University Medical Center Hamburg-Eppendorf in Germany reported the results in a paper online at BioRxiv.org first posted December 24 and revised January 6. The findings are under review at a journal, and Wimmer declined to comment on the study until it is accepted for publication. Wimmer and Büchel recruited 31 brave souls who agreed to feel pain delivered by a heat-delivering thermode on their left forearms. Each person’s pain sensitivity was used to calibrate the amount of heat they received in the experiment, which was either not painful (a 2 on an 8-point scale) or the highest a person could endure multiple times (a full 8). While undergoing a functional MRI scan, participants looked at a series of pictures of unremarkable household objects, such as a camera, sometimes feeling pain and sometimes not. Right after seeing the images, the people took a pop quiz in which they answered whether an object was familiar. Pain didn’t influence memory right away. Right after their ordeal, participants remembered about three-quarters of the previously seen objects, regardless of whether pain was present, the researchers found. © Society for Science & the Public 2000 - 2015.
By Emily Underwood Lumos Labs, the company that produces the popular “brain-training” program Lumosity, yesterday agreed to pay a $2 million settlement to the Federal Trade Commission (FTC) for running deceptive advertisements. Lumos had claimed that its online games can help users perform better at work and in school and stave off cognitive deficits associated with serious diseases such as Alzheimer’s, traumatic brain injury, and post-traumatic stress. The $2 million settlement will be used to compensate Lumosity consumers who were misled by false advertising, says Michelle Rusk, a spokesperson with FTC in Washington, D.C. The company will also be required to provide an easy way to cancel autorenewal billing for the service, which includes online and mobile app subscriptions, with payments ranging from $14.95 monthly to lifetime memberships for $299.95. Before consumers can access the games, a pop-up screen will alert them to FTC’s order and allow them to avoid future billing, Rusk says. The action is part of a larger crackdown on companies selling products that purportedly enhance memory or provide some other cognitive benefit, Rusk says. For some time now, FTC has been “concerned about some of the claims we’re seeing out there,” particularly those from companies like Lumos that suggest their games can reduce the effects of conditions such as dementia, she says. After evaluating the literature on Lumos's products, and the broader research on the benefits of brain-training games, “our assessment was they didn’t have adequate science for the claims that they’re making,” she says. © 2016 American Association for the Advancement of Science
Patricia Neighmond Losing your ability to think and remember is pretty scary. We know the risk of dementia increases with age. But if you have memory lapses, you probably needn't worry. There are pretty clear differences between signs of dementia and age-related memory loss. After age 50, it's quite common to have trouble remembering the names of people, places and things quickly, says Dr. Kirk Daffner, chief of the division of cognitive and behavioral neurology at Brigham and Women's Hospital in Boston. The brain ages just like the rest of the body. Certain parts shrink, especially areas in the brain that are important to learning, memory and planning. Changes in brain cells can affect communication between different regions of the brain. And blood flow can be reduced as arteries narrow. Simply put, this exquisitely complex organ just isn't functioning like it used to. Forgetting the name of an actor in a favorite movie, for example, is nothing to worry about. But if you forget the plot of the movie or don't remember even seeing it, that's far more concerning, Daffner says. When you forget entire experiences, he says, that's "a red flag that something more serious may be involved." Forgetting how to operate a familiar object like a microwave oven or forgetting how to drive to the house of a friend you've visited many times before can also be signs something is wrong. © 2016 npr
By R. Douglas Fields We all heard the warning as kids: “That TV will rot your brain!” You may even find yourself repeating the threat when you see young eyes glued to the tube instead of exploring the real world. The parental scolding dates back to the black-and-white days of I Love Lucy, and today concern is growing amid a flood of video streaming on portable devices. But are young minds really being harmed? With brain imaging, the effects of regular TV viewing on a child's neural circuits are plain to see. Studies suggest watching television for prolonged periods changes the anatomical structure of a child's brain and lowers verbal abilities. Behaviorally, even more detrimental effects may exist: although a cause-and-effect relation is hard to prove, higher rates of antisocial behavior, obesity and mental health problems correlate with hours in front of the set. Now a new study hits the pause button on this line of thinking. The researchers conclude that the entire body of research up to now has overlooked an important confounding variable, heredity, that could call into question the conventional wisdom that TV is bad for the brain. Further study will be needed to evaluate this claim, but the combined evidence suggests we need a more nuanced attitude toward our viewing habits. To understand the argument against television, we should rewind to 2013, when a team ofresearchers at Tohoku University in Japan, led by neuroscientist Hikaru Takeuchi, first published findings from a study in which the brains of 290 children between the ages of five and 18 were imaged. The kids' TV viewing habits, ranging from zero to four hours each day, were also taken into account. © 2016 Scientific American
By Karen Weintraub Mild cognitive impairment, or M.C.I., is not a disease in itself. Rather, it is a clinical description based on performance on a test of memory and thinking skills. Depending on its cause, mild cognitive impairment is potentially reversible. Poor performance on a cognitive test could be caused by certain medications, sleep apnea, depression or other problems, said Dr. Alvaro Pascual-Leone, a professor of neurology at Harvard Medical School and Beth Israel Deaconess Medical Center. In those cases, when the underlying disease is treated, cognitive abilities can bounce back. But in about half of people with M.C.I. – doctors are not sure of the exact number — memory problems are the first sign of impending Alzheimer’s disease. If M.C.I. progresses to Alzheimer’s, there is no recovery. Alzheimer’s is marked by an inexorable decline that is always fatal, although the path from the first signs of cognitive impairment to death may take three to 15 years, said Dr. David Knopman, a professor of neurology at the Mayo Clinic in Rochester, Minn. As many as 20 percent to 30 percent of those with M.C.I. who score below but near the cutoff for normal can cross back above in a subsequent cognitive test – perhaps because they are having a better day, he said. But someone whose score is borderline is at higher risk of developing Alzheimer’s than someone who scores higher, said Dr. Knopman, also vice chair of the medical and scientific advisory council of the Alzheimer’s Association. Doctors may be hesitant to label someone with early Alzheimer’s, which can be difficult to diagnose in the early stages, so they often call it mild cognitive impairment instead, said Dr. John C. Morris, a professor of neurology and the director of the Knight Alzheimer's Disease Research Center at Washington University School of Medicine in St. Louis. © 2015 The New York Times Company
Need to remember something important? Take a break. A proper one – no TV or flicking through your phone messages. It seems that resting in a quiet room for 10 minutes without stimulation can boost our ability to remember new information. The effect is particularly strong in people with amnesia, suggesting that they may not have lost the ability to form new memories after all. “A lot of people think the brain is a muscle that needs to be continually stimulated, but perhaps that’s not the best way,” says Michaela Dewar at Heriot-Watt University in Edinburgh, UK. New memories are fragile. They need to be consolidated before being committed to long-term storage, a process thought to happen while we sleep. But at least some consolidation may occur while we’re awake, says Dewar – all you need is a timeout. In 2012, Dewar’s team showed that having a rest helps a person to remember what they were told a few minutes earlier. And the effect seems to last. People who had a 10-minute rest after hearing a story remembered 10 per cent more of it a week later than those who played a spot-the-difference game immediately afterwards. “We dim the lights and ask them to sit in an empty, quiet room, with no mobile phones,” says Dewar. When asked what they had been thinking about afterwards, most volunteers said they had let their minds wander. Now Dewar, along with Michael Craig at the University of Edinburgh and their colleagues, have found that spatial memories can also be consolidated when we rest. © Copyright Reed Business Information Ltd.
By John Bohannon In July 1984, a man broke into the apartment of Jennifer Thompson, a 22-year-old in North Carolina, and threatened her with a knife. She negotiated, convincing him to not kill her. Instead, he raped her and fled. Just hours later, a sketch artist worked with Thompson to create an image of the assailant's face. Then the police showed her a series of mug shots of similar-looking men. Thompson picked out 22-year-old Ronald Cotton, whose photograph was on file because of a robbery committed in his youth. When word reached Cotton that the police were looking for him, he walked into a precinct voluntarily. He was eventually sentenced to life in prison based on Thompson's testimony. Eleven years later, after DNA sequencing technology caught up, samples taken from Thomson's body matched a different man who finally confessed. Cotton was set free. When Thompson first identified Cotton by photo, she was not convinced of her choice. "I think this is the guy," she told the police after several minutes of hesitation. As time went on, she grew surer. By the time Thompson faced Cotton in court a year later, her doubts were gone. She confidently pointed to him as the man who raped her. Because of examples like these, the U.S. justice system has been changing how eyewitnesses are used in criminal cases. Juries are told to discount the value of eyewitness testimony and ignore how confident the witnesses may be about whom they think they saw. Now, a new study of robbery investigations suggests that these changes may be doing more harm than good. © 2015 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 21715 - Posted: 12.22.2015
Megan Scudellari In 1997, physicians in southwest Korea began to offer ultrasound screening for early detection of thyroid cancer. News of the programme spread, and soon physicians around the region began to offer the service. Eventually it went nationwide, piggybacking on a government initiative to screen for other cancers. Hundreds of thousands took the test for just US$30–50. LISTEN James Harkin, a researcher for the British TV trivia show QI, talks to Adam Levy about how he finds facts and myths for the show — and then runs a mini-quiz to see whether the Podcast team can discern science fact from science fiction 00:00 Across the country, detection of thyroid cancer soared, from 5 cases per 100,000 people in 1999 to 70 per 100,000 in 2011. Two-thirds of those diagnosed had their thyroid glands removed and were placed on lifelong drug regimens, both of which carry risks. Such a costly and extensive public-health programme might be expected to save lives. But this one did not. Thyroid cancer is now the most common type of cancer diagnosed in South Korea, but the number of people who die from it has remained exactly the same — about 1 per 100,000. Even when some physicians in Korea realized this, and suggested that thyroid screening be stopped in 2014, the Korean Thyroid Association, a professional society of endocrinologists and thyroid surgeons, argued that screening and treatment were basic human rights. © 2015 Nature Publishing Group,
Human memory is about to get supercharged. A memory prosthesis being trialled next year could not only restore long-term recall but may eventually be used to upload new skills directly to the brain – just like in the film The Matrix. The first trials will involve people with epilepsy. Seizures can sometimes damage the hippocampus, causing the brain to lose its ability to form long-term memories. To repair this ability, Theodore Berger at the University of Southern California and his colleagues used electrodes already implanted in people’s brains as part of epilepsy treatment to record electrical activity associated with memory. The team then developed an algorithm that could predict the neural activity thought to occur when a short-term memory becomes a long-term memory, as it passes through the hippocampus. Early next year, Berger’s team will use this algorithm to instruct the electrodes to predict and then mimic the activity that should occur when long-term memories are formed. “Hopefully, it will repair their long-term memory,” says Berger. Previous studies using animals suggest that the prosthesis might even give people a better memory than they could expect naturally. A similar approach could eventually be used to implant new memories into the brain. Berger’s team recorded brain activity in a rat that had been trained to perform a specific task. The memory prosthesis then replicated that activity in a rat that hadn’t been trained. The second rat was able to learn the task much faster than the first rat – as if it already had some memory of the task. © Copyright Reed Business Information Ltd.
By Elizabeth Pennisi Imagine trying to train wild sea lions—without them ever seeing you. That was Peter Cook's challenge 8 years ago when he was trying to figure out whether poisonous algae were irrevocably damaging the animals’ brains. With a lot of patience and some luck, the comparative neuroscientist from Emory University in Atlanta has succeeded, and the news isn't good. Toxins from the algae mangle a key memory center, likely making it difficult for sick animals to hunt or navigate effectively, Cook and his colleagues report today. "Sea lions can be seen as sentinels of human health," says Kathi Lefebvre, a research biologist at the Northwest Fisheries Science Center in Seattle, Washington, who was not involved with the work. As oceans warm, toxic algae proliferate and cause so-called red tides because the water looks reddish. So "understanding these toxins in wild animals is going to become more important," she says. Red tides are produced by algae called diatoms. They make a toxin called domoic acid, which is consumed by other plankton that in turn become food for fish and other organisms. Predators such as anchovies, sardines, and other schooling fish accumulate this toxin in their bodies. So when algal populations explode, say, because of warming water, domoic acid concentrations increase in these animals to a point that they affect the sea lions that feast on them. Scientists first recognized this problem in 1998, after hundreds of sea lions were found stranded or disoriented along California's coast. Since then, researchers have studied sick and dead sea lions and documented that the toxin causes seizures and damages the brain, sometimes killing the animal. © 2015 American Association for the Advancement of Science.
By Michael M. Torrice, We learn from experience: It sounds like a trite sentiment posted by a friend on Facebook, but neuroscientists would agree. Our interactions with the world around us strengthen and weaken the connections between our neurons, a process that neuroscientists consider to be the cellular mechanism of learning. Now researchers report that boosting signaling of a certain receptor in the brain with a small molecule can enhance these cellular changes and improve learning in people. The findings could lead to new treatments for patients with disorders associated with deficits in learning, such as Alzheimer’s disease and schizophrenia. Through decades of research on how synapses change in animal brains, scientists have found that the N-methyl-d-aspartate receptor (NMDAR) plays a critical role in strengthening synapses during learning. Compounds that increase NMDAR signaling can drive such changes and, as a result, help animals learn new tasks. Robert F. Asarnow at UCLA and colleagues wanted to test whether one such compound, d-cycloserine, would act similarly in people. But neuroscientists measure synapse changes in animals by sticking electrodes into slices of brain tissue to record electrical signals. “Obviously, we don’t do that to our friends,” Asarnow says. So his team used electroencephalography (EEG) to record electrical activity through electrodes stuck to the scalps of its subjects. The team monitored this activity as the subjects watched a certain pattern flash on a screen at high frequency for a couple minutes. Afterward, the subjects showed a spike in EEG activity in their visual cortex when they viewed the pattern at a later time. This suggested a population of neurons had wired themselves together by strengthening their synapses. © 2015 Scientific American
Keyword: Learning & Memory
Link ID: 21684 - Posted: 12.09.2015
by Laura Sanders There’s only so much brainpower to go around, and when the eyes hog it all, the ears suffer. When challenged with a tough visual task, people are less likely to perceive a tone, scientists report in the Dec. 9 Journal of Neuroscience. The results help explain what parents of screen-obsessed teenagers already know. For the study, people heard a tone while searching for a letter on a computer screen. When the letter was easy to find, participants were pretty good at identifying a tone. But when the search got harder, people were less likely to report hearing the sound, a phenomenon called inattentional deafness. Neural responses to the tone were blunted when people worked on a hard visual task, but not when the visual task was easy, researchers found. By showing that a demanding visual job can siphon resources away from hearing, the results suggest that perceptual overload can jump between senses. © Society for Science & the Public 2000 - 2015
By Nicholas Bakalar Watching television may be bad for your brain, a new study suggests. Researchers followed 3,274 people whose average age was 25 at the start of the study for 25 years, using questionnaires every five years to collect data on their physical activity and TV watching habits. At year 25, they administered three tests that measured various aspects of mental acuity. The highest level of TV watching — more than three hours a day most days — was associated with poor performance on all three tests. Compared with those who watched TV the least, those who watched the most had between one-and-a-half and two times the odds of poor performance on the tests, even after adjusting for age, sex, race, educational level, body mass index, smoking, alcohol use, hypertension and diabetes. Those with the lowest levels of physical activity and the highest levels of TV watching were the most likely to have poor test results. The authors acknowledge that their findings, published in JAMA Psychiatry, depend on self-reports, and that they had no baseline tests of cognitive function for comparison. “We can’t separate out what is going on with the TV watching,” said the lead author, Dr. Kristine Yaffe, a professor of psychiatry and neurology at the University of California, San Francisco. “Is it just the inactivity, or is there something about watching TV that’s the opposite of cognitive stimulation?” © 2015 The New York Times Company
Link ID: 21675 - Posted: 12.05.2015
By Nala Rogers If you travel with a group of friends, you might delegate navigation to the person with the best sense of direction. But among homing pigeons, the leader is whoever flies the fastest—even if that pigeon has to pick up navigation skills on the job, according to a new study. To find out how the skills of individual pigeons influence flock direction, researchers tested four flocks on journeys from three different locations, each about 5 kilometers from their home loft near Oxford, U.K. At each site, the researchers tracked the pigeons during solo flights before releasing them together for several group journeys. The fastest birds surged to the front during group flights and determined when the flock turned, despite the fact that these leaders were often poor navigators during their initial solo expeditions. But on a final set of solo flights—made after the group journeys—these same leaders chose straighter routes than followers, the researchers report today in Current Biology. Apparently, being responsible for group decisions helped pigeons learn the route, say scientists, raising questions about the two-way interplay between skills and leadership. © 2015 American Association for the Advancement of Science
By John Bohannon It may sound like a bird-brained idea, but scientists have trained pigeons to spot cancer in images of biopsied tissue. Individually, the avian analysts can't quite match the accuracy of professional pathologists. But as a flock, they did as well as trained humans, according to a new study appearing this week in PLOS ONE. Cancer diagnosis often begins as a visual challenge: Does this lumpy spot in a mammogram image justify a biopsy? And do cells in biopsy slides look malignant or benign? Training doctors and medical technicians to tell the difference is expensive and time-consuming, and computers aren't yet up to the task. To see whether a different type of trainee could do better, a team led by Richard Levenson, a pathologist and technologist at the University of California, Davis, and Edward Wasserman, a psychologist at the University of Iowa, in Iowa City, turned to pigeons. In spite of their limited intellect, the bobble-headed birds have certain advantages. They have excellent visual systems, similar to, if not better than, a human's. They sense five different colors as opposed to our three, and they don’t “fill in” the gaps like we do when expected shapes are missing. However, training animals to do a sophisticated task is tricky. Animals can pick up on unintentional cues from their trainers and other humans that may help them correctly solve problems. For example, a famous 20th century horse named Clever Hans was purportedly able to do simple arithmetic, but was later shown to be observing the reactions of his human audience. And although animals can perform extremely well on tasks that are confined to limited circumstances, overtraining on one set of materials can lead to total inaccuracy when the same information is conveyed slightly differently. © 2015 American Association for the Advancement of Science