Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By VASILIS K. POZIOS, PRAVEEN R. KAMBAM and H. ERIC BENDER EARLIER this summer the actor Jim Carrey, a star of the new superhero movie “Kick-Ass 2,” tweeted that he was distancing himself from the film because, in the wake of the Sandy Hook massacre, “in all good conscience I cannot support” the movie’s extensive and graphically violent scenes. Mark Millar, a creator of the “Kick-Ass” comic book series and one of the movie’s executive producers, responded that he has “never quite bought the notion that violence in fiction leads to violence in real life any more than Harry Potter casting a spell creates more boy wizards in real life.” While Mr. Carrey’s point of view has its adherents, most people reflexively agree with Mr. Millar. After all, the logic goes, millions of Americans see violent imagery in films and on TV every day, but vanishingly few become killers. But a growing body of research indicates that this reasoning may be off base. Exposure to violent imagery does not preordain violence, but it is a risk factor. We would never say: “I’ve smoked cigarettes for a long time, and I don’t have lung cancer. Therefore there’s no link between smoking cigarettes and lung cancer.” So why use such flawed reasoning when it comes to media violence? There is now consensus that exposure to media violence is linked to actual violent behavior — a link found by many scholars to be on par with the correlation of exposure to secondhand smoke and the risk of lung cancer. In a meta-analysis of 217 studies published between 1957 and 1990, the psychologists George Comstock and Haejung Paik found that the short-term effect of exposure to media violence on actual physical violence against a person was moderate to large in strength. Mr. Comstock and Ms. Paik also conducted a meta-analysis of studies that looked at the correlation between habitual viewing of violent media and aggressive behavior at a point in time. They found 200 studies showing a moderate, positive relationship between watching television violence and physical aggression against another person. © 2013 The New York Times Company
By Geoffrey Mohan If you can’t quite get that nine-note treble opening to "Fur Elise," just sleep on it. The brain will rehearse, reorganize and nail the sequential motor tasks that help you play piano or type on a keyboard. How that consolidation of memory happens has remained largely a mystery, despite telling evidence that the brain’s motor cortex appears to be quite busy during sleep. Now, a team led by Brown University neuroscientists believes it has found the source of the sleeping piano lesson, and it’s not where many expected it to be. Neuroscience has been fixated since its founding on why the brain “needs” that peculiar mix of dormancy and random activity known as sleep. And it equally wondered why we emerge from it better able to do things. Slowly, evidence accrued that we were “learning” during sleep -- consolidating memory in ways that would make waking tasks more successful. It seemed deepest sleep, not the familiar rapid-eye-movement type, had the most effect on our brain’s abilty to reorganize and prepare to perform better in waking hours. “It has been very difficult to measure brain activation during sleep,” said Brown University neuroscientist Masako Tamaki, lead author of the study published online Tuesday in the Journal of Neuroscience. “So it was unclear what brain region was involved.”
Linda Carroll TODAY contributor Whether it’s “One Flew Over the Cuckoo’s Nest,” “Girl Interrupted,” or “Homeland,” Hollywood’s portrayals of electroconvulsive therapy have never been pretty. And the images from those movies and TV shows have only added to a stigma that keeps many desperate patients from opting for a therapy that might turn their lives around, experts say. “We can’t get past the stigma of all the visuals we’ve seen from movies and the fact that it seems so antiquated when you consider modern medicine,” NBC chief medical editor Dr. Nancy Snyderman told TODAY’s Matt Lauer. “But time and time and time again if you look at patients who have severe depression who don’t respond to medications, they will tell you that ECT works.” That’s certainly true in Denise Stewart’s case. Stewart, a mother of two, suffers from schizoaffective disorder. Her hallucinations were pushing her closer and closer to suicide each day. “There would be voices in my head that would sit there and say, ‘Denise, see the knife in the kitchen? Cut your wrists. Denise, see those pills over there? Take all those pills,’” she told TODAY. After antidepressants made Stewart’s condition worse, her doctors suggested ECT. And the change was dramatic. “If it hadn’t been for the electroconvulsive therapy, I wouldn’t be alive right now,” Stewart said. These days an estimated 100,000 Americans undergo ECT each year – and the process is a lot different from what you see in the media, experts say.
Moheb Costandi In the early hours of 9 September, 1984, a stranger entered Mrs M's California home through an open living-room window. Finding Mrs M asleep, he tried to rape her, but fled when other people in the house awoke. Mrs M described her assailant to the police: he was black, weighing about 170 pounds and 5'7” to 5'9” tall, with small braids and a blue baseball cap. Officers cruising her neighbourhood spotted someone roughly matching that description standing beside his car a block away from the house. The man, Joseph Pacely, said that his car had broken down and he was looking for someone to jump-start it. But Mrs M identified him as her attacker and he was charged. At Pacely's trial a few months later, memory researcher Elizabeth Loftus testified on his behalf. She told the jury how memory is fallible; how stress and fear may have impaired Mrs M's ability to identify her assailant, and how people can find it difficult to identify someone of a race other than their own. Pacely was acquitted. “It's cases like this that mean the most to me,” says Loftus, “the ones in which I play a role in bringing justice to an innocent person.” In a career spanning four decades, Loftus, a psychologist at the University of California, Irvine, has done more than any other researcher to document the unreliability of memory in experimental settings. And she has used what she has learned to testify as an expert witness in hundreds of criminal cases — Pacely's was her 101st — informing juries that memories are pliable and that eyewitness accounts are far from perfect recordings of actual events. © 2013 Nature Publishing Group
Keyword: Learning & Memory
Link ID: 18514 - Posted: 08.15.2013
Helen Shen The false mouse memories made the ethicists uneasy. By stimulating certain neurons in the hippocampus, Susumu Tonegawa and his colleagues caused mice to recall receiving foot shocks in a setting in which none had occurred1. Tonegawa, a neuroscientist at the Massachusetts Institute of Technology in Cambridge, says that he has no plans to ever implant false memories into humans — the study, published last month, was designed just to offer insight into memory formation. But the experiment has nonetheless alarmed some neuroethicists. “That was a bell-ringer, the idea that you can manipulate the brain to control the mind,” says James Giordano, chief of neuroethics studies at Georgetown University in Washington DC. He says that the study is one of many raising ethical concerns, and more are sure to come as an ambitious, multi-year US effort to parse the human brain gets under way. The BRAIN (Brain Research through Advancing Innovative Neurotechnologies) Initiative will develop technologies to understand how the brain’s billions of neurons work together to produce thought, emotion, movement and memory. But, along with the discoveries, it could force scientists and society to grapple with a laundry list of ethical issues: the responsible use of cognitive-enhancement devices, the protection of personal neural data, the prediction of untreatable neurodegenerative diseases and the assessment of criminal responsibility through brain scanning. On 20 August, US President Barack Obama’s commission on bioethics will hold a meeting in Philadelphia, Pennsylvania, to begin to craft a set of ethics standards to guide the BRAIN project. There is already one major mechanism for ethical oversight in US research: institutional review boards, which must approve any studies involving human subjects. But many ethicists say that as neuroscience discoveries creep beyond laboratory walls into the marketplace and the courtroom, more comprehensive oversight is needed. © 2013 Nature Publishing Group,
Keyword: Learning & Memory
Link ID: 18513 - Posted: 08.15.2013
by Douglas Heaven It's a cognitive leap forward. IBM can now program an experimental chip they unveiled two years ago. The chips, designed to mimic how our brains work, are set to power computers that handle many streams of input data at once – much like the sensory input we deal with all the time. IBM's TrueNorth computer chips contain memory, processors and communication channels wired up like the synapses, neurons and axons of a brain. A key idea is that the chips can be hooked up into vast grids with many thousands working together in parallel. For certain types of task, such as quickly responding to large amounts of input data from sensors, they are much faster and less power-hungry than standard chips. They could one day replace human reflexes in self-driving cars or power the sensory systems of a robot, for example. But because the chips rewrite the rulebook for how computers are normally put together, they are not easy to program. Dharmendra Modha and his colleagues at IBM Research in San Jose, California, learned this the hard way. The team's first attempts were full of errors: "The programs were very unintuitive and extremely difficult to debug," says Modha. "Things looked hopeless." So they designed a new way of programming. This involves telling the computer how to yoke together the many individual chips in play at once. The IBM team came up with a way to package the functionality of each chip inside blocks of code they call "corelets". © Copyright Reed Business Information Ltd.
Link ID: 18490 - Posted: 08.12.2013
“OUR primary goal is for our users to see us as a gym, where they can work out and keep mentally fit,” says Michael Scanlon, the co-founder and chief scientist of Lumos Labs. For $14.95 a month, subscribers to the firm’s Lumosity website get to play a selection of online games designed to improve their cognitive performance. There are around 40 exercises available, including “speed match”, in which players click if an image matches a previous one; “memory matrix”, which requires remembering which squares on a matrix were shaded; and “raindrops”, which involves solving arithmetic problems before the raindrops containing them hit the ground. The puzzles are varied, according to how well users perform, to ensure they are given a suitably challenging brain-training session each day. The popularity of Lumosity since its launch in 2007 has been, well, mind-blowing. Its smartphone app has been the top education app in the iTunes store at some point in 38 countries. On August 1st it launched an iPad version, which it expects to boost its existing 45m registered users in 180-plus countries. Lumos Labs has already raised almost $70m in venture capital, and is one of two firms vying to become the first public company serving the new “digital brain health” market, says Alvaro Fernandez of SharpBrains, a research firm. (The firm hoping to beat it to the punch is NeuroSky, which makes “brainwave sensors”—including some shaped like cats’ ears that will apparently wiggle if you are enjoying yourself and droop if you are relaxed.) The metaphor of workouts for the mind will set alarm bells ringing for anyone familiar with Brain Gym, a series of physical exercises for children, adopted unquestioningly by many British schools, whose supposed cognitive benefits were debunked in “Bad Science”, a 2008 book by Ben Goldacre. However, Mr Scanlon, who quit his neuroscience PhD at Stanford University to co-found Lumos Labs, says he was inspired to do so by the mounting academic evidence of the plasticity of the brain and of the ability to improve cognitive function through simple exercises. © The Economist Newspaper Limited 2013
Keyword: Learning & Memory
Link ID: 18480 - Posted: 08.10.2013
By GRETCHEN REYNOLDS Over the past decade, in study after study in animals and people, exercise has been shown to improve the ability to learn and remember. But the specifics of that process have remained hazy. Is it better to exercise before you learn something new? What about during? And should the exercise be vigorous or gentle? Two new studies helpfully tackle those questions, with each reaching the conclusion that the timing and intensity of even a single bout of exercise can definitely affect your ability to remember — though not always beneficially. To reach that conclusion, scientists conducting the larger and more ambitious of the new studies, published in May in PLoS One, first recruited 81 healthy young women who were native German speakers and randomly divided them into three groups. Each group wore headphones and listened for 30 minutes to lists of paired words, one a common German noun and the other its Polish equivalent. The women were asked to memorize the unfamiliar word. But they heard the words under quite different circumstances. One group listened after sitting quietly for 30 minutes. A second group rode a stationary bicycle at a gentle pace for 30 minutes and then sat down and donned the headphones. And the third group rode a bicycle at a mild intensity for 30 minutes while wearing the headphones and listening to the new words. Two days later, the women completed tests of their new vocabulary. Everyone could recall some new words. But the women who had gently ridden a bicycle while hearing the new words — who had exercised lightly during the process of creating new memories —performed best. They had the most robust recall of the new information, significantly better than the group that had sat quietly and better than the group that had exercised before learning. Those women performed only slightly better than the women who had not exercised at all. Copyright 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18475 - Posted: 08.08.2013
Jason Bruck Ever been at a party where you recognize everyone’s faces but can’t think of their names? That wouldn’t happen if you were a bottlenose dolphin (Tursiops truncatus). The marine mammals can remember each other’s signature contact whistles—calls that function as names—for more than 20 years, the longest social memory ever recorded for a nonhuman animal, according to a new study. “The ability to remember individuals is thought to be extremely important to the ‘social brain,’ ” says Janet Mann, a marine mammal biologist at Georgetown University in Washington, D.C., who was not involved in the research. Yet, she notes, no one has succeeded in designing a test for this talent in the great apes—our closest kin—let alone in dolphins. Dolphins use their signature whistles to stay in touch. Each has its own unique whistle, and they learn and can repeat the whistles of other dolphins. A dolphin will answer when another dolphin mimics its whistle—just as we reply when someone calls our name. The calls enable the marine mammals to communicate over long distances—which is necessary because they live in “fission-fusion” societies, meaning that dolphins in one group split off to join other groups and later return. By whistling, they’re able to find each other again. Scientists don’t know how long dolphins are separated in the wild, but they do know the animals can live almost 50 years. So how long do the dolphins remember the calls of their friends? To find out, Jason Bruck, a cognitive ethologist at the University of Chicago in Illinois, spent 5 years collecting 71 whistles from 43 dolphins at six captive facilities, including Brookfield Zoo near Chicago and Dolphin Quest in Bermuda. The six sites belong to a consortium that rotates the marine mammals for breeding and has decades-long records of which dolphins have lived together. © 2012 American Association for the Advancement of Science
By NICK BILTON Scientists haven’t yet found a way to mend a broken heart, but they’re edging closer to manipulating memory and downloading instructions from a computer right into a brain. Researchers from the Riken-M.I.T. Center for Neural Circuit Genetics at the Massachusetts Institute of Technology took us closer to this science-fiction world of brain tweaking last week when they said they were able to create a false memory in a mouse. The scientists reported in the journal Science that they caused mice to remember receiving an electrical shock in one location, when in reality they were zapped in a completely different place. The researchers weren’t able to create entirely new thoughts, but they applied good or bad feelings to memories that already existed. “It wasn’t so much writing a memory from scratch, it was basically connecting two different types of memories. We took a neutral memory, and we artificially updated that to make it a negative memory,” said Steve Ramirez, one of the M.I.T. neuroscientists on the project. It may sound insignificant and perhaps not a nice way to treat mice, but it is not a dramatic leap to imagine that one day this research could lead to computer-manipulation of the mind for things like the treatment of post-traumatic stress disorder, Mr. Ramirez said. Technologists are already working on brain-computer interfaces, which will allow us to interact with our smartphones and computers simply by using our minds. And there are already gadgets that read our thoughts and allow us to do things like dodge virtual objects in a computer game or turn switches on and off with a thought. Copyright 2013 The New York Times Company
by Helen Thomson We all get lost sometimes. Luckily, specialised cells in the brain that help animals find their way have now been identified in humans for the first time. The discovery could lead to better treatments for people who have problems navigating. We know that animals use three cell types to navigate the world. Direction cells fire only when an animal is facing a particular direction, place cells fire only in a particular location, and grid cells fire at regular intervals as an animal moves around. To understand how grid cells work, imagine the carpet in front of you has a grid pattern of interlocking triangles. One grid cell will fire whenever you reach the corner of any triangle in that grid. Shift the grid pattern along ever so slightly to another section of the carpet, and another grid cell will be responsible for firing every time you reach the corners of that grid's triangles – and so on. Grid cells send information to place cells and both kinds of cell send information to the hippocampus – responsible for memory formation. Together, this network of activity helps form a mental representation of an animal's location in its environment. Direction and place cells have been identified in humans but the existence of grid cells has so far only been hinted at in brain scans. To find out whether these cells do exist in humans, Joshua Jacobs at Drexel University in Philadelphia, Pennsylvania, and colleagues tested 14 people who had already had electrodes implanted in their brains for epilepsy therapy. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 18459 - Posted: 08.05.2013
By Andrea Anderson In spring a band of brainy rodents made headlines for zipping through mazes and mastering memory tricks. Scientists credited the impressive intellectual feats to human cells transplanted into their brains shortly after birth. But the increased mental muster did not come from neurons, the lanky nerve cells that swap electrical signals and stimulate muscles. The mice benefited from human stem cells called glial progenitors, immature cells poised to become astrocytes and other glia cells, the supposed support cells of the brain. Astrocytes are known for mopping up excess neuro-transmitters and maintaining balance in brain systems. During the past couple of decades, however, researchers started suspecting astrocytes of making more complex cognitive contributions. In the 1990s the cells got caught using calcium to accomplish a form of nonelectrical signaling. Studies since then have revealed how extensively astrocytes interact with neurons, even coordinating their activity in some cases. Perhaps even more intriguing, our astrocytes are enormous compared with the astrocytes of other animals—20 times larger than rodent astrocytes—and they make contact with millions of neurons apiece. Neurons, on the other hand, are nearly identical in all mammals, from rodents to great apes like us. Such clues suggest astrocytes could be evolutionary contributors to our outsized intellect. The new study, published in March in Cell Stem Cell, tested this hypothesis. A subset of the implanted human stem cells matured into rotund, humanlike astrocytes in the animals' brains, taking over operations from the native mouse astrocytes. When tested under a microscope, these human astrocytes accomplished calcium signaling at least three times faster than the mouse astrocytes did. The enhanced mice masterfully memorized new objects, swiftly learned to link certain sounds or situations to an unpleasant foot shock, and displayed unusually savvy maze navigation—signs of mental acuity that surpassed skills exhibited by either typical mice or mice transplanted with glial progenitor cells from their own species. © 2013 Scientific American
Andrew M. Seaman, Reuters Children with an autism spectrum disorder spend about twice as much time playing video games as kids who don't have a developmental disability, according to a new study. Researchers also found that children with an autism spectrum disorder or attention deficit/hyperactivity disorder (ADHD) are at an increased risk of gaming addictions, compared to children without the disabilities. "What we found is that it looks like (addictive gaming) was largely driven by inattention," Christopher Engelhardt, one of the study's authors from the University of Missouri in Columbia, told Reuters Health. Previous studies have found that children with an autism spectrum disorder or ADHD spend more time playing video games and are at increased risk for gaming addictions than other children, write the researchers in the journal Pediatrics. No single study, however, has looked at the three groups to see whether shared features of autism and ADHD - such as inattention or hyperactivity - seem to drive video game use. For the new study, Engelhardt and his colleague surveyed the parents of 141 boys between the ages of 8 and 18 years old. Of those, 56 had an autism spectrum disorder, 44 had ADHD and 41 were developing normally. Overall, they found that kids with an autism spectrum disorder played - on average - 2.1 hours of video games per day. Children with ADHD spent about 1.7 hours per day playing video games and normally developing kids played about 1.2 hours per day.
By Darold Treffert So much of what happens to us in life is not by plan, but rather by coincidence or serendipity. Thus it was with me and my career. After completing my residency in psychiatry I was assigned the responsibility of developing a Children’s Unit at Winnebago Mental Health Institute here in Wisconsin. There were over 800 patients at the hospital, some under age 18. We gathered about 30 such children and adolescents and put them on this new unit. Three patients particularly caught my eye. One boy had memorized the bus system of the entire city of Milwaukee with exhaustive detail and precision. Another little guy, even though mute and severely disabled with autism, could put a 200 piece jig saw puzzle together—picture side down—just from the geometric shapes of the puzzle pieces. And a third lad was an expert on what happened on this day in history and even though I would study up the night before, knowing he would quiz me the next day, I could never surpass his recall of events on that day in history. Kim Peek, his father Fran Peek and Dr. Treffert meeting in Milwaukee I was stunned, and intrigued, by this jarring juxtaposition of ability and disability in the same individual and began to study all that I could about savant syndrome—“islands of genius” amidst a sea of impairment. Then in 1980 Leslie Lemke came to Fond du Lac to give a concert. Leslie–blind, cognitively impaired and with such spasticity in his hands that he could not hold a fork or spoon to eat—had become a accomplished pianist, never having had a piano lesson in his life. Somehow the hand spasticity magically disappears when he sits at the keyboard. The 1983 60 Minutes program, which many still remember, recounted in detail the astonishment of Leslie’s mother, May Lemke, one evening, when Leslie, age 14, played back Tchaikovsky’s Piano Concerto No. 1 flawlessly, having heard it earlier for the first time that evening as the soundtrack to the movie Sincerely Yours. © 2013 Scientific American
Keyword: Learning & Memory
Link ID: 18437 - Posted: 08.01.2013
By Julie Hecht AFTER A LONG DAY of being a dog, no dog in existence has ever curled up on a comfy couch to settle in with a good book. Dogs just don’t roll like that. But that shouldn’t imply that human words don’t or can’t have meaning for dogs. Chaser, a Border Collie from South Carolina, first entered the news in 2011 when a Behavioral Processes paper reported she had learned and retained the distinct names of over 1,000 objects. But that’s not all. When tested on the ability to associate a novel word with an unfamiliar item, she could do that, too. She also learned that different objects fell into different categories: certain things are general “toys,” while others are the more specific “Frisbees” and, of course, there are many, many exciting “balls.” She differentiates between object labels and action commands, interpreting “fetch sock” as two separate words, not as the single phrase “fetchsock.” Fast forward two years. Chaser and her owner and trainer Dr. John Pilley, an emeritus professor of psychology at Wofford College, appeared again in a scientific journal. This time, the study highlighted Chaser’s attention to the syntactical relationships between words, for example, differentiating “to ball take Frisbee” from “to Frisbee take ball.” I’ve been keeping an eye on Chaser, and I’ve been keeping an eye on Rico, Sofia, Bailey, Paddy and Betsy, all companion dogs whose way with human language has been reported in scientific journals. Most media reports tend to focus on outcomes: what these dogs can — or can’t — do with our words. But I think these reports are missing the point. Learning the names of over 1,000 words doesn’t just happen overnight. What does the behind-the-scenes learning and training look like? How did Chaser develop this intimate relationship with human language? © 2013 Scientific American
By Jason Castro It’s the premise of every third sci-fi thriller. Man wakes up to his normal seeming life, but of course it isn’t. At first, just the little things are off – the dog won’t eat and the TV keeps looping some strange video – but whatever. A few cuts later, with only his granddad’s rusty brass knuckles and a steely-eyed contempt for authority, our hero reveals a conspiracy that kicks up straight to the top. There were deals. Some blackmailing. A probe or two. But in the end, what’s most important is that everything he thought he knew was wrong. Because the scientists (Noooo!!) got to him with one of those electrode caps and rewrote his memory. Everything – the job, the daughter, the free parking – is a lie. The dramatic ploy works on us because memory seems inviolable, or at least, we desperately hope that it is. We allow that our memories may fade and fail a bit, but otherwise, we go on the sanity-preserving assumption that there is one reason why we remember a particular thing: because we were there, and it actually happened. Now, a new set of experiments, led by MIT neuroscientists Steve Ramirez and Xu Liu in Susumu Tonegawa’s lab, shows that this needn’t be the case. Using a stunning set of molecular neuroscience techniques (no electrode caps involved), these scientists have captured specific memories in mice, altered them, and shown that the mice behave in accord with these new, false, implanted memories. The era of memory engineering is upon us, and naturally, there are big implications for basic science and, perhaps someday, human health and society. © 2013 Scientific American,
Keyword: Learning & Memory
Link ID: 18432 - Posted: 07.31.2013
Kelly Servick Our imperfect memory is inconvenient at the grocery store and downright dangerous on the witness stand. In extreme cases, we may be confident that we remember something that never happened at all. Now, a group of neuroscientists say that they’ve identified a potential mechanism of false memory creation and have planted such a memory in the brain of a mouse. Neuroscientists are only beginning to tackle the phenomenon of false memory, says Susumu Tonegawa of the Massachusetts Institute of Technology in Cambridge, whose team conducted the new research. “It’s there, and it’s well established,” he says, “but the brain mechanisms underlying this false memory are poorly known.” With optogenetics—the precise stimulation of neurons with light—scientists can seek out the physical basis of recall and even tweak it a bit, using mouse models. Like us, mice develop memories based on context. When a mouse returns to an environment where it felt pain in the past, it recalls that experience and freezes with fear. Tonegawa’s team knew that the hippocampus, a part of the brain responsible for establishing memory, plays a role in encoding context-based experiences, and that stimulating cells in a part of the hippocampus called the dentate gyrus can make a mouse recall and react to a mild electric shock that it received in the past. The new goal was to connect that same painful shock memory to a context where the mouse had not actually received a shock. © 2012 American Association for the Advancement of Science
Keyword: Learning & Memory
Link ID: 18416 - Posted: 07.27.2013
Researchers in Canada and Ireland have discovered that blood pressure drugs, known as ACE inhibitors, can improve brain function while slowing down the onset of dementia. ACE inhibitors, known by names such as ramipril and perindopril, have been already been shown in previous studies to delay the onset of dementia. What the medical community didn’t know was that these drugs may also enhance cognitive function. The study, published in the British Medical Journal, concludes that the use of ACE inhibitors could become useful in the management of dementia. The study examined 361 patients, all of whom had been diagnosed with Alzheimer’s, vascular dementia (triggered by lack of blood supply to the brain) or a mix of the two. Many Alzheimer's patients suffer dementia, which can affect memory, thinking, reasoning, planning and the ability to speak. Eighty-five of the patients were already taking the ACE inhibitors while the rest were not. Researchers also separately tested 30 patients, put on the drugs for the first time, for changes in their brain function. The average age was 77 and participants were followed for one year. © CBC 2013
by Virginia Morell The next time your dog digs a hole in the backyard after watching you garden, don't punish him. He's just imitating you. A new study reveals that our canine pals are capable of copying our behavior as long as 10 minutes after it's happened. The ability is considered mentally demanding and, until this discovery, something that only humans and apes were known to do. Scientists first discovered that dogs are excellent at imitating their owners in 2006. Or at least, one dog had the talent: Philip, a 4-year-old Belgian Tervuren working with József Topál, a behavioral ethologist at the Hungarian Academy of Sciences in Budapest. Topál adapted the method (called "Do as I do") that Keith and Catherine Hayes developed in the 1950s for teaching an infant chimpanzee to copy their actions. Philip was already a trained assistant dog for his disabled owner and readily followed Topál's commands. First, Topál told him to stay, and then commanded "Do as I do." The researcher then performed a simple action, such as jumping in place, barking, putting an object in a box, or carrying it to Philip's owner. Next, Topál ordered, "Do it!", and Philip responded by matching the scientist's actions. The experiment was designed to explore dog's imitative abilities, not to measure how long Philip's memory lasted; but his owner used Philip's skill to teach him how to do new, useful behaviors, such as fetching objects or putting things away. Despite Philip's abilities, "nobody really cared, or saw that it could be useful for investigating how dogs learn or see their world," says Ádám Miklósi, a behavioral ethologist at Eötvös Loránd University in Budapest who was part of Topál's team. And in 2009, another team concluded that dogs were only able to correctly imitate if there was no more than a 5-second delay between watching the action and repeating it. With such a short retention span, dogs' vaunted imitation skills seemed useless. © 2010 American Association for the Advancement of Science
by Virginia Morell A single cue—the taste of a madeleine, a small cake, dipped in lime tea—was all Marcel Proust needed to be transported down memory lane. He had what scientists term an autobiographical memory of the events, a type of memory that many researchers consider unique to humans. Now, a new study argues that at least two species of great apes, chimpanzees and orangutans, have a similar ability; in zoo experiments, the animals drew on 3-year-old memories to solve a problem. Their findings are the first report of such a long-lasting memory in nonhuman animals. The work supports the idea that autobiographical memory may have evolved as a problem-solving aid, but researchers caution that the type of memory system the apes used remains an open question. Elephants can remember, they say, but many scientists think that animals have a very different kind of memory than our own. Many can recall details about their environment and routes they've traveled. But having explicit autobiographical memories of things "I" did, or remembering events that occurred in the past, or imagining those in the future—so-called mental time travel—are considered by many psychologists to be uniquely human skills. Until recently, scientists argued that animals are stuck in time, meaning that they have no sense of the past or future and that they aren't able to recall specific events from their lives—that is, they don't have episodic memories, the what-where-when of an event that happened. © 2010 American Association for the Advancement of Science