Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Melinda Wenner Moyer What if you could pop a pill that made you smarter? It sounds like a Hollywood movie plot, but a new systematic review suggests that the decades-long search for a safe and effective “smart drug” (see below) might have notched its first success. Researchers have found that modafinil boosts higher-order cognitive function without causing serious side effects. Modafinil, which has been prescribed in the U.S. since 1998 to treat sleep-related conditions such as narcolepsy and sleep apnea, heightens alertness much as caffeine does. A number of studies have suggested that it could provide other cognitive benefits, but results were uneven. To clear up the confusion, researchers then at the University of Oxford analyzed 24 studies published between 1990 and 2014 that specifically looked at how modafinil affects cognition. In their review, which was published last year in European Neuropsychopharmacology, they found that the methods used to evaluate modafinil strongly affected the outcomes. Research that looked at the drug's effects on the performance of simple tasks—such as pressing a particular button after seeing a certain color—did not detect many benefits. Yet studies that asked participants to do complex and difficult tasks after taking modafinil or a placebo found that those who took the drug were more accurate, which suggests that it may affect “higher cognitive functions—mainly executive functions but also attention and learning,” explains study co-author Ruairidh Battleday, now a medical doctor and Ph.D. student at the University of California, Berkeley. But don't run to the pharmacy just yet. Although many doctors very likely prescribe the drug off-label to help people concentrate—indeed, a 2008 survey by the journal Nature found that one in five of its readers had taken brain-boosting drugs, and half those people had used modafinil—trials have not yet been done on modafinil's long-term effectiveness or safety. © 2016 Scientific American
Laura Sanders NEW YORK — Cells in a brain structure known as the hippocampus are known to be cartographers, drawing mental maps of physical space. But new studies show that this seahorse-shaped hook of neural tissue can also keep track of social space, auditory space and even time, deftly mapping these various types of information into their proper places. Neuroscientist Rita Tavares described details of one of these new maps April 2 at the annual meeting of the Cognitive Neuroscience Society. Brain scans had previously revealed that activity in the hippocampus was linked to movement through social space. In an experiment reported last year in Neuron, people went on a virtual quest to find a house and job by interacting with a cast of characters. Through these social interactions, the participants formed opinions about how much power each character held, and how kindly they felt toward him or her. These judgments put each character in a position on a “social space” map. Activity in the hippocampus was related to this social mapmaking, Tavares and colleagues found. It turns out that this social map depends on the traits of the person who is drawing it, says Tavares, of Icahn School of Medicine at Mount Sinai in New York City. People with more social anxiety tended to give more power to characters they interacted with. What’s more, these people's social space maps were smaller overall, suggesting that they explored social space less, Tavares says. Tying these behavioral traits to the hippocampus may lead to a greater understanding of social behavior — and how this social mapping may go awry in psychiatric conditions, Tavares said. © Society for Science & the Public 2000 - 2016.
Keyword: Learning & Memory
Link ID: 22076 - Posted: 04.06.2016
by Daniel Galef Footage from a revolutionary behavioural experiment showed non-primates making and using tools just like humans. In the video, a crow is trying to get food out of a narrow vessel, but its beak is too short for it to reach through the container. Nearby, the researchers placed a straight wire, which the crow bent against a nearby surface into a hook. Then, holding the hook in its beak, it fished the food from the bottle. Corvids—the family of birds that includes crows, ravens, rooks, jackdaws, and jays—are pretty smart overall. Although not to the level of parrots and cockatoos, ravens can also mimic human speech. They also have a highly developed system of communication and are believed to be among the most intelligent non-primate animals in existence. McGill Professor Andrew Reisner recalls meeting a graduate student studying corvid intelligence at Oxford University when these results were first published in 2015. “I had read early in the year that some crows had been observed making tools, and I mentioned this to him,” Reisner explained. “He said that he knew about that, as it had been he who had first observed it happening. Evidently the graduate students took turns watching the ‘bird box,’ […] and the tool making first occurred there on his shift.”
Laura Sanders NEW YORK — Sometimes forgetting can be harder than remembering. When people forced themselves to forget a recently seen image, select brain activity was higher than when they tried to remember that image. Forgetting is often a passive process, one in which the memory slips out of the brain, Tracy Wang of the University of Texas at Austin said April 2 at the annual meeting of the Cognitive Neuroscience Society. But in some cases, forgetting can be deliberate. Twenty adults saw images of faces, scenes and objects while an fMRI scanner recorded their brains’ reactions to the images. If instructed to forget the preceding image, people were less likely to remember that image later. Researchers used the scan data to build a computer model that could infer how strongly the brain responds to each particular kind of image. In the ventral temporal cortex, a part of the brain above the ear, brain patterns elicited by a particular image were stronger when a participant was told to forget the sight than when instructed to remember it. Of course, everyone knows that it’s easy to forget something without even trying. But these results show that intentional forgetting isn’t a passive process — the brain has to actively work to wipe out a memory on purpose. Citations T.H. Wang et al. Forgetting is more work than remembering. Annual meeting of the Cognitive Neuroscience Society, New York City, April 2, 2016. © Society for Science & the Public 2000 - 2016
Keyword: Learning & Memory
Link ID: 22068 - Posted: 04.05.2016
The mystery is starting to untangle. It has long been known that twisted fibres of a protein called tau collect in the brain cells of people with Alzheimer’s, but their exact role in the disease is unclear. Now a study in mice has shown how tau interferes with the strengthening of connections between neurons – the key mechanism by which we form memories. In healthy cells, the tau protein helps to stabilise microtubules that act as rails for transporting materials around the cell. In people with Alzheimer’s, these proteins become toxic, but an important unanswered question is what forms of tau are toxic: the tangles may not be the whole story. In the new study, Li Gan and her colleagues at the Gladstone Institute of Neurological Disease in San Francisco found that the brains of those with Alzheimer’s have high levels of tau with a particular modification, called acetylated tau. They then looked at what acetylated tau does in a mouse model of Alzheimer’s, finding that it accumulates at synapses – the connections between neurons. When we form memories, synapses become strengthened through extra receptors inserted into the cell membranes, and this heightens their response. But acetylated tau depletes another protein called KIBRA, which is essential for this synapse-strengthening mechanism. “We’re excited because we think we now have a handle on the link between tau and memory,” says Gan. “We’re also cautious because we know this may not be the only link. It’s still early days in understanding the mechanism.” © Copyright Reed Business Information Ltd.
By David Z. Hambrick Nearly a century after James Truslow Adams coined the phrase, the “American dream” has become a staple of presidential campaign speeches. Kicking off her 2016 campaign, Hillary Clinton told supporters that “we need to do a better job of getting our economy growing again and producing results and renewing the American dream.” Marco Rubio lamented that “too many Americans are starting to doubt” that it is still possible to achieve the American dream, and Ted Cruz asked his supporters to “imagine a legal immigration system that welcomes and celebrates those who come to achieve the American dream.” Donald Trump claimed that “the American dream is dead” and Bernie Sanders quipped that for many “the American dream has become a nightmare.” But the American dream is not just a pie-in-the-sky notion—it’s a scientifically testable proposition. The American dream, Adams wrote, “is not a dream of motor cars and high wages merely, but a dream of social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable…regardless of the fortuitous circumstances of birth or position.” In the parlance of behavioral genetics—the scientific study of genetic influences on individual differences in behavior—Adams’ idea was that all Americans should have an equal opportunity to realize their genetic potential. A study just published in Psychological Science by psychologists Elliot Tucker-Drob and Timothy Bates reveals that this version of the American dream is in serious trouble. Tucker-Drob and Bates set out to evaluate evidence for the influence of genetic factors on IQ-type measures (aptitude and achievement) that predict success in school, work, and everyday life. Their specific question was how the contribution of genes to these measures would compare at low versus high levels of socioeconomic status (or SES), and whether the results would differ across countries. The results reveal, ironically, that the American dream is more of a reality for other countries than it is for America: genetic influences on IQ were uniform across levels of SES in Western Europe and Australia, but, in the United States, were much higher for the rich than for the poor. © 2016 Scientific American
Chris French The fallibility of human memory is one of the most well established findings in psychology. There have been thousands of demonstrations of the unreliability of eyewitness testimony under well-controlled conditions dating back to the very earliest years of the discipline. Relatively recently, it was discovered that some apparent memories are not just distorted memories of witnessed events: they are false memories for events that simply never took place at all. Psychologists have developed several reliable methods for implanting false memories in a sizeable proportion of experimental participants. It is only in the last few years, however, that scientists have begun to systematically investigate the phenomenon of non-believed memories. These are subjectively vivid memories of personal experiences that an individual once believed were accurate but now accepts are not based upon real events. Prior to this, there were occasional anecdotal reports of non-believed memories. One of the most famous was provided by the influential developmental psychologist Jean Piaget. He had a clear memory of almost being kidnapped at about the age of two and of his brave nurse beating off the attacker. His grateful family were so impressed with the nurse that they gave her a watch as a reward. Years later, the nurse confessed that she had made the whole story up. Even after he no longer believed that the event had taken place, Piaget still retained his vivid and detailed memory of it. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22050 - Posted: 03.30.2016
Brendan Maher It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. “Pick one, kidd,” one nudged. Then, “Choose FA GO TT.” It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. © 2016 Nature Publishing Group
By Patrick Monahan Yesterday, mountaineer Richard Parks set out for Kathmandu to begin some highly unusual data-gathering. As part of Project Everest Cynllun, he will climb Mount Everest without supplemental oxygen and perform—on himself—a series of blood draws, muscle biopsies, and cognitive tests. If he makes it to the summit, these will be the highest-elevation blood and tissue samples ever collected. Damian Bailey, a physiologist at the University of South Wales, Pontypridd, in the United Kingdom and the project’s lead scientist, hopes the risky experiment will yield new information about how the human body responds to low-oxygen conditions, and how similar mechanisms might drive cognitive decline with aging. As Parks began the acclimatization process with warm-up climbs on two smaller peaks, Bailey told ScienceInsider about his ambitions for the project. This interview has been edited for clarity and brevity. Q: Parks is an extreme athlete who has climbed Everest before. What can his performance tell us about regular people? A: What we’re trying to understand is, what is it about Richard’s brain that is potentially different from other people’s brains, and can that provide us with some clues to accelerated cognitive decline, which occurs with aging [and] dementia. We know that sedentary aging is associated with a progressive decline in blood flow to the brain. … And the main challenge for sedentary aging is we have to wait so long to see the changes occurring. So this is almost a snapshot, a day in the life of a patient with cognitive decline. © 2016 American Association for the Advancement of Science.
Healthy body, healthy mind. Elderly people who are physically active seem to be able to stave off memory loss – but only if they start exercising before symptoms appear. At the end of a five-year period, the brains of non-exercisers look 10 years older than those who did moderate exercise. That’s what Clinton Wright at the University of Miami in Florida and his colleagues found when they followed 876 people, starting at an average age of 71, for five years. At the start of the study, each participant underwent a number of memory and cognition tests, and had the health of their brain assessed during an MRI scan. Each person was also asked how much exercise they had done in recent weeks, ranging from “no/light”, such as walking or gardening, to “moderate/heavy”, which included running and swimming. Five years later, the volunteers were called back to repeat all the tests. The participants generally performed less well than they had five years earlier. But their scores were linked to their level of exercise – those who reported no or low levels of exercise scored lower in all tests, the team found. The 10 per cent of people who said they had been engaged in moderate-to-heavy exercise not only started with higher scores in the first round of tests, but showed less of a decline five years later . Those who did little or no exercise also seemed to have worse vascular health – they had higher blood pressure, and their MRI scans showed evidence of undetected strokes. © Copyright Reed Business Information Ltd.
By NATALIE ANGIER Juan F. Masello never intended to study wild parrots. Twenty years ago, as a graduate student visiting the northernmost province of Patagonia in Argentina, he planned to write his dissertation on colony formation among seabirds. But when he asked around for flocks of, say, cormorants or storm petrels, a park warden told him he was out of luck. “He said, ‘This is the only part of Patagonia with no seabird colonies,’” recalled Dr. Masello, a principal investigator in animal ecology and systematics at Justus Liebig University in Germany. Might the young scientist be interested in seeing a large colony of parrots instead? The sight that greeted Dr. Masello was “amazing” and “incredible,” he said. “It was almost beyond words.” On a 160-foot-high sandstone cliff that stretched some seven miles along the Atlantic coast, tens of thousands of pairs of burrowing parrots had used their powerful bills to dig holes — their nests — deep into the rock face. And when breeding season began not long afterward, the sky around the cliffs erupted into a raucous carnival of parrot: 150,000 crow-size, polychromed aeronauts with olive backsides, turquoise wings, white epaulets and bright red belly patches ringed in gold. Dr. Masello was hooked. Today, Dr. Masello’s hands are covered with bite scars. He has had four operations to repair a broken knee, a broken nose — “the little accidents you get from working with parrots,” he said. Still, he has no regrets. “Their astonishing beauty and intelligence,” Dr. Masello said, “are inspirational.” © 2016 The New York Times Company
By Manuel Valdes For nearly every step of his almost 12-mile walks around Seattle, Darryl Dyer has company. Flocks of crows follow him, signaling each other, because they all know that he’s the guy with the peanuts. “They know your body type. The way you walk,” Dyer said. “They’ll take their young down and say: ‘You want to get to know this guy. He’s got the food.’ ” Scientists have known for years that crows have great memories, that they can recognize a human face and behavior, that they can pass that information on to their offspring. Researchers are trying to understand more about the crow’s brain and behavior, specifically what the birds do when they see one of their own dead. They react loudly, but the reasons aren’t entirely known. Among the guesses is that they are mourning; given that crows mate for life, losing a partner could be a significant moment for the social animals. There are anecdotes of crows placing sticks and other objects on dead birds — a funeral of sorts. Using masks with dark-haired wigs that looked creepily nonhuman, researchers showed up at Seattle parks carrying a stuffed crow and recorded the reactions. One crow signals an alarm, then dozens show up. They surround the dead crow, looking at it as they perch on trees or fly above it, a behavior called mobbing. “Crows have evolved to have these complex social relationships, and they have a big brain,” said Kaeli Swift, a University of Washington graduate student who led the study.
Mo Costandi In order to remember, we must forget. Recent research shows that when your brain retrieves newly encoded information, it suppresses older related information so that it does not interfere with the process of recall. Now a team of European researchers has identified a neural pathway that induces forgetting by actively erasing memories. The findings could eventually lead to novel treatments for conditions such as post-traumatic stress disorder (PTSD). We’ve known since the early 1950s that a brain structure called the hippocampus is critical for memory formation and retrieval, and subsequent work using modern techniques has revealed a great deal of information about the underlying cellular mechanisms. The hippocampus contains neural circuits that loop through three of its sub-regions – the dentate gyrus and the CA3 and CA1 areas – and it’s widely believed that memories form by the strengthening and weakening of synaptic connections within these circuits. The dentate gyrus gives rise to so-called mossy fibres, which form the main ‘input’ to the hippocampus, by relaying sensory information from an upstream region called the entorhinal cortex first to CA3 and then onto CA1. It’s thought that the CA3 region integrates the information to encode, store, and retrieve new memories, before transferring them to the cerebral cortex for long-term storage. Exactly how each of these hippocampal sub-regions contribute to memory formation, storage, and retrieval is still not entirely clear, however. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22007 - Posted: 03.19.2016
Laura Sanders Using flashes of blue light, scientists have pulled forgotten memories out of the foggy brains of mice engineered to have signs of early Alzheimer’s disease. This memory rehab feat, described online March 16 in Nature, offers new clues about how the brain handles memories, and how that process can go awry. The result “provides a theoretical mechanism for reviving old, forgotten memories,” says Yale School of Medicine neurologist Arash Salardini. Memory manipulations, such as the retrieval of lost memories and the creation of false memories, were “once the realm of science fiction,” he says. But this experiment and other recent work have now accomplished these feats, at least in rodents (SN: 12/27/14, p. 19), he says. To recover a lost memory, scientists first had to mark it. Neuroscientist Susumu Tonegawa of MIT and colleagues devised a system that tagged the specific nerve cells that stored a memory — in this case, an association between a particular cage and a shock. A virus delivered a gene for a protein that allowed researchers to control this collection of memory-holding nerve cells. The genetic tweak caused these cells to fire off signals in response to blue laser light, letting Tonegawa and colleagues call up the memory with light delivered by an optic fiber implanted in the brain. A day after receiving a shock in a particular cage, mice carrying two genes associated with Alzheimer’s seemed to have forgotten their ordeal; when put back in that cage, these mice didn’t seem as frightened as mice without the Alzheimer’s-related genes. But when the researchers used light to restore this frightening memory, it caused the mice to freeze in place in a different cage. (Freezing in a new venue showed that laser activation of the memory cells, and not environmental cues, caused the fear reaction.) © Society for Science & the Public 2000 - 2016. All rights reserved.
THERE they are! Newborn neurons vital for memory have been viewed in a live brain for the first time. The work could aid treatments for anxiety and stress disorders. Attila Losonczy at Columbia University Medical Center in New York and his team implanted a tiny microscope into the brains of live mice, the brain cells of which had been modified to make newly made neurons glow. The mice then ran on a treadmill as the team tweaked the surrounding sights, smells and sounds. The researchers paired a small electric shock with some cues, so the mice learned to associate these with an unpleasant experience. They then deactivated the newborn neurons – present in areas of the brain responsible for learning and memory – using optogenetics, which switches off specific cells with light. After this, the mice were unable to tell the difference between the scary and safe cues, becoming fearful of them all (Neuron, doi.org/bc7v). “It suggests that newborn cells do something special that allows animals to tell apart and separate memories,” says Losonczy. An inability to discriminate between similar sensory information triggered by different events – such as the sound of a gunshot and a car backfiring – is often seen in panic and anxiety disorders, such as PTSD. This suggests that new neurons, or a lack of them, plays a part in such conditions and could guide novel treatments. © Copyright Reed Business Information Ltd.
Barbara Bradley Hagerty Faced with her own forgetfulness, former NPR correspondent and author Barbara Bradley Hagerty tried to do something about it. She's written about her efforts in her book on midlife, called Life Reimagined. To her surprise, she discovered that an older dog can learn new tricks. A confession: I loathe standardized tests, and one of the perks of reaching midlife is that I thought I'd never have to take another. But lately I've noticed that in my 50s, my memory isn't the same as it once was. And so I decided to take a radical leap into the world of brain training. At the memory laboratory at the University of Maryland, manager Ally Stegman slides a sheet of paper in front of me. It has a series of boxes containing different patterns and one blank space. My job is to figure out the missing pattern. The test measures a sort of raw intelligence, the ability to figure out novel problems. Time races by. It takes me two minutes to crack the first question. I am stumped by the second and third. Finally, I begin to guess. After 25 minutes, the test is over, and to my relief, Stegman walks in. This test was really, really hard. The reason I am here, voluntarily reliving my nightmare, is simple: I want to tune up my 50-something brain. So over the next month, I will do brain-training exercises, then come back, take the test again and see if I made myself smarter. © 2016 npr
By Julia Shaw Our brains play tricks on us all the time, and these tricks can mislead us into believing we can accurately reconstruct our personal past. In reality, false memories are everywhere. False memories are recollections of things that you never actually experienced. These can be small memory errors, such as thinking you saw a yield sign when you actually saw a stop sign, or big errors like thinking you took a hot air balloon ride that never actually happened. If you want to know more about how we can come to misremember complex autobiographical events, here is a recipe and here is a video with footage from my own research. A few weeks ago I reached out to see what you actually wanted to know about this phenomenon on Reddit, and here are the answers to my six favorite questions. 1. Is there any way a person can check if their own memories are real or false? The way that I have interpreted the academic literature, once they take hold false memories are no different from true memories in the brain. This means that they have the same properties as any other memories, and are indistinguishable from memories of events that actually happened. The only way to check, is to find corroborating evidence for any particular memory that you are interested in “validating”. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 21985 - Posted: 03.15.2016
By Emily Underwood Nestled deep within a brain region that processes memory is a sliver of tissue that continually sprouts brand-new neurons, at least into late adulthood. A study in mice now provides the first glimpse at how these newborn neurons behave in animals as they learn, and hints at the purpose of the new arrivals: to keep closely-related but separate memories distinct. A number of previous studies have suggested that the birth of new neurons is key to memory formation. In particular, scientists believe the new cell production—known as neurogenesis—plays a role in pattern separation, the ability to discriminate between similar experiences, events, or contexts based on sensory cues such as a certain smell or visual landmark. Pattern separation helps us use cues such as the presence of a particular tree or cars nearby, for example, to distinguish which parking space we chose today, as opposed to yesterday or the day before. This ability appears to be particularly diminished in people with anxiety and mood disorders. Scientists can produce deficits in pattern separation in animals by blocking neurogenesis, using x-ray radiation to kill targeted populations of cells in the dentate gyrus. Because such studies have not established the precise identity of which cells are being recorded from, however, no one has been able to address the “burning question” in the field: "how young, adult-born neurons and mature dentate granule neurons differ in their activity," says Amar Sahay, a neuroscientist at the Massachusetts General Hospital and Harvard Medical School. © 2016 American Association for the Advancement of Scienc
How is the brain able to use past experiences to guide decision-making? A few years ago, researchers supported by the National Institutes of Health discovered in rats that awake mental replay of past experiences is critical for learning and making informed choices. Now, the team has discovered key secrets of the underlying brain circuitry – including a unique system that encodes location during inactive periods. “Advances such as these in understanding cellular and circuit-level processes underlying such basic functions as executive function, social cognition, and memory fit into NIMH’s mission of discovering the roots of complex behaviors,” said NIMH acting director Bruce Cuthbert, Ph.D. While a rat is moving through a maze — or just mentally replaying the experience — an area in the brain’s memory hub, or hippocampus, specialized for locations, called CA1, communicates with a decision-making area in the executive hub or prefrontal cortex (PFC). A distinct subset of PFC neurons excited during mental replay of the experience are activated during movement, while another distinct subset, less engaged during movement in the maze – and therefore potentially distracting – are inhibited during replay. “Such strongly coordinated activity within this CA1-PFC circuit during awake replay is likely to optimize the brain’s ability to consolidate memories and use them to decide on future action” explained Shantanu Jadhav, Ph.D. (link is external), now an assistant professor at Brandeis University, Waltham, MA., the study’s co-first author. His contributions to this line of research were made possible, in part, by a Pathway to Independence award from the Office of Research Training and Career Development of the NIH’s National Institute of Mental Health (NIMH).
By Roberto A. Ferdman In the mid 1970s, psychologist Merrill Elias began tracking the cognitive abilities of more than a thousand people in the state of New York. The goal was fairly specific: to observe the relationship between people's blood pressure and brain performance. And for decades he did just that, eventually expanding the Maine-Syracuse Longitudinal Study (MSLS) to observe other cardiovascular risk factors, including diabetes, obesity, and smoking. There was never an inkling that his research would lead to any sort of discovery about chocolate. And yet, 40 years later, it seems to have done just that. Late in the study, Elias and his team had an idea. Why not ask the participants what they were eating too? It wasn't unreasonable to wonder if what someone ate might add to the discussion. Diets, after all, had been shown to affect the risk factors Elias was already monitoring. Plus, they had this large pool of participants at their disposal, a perfect chance to learn a bit more about the decisions people were making about food. The researchers incorporated a new questionnaire into the sixth wave of their data collection, which spanned the five years between 2001 and 2006 (there have been seven waves in all, each conducted in five year intervals). The questionnaire gathered all sorts of information about the dietary habits of the participants. And the dietary habits of the participants revealed an interesting pattern. "We found that people who eat chocolate at least once a week tend to perform better cognitively," said Elias. "It's significant—it touches a number of cognitive domains." © 1996-2016 The Washington Post