Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Christian Jarrett One of the saddest things about loneliness is that it leads to what psychologists call a “negative spiral.” People who feel isolated come to dread bad social experiences and they lose faith that it’s possible to enjoy good company. The usual result, as Melissa Dahl recently noted, is more loneliness. This hardly seems adaptive, but experts say it’s because we’ve evolved to enter a self-preservation mode when we’re alone. Without the backup of friends and family, our brains become alert to threat, especially the potential danger posed by strangers. Until now, much of the evidence to support this account has come from behavioral studies. For example, when shown a video depicting a social scene, lonely people spend more time than others looking at signs of social threat, such as a person being ignored by their friends or one person turning their back on another. Unpublished work also shows that lonely people’s attention seems to be grabbed more quickly by words that pertain to social threat, such as rejected or unwanted. Now the University of Chicago’s husband-and-wife research team of Stephanie and John Cacioppo — leading authorities on the psychology and neuroscience of loneliness — have teamed up with their colleague, Stephen Balogh, to provide the first evidence that lonely people’s brains, compared to the non-lonely, are exquisitely alert to the difference between social and nonsocial threats. The finding, reported online in the journal Cortex, supports their broader theory that, for evolutionary reasons, loneliness triggers a cascade of brain-related changes that put us into a socially nervous, vigilant mode. The researchers used a loneliness questionnaire to recruit 38 very lonely people and 32 people who didn’t feel lonely (note that loneliness was defined here as the subjective feeling of isolation, as opposed to the number of friends or close relatives one has). Next, the researchers placed an electrode array of 128 sensors on each of the participants’ heads, allowing them to record the participants’ brain waves using an established technique known as electro-encephalography (EEG) that’s particularly suited to measuring brain activity changes over very short time periods. © 2015, New York Media LLC.
Steve Connor A computer game designed by neuroscientists has helped patients with schizophrenia to recover their ability to carry out everyday tasks that rely on having good memory, a study has found. Patients who played the game regularly for a month were four times better than non-players at remembering the kind of things that are critical for normal, day-to-day life, researchers said. The computer game was based on scientific principles that are known to “train” the brain in episodic memory, which helps people to remember events such as where they parked a car or placed a set of keys, said Professor Barbara Sahakian of Cambridge University, the lead author of the study. People recovering from schizophrenia suffer serious lapses in episodic memory which prevent them from returning to work or studying at university, so anything that can improve the ability of the brain to remember everyday events will help them to lead a normal life, Professor Sahakian said. Schizophrenia affects about one in every hundred people and results in hallucinations and delusions (Rex) Schizophrenia affects about one in every hundred people and results in hallucinations and delusions (Rex) “This kind of memory is essential for everyday learning and everything we do really both at home and at work. We have formulated an iPad game that could drive the neural circuitry behind episodic memory by stimulating the ability to remember where things were on the screen,” Professor Sahakian said. © independent.co.uk
Michael Sullivan It's 5:45 in the morning, and in a training field outside Siem Reap, home of Angkor Wat, Cambodia's demining rats are already hard at work. Their noses are close to the wet grass, darting from side to side, as they try to detect explosives buried just beneath the ground. Each rat is responsible for clearing a 200-square-meter (239-square-yard) patch of land. Their Cambodian supervisor, Hulsok Heng, says they're good at it. "They are very good," he says. "You see this 200 square meters? They clear in only 30 minutes or 35 minutes. If you compare that to a deminer, maybe two days or three days. The deminer will pick up all the fragmentation, the metal in the ground, but the rat picks up only the smell of TNT. Not fragmentation or metal or a nail or a piece of crap in the ground." That's right: Someone using a metal-detecting machine will take a lot longer to detect a land mine than a rat using its nose. There's plenty of work for the rats here in Cambodia. The government estimates there are 4 million to 6 million land mines or other pieces of unexploded ordnance — including bombs, shells and grenades — littering the countryside, remnants of decades of conflict. Neighboring Vietnam and Laos also have unexploded ordnance left over from the Vietnam War. Dozens of people are killed or maimed in the region every year — and there's a financial toll as well, since the presence of these potentially deadly devices decreases the amount of land available to farmers. © 2015 NPR
By Robert Gebelhoff Just in case sea snails aren't slow enough, new research has found that they get more sluggish when they grow old — and the discovery is helping us to understand how memory loss happens in humans. It turns out that the sea snail, which has a one-year lifespan, is actually a good model to study nerve cells and how the nervous system works in people. How neurons work is fundamentally identical in almost all animals, and the simplicity of the snail's body gives researchers the chance to view how different the system works more directly. "You can count the number of nerve cells that are relevant to a reflex," said Lynne Fieber, a professor at the University of Miami who leads research with the snails at the school. She and a team of researchers have been using the slimy little critters to learn how nerve cells respond to electric shock. They "taught" the snails to quickly contract their muscle tails by administering electric shocks and then poking the tails, a process called "sensitization." They then studied the responses at various ages. The scientists, whose work was published this week in the journal PlOS One, found that as the senior citizen specimens do not learn to contract from the shock very well. As the snails grow older, their tail startle reflex lessened, and then disappeared. So I guess you could say the frail snails' tails fail to avail (okay, I'll stop).
By Bret Stetka The brain is extraordinarily good at alerting us to threats. Loud noises, noxious smells, approaching predators: they all send electrical impulses buzzing down our sensory neurons, pinging our brain’s fear circuitry and, in some cases, causing us to fight or flee. The brain is also adept at knowing when an initially threatening or startling stimulus turns out to be harmless or resolved. But sometimes this system fails and unpleasant associations stick around, a malfunction thought to be at the root of post-traumatic stress disorder (PTSD). New research has identified a neuronal circuit responsible for the brain’s ability to purge bad memories, findings that could have implications for treating PTSD and other anxiety disorders. Like most emotions, fear is neurologically complicated. But previous work has consistently implicated two specific areas of the brain as contributing to and regulating fear responses. The amygdala, two small arcs of brain tissue deep beneath our temples, is involved in emotional reactions, and it flares with activity when we are scared. If a particular threat turns out to be harmless, a brain region behind the forehead called the prefrontal cortex steps in and the fright subsides. Our ability to extinguish painful memories is known to involve some sort of coordinated effort between the amygdala and the prefrontal cortex. The new study, led by Andrew Holmes at the National Institutes of Health, however, confirms that a working connection between the two brain regions is necessary to do away with fear. Normally mice that repeatedly listen to a sound previously associated with a mild foot shock will learn that on its own the tone is harmless, and they will stop being afraid. Using optogenetic stimulation technology, or controlling specific neurons and animal behavior using light, the authors found that disrupting the amygdala–prefrontal cortex connection prevents mice from overcoming the negative association with the benign tone. In neurobiology speak, memory “extinction” fails to occur. They also found that the opposite is true—that stimulating the circuit results in increased extinction of fearful memories. © 2015 Scientific American
By Neuroskeptic According to British biochemist Donald R. Forsdyke in a new paper in Biological Theory, the existence of people who seem to be missing most of their brain tissue calls into question some of the “cherished assumptions” of neuroscience. I’m not so sure. Forsdyke discusses the disease called hydrocephalus (‘water on the brain’). Some people who suffer from this condition as children are cured thanks to prompt treatment. Remarkably, in some cases, these post-hydrocephalics turn out to have grossly abnormal brain structure: huge swathes of their brain tissue are missing, replaced by fluid. Even more remarkably, in some cases, these people have normal intelligence and display no obvious symptoms, despite their brains being mostly water. This phenomenon was first noted by a British pediatrician called John Lorber. Lorber never published his observations in a scientific journal, although a documentary was made about them. However, his work was famously discussed in Science in 1980 by Lewin in an article called “Is Your Brain Really Necessary?“. There have been a number of other more recent published cases. Forsdyke argues that such cases pose a problem for mainstream neuroscience. If a post-hydrocephalic brain can store the same amount of information as a normal brain, he says, then “brain size does not scale with information quantity”, therefore, “it would seem timely to look anew at possible ways our brains might store their information.”
Chris Woolston A study that did not find cognitive benefits of musical training for young children triggered a “media firestorm”. Researchers often complain about inaccurate science stories in the popular press, but few air their grievances in a journal. Samuel Mehr, a PhD student at Harvard University in Cambridge, Massachusetts, discussed in a Frontiers in Psychology article1 some examples of media missteps from his own field — the effects of music on cognition. The opinion piece gained widespread attention online. Arseny Khakhalin, a neuroscientist at Bard College in Annandale-on-Hudson, New York, tweeted: Mehr gained first-hand experience of the media as the first author of a 2013 study in PLoS ONE2. The study involved two randomized, controlled trials of a total of 74 four-year-olds. For children who did six weeks of music classes, there was no sign that musical activities improved scores on specific cognitive tests compared to children who did six weeks of art projects or took part in no organized activities. The authors cautioned, however, that the lack of effect of the music classes could have been a result of how they did the studies. The intervention in the trials was brief and not especially intensive — the children mainly sang songs and played with rhythm instruments — and older children might have had a different response than the four-year-olds. There are many possible benefits of musical training, Mehr said in an interview, but finding them was beyond the scope of the study. © 2015 Nature Publishing Group
Kashmira Gander Performing well at school and going on to have a complex job could lower the risk of dementia, scientists have found. On the contrary, loneliness, watching too much TV and a sedentary lifestyle can make a person’s cognitive abilities decline more quickly, according to new research being presented to experts at the international Alzheimer's Association International Conference in Washington DC. Researchers are also due to show attendees the results from trials Solanezumab – believed to be the first drug to halt the progression of the disease if a patient is diagnosed early enough. One study involving 7,500 people aged 65 and above in Sweden over a 20-year period showed that dementia rates were 21 per cent higher in those whose grades were in the bottom fifth of the population. Meanwhile, participants with complex jobs involving data and numbers saw their chance of developing the disease cut by 23 per cent. Read more: Why fish oil pills may not be so healthy after all Proof that dementia risk can be reduced by improving lifestyle Charity warns of a 'worrying' lack of support for dementia patients Dementia research: Drug firms despair of finding cure and withdraw funding after a catalogue of failures For separate study in Sweden, scientists followed the lives of 440 people aged 75 or over for nine years, and discovered that those in the bottom fifth for school grades were found to have a 50 per cent increase in the risk of developing dementia. © independent.co.uk
by Sarah Zielinski It may not be polite to eavesdrop, but sometimes, listening in on others’ conversations can provide valuable information. And in this way, humans are like most other species in the animal world, where eavesdropping is a common way of gathering information about potential dangers. Because alarm calls can vary from species to species, scientists have assumed that eavesdropping on these calls of “danger!” requires some kind of learning. Evidence of that learning has been scant, though. The only study to look at this topic tested five golden-mantled ground squirrels and found that the animals may have learned to recognize previously unknown alarm calls. But the experiment couldn’t rule out other explanations for the squirrels’ behavior, such as that the animals had simply become more wary in general. So Robert Magrath and colleagues at Australian National University in Canberra turned to small Australian birds called superb fairy-wrens. In the wild, these birds will flee to safety when they hear unfamiliar sounds that sound like their own alarm calls, but not when they hear alarm calls that sound different from their own. There’s an exception, though: They’ll take to cover in response to the alarm calls of other species that are common where they live. That suggests the birds learn to recognize those calls. In the lab, the team played the alarm call from a thornbill or a synthetic alarm call for 10 fairy-wrens. The birds didn’t respond to the noise. Then the birds went through two days of training in which the alarm call was played as a mock predator glided overhead. Another group of birds heard the calls but there was no pretend predator. © Society for Science & the Public 2000 - 2015
By Claire Asher Even fish have role models. In a new study, researchers paired up inexperienced fathead minnows (Pimephales promelas, pictured) with two types of mentors: a minnow raised in an environment free of predators or a minnow raised in a dangerous one simulated by the odors of predatory pike and sturgeon. Fish from dangerous environments were fearful of the smell of both unknown and familiar predators, whereas fish that grew up in safety hid when they smelled a known predator but were curious about new smells. Both types of fish passed on their fears to their protégés: Minnows that spent time with fish raised in dangerous environments were scared of all smells they came across, but those that learned from fish raised in safety feared only specific predators and took new experiences in stride, the team reports online this week in the Proceedings of the Royal Society B. The authors say this is the first experiment to show that environment can influence the social transmission of fear and reveals how risk aversion can be learned. The researchers also suggest their study may shed light on how fear disorders such as post-traumatic stress disorder (PTSD) develop in humans, which research shows can be influenced by social environment; PTSD symptoms can be acquired from friends or family who have suffered trauma, for example. © 2015 American Association for the Advancement of Science
By Emily Underwood Glance at a runner's wrist or smartphone, and you'll likely find a GPS-enabled app or gadget ticking off miles and minutes as she tries to break her personal record. Long before FitBit or MapMyRun, however, the brain evolved its own system for tracking where we go. Now, scientists have discovered a key component of this ancient navigational system in rats: a group of neurons called "speed cells" that alter their firing rates with the pace at which the rodents run. The findings may help explain how the brain maintains a constantly updated map of our surroundings. In the 1970s, neuroscientist John O'Keefe, now at University College London, discovered neurons called place cells, which fire whenever a rat enters a specific location. Thirty-five years later, neuroscientists May-Britt and Edvard Moser, now at the Norwegian University of Science and Technology in Trondheim, Norway, discovered a separate group of neurons, called grid cells, which fire at regular intervals as rats traverse an open area, creating a hexagonal grid with coordinates similar to those in GPS. The Mosers and O'Keefe shared last year's Nobel Prize in Physiology and Medicine for their findings, which hint at how the brain constructs a mental map of an animal's environment. Still mysterious, however, is how grid and place cells obtain the information that every GPS system requires: the angle and speed of an object's movement relative to a known starting point, says Edvard Moser, co-author of the new study along with May-Britt Moser, his spouse and collaborator. If the brain does indeed contain a dynamic, internal map of the world, "there has to be a speed signal" that tells the network how far an animal has moved in a given period of time, he says. © 2015 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 21178 - Posted: 07.16.2015
OLIVER SACHGAU Marc Lewis spends a lot of his time thinking about addiction. He has good reason to: In his 20s he struggled with his own addiction to opiates. He was eventually able to quit, and began researching addiction and neuroscience. Mr. Lewis became a professor of developmental psychology at the University of Toronto in 1989, and moved to Radboud University in the Netherlands in 2010. His new book, The Biology of Desire: Why Addiction is Not a Disease, looks at the neuroscience of addiction, mixing personal narratives with scientific data. The book will be released in Canada on Aug. 4. You argue addiction is not a disease, but an example of very normal brain activity. What do you mean? [It’s] an exaggerated form of learning. Let’s put it that way. People in neuroscience agree that addiction corresponds with brain changes, and that’s the basis of the disease argument: That addiction changes the brain, or hijacks the brain, as they say. As though it were a pathology or disease process. Whereas I argue that all learning changes – the brain is designed to change – but when you have highly motivated learning, especially something that gets repeated over and over, then the learning curve rises extremely rapidly, and you have a kind of exaggerated learning phenomenon, where the learning is deep and specialized, and blots out other available habits or other available perceptions. You chose to mix hard scientific data with these anecdotal stories. How come? I love that way of writing. It seems to me so amazing that brain changes are going on at the same time as lived experiences: The moment-to-moment changes of thoughts and feelings are completely yoked to changes and activity in your brain, but it’s almost impossible to tell both stories at the same time, because one is under the skin, in terms of cell firings and electrochemical impulses and stuff, and the other one is in terms of behavior and human values and norms and so forth. © Copyright 2015 The Globe and Mail Inc
Zoë Corbyn Jesper Noehr, 30, reels off the ingredients in the chemical cocktail he’s been taking every day before work for the past six months. It’s a mixture of exotic dietary supplements and research chemicals that he says gives him an edge in his job without ill effects: better memory, more clarity and focus and enhanced problem-solving abilities. “I can keep a lot of things on my mind at once,” says Noehr, who is chief technology officer for a San Francisco startup. The chemicals he takes, dubbed nootropics from the Greek “noos” for “mind”, are intended to safely improve cognitive functioning. They must not be harmful, have significant side-effects or be addictive. That means well-known “smart drugs” such as the prescription-only stimulants Adderall and Ritalin, popular with swotting university students, are out. What’s left under the nootropic umbrella is a dizzying array of over-the-counter supplements, prescription drugs and unclassified research chemicals, some of which are being trialled in older people with fading cognition. There is no official data on their usage, but nootropics as well as other smart drugs appear popular in the Silicon Valley. “I would say that most tech companies will have at least one person on something,” says Noehr. It is a hotbed of interest because it is a mentally competitive environment, says Jesse Lawler, a LA based software developer and nootropics enthusiast who produces the podcast Smart Drug Smarts. “They really see this as translating into dollars.” But Silicon Valley types also do care about safely enhancing their most prized asset – their brains – which can give nootropics an added appeal, he says. © 2015 Guardian News and Media Limited
By Sarah C. P. Williams The next time you forget where you left your car keys, you might be able blame an immune protein that builds up in your blood as you age. The protein impairs the formation of new brain cells and contributes to age-related memory loss—at least in mice, according to a new study. Blocking it could help prevent run-of-the-mill memory decline or treat cognitive disorders, the researchers say. “The findings are really exciting,” says neurologist Dena Dubal of the University of California, San Francisco (UCSF), who was not involved in the study. “The importance of this work cannot be underestimated as the world’s population is aging rapidly.” Multiple groups of scientists have shown that adding the blood of older mice to younger animals’ bodies makes them sluggish, weaker, and more forgetful. Likewise, young blood can restore the memory and energy of older mice. Neuroscientist Saul Villeda of UCSF homed in on one actor he thought might be responsible for some of that effect: β2 microglobulin (B2M), an immune protein normally involved in distinguishing one’s own cells from invading pathogens. B2M has also been found at increased levels in patients with Alzheimer’s disease and other cognitive disorders. Villeda and his colleagues first measured B2M levels in the blood of both people and mice of different ages; they found that those levels increased with age. When the researchers injected B2M into 3-month-old mice, the young animals suddenly had trouble remembering how to complete a water maze, making more than twice as many errors after they’d already been trained to navigate the maze. Moreover, their brains had fewer new neurons than other mice. Thirty days later, however, when the protein had been cleared from their bodies, the animals' memory troubles were gone as well, and the number of newly formed brain cells was back to normal. © 2015 American Association for the Advancement of Science
By Michael T. Ullman and Mariel Y. Pullman The human brain possesses an incredible capacity to adapt to new conditions. This plasticity enables us not only to constantly learn but also to overcome brain injury and loss of function. Take away one capability, and little by little we often compensate for these deficits. Our brain may be especially well suited to overcome limitations in the case of psychiatric or neurological conditions that originate early in life, what clinicians call neurodevelopmental disorders. Given the brain's considerable plasticity during early years, children with these disorders may have particular advantages in learning compensatory strategies. It now appears that a single brain system—declarative memory—can pick up slack for many kinds of problems across multiple neurodevelopmental disorders. This system, rooted in the brain's hippocampus, is what we typically refer to when we think of learning and memory. It allows us to memorize facts and names or recall a first grade teacher or a shopping list. Whereas other memory systems are more specialized—helping us learn movements or recall emotional events, for instance—declarative memory absorbs and retains a much broader range of knowledge. In fact, it may allow us to learn just about anything. Given declarative memory's powerful role in learning, one might expect it to help individuals acquire all kinds of compensatory strategies—as long as it remains functional. Indeed, research suggests that it not only remains largely intact but also compensates for diverse impairments in five common conditions that are rarely studied in conjunction: autism spectrum disorder, obsessive-compulsive disorder (OCD), Tourette's syndrome, dyslexia and developmental language disorder (which is often referred to as specific language impairment, or SLI). © 2015 Scientific American
Keyword: Learning & Memory
Link ID: 21143 - Posted: 07.07.2015
By David Robson William’s internal clock is eternally jammed at 13:40 on 14 March 2005 – right in the middle of a dentist appointment. A member of the British Armed Forces, he had returned to his post in Germany the night before after attending his grandfather’s funeral. He had gym in the morning, where he played volleyball for 45 minutes. He then entered his office to clear a backlog of emails, before heading to the dentist’s for root-canal surgery. “I remember getting into the chair and the dentist inserting the local anaesthetic,” he tells me. After that? A complete blank. It is as if all new memories are being written in invisible ink that slowly disappears. Since then, he has been unable to remember almost anything for longer than 90 minutes. So while he can still tell me about the first time he met the Duke of York for a briefing at the Ministry of Defence, he can’t even remember where he’s living now; he wakes up every morning believing he is still in Germany in 2005, waiting to visit the dentist. Without a record of new experiences, the passing of time means nothing to him. Today, he only knows that there is a problem because he and his wife have written detailed notes on his smartphone, in a file labelled “First thing – read this”. It is as if all new memories are being written in invisible ink that slowly disappears. How could minor dental work have affected his brain in such a profound way? This real-life medical mystery offers a rare glimpse at the hidden depths of the brain’s workings. © 2015 BBC.
Keyword: Learning & Memory
Link ID: 21137 - Posted: 07.06.2015
By SINDYA N. BHANOO Learning can be traced back to individual neurons in the brain, according to a new study. “What we wanted to do was see if we could actually create a new association — a memory — and see if we would be able to see actual change in the neurons,” said Matias Ison, a neuroscientist at the University of Leicester in England and one of the study’s authors. He and his colleagues were able to monitor the brain activity of neurosurgical patients at UCLA Medical Center. The patients already had electrodes implanted in their medial temporal lobes for clinical reasons. The patients were first presented with images of notable people — like Jennifer Aniston, Clint Eastwood and Halle Berry. Then, they were shown images of the same people against different backdrops — like the Eiffel Tower, the Leaning Tower of Pisa and the Sydney Opera House. The same neurons that fired for the images of each of the actors also fired when patients were shown the associated landmark images. In other words, the researchers were able to watch as the patients’ neurons recorded a new memory — not just of a particular person, but of the person at a particular place. © 2015 The New York Times Company
Keyword: Learning & Memory
Link ID: 21126 - Posted: 07.02.2015
Jon Hamilton If you run into an old friend at the train station, your brain will probably form a memory of the experience. And that memory will forever link the person you saw with the place where you saw them. For the first time, researchers have been able to see that sort of link being created in people's brains, according to a study published Wednesday in the journal Neuron. The process involves neurons in one area of the brain that change their behavior as soon as someone associates a particular person with a specific place. "This type of study helps us understand the neural code that serves memory," says Itzhak Fried, an author of the paper and head of the Cognitive Neurophysiology Laboratory at UCLA. It also could help explain how diseases like Alzheimer's make it harder for people to form new memories, Fried says. The research is an extension of work that began more than a decade ago. That's when scientists discovered special neurons in the medial temporal lobe that respond only to a specific place, or a particular person, like the actress Jennifer Aniston. The experiment used a fake photo of actor Clint Eastwood and Pisa's leaning tower to test how the brain links person and place. More recently, researchers realized that some of these special neurons would respond to two people, but only if the people were connected somehow. For example, "a neuron that was responding to Jennifer Aniston was also responding to pictures of Lisa Kudrow," [another actress on the TV series Friends], says Matias Ison of the University of Leicester in the U.K. © 2015 NPR
By Erika Beras Marijuana is the drug of choice for people who drink alcohol. And people who use both are twice as likely to do so at the same time than to indulge in just one or the other. That’s according to a study in the journal Alcoholism: Clinical and Experimental Research. [Meenakshi S. Subbaraman and William C. Kerr, Simultaneous Versus Concurrent Use of Alcohol and Cannabis in the National Alcohol Survey The data came from self-reported answers that more than 8,600 people provided to what’s called the National Alcohol Surveys, done by phone in 2005 and 2010. People who used pot and alcohol were about twice as likely to drive drunk than those who just drank. And they doubled their chances of what are referred to as negative social consequences, such as arrests, fights and job problems. Meanwhile, another new study finds that if you’re chronically stoned, you’re more likely to remember things differently from how they happened, or not at all. Researchers showed a series of words to people who do not use marijuana and to regular pot users who had not partaken in a month. A few minutes later, all participants were shown the same list of words along with other words. The volunteers were then asked to identify only the original words. The pot smokers thought more of the new words were in the original list than did the nonusers. And brain scans revealed that the regular pot users showed less activity in brain regions associated with memory and cognitive resources than did the nonusers. The study is in the journal Molecular Psychiatry. [J. Riba et al, Telling true from false: cannabis users show increased susceptibility to false memories] © 2015 Scientific American
By Ariana Eunjung Cha One of the most heartbreaking things about Alzheimer's is that it has been impossible for doctors to predict who will get it before symptoms begin. And without early detection, researchers say, a treatment or cure may be impossible. Governments, drug companies and private foundations have poured huge amounts of money into trying to come up with novel ways to detect risk through cutting-edge technologies ranging from brain imaging, protein analysis of cerebrospinal fluid and DNA profiling. Now a new study, published in the journal Neurology, shows that perhaps something more old-fashioned could be the answer: a memory test. The researchers tracked 2,125 participants in four Chicago neighborhoods for 18 years, giving them tests of memory and thinking every three years. They found that those who scored lowest on the tests during the first year were 10 times more likely to be diagnosed with Alzheimer's down the road -- indicating that cognitive impairment may be affecting the brain "substantially earlier than previously established," the researchers wrote.