Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Jessica Morrison Bees, like birds and butterflies, use the Sun as a compass for navigation, whereas mammals typically find their way by remembering familiar landmarks on a continuous mental map. However, the latest research suggests that bees also use this type of map, despite their much smaller brain size. The work adds a new dimension to complex bee-navigation abilities that have long captivated researchers. “The surprise comes for many people that such a tiny little brain is able to form such a rich memory described as a cognitive map,” says co-author Randolf Menzel, a neurobiologist at the Free University of Berlin. The research by Menzel and his team, published today in the Proceedings of the National Academy of Sciences1, demonstrates that bees can find their way back to their hives without relying solely on the Sun. Instead, they seem to use a 'cognitive map' that is made up of memorized landscape snapshots that direct them home. The cognitive map used by mammals is thought to originate in the brain’s hippocampus. Humans employ such maps on a daily basis; for example, even in a windowless office, many people can point towards their home, orienting themselves in space based on knowledge of their location relative to the outside world. “They can point to their home generally even though they can’t see it, even along a path through a wall that they haven’t travelled,” explains Fred Dyer, a behavioural biologist at Michigan State University in East Lansing, who was not involved in the research. The study authors argue that bees can do something similar, albeit on a much more rudimentary level. © 2014 Nature Publishing Group
Keyword: Animal Migration
Link ID: 19684 - Posted: 06.03.2014
By DAAN HEERMA VAN VOSS I was 25 when I lost my memory. It happened on Jan. 16, 2012. I woke up, not knowing where I was. I was lying in bed, sure, but whose bed was it? There was no one in the room, no sound that I recognized: I was alone with my body. Of course, my relationship to my body was radically different than before. My body parts seemed to belong to someone else or, rather, to something else. The vague sense of identity that I possessed was confined to the knowledge of my name, but even that felt arbitrary — a collection of random letters, crumbling. No words can accurately describe the feeling of losing your memory, your life. Sammy Harkham Underlying the loss of facts is a deeper problem: the loss of logic and causality. A person can function, ask questions, only when he recognizes a fundamental link between circumstances and time, past and present. The links between something happening to you, leading you to do or say something, which leads to someone else responding. No act is without an act leading up to it, no word is without a word that came before. Without the sense of causality provided by memory, there is chaos. When I woke up, I had no grip on logic, and logic none on me. It was a profound not-knowing, and it was terrifying. I started hyperventilating. What struck me has a name: Transient Global Amnesia. T.G.A., as it’s referred to, is a neurological disorder. The name sounds definitive, but in fact, it’s just a fancy way of saying: We don’t know the cause, we know only what the symptoms are. Its most defining symptom is a near total disruption of short-term memory. In many cases, there is a temporary loss of long-term memory as well. But there is a bright side. T.G.A. lasts for approximately two to 20 hours, so it’s a one-day thing. At the time, though, I didn’t know this. Two names popped into my mind: Daniel and Sophie. I didn’t know where the names came from, or to whom they belonged. I stumbled across the room, opened a door, and discovered that I was alone in the apartment. (It was, in fact, my apartment.) I found an iPhone and, quite magically, I thought, knew how to work it. As it turns out, there was nothing magical about this: A characteristic of T.G.A. is that those afflicted with it can perform familiar tasks, even ones as difficult as driving a car. (But I wouldn’t recommend that.) Occurrence of T.G.A. is rare, with at most 10 cases per 100,000 people. It is most likely to happen when you’re between 40 and 80; the average age of a T.G.A. patient is 62 years old. But I have always been in the fast lane. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19681 - Posted: 06.02.2014
Elizabeth Norton It's a sad fact that children born in poverty start out at a disadvantage and continue to fall further behind kids who are more privileged as they grow up. In developing countries, chiefly in Africa and Asia, some 200 million children under age 5 won't reach the same milestones—for physical growth, school performance, and earnings later on—as children who are less deprived. But a new analysis of a long-term study in Jamaica shows that surprisingly simple ways of stimulating children’s mental development can have dramatic benefits later in life. The children were participants in the Jamaican Study, a project geared toward improving cognitive development begun in the mid-1980s by child health specialists Sally Grantham-McGregor of University College London and Susan Walker of the University of the West Indies, Mona, in Jamaica. They focused on children between the ages of 9 and 24 months whose growth was stunted, placing them in the bottom 5% of height for their age and sex (an easy-to-quantify gauge of extreme poverty). Children of normal height in the same neighborhoods were also studied for comparison. For 2 years, community health workers visited the families weekly. One group was given nutritional assistance only (a formula containing 66% of daily recommended calories, along with vitamins and minerals). One group received a mental and social stimulation program only, and one group got stimulation and nutritional assistance. A final group had no intervention and served as a control. The mental stimulation program involved giving parents simple picture books and handmade toys, and encouraging them to read and sing to their children and point out names of objects, shapes, and colors. They were also taught better ways to converse and respond to their toddlers. These everyday interactions aren't always part of the culture in low-income countries, explains Paul Gertler, an economist at the University of California, Berkeley. "Parents might have five or six kids and few toys. They might be working really hard and have a lot of competing demands. They might not have been taught how to talk to their children, or how important and effective it is," he says. Past research attests to the importance of everyday conversation for children’s mental development: A recent study suggests that children of affluent parents do better in life in large part because their parents talk to them more. © 2014 American Association for the Advancement of Science
By KATE MURPHY The baseball hurtles toward the batter, and he must decide from its rotation whether it’s a fastball worth a swing or a slider about to drop out of the strike zone. Running full speed, the wide receiver tracks both the football flying through the air and the defensive back on his heels. Golfers must rapidly shift visual focus in order to drive the ball at their feet toward a green in the distance. Many athletes need excellent vision to perform well in their sports, and now many are adding something new to their practice regimens: vision training. The idea has been around for years, but only recently have studies hinted that it might really work — that it might be possible to train yourself to see better without resorting to glasses or surgery. “Vision training has been out there for a long time,” said Mark Blumenkranz, a professor of ophthalmology at Stanford University Medical School. “But it’s being made more respectable lately thanks to the attention it’s been getting from psychophysicists, vision scientists, neurologists and optometrists.” Vision training actually has little to do with improving eyesight. The techniques, a form of perceptual learning, are intended to improve the ability to process what is seen. The idea is that if visual sensory neurons are repeatedly activated, they increase their ability to send electrical signals from one cell to another across connecting synapses. If neurons are not used, over time these transmissions are weakened. “With sensory neurons, just like muscles, it’s use or lose it,” said Dr. Bernhard Sabel, a neuroscientist at Otto von Guericke University in Magdeburg, Germany, who studies plasticity in the brain. “This applies both to athletes and the partially blind.” Vision training may involve simple strategies — for instance, focusing sequentially on beads knotted at intervals on a length of string with one end held at the tip of the nose. This is said to improve convergence (inward turning of the eye to maintain binocular vision) and the ability to focus near and far. © 2014 The New York Times Company
By BENEDICT CAREY SAN DIEGO – The last match of the tournament had all the elements of a classic showdown, pitting style versus stealth, quickness versus deliberation, and the world’s foremost card virtuoso against its premier numbers wizard. If not quite Ali-Frazier or Williams-Sharapova, the duel was all the audience of about 100 could ask for. They had come to the first Extreme Memory Tournament, or XMT, to see a fast-paced, digitally enhanced memory contest, and that’s what they got. The contest, an unusual collaboration between industry and academic scientists, featured one-minute matches between 16 world-class “memory athletes” from all over the world as they met in a World Cup-like elimination format. The grand prize was $20,000; the potential scientific payoff was large, too. One of the tournament’s sponsors, the company Dart NeuroScience, is working to develop drugs for improved cognition. The other, Washington University in St. Louis, sent a research team with a battery of cognitive tests to determine what, if anything, sets memory athletes apart. Previous research was sparse and inconclusive. Yet as the two finalists, both Germans, prepared to face off — Simon Reinhard, 35, a lawyer who holds the world record in card memorization (a deck in 21.19 seconds), and Johannes Mallow, 32, a teacher with the record for memorizing digits (501 in five minutes) — the Washington group had one preliminary finding that wasn’t obvious. “We found that one of the biggest differences between memory athletes and the rest of us,” said Henry L. Roediger III, the psychologist who led the research team, “is in a cognitive ability that’s not a direct measure of memory at all but of attention.” People have been performing feats of memory for ages, scrolling out pi to hundreds of digits, or phenomenally long verses, or word pairs. Most store the studied material in a so-called memory palace, associating the numbers, words or cards with specific images they have already memorized; then they mentally place the associated pairs in a familiar location, like the rooms of a childhood home or the stops on a subway line. The Greek poet Simonides of Ceos is credited with first describing the method, in the fifth century B.C., and it has been vividly described in popular books, most recently “Moonwalking With Einstein,” by Joshua Foer. © 2014 The New York Times Company
|By Beth Skwarecki The protein family notorious for causing neurogenerative diseases such as Parkinson's—not to mention mad cow—appears to play an important role in healthy cells. “Do you think God created prions just to kill?” muses Eric R. Kandel of Columbia University. “These things have evolved initially to have a physiological function.” Kandel's work on memory helped to reveal that animals make and use prions in their nervous systems as part of an essential function: stabilizing the synapses involved with forming long-term memories. These natural prions are not infectious, but on a molecular level they chain up exactly the same way as their disease-causing brethren. (Some researchers call them “prionlike” to avoid confusion.) Now neuroscientist Kausik Si of the Stowers Institute for Medical Research in Kansas City, Mo., one of Kandel's former students, has shown that the prion's action is tightly controlled by the cell and can be turned on when a new long-term memory needs to be formed. Once the prion's chain reaction gets started, it is self-perpetuating, and thus the synapse—where neurons connect—can be maintained after the initial trigger is gone, perhaps for a lifetime. But that still does not explain how the first prion is triggered or why it happens at only certain of the synapses, which play a crucial role in forming memories. Si's work, published February 11 in PLOS Biology, traces the biochemistry of this protein-preservation process in fruit flies, showing how the cell turns on the machinery responsible for the persistence of memory—and how the memory can be stabilized at just the right time and in the right place. © 2014 Scientific American
By DANIEL GOLEMAN Which will it be — the berries or the chocolate dessert? Homework or the Xbox? Finish that memo, or roam Facebook? Such quotidian decisions test a mental ability called cognitive control, the capacity to maintain focus on an important choice while ignoring other impulses. Poor planning, wandering attention and trouble inhibiting impulses all signify lapses in cognitive control. Now a growing stream of research suggests that strengthening this mental muscle, usually with exercises in so-called mindfulness, may help children and adults cope with attention deficit hyperactivity disorder and its adult equivalent, attention deficit disorder. The studies come amid growing disenchantment with the first-line treatment for these conditions: drugs. In 2007, researchers at the University of California, Los Angeles, published a study finding that the incidence of A.D.H.D. among teenagers in Finland, along with difficulties in cognitive functioning and related emotional disorders like depression, were virtually identical to rates among teenagers in the United States. The real difference? Most adolescents with A.D.H.D. in the United States were taking medication; most in Finland were not. “It raises questions about using medication as a first line of treatment,” said Susan Smalley, a behavior geneticist at U.C.L.A. and the lead author. In a large study published last year in The Journal of the American Academy of Child & Adolescent Psychiatry, researchers reported that while most young people with A.D.H.D. benefit from medications in the first year, these effects generally wane by the third year, if not sooner. “There are no long-term, lasting benefits from taking A.D.H.D. medications,” said James M. Swanson, a psychologist at the University of California, Irvine, and an author of the study. “But mindfulness seems to be training the same areas of the brain that have reduced activity in A.D.H.D.” © 2014 The New York Times Company
By DAVID L. KIRP Whenever President Obama proposes a major federal investment in early education, as he did in his two most recent State of the Union addresses, critics have a two-word riposte: Head Start. Researchers have long cast doubt on that program’s effectiveness. The most damning evidence comes from a 2012 federal evaluation that used gold-standard methodology and concluded that children who participated in Head Start were not more successful in elementary school than others. That finding was catnip to the detractors. “Head Start’s impact is no better than random,” The Wall Street Journal editorialized. Why throw good money after bad? Though the faultfinders have a point, the claim that Head Start has failed overstates the case. For one thing, it has gotten considerably better in the past few years because of tougher quality standards. For another, researchers have identified a “sleeper effect” — many Head Start youngsters begin to flourish as teenagers, maybe because the program emphasizes character and social skills as well as the three R’s. Still, few would give Head Start high marks, and the bleak conclusion of the 2012 evaluation stands in sharp contrast to the impressive results from well-devised studies of state-financed prekindergartens. Head Start, a survivor of President Lyndon B. Johnson’s war on poverty, enrolls only poor kids. That’s a big part of the problem — as the adage goes, programs for the poor often become poor programs. Whether it’s health care (compare the trajectories of Medicare, for those 65 and older of all incomes, and Medicaid, only for the poor), education or housing, the sorry truth is that “we” don’t like subsidizing “them.” Head Start is no exception. It has been perpetually underfunded, never able to enroll more than half of eligible children or pay its teachers a decent wage. If Head Start is going to realize its potential, it has to break out of the antipoverty mold. One promising but unfortunately rarely used strategy is to encourage all youngsters, not just poor kids, to enroll, with poor families paying nothing and middle-class families contributing on a sliding scale. Another is to merge Head Start with high-quality state prekindergarten. © 2014 The New York Times Company
Helen Shen For anyone fighting to save old memories, a fresh crop of brain cells may be the last thing they need. Research published today in Science suggests that newly formed neurons in the hippocampus — an area of the brain involved in memory formation — could dislodge previously learned information1. The work may provide clues as to why childhood memories are so difficult to recall. “The finding was very surprising to us initially. Most people think new neurons mean better memory,” says Sheena Josselyn, a neuroscientist who led the study together with her husband Paul Frankland at the Hospital for Sick Children in Toronto, Canada. Humans, mice and several other mammals grow new neurons in the hippocampus throughout their lives — rapidly at first, but more and more slowly with age. Researchers have previously shown that boosting neural proliferation before learning can enhance memory formation in adult mice2, 3. But the latest study shows that after information is learned, neuron growth can degrade those memories. Although seemingly counterintuitive, the disruptive role of these neurons makes some sense, says Josselyn. She notes that some theoretical models have predicted such an effect4. “More neurons increase the capacity to learn new memories in the future,” she says. “But memory is based on a circuit, so if you add to this circuit, it makes sense that it would disrupt it.” Newly added neurons could have a useful role in clearing old memories and making way for new ones, says Josselyn. Forgetting curve The researchers tested newborn and adult mice on a conditioning task, training the animals to fear an environment in which they received repeated electric shocks. All the mice learned the task quickly, but whereas infant mice remembered the negative experience for only one day after training, adult mice retained the negative memory for several weeks. © 2014 Nature Publishing Group
By GRETCHEN REYNOLDS The more physically active you are at age 25, the better your thinking tends to be when you reach middle age, according to a large-scale new study. Encouragingly, the findings also suggest that if you negligently neglected to exercise when young, you can start now and still improve the health of your brain. Those of us past age 40 are generally familiar with those first glimmerings of forgetfulness and muddled thinking. We can’t easily recall people’s names, certain words, or where we left the car keys. “It’s what we scientists call having a C.R.S. problem,” said David R. Jacobs, a professor of public health at the University of Minnesota in Minneapolis and a co-author of the new study. “You can’t remember stuff.” But these slight, midlife declines in thinking skills strike some people later or less severely than others, and scientists have not known why. Genetics almost certainly play a role, most researchers agree. Yet the contribution of lifestyle, and in particular of exercise habits, has been unclear. So recently, Dr. Jacobs and colleagues from universities in the United States and overseas turned to a large trove of data collected over several decades for the Cardia study. The study, whose name is short for Coronary Artery Risk Development in Young Adults, began in the mid-1980s with the recruitment of thousands of men and women then ages 18 to 30 who underwent health testing to determine their cholesterol levels, blood pressure and other measures. Many of the volunteers also completed a treadmill run to exhaustion, during which they strode at an increasingly brisk pace until they could go no farther. The average time to exhaustion among these young adults was 10 minutes, meaning that most were moderately but not tremendously fit. © 2014 The New York Times Company
Erin Allday The game seems pretty simple. An alien-looking creature stands on a block of ice that's flowing down a river. The goal is to maneuver the ice around whales and other hurdles and periodically cause the alien to "jump" to grab green fish as they leap out of the water. The game is played on a tablet, and it looks a lot like any of hundreds of apps that can be downloaded for some mindless entertainment during an afternoon commute on BART. Here's what sets the game apart: It was designed by scientists at UCSF looking for a new way to treat serious symptoms of depression. "We're trying to see whether we can get the same effects with the game as with therapy," said Patricia Arean, a clinical psychologist at UCSF who is studying the potential mental health benefits of video game play in older adults. Arean is joining the burgeoning field of research into the use of video games as tools for promoting brain health. Video games undoubtedly have some kind of effect on our brains, but harnessing the technology and forcing a lasting - and positive - change is the challenge. So far, what little evidence does exist that video games can have a measurable impact on brain activity has been gathered almost entirely on healthy subjects. But in small clinical trials - like Arean's study of depression in older adults - the effects of games on both healthy and unhealthy people are being studied to find out whether they're useful in treating mental illness, such as autism, attention deficit and hyperactivity disorder, and post-traumatic stress disorder. Some neuroscientists say video games may also strengthen neural networks, potentially preventing or slowing down the brain deterioration associated with old age or diseases like Alzheimer's or Parkinson's. "We're in the infancy of this idea that entertaining and gaming stuff can be useful for you," said Joaquin Anguera, a UCSF neuroscientist who designs cognitive training games, including the one Arean is testing with patients. © 2014 Hearst Communications, Inc.
by Helen Thomson A 22-year-old man has been instantaneously transported to his family's pizzeria and his local railway station – by having his brain zapped. These fleeting visual hallucinations have helped researchers pinpoint places where the brain stores visual location information. Pierre Mégevand at the Feinstein Institute for Medical Research in Manhasset, New York, and his colleagues wanted to discover just where in the brain we store and retrieve information about locations and places. They sought the help of a 22-year-old man being treated for epilepsy, because the treatment involved implanting electrodes into his brain that would record his neural activity. Mégevand and his colleagues scanned the volunteer's brain using functional MRI while he looked at pictures of different objects and scenes. They then recorded activity from the implanted electrodes as he looked at a similar set of pictures. In both situations, a specific area of the cortex around the hippocampus responded to images of places, but not to images of other kinds of objects, such as body parts or tools. "There are these little spots of tissues that seem to care about houses and places more than any other class of object," says research team member Ashesh Mehta, also at the Feinstein Institute. Next, the team used the implanted electrodes to stimulate the brain in this area – a move that the volunteer said triggered a series complex visual hallucinations. First he described seeing a railway station in the neighbourhood where he lives. Stimulation of a nearby area elicited another hallucination, this time of a staircase and a blue closet in his home. When stimulation of these areas was repeated, the same scenes arose. © Copyright Reed Business Information Ltd.
By Emily Chung, CBC News If you're in your late 20s or older, you're not as sharp as you used to be, suggests a study of gamers playing the popular video game Starcraft 2. The study analyzed the way 3,305 people, aged 16 to 44, played the game against a single random opponent of similar skill, in order to measure the gamers' cognitive motor performance. Cognitive motor performance is how quickly your brain reacts to things happening around you, allowing you to act during tasks such as driving. The analysis revealed exactly when advancing age starts to take its toll on brain performance – at the tender age of 24 years. The results were published late last week in the journal PLOS ONE. Joe Thompson, lead author of the study, said he was surprised by how early the decline started and how big the age effect was, even among those in their 30s. "If you're 39, competing against a 24-year-old and you're both in the otherwise same level of skill," Thompson said, "the effect of age is expected to offset a great deal of your learning." Starcraft 2 is a popular strategy game, similar in concept to Risk, where players compete to build armies and conquer a science fictional world. Unlike Risk, however, players don't take turns. "Starcraft is like high-speed chess," said Thompson, a PhD student who plays the game himself. "You simply can make as many moves as you want, as fast as you can go." Players can't see the whole "world" at once, as they mine resources needed to build up their armies, as they attack their opponents, and as they defend against opponents' attacks, they need to quickly move their screen around from one part of the world to another. © CBC 2014
|By Janali Gustafson Cravings—we all have them. These intense desires can be triggered by a place, a smell, even a picture. For recovering drug addicts, such memory associations can increase vulnerability to relapse. Now researchers at the Florida campus of the Scripps Research Institute have found a chemical that prevents rats from recalling their drug-associated memories. The study, published online in Biological Psychiatry last fall, is also the first of its kind to disrupt memories without requiring active recollection. Over the course of six days the rats in this study alternated between one of two chambers. On days one, three and five, the animals were injected with methamphetamine hydrochloride—the street drug known as meth—and placed in one room. On the even-numbered days they received a saline placebo and entered a different chamber. After two more days, half the rodents were given a choice between the rooms. As expected, they showed a clear preference for the place they visited after receiving meth. The other half of the animals were injected with a solution containing Latrunculin A (LatA). This chemical interferes with actin, a protein known to be involved in memory formation. These animals showed no preference between rooms, even up to a day later: their choices seemed not to be driven by a memory of meth. Previous research has suggested that drugs of abuse alter the way actin functions, causing it to constantly refresh memories associated with these drugs rather than tucking them away into typical memory storage, which is more inert. As a result of their active status, drug memories might remain susceptible to disruption long after their initial formation. © 2014 Scientific American
By By Stephanie Pappas, A little stress may be a good thing for teenagers learning to drive. In a new study, teens whose levels of the stress hormone cortisol increased more during times of stress got into fewer car crashes or near crashes in their first months of driving than their less-stress-responsive peers did. The study suggests that biological differences may affect how teens learn to respond to crises on the road, the researchers reported today (April 7) in the journal JAMA Pediatrics. Efforts to reduce teen car accidents include graduated driver licensing programs, safety messages and increased parental management, but these efforts seem to work better for some teens than others, the researchers said. Alternatives, such as in-vehicle technologies aimed at reducing accidents, may be especially useful for teens with a "neurological basis" for their increased risk of getting into an accident, they said. Automobile accidents are the No. 1 cause of death of teenagers in the United States, according to the Centers for Disease Control and Prevention. Car crashes also kill more 15- to 29-year-olds globally than any other cause, according to the World Health Organization.
By Sam Kean Kent Cochrane, the amnesiac known throughout the world of neuroscience and psychology as K.C., died last week at age 62 in his nursing home in Toronto, probably of a stroke or heart attack. Although not as celebrated as the late American amnesiac H.M., for my money K.C. taught us more important and poignant things about how memory works. He showed how we make memories personal and personally meaningful. He also had a heck of a life story. During a wild and extended adolescence, K.C. jammed in rock bands, partied at Mardi Gras, played cards till all hours, and got into fights in bars; he was also knocked unconscious twice, once in a dune-buggy accident, once when a bale of hay conked him on the head. In October 1981, at age 30, he skidded off an exit ramp on his motorcycle. He spent a month in intensive care and lost, among other brain structures, both his hippocampuses. As H.M.’s case demonstrated in the early 1950s, the hippocampus—you have one in each hemisphere of your brain—helps form and store new memories and retrieve old ones. Without a functioning hippocampus, names, dates, and other information falls straight through the mind like a sieve. At least that’s what supposed to happen. K.C. proved that that’s not quite true—memories can sometimes bypass the hippocampus. After the motorcycle accident, K.C. lost most of his past memories and could make almost no new memories. But a neuroscientist named Endel Tulving began studying K.C., and he determined that K.C. could remember certain things from his past life just fine. Oddly, though, everything K.C. remembered fell within one restricted category: It was all stuff you could look up in reference books, like the difference between stalactites and stalagmites or between spares and strikes in bowling. Tulving called these bare facts “semantic memories,” memories devoid of all context and emotion. © 2014 The Slate Group LLC
Keyword: Learning & Memory
Link ID: 19455 - Posted: 04.08.2014
He was known in his many appearances in the scientific literature as simply K.C., an amnesiac who was unable to form new memories. But to the people who knew him, and the scientists who studied him for decades, he was Kent Cochrane, or just Kent. Cochrane, who suffered a traumatic brain injury in a motorcycle accident when he was 30 years old, helped to rewrite the understanding of how the brain forms new memories and whether learning can occur without that capacity. "From a scientific point of view, we've really learned a lot [from him], not just about memory itself but how memory contributes to other abilities," said Shayna Rosenbaum, a cognitive neuropsychologist at York University who started working with Cochrane in 1998 when she was a graduate student. Cochrane was 62 when he died late last week. The exact cause of death is unknown, but his sister, Karen Casswell, said it is believed he had a heart attack or stroke. He died in his room at an assisted living facility where he lived and the family opted not to authorize an autopsy. Few in the general public would know about Cochrane, though some may have seen or read media reports on the man whose life was like that of the lead character of the 2000 movie Memento. But anyone who works on the science of human memory would know K.C. Casswell and her mother, Ruth Cochrane, said the family was proud of the contribution Kent Cochrane made to science. Casswell noted her eldest daughter was in a psychology class at university when the professor started to lecture about the man the scientific literature knows as K.C. © CBC 2014
Keyword: Learning & Memory
Link ID: 19442 - Posted: 04.03.2014
By SABRINA TAVERNISE In 1972, researchers in North Carolina started following two groups of babies from poor families. In the first group, the children were given full-time day care up to age 5 that included most of their daily meals, talking, games and other stimulating activities. The other group, aside from baby formula, got nothing. The scientists were testing whether the special treatment would lead to better cognitive abilities in the long run. Forty-two years later, the researchers found something that they had not expected to see: The group that got care was far healthier, with sharply lower rates of high blood pressure and obesity, and higher levels of so-called good cholesterol. The study, which was published in the journal Science on Thursday, is part of a growing body of scientific evidence that hardship in early childhood has lifelong health implications. But it goes further than outlining the problem, offering evidence that a particular policy might prevent it. “This tells us that adversity matters and it does affect adult health,” said James Heckman, a professor of economics at the University of Chicago who led the data analysis. “But it also shows us that we can do something about it, that poverty is not just a hopeless condition.” The findings come amid a political push by the Obama administration for government-funded preschool for 4-year-olds. But a growing number of experts, Professor Heckman among them, say they believe that more effective public programs would start far earlier — in infancy, for example, because that is when many of the skills needed to take control of one’s life and become a successful adult are acquired. © 2014 The New York Times Company
By Shelly Fan One of the tragedies of aging is the slow but steady decline in memory. Phone numbers slipping your mind? Forgetting crucial items on your grocery list? Opening the door but can’t remember why? Up to 50 percent of adults aged 64 years or older report memory complaints. For many of us, senile moments are the result of normal changes in brain structure and function instead of a sign of dementia, and will inevitably haunt us all. Rather than taking it lying down, scientists are devising interventions to help keep the elderly mind sharp. One popular approach—borrowed from the training of memory experts—is to teach the elderly mnemonics, or little tricks to help encode and recall new information using rhythm, imagery or spatial navigation. By far the most widely used mnemonic device is the method of loci (MoL), a technique devised in ancient Greece. In a 2002 study looking at the neural correlates of superior human memory, nine of 10 memory masters employed the method spontaneously. It involves picturing highly familiar routes through a building (your childhood home) or a town (your way to work). Walk down the route and imagine placing to-be-remembered items at attention-grabbing spots along the way; the more surreal or bizarre you make these images, the better they can help you remember. To recall these stored items, simply retrace your steps. Like fishing lines, the loci are hooked to the memory and help you pull them to the surface. Although generally used to remember objects, numbers or names, the MoL has also been used in people with depression to successfully store bits and pieces of happy autobiographical memories that they can easily retrieve in times of stress. © 2014 Scientific American,
By TARA PARKER-POPE For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory. While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm Neuronix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.” And last month, in a move that could significantly improve the financial prospects for brain-game developers, the Centers for Medicare and Medicaid Services began seeking comments on a proposal that would, in some cases, reimburse the cost of “memory fitness activities.” Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19346 - Posted: 03.11.2014