Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By DAVID L. KIRP Whenever President Obama proposes a major federal investment in early education, as he did in his two most recent State of the Union addresses, critics have a two-word riposte: Head Start. Researchers have long cast doubt on that program’s effectiveness. The most damning evidence comes from a 2012 federal evaluation that used gold-standard methodology and concluded that children who participated in Head Start were not more successful in elementary school than others. That finding was catnip to the detractors. “Head Start’s impact is no better than random,” The Wall Street Journal editorialized. Why throw good money after bad? Though the faultfinders have a point, the claim that Head Start has failed overstates the case. For one thing, it has gotten considerably better in the past few years because of tougher quality standards. For another, researchers have identified a “sleeper effect” — many Head Start youngsters begin to flourish as teenagers, maybe because the program emphasizes character and social skills as well as the three R’s. Still, few would give Head Start high marks, and the bleak conclusion of the 2012 evaluation stands in sharp contrast to the impressive results from well-devised studies of state-financed prekindergartens. Head Start, a survivor of President Lyndon B. Johnson’s war on poverty, enrolls only poor kids. That’s a big part of the problem — as the adage goes, programs for the poor often become poor programs. Whether it’s health care (compare the trajectories of Medicare, for those 65 and older of all incomes, and Medicaid, only for the poor), education or housing, the sorry truth is that “we” don’t like subsidizing “them.” Head Start is no exception. It has been perpetually underfunded, never able to enroll more than half of eligible children or pay its teachers a decent wage. If Head Start is going to realize its potential, it has to break out of the antipoverty mold. One promising but unfortunately rarely used strategy is to encourage all youngsters, not just poor kids, to enroll, with poor families paying nothing and middle-class families contributing on a sliding scale. Another is to merge Head Start with high-quality state prekindergarten. © 2014 The New York Times Company
Helen Shen For anyone fighting to save old memories, a fresh crop of brain cells may be the last thing they need. Research published today in Science suggests that newly formed neurons in the hippocampus — an area of the brain involved in memory formation — could dislodge previously learned information1. The work may provide clues as to why childhood memories are so difficult to recall. “The finding was very surprising to us initially. Most people think new neurons mean better memory,” says Sheena Josselyn, a neuroscientist who led the study together with her husband Paul Frankland at the Hospital for Sick Children in Toronto, Canada. Humans, mice and several other mammals grow new neurons in the hippocampus throughout their lives — rapidly at first, but more and more slowly with age. Researchers have previously shown that boosting neural proliferation before learning can enhance memory formation in adult mice2, 3. But the latest study shows that after information is learned, neuron growth can degrade those memories. Although seemingly counterintuitive, the disruptive role of these neurons makes some sense, says Josselyn. She notes that some theoretical models have predicted such an effect4. “More neurons increase the capacity to learn new memories in the future,” she says. “But memory is based on a circuit, so if you add to this circuit, it makes sense that it would disrupt it.” Newly added neurons could have a useful role in clearing old memories and making way for new ones, says Josselyn. Forgetting curve The researchers tested newborn and adult mice on a conditioning task, training the animals to fear an environment in which they received repeated electric shocks. All the mice learned the task quickly, but whereas infant mice remembered the negative experience for only one day after training, adult mice retained the negative memory for several weeks. © 2014 Nature Publishing Group
By GRETCHEN REYNOLDS The more physically active you are at age 25, the better your thinking tends to be when you reach middle age, according to a large-scale new study. Encouragingly, the findings also suggest that if you negligently neglected to exercise when young, you can start now and still improve the health of your brain. Those of us past age 40 are generally familiar with those first glimmerings of forgetfulness and muddled thinking. We can’t easily recall people’s names, certain words, or where we left the car keys. “It’s what we scientists call having a C.R.S. problem,” said David R. Jacobs, a professor of public health at the University of Minnesota in Minneapolis and a co-author of the new study. “You can’t remember stuff.” But these slight, midlife declines in thinking skills strike some people later or less severely than others, and scientists have not known why. Genetics almost certainly play a role, most researchers agree. Yet the contribution of lifestyle, and in particular of exercise habits, has been unclear. So recently, Dr. Jacobs and colleagues from universities in the United States and overseas turned to a large trove of data collected over several decades for the Cardia study. The study, whose name is short for Coronary Artery Risk Development in Young Adults, began in the mid-1980s with the recruitment of thousands of men and women then ages 18 to 30 who underwent health testing to determine their cholesterol levels, blood pressure and other measures. Many of the volunteers also completed a treadmill run to exhaustion, during which they strode at an increasingly brisk pace until they could go no farther. The average time to exhaustion among these young adults was 10 minutes, meaning that most were moderately but not tremendously fit. © 2014 The New York Times Company
Erin Allday The game seems pretty simple. An alien-looking creature stands on a block of ice that's flowing down a river. The goal is to maneuver the ice around whales and other hurdles and periodically cause the alien to "jump" to grab green fish as they leap out of the water. The game is played on a tablet, and it looks a lot like any of hundreds of apps that can be downloaded for some mindless entertainment during an afternoon commute on BART. Here's what sets the game apart: It was designed by scientists at UCSF looking for a new way to treat serious symptoms of depression. "We're trying to see whether we can get the same effects with the game as with therapy," said Patricia Arean, a clinical psychologist at UCSF who is studying the potential mental health benefits of video game play in older adults. Arean is joining the burgeoning field of research into the use of video games as tools for promoting brain health. Video games undoubtedly have some kind of effect on our brains, but harnessing the technology and forcing a lasting - and positive - change is the challenge. So far, what little evidence does exist that video games can have a measurable impact on brain activity has been gathered almost entirely on healthy subjects. But in small clinical trials - like Arean's study of depression in older adults - the effects of games on both healthy and unhealthy people are being studied to find out whether they're useful in treating mental illness, such as autism, attention deficit and hyperactivity disorder, and post-traumatic stress disorder. Some neuroscientists say video games may also strengthen neural networks, potentially preventing or slowing down the brain deterioration associated with old age or diseases like Alzheimer's or Parkinson's. "We're in the infancy of this idea that entertaining and gaming stuff can be useful for you," said Joaquin Anguera, a UCSF neuroscientist who designs cognitive training games, including the one Arean is testing with patients. © 2014 Hearst Communications, Inc.
by Helen Thomson A 22-year-old man has been instantaneously transported to his family's pizzeria and his local railway station – by having his brain zapped. These fleeting visual hallucinations have helped researchers pinpoint places where the brain stores visual location information. Pierre Mégevand at the Feinstein Institute for Medical Research in Manhasset, New York, and his colleagues wanted to discover just where in the brain we store and retrieve information about locations and places. They sought the help of a 22-year-old man being treated for epilepsy, because the treatment involved implanting electrodes into his brain that would record his neural activity. Mégevand and his colleagues scanned the volunteer's brain using functional MRI while he looked at pictures of different objects and scenes. They then recorded activity from the implanted electrodes as he looked at a similar set of pictures. In both situations, a specific area of the cortex around the hippocampus responded to images of places, but not to images of other kinds of objects, such as body parts or tools. "There are these little spots of tissues that seem to care about houses and places more than any other class of object," says research team member Ashesh Mehta, also at the Feinstein Institute. Next, the team used the implanted electrodes to stimulate the brain in this area – a move that the volunteer said triggered a series complex visual hallucinations. First he described seeing a railway station in the neighbourhood where he lives. Stimulation of a nearby area elicited another hallucination, this time of a staircase and a blue closet in his home. When stimulation of these areas was repeated, the same scenes arose. © Copyright Reed Business Information Ltd.
By Emily Chung, CBC News If you're in your late 20s or older, you're not as sharp as you used to be, suggests a study of gamers playing the popular video game Starcraft 2. The study analyzed the way 3,305 people, aged 16 to 44, played the game against a single random opponent of similar skill, in order to measure the gamers' cognitive motor performance. Cognitive motor performance is how quickly your brain reacts to things happening around you, allowing you to act during tasks such as driving. The analysis revealed exactly when advancing age starts to take its toll on brain performance – at the tender age of 24 years. The results were published late last week in the journal PLOS ONE. Joe Thompson, lead author of the study, said he was surprised by how early the decline started and how big the age effect was, even among those in their 30s. "If you're 39, competing against a 24-year-old and you're both in the otherwise same level of skill," Thompson said, "the effect of age is expected to offset a great deal of your learning." Starcraft 2 is a popular strategy game, similar in concept to Risk, where players compete to build armies and conquer a science fictional world. Unlike Risk, however, players don't take turns. "Starcraft is like high-speed chess," said Thompson, a PhD student who plays the game himself. "You simply can make as many moves as you want, as fast as you can go." Players can't see the whole "world" at once, as they mine resources needed to build up their armies, as they attack their opponents, and as they defend against opponents' attacks, they need to quickly move their screen around from one part of the world to another. © CBC 2014
|By Janali Gustafson Cravings—we all have them. These intense desires can be triggered by a place, a smell, even a picture. For recovering drug addicts, such memory associations can increase vulnerability to relapse. Now researchers at the Florida campus of the Scripps Research Institute have found a chemical that prevents rats from recalling their drug-associated memories. The study, published online in Biological Psychiatry last fall, is also the first of its kind to disrupt memories without requiring active recollection. Over the course of six days the rats in this study alternated between one of two chambers. On days one, three and five, the animals were injected with methamphetamine hydrochloride—the street drug known as meth—and placed in one room. On the even-numbered days they received a saline placebo and entered a different chamber. After two more days, half the rodents were given a choice between the rooms. As expected, they showed a clear preference for the place they visited after receiving meth. The other half of the animals were injected with a solution containing Latrunculin A (LatA). This chemical interferes with actin, a protein known to be involved in memory formation. These animals showed no preference between rooms, even up to a day later: their choices seemed not to be driven by a memory of meth. Previous research has suggested that drugs of abuse alter the way actin functions, causing it to constantly refresh memories associated with these drugs rather than tucking them away into typical memory storage, which is more inert. As a result of their active status, drug memories might remain susceptible to disruption long after their initial formation. © 2014 Scientific American
By By Stephanie Pappas, A little stress may be a good thing for teenagers learning to drive. In a new study, teens whose levels of the stress hormone cortisol increased more during times of stress got into fewer car crashes or near crashes in their first months of driving than their less-stress-responsive peers did. The study suggests that biological differences may affect how teens learn to respond to crises on the road, the researchers reported today (April 7) in the journal JAMA Pediatrics. Efforts to reduce teen car accidents include graduated driver licensing programs, safety messages and increased parental management, but these efforts seem to work better for some teens than others, the researchers said. Alternatives, such as in-vehicle technologies aimed at reducing accidents, may be especially useful for teens with a "neurological basis" for their increased risk of getting into an accident, they said. Automobile accidents are the No. 1 cause of death of teenagers in the United States, according to the Centers for Disease Control and Prevention. Car crashes also kill more 15- to 29-year-olds globally than any other cause, according to the World Health Organization.
By Sam Kean Kent Cochrane, the amnesiac known throughout the world of neuroscience and psychology as K.C., died last week at age 62 in his nursing home in Toronto, probably of a stroke or heart attack. Although not as celebrated as the late American amnesiac H.M., for my money K.C. taught us more important and poignant things about how memory works. He showed how we make memories personal and personally meaningful. He also had a heck of a life story. During a wild and extended adolescence, K.C. jammed in rock bands, partied at Mardi Gras, played cards till all hours, and got into fights in bars; he was also knocked unconscious twice, once in a dune-buggy accident, once when a bale of hay conked him on the head. In October 1981, at age 30, he skidded off an exit ramp on his motorcycle. He spent a month in intensive care and lost, among other brain structures, both his hippocampuses. As H.M.’s case demonstrated in the early 1950s, the hippocampus—you have one in each hemisphere of your brain—helps form and store new memories and retrieve old ones. Without a functioning hippocampus, names, dates, and other information falls straight through the mind like a sieve. At least that’s what supposed to happen. K.C. proved that that’s not quite true—memories can sometimes bypass the hippocampus. After the motorcycle accident, K.C. lost most of his past memories and could make almost no new memories. But a neuroscientist named Endel Tulving began studying K.C., and he determined that K.C. could remember certain things from his past life just fine. Oddly, though, everything K.C. remembered fell within one restricted category: It was all stuff you could look up in reference books, like the difference between stalactites and stalagmites or between spares and strikes in bowling. Tulving called these bare facts “semantic memories,” memories devoid of all context and emotion. © 2014 The Slate Group LLC
Keyword: Learning & Memory
Link ID: 19455 - Posted: 04.08.2014
He was known in his many appearances in the scientific literature as simply K.C., an amnesiac who was unable to form new memories. But to the people who knew him, and the scientists who studied him for decades, he was Kent Cochrane, or just Kent. Cochrane, who suffered a traumatic brain injury in a motorcycle accident when he was 30 years old, helped to rewrite the understanding of how the brain forms new memories and whether learning can occur without that capacity. "From a scientific point of view, we've really learned a lot [from him], not just about memory itself but how memory contributes to other abilities," said Shayna Rosenbaum, a cognitive neuropsychologist at York University who started working with Cochrane in 1998 when she was a graduate student. Cochrane was 62 when he died late last week. The exact cause of death is unknown, but his sister, Karen Casswell, said it is believed he had a heart attack or stroke. He died in his room at an assisted living facility where he lived and the family opted not to authorize an autopsy. Few in the general public would know about Cochrane, though some may have seen or read media reports on the man whose life was like that of the lead character of the 2000 movie Memento. But anyone who works on the science of human memory would know K.C. Casswell and her mother, Ruth Cochrane, said the family was proud of the contribution Kent Cochrane made to science. Casswell noted her eldest daughter was in a psychology class at university when the professor started to lecture about the man the scientific literature knows as K.C. © CBC 2014
Keyword: Learning & Memory
Link ID: 19442 - Posted: 04.03.2014
By SABRINA TAVERNISE In 1972, researchers in North Carolina started following two groups of babies from poor families. In the first group, the children were given full-time day care up to age 5 that included most of their daily meals, talking, games and other stimulating activities. The other group, aside from baby formula, got nothing. The scientists were testing whether the special treatment would lead to better cognitive abilities in the long run. Forty-two years later, the researchers found something that they had not expected to see: The group that got care was far healthier, with sharply lower rates of high blood pressure and obesity, and higher levels of so-called good cholesterol. The study, which was published in the journal Science on Thursday, is part of a growing body of scientific evidence that hardship in early childhood has lifelong health implications. But it goes further than outlining the problem, offering evidence that a particular policy might prevent it. “This tells us that adversity matters and it does affect adult health,” said James Heckman, a professor of economics at the University of Chicago who led the data analysis. “But it also shows us that we can do something about it, that poverty is not just a hopeless condition.” The findings come amid a political push by the Obama administration for government-funded preschool for 4-year-olds. But a growing number of experts, Professor Heckman among them, say they believe that more effective public programs would start far earlier — in infancy, for example, because that is when many of the skills needed to take control of one’s life and become a successful adult are acquired. © 2014 The New York Times Company
By Shelly Fan One of the tragedies of aging is the slow but steady decline in memory. Phone numbers slipping your mind? Forgetting crucial items on your grocery list? Opening the door but can’t remember why? Up to 50 percent of adults aged 64 years or older report memory complaints. For many of us, senile moments are the result of normal changes in brain structure and function instead of a sign of dementia, and will inevitably haunt us all. Rather than taking it lying down, scientists are devising interventions to help keep the elderly mind sharp. One popular approach—borrowed from the training of memory experts—is to teach the elderly mnemonics, or little tricks to help encode and recall new information using rhythm, imagery or spatial navigation. By far the most widely used mnemonic device is the method of loci (MoL), a technique devised in ancient Greece. In a 2002 study looking at the neural correlates of superior human memory, nine of 10 memory masters employed the method spontaneously. It involves picturing highly familiar routes through a building (your childhood home) or a town (your way to work). Walk down the route and imagine placing to-be-remembered items at attention-grabbing spots along the way; the more surreal or bizarre you make these images, the better they can help you remember. To recall these stored items, simply retrace your steps. Like fishing lines, the loci are hooked to the memory and help you pull them to the surface. Although generally used to remember objects, numbers or names, the MoL has also been used in people with depression to successfully store bits and pieces of happy autobiographical memories that they can easily retrieve in times of stress. © 2014 Scientific American,
By TARA PARKER-POPE For a $14.95 monthly membership, the website Lumosity promises to “train” your brain with games designed to stave off mental decline. Users view a quick succession of bird images and numbers to test attention span, for instance, or match increasingly complex tile patterns to challenge memory. While Lumosity is perhaps the best known of the brain-game websites, with 50 million subscribers in 180 countries, the cognitive training business is booming. Happy Neuron of Mountain View, Calif., promises “brain fitness for life.” Cogmed, owned by the British education company Pearson, says its training program will give students “improved attention and capacity for learning.” The Israeli firm Neuronix is developing a brain stimulation and cognitive training program that the company calls a “new hope for Alzheimer’s disease.” And last month, in a move that could significantly improve the financial prospects for brain-game developers, the Centers for Medicare and Medicaid Services began seeking comments on a proposal that would, in some cases, reimburse the cost of “memory fitness activities.” Much of the focus of the brain fitness business has been on helping children with attention-deficit problems, and on improving cognitive function and academic performance in healthy children and adults. An effective way to stave off memory loss or prevent Alzheimer’s — particularly if it were a simple website or video game — is the “holy grail” of neuroscience, said Dr. Murali Doraiswamy, director of the neurocognitive disorders program at Duke Institute for Brain Sciences. The problem, Dr. Doraiswamy added, is that the science of cognitive training has not kept up with the hype. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19346 - Posted: 03.11.2014
|By Christie Nicholson Our memories are inaccurate, more than we’d like to believe. And now a study demonstrates one reason: we apparently add current experiences onto memories. Study subjects examined the location of objects on a computer screen against a background of an underwater ocean scene. Researchers then showed the subjects a fresh screen with a different background, this time a photo of farmland. And the subjects had to place an object in the same position it was in on the original screen. And they always placed the object in the wrong position. The researchers then presented three objects on the original ocean background. One was in the original location, another was in the location the subject just chose in the previous task and the third was in a new location. The subject was asked to pick the original location of the object in the original ocean background. And instead of choosing the original correct location, they always picked the position they had chosen. That is, they now believed the position they’d picked on the farm scene was the original position on the ocean background. The study is in the Journal of Neuroscience. [Donna J. Bridge and Joel L. Voss, Hippocampal Binding of Novel Information with Dominant Memory Traces Can Support Both Memory Stability and Change] The researchers note that recent and easily retrievable information “can overwrite what was there to begin with.” Consider that next time you hear eyewitness testimony. © 2014 Scientific American
Keyword: Learning & Memory
Link ID: 19312 - Posted: 03.03.2014
Carl Zimmer Forcing male flies into monogamy has a startling effect: After a few dozen generations, the flies become worse at learning. This discovery, published on Wednesday in the Proceedings of the Royal Society, isn’t a biological excuse for men who have strayed from their significant other. Instead, it’s a tantalizing clue about why intelligence evolved. The new study was carried out by Brian Hollis and Tadeusz J. Kawecki, biologists at the University of Lausanne in Switzerland. They investigated a fly species called Drosophila melanogaster that normally has a very un-monogamous way of life. To find a mate, the male flies seek out females on rotting pieces of fruit. They often engage in battles to chase their rivals away, and then pick a female to court. “The males will do this wing song, where they use one wing or the other to generate a song,” said Dr. Hollis. This wing song may last from 10 minutes to an hour. Virgin females usually accept the overtures. But if a female has just mated, she will reject a new male’s advances. “If a male comes at her from behind and she’s not interested, she’ll kick at him with her rear legs,” said Dr. Hollis. If a couple of days have passed since her last mating, however, the female may choose to mate again. Seven years ago, while he was a graduate student at Florida State University, Dr. Hollis set out to study how the competition among males shapes their evolution. He began breeding two groups of flies — one polygamous, the other monogamous. In 2011, he took his flies to the University of Lausanne, where he met Dr. Kawecki, an expert on learning. The two scientists wondered if the different mating habits of Dr. Hollis’s flies had altered their brains. © 2014 The New York Times Company
A brain-training video game that improved the vision of college baseball players by as much as two lines on an eye chart has been developed by U.S. researchers. "This is something which I think could help almost anybody," said Aaron Seitz, a neuroscientist at the University of California, Riverside, who the led the research. Players on the university's baseball team improved their visual acuity by 31 per cent after training with the app. And that translated into better performance on the baseball field, where better vision improves the odds of hitting a ball travelling well over 100 km/h. "What we found is they had fewer strikeouts, they were able to create more runs," Seitz told CBC's Quirks & Quarks in an interview that airs Saturday. The players had more runs than predicted even after taking into account the natural improvement that would be expected over the course of the season. Further calculations suggest the improved performance helped the team to win four or five additional games. Following 30 sessions of training with the app, players had better vision, fewer strikeouts, more runs and more wins. But Seitz thinks the app has even more potential to help people with eye conditions such as lazy eye, glaucoma, or age-related macular degeneration. There are 100 million people around the world who have such low vision that glasses don't help, he added. "All that they have to gain is the brain training element.… For these people, there's just really big real-world benefits that could be achieved if we're able to improve their vision."
|By Beth Skwarecki Prions, the protein family notorious for causing "mad cow" and neurodegenerative diseases like Parkinson's, can play an important role in healthy cells. "Do you think God created prions just to kill?" mused Nobel laureate Eric Kandel. "These things must have evolved initially to have a physiological function." His work on memory helped reveal that animals make and use prions in their nervous systems as part of an essential function: stabilizing the synapses that constitute long-term memories. These natural prions aren't infectious but on a molecular level they chain up exactly the same way as their disease-causing brethren. (Some researchers call them "prionlike" to avoid confusion.) This week, work from neuroscientist Kausik Si of the Stowers Institute for Medical Research, one of Kandel's former students, shows that the prion's action is tightly controlled by the cell, and can be turned on when a new long-term memory needs to be formed. Prions are proteins with two unusual properties: First, they can switch between two possible shapes, one that is stable on its own and an alternate conformation that can form chains. Second, the chain-forming version has to be able to trigger others to change shape and join the chain. Say that in the normal version the protein is folded so that one portion of the protein structure—call it "tab A"—fits into its own "slot B." In the alternate form, though, tab A is available to fit into its neighbor's slot B. That means the neighbor can do the same thing to the next protein to come along, forming a chain or clump that can grow indefinitely. © 2014 Scientific American,
James Hamblin Brain training is becoming big business. Everywhere you look, someone is talking about neuroplasticity and trying to train your brain. Soon there will be no wild brains left. At the same time, everyone who spends more than two continuous hours using a computer is, according to the American Optometric Association, ruining their eyes with Computer Vision Syndrome. So, Dr. Aaron Seitz might be onto something with his new brain-training program that promises better vision. UltimEyes is a game-based app that's sold as "fun and rewarding" as it improves your vision and "reverse[s] the effects of aging eyes." It doesn't claim to work on the eyes themselves, but on the brain cortex that processes vision—the part that takes blurry puzzle pieces from the eyes and arranges them into a sweet puzzle. (Brain training for memory, the kind we hear about the most on TV, would be the part that lacquers the finished puzzle, frames it, and hangs it on the wall.) A standard 25-minute session using UltimEyes forces your eyes to work in ways they probably don't in everyday life, and its website warns that after the first use, "just like the first time that you go to the gym, your eyes may feel a bit tired. This experience typically goes away by your third session as your visual system adjusts to its new work-out routine." Seitz is a neuroscientist at the University of California, Riverside. To test out his vision-training game, he had players on the university's baseball team use the app. Half the team trained for 30 sessions. For comparison, the other half did no training. © 2014 by The Atlantic Monthly Group
By GREGORY COWLES David Stuart MacLean’s first book, “The Answer to the Riddle Is Me,” opens with a scene out of Robert Ludlum: The protagonist wakes from a blackout to find himself on a crowded train platform in India, with no idea who he is or what he’s doing in a foreign country. The catch is that the protagonist is Mr. MacLean himself, and his book isn’t an international thriller but a “memoir of amnesia,” as his agreeably paradoxical subtitle puts it — the true story of how his memory was wiped clean and how that condition has subsequently affected his life. It is all the more thrilling for that. In 2002, Mr. MacLean was a 28-year-old Fulbright scholar visiting India to research a novel. It wasn’t his first trip; he had gone a few years earlier and stayed for months. But this time around, his anti-malaria medication touched off a break with reality as sudden as it was severe. He hallucinated angels and demons, and felt his thoughts “puddling in the carpet near the doorway and sloshing down the hall.” Delirious, he agreed with the police officer who surmised he must be a drug addict, and apologized profusely for misdeeds he had never committed. At the hospital, a nurse called him “the most entertaining psychotic that they’d ever had.” As harrowing as this territory is, Mr. MacLean makes an affable, sure-footed guide. In his descriptions, you can recognize the good fiction writer he must have been even before amnesia forced him to view the world anew; if the writer’s task is to “make it new,” then losing your memory turns out to be an unexpected boon. An avid drinker before his breakdown, he recoils the first time he tries Scotch again, thinking it smells “like Band-Aids.” He can’t remember his girlfriend of a year, but her voice is “faintly familiar, like the smell of the car heater the first time you turn it on in the fall.” He grasps at hope when his parents arrive to take him home: “I still didn’t have my memory, but I now had an outline of myself, like a tin form waiting for batter.” © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19262 - Posted: 02.18.2014
Katherine Sharpe Ben Harkless could not sit still. At home, the athletic ten-year-old preferred doing three activities at once: playing with his iPad, say, while watching television and rolling on an exercise ball. Sometimes he kicked the walls; other times, he literally bounced off them. School was another story, however. Ben sat in class most days with his head down on his desk, “a defeated heap”, remembers his mother, Suzanne Harkless, a social worker in Berkeley, California. His grades were poor, and his teacher was at a loss for what to do. Harkless took Ben to a therapist who diagnosed him with attention deficit hyperactivity disorder (ADHD). He was prescribed methylphenidate, a stimulant used to improve focus in people with the condition. Harkless was reluctant to medicate her child, so she gave him a dose on a morning when she could visit the school to observe. “He didn't whip through his work, but he finished his work,” she says. “And then he went on and helped his classmate next to him. My jaw dropped.” ADHD diagnoses are rising rapidly around the world and especially in the United States, where 11% of children aged between 4 and 17 years old have been diagnosed with the disorder. Between half and two-thirds of those are put on medication, a decision often influenced by a child's difficulties at school. And there are numerous reports of adolescents and young adults without ADHD using the drugs as study aids. © 2014 Nature Publishing Group,