Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Lauren Hitchings Our brain's ability to rapidly interpret and analyse new information may lie in the musical hum of our brainwaves. We continuously take in information about the world but establishing new neural connections and pathways – the process thought to underlie memory formation – is too slow to account for our ability to learn rapidly. Evan Antzoulatos and Earl Miller at the Massachusetts Institute of Technology decided to see if brainwaves – the surges of electricity produced by individual neurons firing en masse – play a role. They used EEG to observe patterns of electrical activity in the brains of monkeys as they taught the animals to categorise patterns of dots into two distinct groups. At first, they memorised which dots went where, but as the task became harder, they shifted to learning the rules that defined the categories. Humming brainwaves The researchers found that, initially, brainwaves of different frequencies were being produced independently by the prefrontal cortex and the striatum – two brain regions involved in learning. But as the monkeys made sense of the game, the waves began to synchronise and "hum" at the same frequency – with each category of dots having its own frequency. Miller says the synchronised brainwaves indicate the formation of a communication circuit between the two brain regions. He believes this happens before anatomical changes in brain connections take place, giving our minds time to think through various options when presented with new information before the right one gets laid down as a memory. Otherwise, the process is too time-consuming to account for the flexibility and speed of the human mind, says Miller. © Copyright Reed Business Information Ltd.
Keyword: Learning & Memory
Link ID: 19746 - Posted: 06.19.2014
by Bethany Brookshire When a cartoon character gets an idea, you know it. A lightbulb goes on over Wile E. Coyote’s head, or a ding sounds as Goofy puts two and two together. While the lightbulb and sound effects are the stuff of cartoons, scientists can, in a way, watch learning in action. In a new study, a learning task in rats was linked to increases in activity patterns in groups of brain cells. The results might help scientists pin down what learning looks like at the nerve cell level, and give us a clue about how memories are made. Different areas of the brain communicate with each other, transferring information from one area to another for processing and interpretation. Brain cell meets brain cell at connections called synapses. But to transfer information between areas often takes more than one neuron firing a lonely signal. It takes cortical oscillations — networks of brain cells sending electrical signals in concert — over and over again for a message to transmit from one brain area to another. Changes in electrical fields increase the probability that neurons in a population will fire. These cortical oscillations are like a large crowd chanting. Not all voices may be yelling at once, some people may be ahead or behind, some may even be whispering, but you still hear an overwhelming “USA! USA!” Cortical oscillations can occur within a single brain area, or they can extend from one area to another. “The oscillation tells you what the other brain area is likely to ‘see’ when it gets that input,” explains Leslie Kay, a neuroscientist at the University of Chicago. Once the receiving area ‘sees’ the incoming oscillation, it may synchronize its own population firing, joining in the chant. “A synchronized pattern of oscillations in two separate brain regions serves to communicate between the two regions,” says Kei Igarashi, a neuroscientist at the Norwegian University of Science and Technology in Trondheim. © Society for Science & the Public 2000 - 2013
Keyword: Learning & Memory
Link ID: 19742 - Posted: 06.17.2014
Virginia Morell Teaching isn’t often seen in animals other than humans—and it’s even more difficult to demonstrate in animals living in the wild rather than in a laboratory setting. But researchers studying the Australian superb fairy-wren (Malurus cyaneus) in the wild think the small songbirds (a male is shown in the photo above) practice the behavior. They regard a female fairy-wren sitting on her nest and incubating her eggs as the teacher, and her embryonic chicks as her pupils. She must teach her unhatched chicks a password—a call they will use after emerging to solicit food from their parents; the better they learn the password, the more they will be fed. Since 1992, there’s been a well-accepted definition of teaching that consists of three criteria. First, the teacher must modify his or her behavior in the presence of a naive individual—which the birds do; the mothers increase their teaching (that is, the rate at which they make the call) when their chicks are in a late stage of incubation. Second, there must be a benefit to the pupil, which there clearly is. Scientists reported online yesterday in Behavioral Ecology that the fairy-wrens also pass the third criteria: There must be a cost to the teacher. And for the small birds, there can be a hefty price to pay. The more often a female repeats the password, the more likely she is to attract a parasitical cuckoo, which will sneak in and lay its eggs in her nest. From careful field observations, the scientists discovered that at nests that were parasitized, the females had recited their password 20 times an hour. But at nests that were not parasitized, the females had called only 10 times per hour. Superb fairy-wrens thus join a short but growing list of animal-teachers, such as rock ants, meerkats, and pied babblers. © 2014 American Association for the Advancement of Science.
by Ashley Yeager Being put under anesthesia as an infant may make it harder for a person to recall details or events when they grow older. Previous studies on animals had shown that anesthesia impairs parts of the brain that help with recollection. But it was not clear how this type of temporary loss of consciousness affected humans. Comparing the memory of 28 children ages 6 to 11 who had undergone anesthesia as infants to 28 children similar in age who had not been put under suggests that the early treatment impairs recollection later in life, researchers report June 9 in Neuropsychopharmacology. The team reported similar results for a small study on rats and notes that early anesthesia did not appear to affect the children's familiarity with objects and events or their IQ. © Society for Science & the Public 2000 - 2013.
by Moheb Costandi Rest easy after learning a new skill. Experiments in mice suggest that a good night's sleep helps us lay down memories by promoting the growth of new connections between brain cells. Neuroscientists believe that memory involves the modification of synapses, which connect brain cells, and numerous studies published over the past decade have shown that sleep enhances the consolidation of newly formed memories in people. But exactly how these observations were related was unclear. To find out, Wenbiao Gan of the Skirball Institute of Biomolecular Medicine at New York University Medical School and his colleagues trained 15 mice to run backwards or forwards on a rotating rod. They allowed some of them to fall asleep afterwards for 7 hours, while the rest were kept awake. The team monitored the activity and microscopic structure of the mice's motor cortex, the part of the brain that controls movement, through a small transparent "window" in their skulls. This allowed them to watch in real time how the brain responded to learning the different tasks. Sprouting spines They found that learning a new task led to the formation of new dendritic spines – tiny structures that project from the end of nerve cells and help pass electric signals from one neuron to another – but only in the mice left to sleep. This happened during the non-rapid eye movement stage of sleep. Each task caused a different pattern of spines to sprout along the branches of the same motor cortex neurons. © Copyright Reed Business Information Ltd.
By Sadie Dingfelder Want to become famous in the field of neuroscience? You could go the usual route, spending decades collecting advanced degrees, slaving away in science labs and publishing your results. Or you could simply fall victim to a freak accident. The stars of local science writer Sam Kean’s new book, “The Tale of the Dueling Neurosurgeons,” (which he’ll discuss Saturday at Politics and Prose) took the latter route. Be it challenging the wrong guy to a joust, spinning out on a motorcycle, or suffering from a stroke, these folks sustained brain injuries with bizarre and fascinating results. One man, for instance, lost the ability to identify different kinds of animals but had no trouble naming plants and objects. Another man lost his short-term memory. The result? A diary filled with entries like: “I am awake for the very first time.” “Now, I’m really awake.” “Now, I’m really, completely awake.” Unfortunate mishaps like these have advanced our understanding of how the gelatinous gray mass that (usually) stays hidden inside our skulls gives rise to thoughts, feelings and ideas, Kean says. “Traditionally, every major discovery in the history of neuroscience came about this way,” he says. “We had no other way of looking at the brain for centuries and centuries, because we didn’t have things like MRI machines.” Rather than covering the case studies textbook-style, Kean provides all the gory details. Consider Phineas Gage. You may remember from Psych 101 that Gage, a railroad worker, survived having a metal rod launched through his skull. You might not know, however, that one doctor “shaved Gage’s scalp and peeled off the dried blood and gelatinous brains. He then extracted skull fragments from the wound by sticking his fingers in from both ends, Chinese-finger-trap-style,” as Kean writes in his new book. © 1996-2014 The Washington Post
Ewen Callaway By controlling rats' brain cells they had genetically engineered to respond to light, researchers were able to create fearful memories of events that never happened — and then to erase those memories again. Neuroscientists can breathe a collective sigh of relief. Experiments have confirmed a long-standing theory for how memories are made and stored in the brain. Researchers have created and erased frightening associations in rats' brains using light, providing the most direct demonstration yet that the strengthening and weakening of connections between neurons is the basis for memory. “This is the best evidence so far available, period,” says Eric Kandel, a neuroscientist at Columbia University in New York. Kandel, who shared the 2000 Nobel Prize in Physiology or Medicine for his work unravelling the molecular basis of memory, was not involved in the latest study, which was published online in Nature1 on 1 June. In the 1960s and 1970s, researchers in Norway noticed a peculiar property of brain cells. Repeatedly delivering a burst of electricity to a neuron in an area of the brain known as the hippocampus seemed to boost the cell’s ability to talk to a neighbouring neuron. These communiqués occur across tiny gaps called synapses, which neurons can form with thousands of other nerve cells. The process was called long-term potentiation (LTP), and neuroscientists suspected that it was the physical basis of memory. The hippocampus, they realized, was important for forming long-term memories, and the long-lasting nature of LTP hinted that information might be stored in a neural circuit for later recall. © 2014 Nature Publishing Group,
Jessica Morrison Bees, like birds and butterflies, use the Sun as a compass for navigation, whereas mammals typically find their way by remembering familiar landmarks on a continuous mental map. However, the latest research suggests that bees also use this type of map, despite their much smaller brain size. The work adds a new dimension to complex bee-navigation abilities that have long captivated researchers. “The surprise comes for many people that such a tiny little brain is able to form such a rich memory described as a cognitive map,” says co-author Randolf Menzel, a neurobiologist at the Free University of Berlin. The research by Menzel and his team, published today in the Proceedings of the National Academy of Sciences1, demonstrates that bees can find their way back to their hives without relying solely on the Sun. Instead, they seem to use a 'cognitive map' that is made up of memorized landscape snapshots that direct them home. The cognitive map used by mammals is thought to originate in the brain’s hippocampus. Humans employ such maps on a daily basis; for example, even in a windowless office, many people can point towards their home, orienting themselves in space based on knowledge of their location relative to the outside world. “They can point to their home generally even though they can’t see it, even along a path through a wall that they haven’t travelled,” explains Fred Dyer, a behavioural biologist at Michigan State University in East Lansing, who was not involved in the research. The study authors argue that bees can do something similar, albeit on a much more rudimentary level. © 2014 Nature Publishing Group
Keyword: Animal Migration
Link ID: 19684 - Posted: 06.03.2014
By DAAN HEERMA VAN VOSS I was 25 when I lost my memory. It happened on Jan. 16, 2012. I woke up, not knowing where I was. I was lying in bed, sure, but whose bed was it? There was no one in the room, no sound that I recognized: I was alone with my body. Of course, my relationship to my body was radically different than before. My body parts seemed to belong to someone else or, rather, to something else. The vague sense of identity that I possessed was confined to the knowledge of my name, but even that felt arbitrary — a collection of random letters, crumbling. No words can accurately describe the feeling of losing your memory, your life. Sammy Harkham Underlying the loss of facts is a deeper problem: the loss of logic and causality. A person can function, ask questions, only when he recognizes a fundamental link between circumstances and time, past and present. The links between something happening to you, leading you to do or say something, which leads to someone else responding. No act is without an act leading up to it, no word is without a word that came before. Without the sense of causality provided by memory, there is chaos. When I woke up, I had no grip on logic, and logic none on me. It was a profound not-knowing, and it was terrifying. I started hyperventilating. What struck me has a name: Transient Global Amnesia. T.G.A., as it’s referred to, is a neurological disorder. The name sounds definitive, but in fact, it’s just a fancy way of saying: We don’t know the cause, we know only what the symptoms are. Its most defining symptom is a near total disruption of short-term memory. In many cases, there is a temporary loss of long-term memory as well. But there is a bright side. T.G.A. lasts for approximately two to 20 hours, so it’s a one-day thing. At the time, though, I didn’t know this. Two names popped into my mind: Daniel and Sophie. I didn’t know where the names came from, or to whom they belonged. I stumbled across the room, opened a door, and discovered that I was alone in the apartment. (It was, in fact, my apartment.) I found an iPhone and, quite magically, I thought, knew how to work it. As it turns out, there was nothing magical about this: A characteristic of T.G.A. is that those afflicted with it can perform familiar tasks, even ones as difficult as driving a car. (But I wouldn’t recommend that.) Occurrence of T.G.A. is rare, with at most 10 cases per 100,000 people. It is most likely to happen when you’re between 40 and 80; the average age of a T.G.A. patient is 62 years old. But I have always been in the fast lane. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 19681 - Posted: 06.02.2014
Elizabeth Norton It's a sad fact that children born in poverty start out at a disadvantage and continue to fall further behind kids who are more privileged as they grow up. In developing countries, chiefly in Africa and Asia, some 200 million children under age 5 won't reach the same milestones—for physical growth, school performance, and earnings later on—as children who are less deprived. But a new analysis of a long-term study in Jamaica shows that surprisingly simple ways of stimulating children’s mental development can have dramatic benefits later in life. The children were participants in the Jamaican Study, a project geared toward improving cognitive development begun in the mid-1980s by child health specialists Sally Grantham-McGregor of University College London and Susan Walker of the University of the West Indies, Mona, in Jamaica. They focused on children between the ages of 9 and 24 months whose growth was stunted, placing them in the bottom 5% of height for their age and sex (an easy-to-quantify gauge of extreme poverty). Children of normal height in the same neighborhoods were also studied for comparison. For 2 years, community health workers visited the families weekly. One group was given nutritional assistance only (a formula containing 66% of daily recommended calories, along with vitamins and minerals). One group received a mental and social stimulation program only, and one group got stimulation and nutritional assistance. A final group had no intervention and served as a control. The mental stimulation program involved giving parents simple picture books and handmade toys, and encouraging them to read and sing to their children and point out names of objects, shapes, and colors. They were also taught better ways to converse and respond to their toddlers. These everyday interactions aren't always part of the culture in low-income countries, explains Paul Gertler, an economist at the University of California, Berkeley. "Parents might have five or six kids and few toys. They might be working really hard and have a lot of competing demands. They might not have been taught how to talk to their children, or how important and effective it is," he says. Past research attests to the importance of everyday conversation for children’s mental development: A recent study suggests that children of affluent parents do better in life in large part because their parents talk to them more. © 2014 American Association for the Advancement of Science
By KATE MURPHY The baseball hurtles toward the batter, and he must decide from its rotation whether it’s a fastball worth a swing or a slider about to drop out of the strike zone. Running full speed, the wide receiver tracks both the football flying through the air and the defensive back on his heels. Golfers must rapidly shift visual focus in order to drive the ball at their feet toward a green in the distance. Many athletes need excellent vision to perform well in their sports, and now many are adding something new to their practice regimens: vision training. The idea has been around for years, but only recently have studies hinted that it might really work — that it might be possible to train yourself to see better without resorting to glasses or surgery. “Vision training has been out there for a long time,” said Mark Blumenkranz, a professor of ophthalmology at Stanford University Medical School. “But it’s being made more respectable lately thanks to the attention it’s been getting from psychophysicists, vision scientists, neurologists and optometrists.” Vision training actually has little to do with improving eyesight. The techniques, a form of perceptual learning, are intended to improve the ability to process what is seen. The idea is that if visual sensory neurons are repeatedly activated, they increase their ability to send electrical signals from one cell to another across connecting synapses. If neurons are not used, over time these transmissions are weakened. “With sensory neurons, just like muscles, it’s use or lose it,” said Dr. Bernhard Sabel, a neuroscientist at Otto von Guericke University in Magdeburg, Germany, who studies plasticity in the brain. “This applies both to athletes and the partially blind.” Vision training may involve simple strategies — for instance, focusing sequentially on beads knotted at intervals on a length of string with one end held at the tip of the nose. This is said to improve convergence (inward turning of the eye to maintain binocular vision) and the ability to focus near and far. © 2014 The New York Times Company
By BENEDICT CAREY SAN DIEGO – The last match of the tournament had all the elements of a classic showdown, pitting style versus stealth, quickness versus deliberation, and the world’s foremost card virtuoso against its premier numbers wizard. If not quite Ali-Frazier or Williams-Sharapova, the duel was all the audience of about 100 could ask for. They had come to the first Extreme Memory Tournament, or XMT, to see a fast-paced, digitally enhanced memory contest, and that’s what they got. The contest, an unusual collaboration between industry and academic scientists, featured one-minute matches between 16 world-class “memory athletes” from all over the world as they met in a World Cup-like elimination format. The grand prize was $20,000; the potential scientific payoff was large, too. One of the tournament’s sponsors, the company Dart NeuroScience, is working to develop drugs for improved cognition. The other, Washington University in St. Louis, sent a research team with a battery of cognitive tests to determine what, if anything, sets memory athletes apart. Previous research was sparse and inconclusive. Yet as the two finalists, both Germans, prepared to face off — Simon Reinhard, 35, a lawyer who holds the world record in card memorization (a deck in 21.19 seconds), and Johannes Mallow, 32, a teacher with the record for memorizing digits (501 in five minutes) — the Washington group had one preliminary finding that wasn’t obvious. “We found that one of the biggest differences between memory athletes and the rest of us,” said Henry L. Roediger III, the psychologist who led the research team, “is in a cognitive ability that’s not a direct measure of memory at all but of attention.” People have been performing feats of memory for ages, scrolling out pi to hundreds of digits, or phenomenally long verses, or word pairs. Most store the studied material in a so-called memory palace, associating the numbers, words or cards with specific images they have already memorized; then they mentally place the associated pairs in a familiar location, like the rooms of a childhood home or the stops on a subway line. The Greek poet Simonides of Ceos is credited with first describing the method, in the fifth century B.C., and it has been vividly described in popular books, most recently “Moonwalking With Einstein,” by Joshua Foer. © 2014 The New York Times Company
|By Beth Skwarecki The protein family notorious for causing neurogenerative diseases such as Parkinson's—not to mention mad cow—appears to play an important role in healthy cells. “Do you think God created prions just to kill?” muses Eric R. Kandel of Columbia University. “These things have evolved initially to have a physiological function.” Kandel's work on memory helped to reveal that animals make and use prions in their nervous systems as part of an essential function: stabilizing the synapses involved with forming long-term memories. These natural prions are not infectious, but on a molecular level they chain up exactly the same way as their disease-causing brethren. (Some researchers call them “prionlike” to avoid confusion.) Now neuroscientist Kausik Si of the Stowers Institute for Medical Research in Kansas City, Mo., one of Kandel's former students, has shown that the prion's action is tightly controlled by the cell and can be turned on when a new long-term memory needs to be formed. Once the prion's chain reaction gets started, it is self-perpetuating, and thus the synapse—where neurons connect—can be maintained after the initial trigger is gone, perhaps for a lifetime. But that still does not explain how the first prion is triggered or why it happens at only certain of the synapses, which play a crucial role in forming memories. Si's work, published February 11 in PLOS Biology, traces the biochemistry of this protein-preservation process in fruit flies, showing how the cell turns on the machinery responsible for the persistence of memory—and how the memory can be stabilized at just the right time and in the right place. © 2014 Scientific American
By DANIEL GOLEMAN Which will it be — the berries or the chocolate dessert? Homework or the Xbox? Finish that memo, or roam Facebook? Such quotidian decisions test a mental ability called cognitive control, the capacity to maintain focus on an important choice while ignoring other impulses. Poor planning, wandering attention and trouble inhibiting impulses all signify lapses in cognitive control. Now a growing stream of research suggests that strengthening this mental muscle, usually with exercises in so-called mindfulness, may help children and adults cope with attention deficit hyperactivity disorder and its adult equivalent, attention deficit disorder. The studies come amid growing disenchantment with the first-line treatment for these conditions: drugs. In 2007, researchers at the University of California, Los Angeles, published a study finding that the incidence of A.D.H.D. among teenagers in Finland, along with difficulties in cognitive functioning and related emotional disorders like depression, were virtually identical to rates among teenagers in the United States. The real difference? Most adolescents with A.D.H.D. in the United States were taking medication; most in Finland were not. “It raises questions about using medication as a first line of treatment,” said Susan Smalley, a behavior geneticist at U.C.L.A. and the lead author. In a large study published last year in The Journal of the American Academy of Child & Adolescent Psychiatry, researchers reported that while most young people with A.D.H.D. benefit from medications in the first year, these effects generally wane by the third year, if not sooner. “There are no long-term, lasting benefits from taking A.D.H.D. medications,” said James M. Swanson, a psychologist at the University of California, Irvine, and an author of the study. “But mindfulness seems to be training the same areas of the brain that have reduced activity in A.D.H.D.” © 2014 The New York Times Company
By DAVID L. KIRP Whenever President Obama proposes a major federal investment in early education, as he did in his two most recent State of the Union addresses, critics have a two-word riposte: Head Start. Researchers have long cast doubt on that program’s effectiveness. The most damning evidence comes from a 2012 federal evaluation that used gold-standard methodology and concluded that children who participated in Head Start were not more successful in elementary school than others. That finding was catnip to the detractors. “Head Start’s impact is no better than random,” The Wall Street Journal editorialized. Why throw good money after bad? Though the faultfinders have a point, the claim that Head Start has failed overstates the case. For one thing, it has gotten considerably better in the past few years because of tougher quality standards. For another, researchers have identified a “sleeper effect” — many Head Start youngsters begin to flourish as teenagers, maybe because the program emphasizes character and social skills as well as the three R’s. Still, few would give Head Start high marks, and the bleak conclusion of the 2012 evaluation stands in sharp contrast to the impressive results from well-devised studies of state-financed prekindergartens. Head Start, a survivor of President Lyndon B. Johnson’s war on poverty, enrolls only poor kids. That’s a big part of the problem — as the adage goes, programs for the poor often become poor programs. Whether it’s health care (compare the trajectories of Medicare, for those 65 and older of all incomes, and Medicaid, only for the poor), education or housing, the sorry truth is that “we” don’t like subsidizing “them.” Head Start is no exception. It has been perpetually underfunded, never able to enroll more than half of eligible children or pay its teachers a decent wage. If Head Start is going to realize its potential, it has to break out of the antipoverty mold. One promising but unfortunately rarely used strategy is to encourage all youngsters, not just poor kids, to enroll, with poor families paying nothing and middle-class families contributing on a sliding scale. Another is to merge Head Start with high-quality state prekindergarten. © 2014 The New York Times Company
Helen Shen For anyone fighting to save old memories, a fresh crop of brain cells may be the last thing they need. Research published today in Science suggests that newly formed neurons in the hippocampus — an area of the brain involved in memory formation — could dislodge previously learned information1. The work may provide clues as to why childhood memories are so difficult to recall. “The finding was very surprising to us initially. Most people think new neurons mean better memory,” says Sheena Josselyn, a neuroscientist who led the study together with her husband Paul Frankland at the Hospital for Sick Children in Toronto, Canada. Humans, mice and several other mammals grow new neurons in the hippocampus throughout their lives — rapidly at first, but more and more slowly with age. Researchers have previously shown that boosting neural proliferation before learning can enhance memory formation in adult mice2, 3. But the latest study shows that after information is learned, neuron growth can degrade those memories. Although seemingly counterintuitive, the disruptive role of these neurons makes some sense, says Josselyn. She notes that some theoretical models have predicted such an effect4. “More neurons increase the capacity to learn new memories in the future,” she says. “But memory is based on a circuit, so if you add to this circuit, it makes sense that it would disrupt it.” Newly added neurons could have a useful role in clearing old memories and making way for new ones, says Josselyn. Forgetting curve The researchers tested newborn and adult mice on a conditioning task, training the animals to fear an environment in which they received repeated electric shocks. All the mice learned the task quickly, but whereas infant mice remembered the negative experience for only one day after training, adult mice retained the negative memory for several weeks. © 2014 Nature Publishing Group
By GRETCHEN REYNOLDS The more physically active you are at age 25, the better your thinking tends to be when you reach middle age, according to a large-scale new study. Encouragingly, the findings also suggest that if you negligently neglected to exercise when young, you can start now and still improve the health of your brain. Those of us past age 40 are generally familiar with those first glimmerings of forgetfulness and muddled thinking. We can’t easily recall people’s names, certain words, or where we left the car keys. “It’s what we scientists call having a C.R.S. problem,” said David R. Jacobs, a professor of public health at the University of Minnesota in Minneapolis and a co-author of the new study. “You can’t remember stuff.” But these slight, midlife declines in thinking skills strike some people later or less severely than others, and scientists have not known why. Genetics almost certainly play a role, most researchers agree. Yet the contribution of lifestyle, and in particular of exercise habits, has been unclear. So recently, Dr. Jacobs and colleagues from universities in the United States and overseas turned to a large trove of data collected over several decades for the Cardia study. The study, whose name is short for Coronary Artery Risk Development in Young Adults, began in the mid-1980s with the recruitment of thousands of men and women then ages 18 to 30 who underwent health testing to determine their cholesterol levels, blood pressure and other measures. Many of the volunteers also completed a treadmill run to exhaustion, during which they strode at an increasingly brisk pace until they could go no farther. The average time to exhaustion among these young adults was 10 minutes, meaning that most were moderately but not tremendously fit. © 2014 The New York Times Company
Erin Allday The game seems pretty simple. An alien-looking creature stands on a block of ice that's flowing down a river. The goal is to maneuver the ice around whales and other hurdles and periodically cause the alien to "jump" to grab green fish as they leap out of the water. The game is played on a tablet, and it looks a lot like any of hundreds of apps that can be downloaded for some mindless entertainment during an afternoon commute on BART. Here's what sets the game apart: It was designed by scientists at UCSF looking for a new way to treat serious symptoms of depression. "We're trying to see whether we can get the same effects with the game as with therapy," said Patricia Arean, a clinical psychologist at UCSF who is studying the potential mental health benefits of video game play in older adults. Arean is joining the burgeoning field of research into the use of video games as tools for promoting brain health. Video games undoubtedly have some kind of effect on our brains, but harnessing the technology and forcing a lasting - and positive - change is the challenge. So far, what little evidence does exist that video games can have a measurable impact on brain activity has been gathered almost entirely on healthy subjects. But in small clinical trials - like Arean's study of depression in older adults - the effects of games on both healthy and unhealthy people are being studied to find out whether they're useful in treating mental illness, such as autism, attention deficit and hyperactivity disorder, and post-traumatic stress disorder. Some neuroscientists say video games may also strengthen neural networks, potentially preventing or slowing down the brain deterioration associated with old age or diseases like Alzheimer's or Parkinson's. "We're in the infancy of this idea that entertaining and gaming stuff can be useful for you," said Joaquin Anguera, a UCSF neuroscientist who designs cognitive training games, including the one Arean is testing with patients. © 2014 Hearst Communications, Inc.
by Helen Thomson A 22-year-old man has been instantaneously transported to his family's pizzeria and his local railway station – by having his brain zapped. These fleeting visual hallucinations have helped researchers pinpoint places where the brain stores visual location information. Pierre Mégevand at the Feinstein Institute for Medical Research in Manhasset, New York, and his colleagues wanted to discover just where in the brain we store and retrieve information about locations and places. They sought the help of a 22-year-old man being treated for epilepsy, because the treatment involved implanting electrodes into his brain that would record his neural activity. Mégevand and his colleagues scanned the volunteer's brain using functional MRI while he looked at pictures of different objects and scenes. They then recorded activity from the implanted electrodes as he looked at a similar set of pictures. In both situations, a specific area of the cortex around the hippocampus responded to images of places, but not to images of other kinds of objects, such as body parts or tools. "There are these little spots of tissues that seem to care about houses and places more than any other class of object," says research team member Ashesh Mehta, also at the Feinstein Institute. Next, the team used the implanted electrodes to stimulate the brain in this area – a move that the volunteer said triggered a series complex visual hallucinations. First he described seeing a railway station in the neighbourhood where he lives. Stimulation of a nearby area elicited another hallucination, this time of a staircase and a blue closet in his home. When stimulation of these areas was repeated, the same scenes arose. © Copyright Reed Business Information Ltd.
By Emily Chung, CBC News If you're in your late 20s or older, you're not as sharp as you used to be, suggests a study of gamers playing the popular video game Starcraft 2. The study analyzed the way 3,305 people, aged 16 to 44, played the game against a single random opponent of similar skill, in order to measure the gamers' cognitive motor performance. Cognitive motor performance is how quickly your brain reacts to things happening around you, allowing you to act during tasks such as driving. The analysis revealed exactly when advancing age starts to take its toll on brain performance – at the tender age of 24 years. The results were published late last week in the journal PLOS ONE. Joe Thompson, lead author of the study, said he was surprised by how early the decline started and how big the age effect was, even among those in their 30s. "If you're 39, competing against a 24-year-old and you're both in the otherwise same level of skill," Thompson said, "the effect of age is expected to offset a great deal of your learning." Starcraft 2 is a popular strategy game, similar in concept to Risk, where players compete to build armies and conquer a science fictional world. Unlike Risk, however, players don't take turns. "Starcraft is like high-speed chess," said Thompson, a PhD student who plays the game himself. "You simply can make as many moves as you want, as fast as you can go." Players can't see the whole "world" at once, as they mine resources needed to build up their armies, as they attack their opponents, and as they defend against opponents' attacks, they need to quickly move their screen around from one part of the world to another. © CBC 2014