Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 1090

By recording from the brains of bats as they flew and landed, scientists have found that the animals have a "neural compass" - allowing them to keep track of exactly where and even which way up they are. These head-direction cells track bats in three dimensions as they manoeuvre. The researchers think a similar 3D internal navigation system is likely to be found throughout the animal kingdom. The findings are published in the journal Nature. Lead researcher Arseny Finkelstein, from the Weizmann Institute of Science in Rehovot, Israel, explained that this was the first time measurements had been taken from animals as they had flown around a space in any direction and even carried out their acrobatic upside-down landings. "We're the only lab currently able to conduct wireless recordings in flying animals," he told BBC News. "A tiny device attached to the bats allows us to monitor the activity of single neurons while the animal is freely moving." Decades of study of the brain's internal navigation system garnered three renowned neuroscientists this year's Nobel Prize for physiology and medicine. The research, primarily in rats, revealed how animals had "place" and "grid" cells - essentially building a map in the brain and coding for where on that map an animal was at any time. Mr Finkelstein and his colleagues' work in bats has revealed that their brains also have "pitch" and "roll" cells. These tell the animal whether it is pointing upwards or downwards and whether its head is tilted one way or the other. BBC © 2014

Keyword: Hearing; Learning & Memory
Link ID: 20393 - Posted: 12.04.2014

By CHRISTOPHER F. CHABRIS and DANIEL J. SIMONS NEIL DEGRASSE TYSON, the astrophysicist and host of the TV series “Cosmos,” regularly speaks to audiences on topics ranging from cosmology to climate change to the appalling state of science literacy in America. One of his staple stories hinges on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” — meaning to divide Judeo-Christian Americans from fundamentalist Muslims — Mr. Bush uttered the words “Our God is the God who named the stars.” Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy — and Mr. Bush might not have said what he did if he had known this fact. This is a powerful example of how our biases can blind us. But not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory. In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush had indeed once said something like what Dr. Tyson remembered; in 2003 Mr. Bush said, in tribute to the astronauts lost in the Columbia space shuttle explosion, that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20387 - Posted: 12.03.2014

|By David Z. Hambrick If you’ve spent more than about 5 minutes surfing the web, listening to the radio, or watching TV in the past few years, you will know that cognitive training—better known as “brain training”—is one of the hottest new trends in self improvement. Lumosity, which offers web-based tasks designed to improve cognitive abilities such as memory and attention, boasts 50 million subscribers and advertises on National Public Radio. Cogmed claims to be “a computer-based solution for attention problems caused by poor working memory,” and BrainHQ will help you “make the most of your unique brain.” The promise of all of these products, implied or explicit, is that brain training can make you smarter—and make your life better. Yet, according to a statement released by the Stanford University Center on Longevity and the Berlin Max Planck Institute for Human Development, there is no solid scientific evidence to back up this promise. Signed by 70 of the world’s leading cognitive psychologists and neuroscientists, the statement minces no words: "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based “brain games” alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease." The statement also cautions that although some brain training companies “present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training…the cited research is [often] only tangentially related to the scientific claims of the company, and to the games they sell.” © 2014 Scientific American,

Keyword: Learning & Memory
Link ID: 20380 - Posted: 12.02.2014

By Nicholas Bakalar Researchers have found that people diagnosed with diabetes in their 50’s are significantly more likely than others to suffer mental decline by their 70’s. The study, published Monday in the Annals of Internal Medicine, started in 1990. Scientists examined 13,351 black and white adults, aged 48 to 67, for diabetes and prediabetes using self-reported physician diagnoses and glucose control tests. They also administered widely used tests of memory, reasoning, problem solving and planning. About 13 percent had diabetes at the start. The researchers followed them with five periodic examinations over the following 20 years. By that time, 5,987 participants were still enrolled. After adjusting for numerous health and behavioral factors, and for the large attrition in the study, the researchers found people with diabetes suffered a 30 percent larger decline in mental acuity than those without the disease. Diabetes can impair blood circulation, and the authors suggest that the association of diabetes with thinking and memory problems may be the result of damage to small blood vessels in the brain. “People may think cognitive decline with age is inevitable, but it’s not,” said the senior author, Elizabeth Selvin, an associate professor of epidemiology at the Johns Hopkins Bloomberg School of Public Health. “Factors like diabetes are potentially modifiable. If we can better control diabetes we can stave off cognitive decline and future dementia.” © 2014 The New York Times Company

Keyword: Obesity; Learning & Memory
Link ID: 20377 - Posted: 12.02.2014

by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Glia
Link ID: 20375 - Posted: 12.01.2014

By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014

By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.

Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014

By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,

Keyword: Learning & Memory; Emotions
Link ID: 20357 - Posted: 11.25.2014

By MAX BEARAK MUMBAI, India — The young man sat cross-legged atop a cushioned divan on an ornately decorated stage, surrounded by other Jain monks draped in white cloth. His lip occasionally twitched, his hands lay limp in his lap, and for the most part his eyes were closed. An announcer repeatedly chastised the crowd for making even the slightest noise. From daybreak until midafternoon, members of the audience approached the stage, one at a time, to show the young monk a random object, pose a math problem, or speak a word or phrase in one of at least six different languages. He absorbed the miscellany silently, letting it slide into his mind, as onlookers in their seats jotted everything down on paper. After six hours, the 500th and last item was uttered — it was the number 100,008. An anxious hush descended over the crowd. And the monk opened his eyes and calmly recalled all 500 items, in order, detouring only once to fill in a blank he had momentarily set aside. When he was done, and the note-keepers in the audience had confirmed his achievement, the tense atmosphere dissolved and the announcer led the crowd in a series of triumphant chants. The opportunity to witness the feat of memory drew a capacity crowd of 6,000 to the Sardar Vallabhbhai Patel stadium in Mumbai on Sunday. The exhibition was part of a campaign to encourage schoolchildren to use meditation to build brainpower, as Jain monks have done for centuries in India, a country drawn both toward ancient religious practices and more recent ambitions. But even by Jain standards, the young monk — Munishri Ajitchandrasagarji, 24 — is something special. His guru, P. P. Acharya Nayachandrasagarji, said no other monk in many years had come close to his ability. © 2014 The New York Times Company

Keyword: Learning & Memory; Attention
Link ID: 20334 - Posted: 11.20.2014

By Gretchen Reynolds Exercise seems to be good for the human brain, with many recent studies suggesting that regular exercise improves memory and thinking skills. But an interesting new study asks whether the apparent cognitive benefits from exercise are real or just a placebo effect — that is, if we think we will be “smarter” after exercise, do our brains respond accordingly? The answer has significant implications for any of us hoping to use exercise to keep our minds sharp throughout our lives. In experimental science, the best, most reliable studies randomly divide participants into two groups, one of which receives the drug or other treatment being studied and the other of which is given a placebo, similar in appearance to the drug, but not containing the active ingredient. Placebos are important, because they help scientists to control for people’s expectations. If people believe that a drug, for example, will lead to certain outcomes, their bodies may produce those results, even if the volunteers are taking a look-alike dummy pill. That’s the placebo effect, and its occurrence suggests that the drug or procedure under consideration isn’t as effective as it might seem to be; some of the work is being done by people’s expectations, not by the medicine. Recently, some scientists have begun to question whether the apparently beneficial effects of exercise on thinking might be a placebo effect. While many studies suggest that exercise may have cognitive benefits, those experiments all have had a notable scientific limitation: They have not used placebos. This issue is not some abstruse scientific debate. If the cognitive benefits from exercise are a result of a placebo effect rather than of actual changes in the brain because of the exercise, then those benefits could be ephemeral and unable in the long term to help us remember how to spell ephemeral. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20329 - Posted: 11.20.2014

By NICK BILTON Ebola sounds like the stuff of nightmares. Bird flu and SARS also send shivers down my spine. But I’ll tell you what scares me most: artificial intelligence. The first three, with enough resources, humans can stop. The last, which humans are creating, could soon become unstoppable. Before we get into what could possibly go wrong, let me first explain what artificial intelligence is. Actually, skip that. I’ll let someone else explain it: Grab an iPhone and ask Siri about the weather or stocks. Or tell her “I’m drunk.” Her answers are artificially intelligent. Right now these artificially intelligent machines are pretty cute and innocent, but as they are given more power in society, these machines may not take long to spiral out of control. In the beginning, the glitches will be small but eventful. Maybe a rogue computer momentarily derails the stock market, causing billions in damage. Or a driverless car freezes on the highway because a software update goes awry. But the upheavals can escalate quickly and become scarier and even cataclysmic. Imagine how a medical robot, originally programmed to rid cancer, could conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease. Nick Bostrom, author of the book “Superintelligence,” lays out a number of petrifying doomsday settings. One envisions self-replicating nanobots, which are microscopic robots designed to make copies of themselves. In a positive situation, these bots could fight diseases in the human body or eat radioactive material on the planet. But, Mr. Bostrom says, a “person of malicious intent in possession of this technology might cause the extinction of intelligent life on Earth.” © 2014 The New York Times Company

Keyword: Robotics; Intelligence
Link ID: 20283 - Posted: 11.06.2014

By James Gallagher Health editor, BBC News website Working antisocial hours can prematurely age the brain and dull intellectual ability, scientists warn. Their study, in the journal Occupational and Environmental Medicine, suggested a decade of shifts aged the brain by more than six years. There was some recovery after people stopped working antisocial shifts, but it took five years to return to normal. Experts say the findings could be important in dementia, as many patients have disrupted sleep. The body's internal clock is designed for us to be active in the day and asleep at night. The damaging effects on the body of working against the body clock, from breast cancer to obesity, are well known. Now a team at the University of Swansea and the University of Toulouse has shown an impact on the mind as well. Three thousand people in France performed tests of memory, speed of thought and wider cognitive ability. The brain naturally declines as we age, but the researchers said working antisocial shifts accelerated the process. Those with more than 10 years of shift work under their belts had the same results as someone six and a half years older. The good news is that when people in the study quit shift work, their brains did recover. Even if it took five years. Dr Philip Tucker, part of the research team in Swansea, told the BBC: "It was quite a substantial decline in brain function, it is likely that when people trying to undertake complex cognitive tasks then they might make more mistakes and slip-ups, maybe one in 100 makes a mistake with a very large consequence, but it's hard to say how big a difference it would make in day-to-day life." BBC © 2014

Keyword: Biological Rhythms; Sleep
Link ID: 20281 - Posted: 11.05.2014

By RICHARD A. FRIEDMAN ATTENTION deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease. And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating. To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture. From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting. One of my patients, a young woman in her early 20s, is prototypical. “I’ve been on Adderall for years to help me focus,” she told me at our first meeting. Before taking Adderall, she found sitting in lectures unendurable and would lose her concentration within minutes. Like many people with A.D.H.D., she hankered for exciting and varied experiences and also resorted to alcohol to relieve boredom. But when something was new and stimulating, she had laserlike focus. I knew that she loved painting and asked her how long she could maintain her interest in her art. “No problem. I can paint for hours at a stretch.” Rewards like sex, money, drugs and novel situations all cause the release of dopamine in the reward circuit of the brain, a region buried deep beneath the cortex. Aside from generating a sense of pleasure, this dopamine signal tells your brain something like, “Pay attention, this is an important experience that is worth remembering.” © 2014 The New York Times Company

Keyword: ADHD; Learning & Memory
Link ID: 20272 - Posted: 11.03.2014

Maanvi Singh How does a sunset work? We love to look at one, but Jolanda Blackwell wanted her eighth-graders to really think about it, to wonder and question. So Blackwell, who teaches science at Oliver Wendell Holmes Junior High in Davis, Calif., had her students watch a video of a sunset on YouTube as part of a physics lesson on motion. "I asked them: 'So what's moving? And why?' " Blackwell says. The students had a lot of ideas. Some thought the sun was moving; others, of course, knew that a sunset is the result of the Earth spinning around on its axis. Once she got the discussion going, the questions came rapid-fire. "My biggest challenge usually is trying to keep them patient," she says. "They just have so many burning questions." Students asking questions and then exploring the answers. That's something any good teacher lives for. And at the heart of it all is curiosity. Blackwell, like many others teachers, understands that when kids are curious, they're much more likely to stay engaged. But why? What, exactly, is curiosity and how does it work? A study published in the October issue of the journal Neuron suggests that the brain's chemistry changes when we become curious, helping us better learn and retain information. © 2014 NPR

Keyword: Learning & Memory; Attention
Link ID: 20271 - Posted: 11.03.2014

By Eric Niiler Has our reliance on iPhones and other instant-info devices harmed our memories? Michael Kahana, a University of Pennsylvania psychology professor who studies memory, says maybe: “We don’t know what the long-lasting impact of this technology will be on our brains and our ability to recall.” Kahana, 45, who has spent the past 20 years looking at how the brain creates memories, is leading an ambitious four-year Pentagon project to build a prosthetic memory device that can be implanted into human brains to help veterans with traumatic brain injuries. He spoke by telephone with The Post about what we can do to preserve or improve memory. Practicing the use of your memory is helpful. The other thing which I find helpful is sleep, which I don’t get enough of. As a general principle, skills that one continues to practice are skills that one will maintain in the face of age-related changes in cognition. [As for all those brain games available], I am not aware of any convincing data that mental exercises have a more general effect other than maintaining the skills for those exercises. I think the jury is out on that. If you practice doing crossword puzzles, you will preserve your ability to do crossword puzzles. If you practice any other cognitive skill, you will get better at that as well. Michael Kahana once could name every student in a class of 100. Now, says the University of Pennsylvania psychology professor who studies memory, “I find it too difficult even with a class of 20.” (From Michael Kahana)

Keyword: Learning & Memory
Link ID: 20249 - Posted: 10.28.2014

By PAM BELLUCK Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age. In a small study in the journal Nature Neuroscience, healthy people, ages 50 to 69, who drank a mixture high in antioxidants called cocoa flavanols for three months performed better on a memory test than people who drank a low-flavanol mixture. On average, the improvement of high-flavanol drinkers meant they performed like people two to three decades younger on the study’s memory task, said Dr. Scott A. Small, a neurologist at Columbia University Medical Center and the study’s senior author. They performed about 25 percent better than the low-flavanol group. “An exciting result,” said Craig Stark, a neurobiologist at the University of California, Irvine, who was not involved in the research. “It’s an initial study, and I sort of view this as the opening salvo.” He added, “And look, it’s chocolate. Who’s going to complain about chocolate?” The findings support recent research linking flavanols, especially epicatechin, to improved blood circulation, heart health and memory in mice, snails and humans. But experts said the new study, although involving only 37 participants and partly funded by Mars Inc., the chocolate company, goes further and was a well-controlled, randomized trial led by experienced researchers. Besides improvements on the memory test — a pattern recognition test involving the kind of skill used in remembering where you parked the car or recalling the face of someone you just met — researchers found increased function in an area of the brain’s hippocampus called the dentate gyrus, which has been linked to this type of memory. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20246 - Posted: 10.27.2014

By Gary Stix Scott Small, a professor of neurology at Columbia University’s College of Physicians and Surgeons, researches Alzheimer’s, but he also studies the memory loss that occurs during the normal aging process. Research on the commonplace “senior moments” focuses on the hippocampus, an area of the brain involved with formation of new memories. In particular, one area of the hippocampus, the dentate gyrus, which helps distinguish one object from another, has lured researchers on age-related memory problems. In a study by Small and colleagues published Oct. 26 in Nature Neuroscience, naturally occurring chemicals in cocoa increased dentate gyrus blood flow. Psychological testing showed that the pattern recognition abilities of a typical 60-year-old on a high dose of the cocoa phytochemicals in the 37-person study matched those of a 30-or 40-year old after three months. The study received support from the food company Mars, but Small cautions against going out to gorge on Snickers Bars, as most of the beneficial chemicals, or flavanols, are removed when processing cocoa. An edited transcript of an interview with Small follows: Can you explain what you found in your study? The main motive of the study was to causally establish an anatomical source of age-related memory loss. A number of labs have shown in the last 10 years that there’s one area of the brain called the dentate gyrus that is linked to the aging process. But no one has tested that concept. Until now the observations have been correlational. There is decreased function in that region and, to prove causation, we were trying to see if we could reverse that. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20245 - Posted: 10.27.2014

by Neurobonkers A paper published in Nature Reviews Neuroscience last week addressed the prevalence of neuromyths among educators. The paper has been widely reported, but the lion's share of the coverage glossed over the impact that neuromyths have had in the real world. Your first thought after reading the neuromyths in the table below — which were widely believed by teachers — may well be, "so what?" It is true that some of the false beliefs are relatively harmless. For example, encouraging children to drink a little more water might perhaps result in the consumption of less sugary drinks. This may do little if anything to reduce hyperactivity but could encourage a more nutritious diet which might have impacts on problems such as Type II diabetes. So, what's the harm? The paper addressed a number of areas where neuromyths have had real world impacts on educators and policymakers, which may have resulted negatively on the provision of education. The graph above, reprinted in the Nature Reviews Neuroscience, paper has been included as empirical data in educational policy documents to provide evidence for an "allegedly scientific argument for withdrawing public funding of university education." The problem? The data is made up. The graph is in fact a model that is based on the false assumption that investment before the age of three will have many times the benefit of investment made in education later in life. The myth of three — the belief that there is a critical window to educate children before the age of three, after which point the trajectory is fixed — is one of the most persistent neuromyths. Viewed on another level, while some might say investment in early education can never be a bad thing, how about the implication that the potential of a child is fixed at such an early point in their life, when in reality their journey has just begun. © Copyright 2014, The Big Think, Inc

Keyword: Development of the Brain; Learning & Memory
Link ID: 20239 - Posted: 10.25.2014

By CLIVE THOMPSON “You just crashed a little bit,” Adam Gazzaley said. It was true: I’d slammed my rocket-powered surfboard into an icy riverbank. This was at Gazzaley’s San Francisco lab, in a nook cluttered with multicolored skullcaps and wires that hooked up to an E.E.G. machine. The video game I was playing wasn’t the sort typically pitched at kids or even middle-aged, Gen X gamers. Indeed, its intended users include people over 60 — because the game might just help fend off the mental decline that accompanies aging. It was awfully hard to play, even for my Call of Duty-toughened brain. Project: Evo, as the game is called, was designed to tax several mental abilities at once. As I maneuvered the surfboard down winding river pathways, I was supposed to avoid hitting the sides, which required what Gazzaley said was “visual-motor tracking.” But I also had to watch out for targets: I was tasked with tapping the screen whenever a red fish jumped out of the water. The game increased in difficulty as I improved, making the river twistier and obliging me to remember turns I’d taken. (These were “working-memory challenges.”) Soon the targets became more confusing — I was trying to tap blue birds and green fish, but the game faked me out by mixing in green birds and blue fish. This was testing my “selective attention,” or how quickly I could assess a situation and react to it. The company behind Project: Evo is now seeking approval from the Food and Drug Administration for the game. If it gets that government stamp, it might become a sort of cognitive Lipitor or Viagra, a game that your doctor can prescribe for your aging mind. After only two minutes of play, I was making all manner of mistakes, stabbing frantically at the wrong fish as the game sped up. “It’s hard,” Gazzaley said, smiling broadly as he took back the iPad I was playing on. “It’s meant to really push it.” “Brain training” games like Project: Evo have become big business, with Americans spending an estimated $1.3 billion a year on them. They are also a source of controversy. © 2014 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 20238 - Posted: 10.23.2014

By Emily Underwood Aging baby boomers and seniors would be better off going for a hike than sitting down in front of one of the many video games designed to aid the brain, a group of nearly 70 researchers asserted this week in a critique of some of the claims made by the brain-training industry. With yearly subscriptions running as much as $120, an expanding panoply of commercial brain games promises to improve memory, processing speed, and problem-solving, and even, in some cases, to stave off Alzheimer’s disease. Many companies, such as Lumosity and Cogmed, describe their games as backed by solid scientific evidence and prominently note that neuroscientists at top universities and research centers helped design the programs. But the cited research is often “only tangentially related to the scientific claims of the company, and to the games they sell,” according to the statement released Monday by the Stanford Center on Longevity in Palo Alto, California, and the Max Planck Institute for Human Development in Berlin. Although the letter, whose signatories include many researchers outside those two organizations, doesn’t point to specific bad actors, it concludes that there is “little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” A similar statement of concern was published in 2008 with a smaller number of signatories, says Ulman Lindenberger of the Max Planck Institute for Human Development, who helped organize both letters. Although Lindenberger says there was no particular trigger for the current statement, he calls it the “expression of a growing collective concern among a large number of cognitive psychologists and neuroscientists who study human cognitive aging.” © 2014 American Association for the Advancement of Science

Keyword: Alzheimers; Learning & Memory
Link ID: 20237 - Posted: 10.23.2014