Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Sara Reardon Delivering medications to the brain could become easier, thanks to molecules that can escort drugs through the notoriously impervious sheath that separates blood vessels from neurons. In a proof-of-concept study in monkeys, biologists used the system to reduce levels of the protein amyloid-β, which accumulates in the brain plaques associated with Alzheimer's disease1. The blood–brain barrier is a layer of cells lining the inner surface of the capillaries that feed the central nervous system. It is nature's way of protecting the delicate brain from infectious agents and toxic compounds, while letting nutrients and oxygen in and waste products out. Because the barrier strictly regulates the passage of larger molecules and often prevents drug molecules from entering the brain, it has long posed one of the most difficult challenges in developing treatments for brain disorders. Several approaches to bypassing the barrier are being tested, including nanoparticles that are small enough to pass through the barrier's cellular membranes and deliver their payload; catheters that carry a drug directly into the brain; and ultrasound pulses that push microbubbles through the barrier. But no approach has yet found broad therapeutic application. Neurobiologist Ryan Watts and his colleagues at the biotechnology company Genentech in South San Francisco have sought to break through the barrier by exploiting transferrin, a protein that sits on the surface of blood vessels and carries iron into the brain. The team created an antibody with two ends. One end binds loosely to transferrin and uses the protein to transport itself into the brain. And once the antibody is inside, its other end targets an enzyme called β-secretase 1 (BACE1), which produces amyloid-β. Crucially, the antibody binds more tightly to BACE1 than to transferrin, and this pulls it off the blood vessel and into the brain. It locks BACE1 shut and prevents it from making amyloid-β. © 2014 Nature Publishing Group,
Link ID: 20286 - Posted: 11.06.2014
|By Lindsey Konkel and Environmental Health News New York City children exposed in the womb to high levels of pollutants in vehicle exhaust had a five times higher risk of attention problems at age 9, according to research by Columbia University scientists published Wednesday. The study adds to earlier evidence that mothers' exposures to polycyclic aromatic hydrocarbons (PAHs), which are emitted by the burning of fossil fuels and other organic materials, are linked to children's behavioral problems associated with Attention Deficit Hyperactivity Disorder (ADHD). “Our research suggests that environmental factors may be contributing to attention problems in a significant way,” said Frederica Perera, an environmental health scientist at Columbia’s Mailman School of Public Health who was the study's lead author. About one in 10 U.S. kids is diagnosed with ADHD, according to the Centers for Disease Control and Prevention. Children with ADHD are at greater risk of poor academic performance, risky behaviors and lower earnings in adulthood, the researchers wrote. “Air pollution has been linked to adverse effects on attention span, behavior and cognitive functioning in research from around the globe. There is little question that air pollutants may pose a variety of potential health risks to children of all ages, possibly beginning in the womb,” said Dr. Andrew Adesman, chief of developmental and behavioral pediatrics at Steven & Alexandra Cohen Children’s Medical Center of New York. He did not participate in the new study. © 2014 Scientific American
By NICK BILTON Ebola sounds like the stuff of nightmares. Bird flu and SARS also send shivers down my spine. But I’ll tell you what scares me most: artificial intelligence. The first three, with enough resources, humans can stop. The last, which humans are creating, could soon become unstoppable. Before we get into what could possibly go wrong, let me first explain what artificial intelligence is. Actually, skip that. I’ll let someone else explain it: Grab an iPhone and ask Siri about the weather or stocks. Or tell her “I’m drunk.” Her answers are artificially intelligent. Right now these artificially intelligent machines are pretty cute and innocent, but as they are given more power in society, these machines may not take long to spiral out of control. In the beginning, the glitches will be small but eventful. Maybe a rogue computer momentarily derails the stock market, causing billions in damage. Or a driverless car freezes on the highway because a software update goes awry. But the upheavals can escalate quickly and become scarier and even cataclysmic. Imagine how a medical robot, originally programmed to rid cancer, could conclude that the best way to obliterate cancer is to exterminate humans who are genetically prone to the disease. Nick Bostrom, author of the book “Superintelligence,” lays out a number of petrifying doomsday settings. One envisions self-replicating nanobots, which are microscopic robots designed to make copies of themselves. In a positive situation, these bots could fight diseases in the human body or eat radioactive material on the planet. But, Mr. Bostrom says, a “person of malicious intent in possession of this technology might cause the extinction of intelligent life on Earth.” © 2014 The New York Times Company
By James Gallagher Health editor, BBC News website Working antisocial hours can prematurely age the brain and dull intellectual ability, scientists warn. Their study, in the journal Occupational and Environmental Medicine, suggested a decade of shifts aged the brain by more than six years. There was some recovery after people stopped working antisocial shifts, but it took five years to return to normal. Experts say the findings could be important in dementia, as many patients have disrupted sleep. The body's internal clock is designed for us to be active in the day and asleep at night. The damaging effects on the body of working against the body clock, from breast cancer to obesity, are well known. Now a team at the University of Swansea and the University of Toulouse has shown an impact on the mind as well. Three thousand people in France performed tests of memory, speed of thought and wider cognitive ability. The brain naturally declines as we age, but the researchers said working antisocial shifts accelerated the process. Those with more than 10 years of shift work under their belts had the same results as someone six and a half years older. The good news is that when people in the study quit shift work, their brains did recover. Even if it took five years. Dr Philip Tucker, part of the research team in Swansea, told the BBC: "It was quite a substantial decline in brain function, it is likely that when people trying to undertake complex cognitive tasks then they might make more mistakes and slip-ups, maybe one in 100 makes a mistake with a very large consequence, but it's hard to say how big a difference it would make in day-to-day life." BBC © 2014
Kate Baggaley Much of the increase in autism diagnoses in recent decades may be tied to changes in how the condition is reported. Sixty percent of the increase in autism cases in Denmark can be explained by these changes, scientists report November 3 in JAMA Pediatrics. The researchers followed all 677,915 people born in Denmark in 1980 through 1991, monitoring them from birth through the end of 2011. Among children born in this period, diagnoses increased fivefold, until 1 percent of children born in the early 1990s were diagnosed with autism by age 20. During these decades, Denmark experienced two changes in the way autism is reported. In 1994, the criteria physicians rely on to diagnose autism were updated in both the International Classification of Diseases manual used by Denmark and in its American counterpart, the Diagnostic and Statistical Manual of Mental Disorders. Then in 1995, the Danish Psychiatric Register began reporting diagnoses where doctors had only outpatient contact with children, in addition to cases where autism was diagnosed after children had been kept overnight. The researchers estimated Danish children’s likelihood of being diagnosed with autism before and after the two reporting changes. These changes accounted for 60 percent of the increase in diagnoses. © Society for Science & the Public 2000 - 2014.
Link ID: 20280 - Posted: 11.05.2014
By SINDYA N. BHANOO BERKELEY, CALIF. — Lilith Sadil, 12, climbs into an examination chair here at the Myopia Control Center at the University of California. “Do you know why you are here?” asks Dr. Maria Liu, an optometrist. “Because my eyes are changing fast,” Lilith says. “Do you read a lot?” Dr. Liu asks. “Yes.” “Do you use the computer a lot?” “Yes.” Lilith is an active child who practices taekwondo. But like an increasing number of children, she has myopia — she can see close up but not farther away. Her mother, Jinnie Sadil, has brought her to the center because she has heard about a new treatment that could help. Eye specialists are offering young patients special contact lenses worn overnight that correct vision for the next day. Myopia has become something of a minor epidemic: More than 40 percent of Americans are nearsighted, a 16 percent increase since the 1970s. People with so-called high myopia — generally, blurry vision beyond about five inches — face an increased likelihood of developing cataracts and glaucoma, are at higher risk for retinal detachments that can result in blindness. Exactly what is causing the nationwide rise in nearsightedness is not known. “It can’t be entirely genetic, because genes don’t change that fast,” said Susan Vitale, an epidemiologist at the National Institutes of Health who studies myopia. “It’s probably something that’s environmental, or a combination of genetic and environmental factors.” Some research indicates that “near work” — reading, computer work, playing video games, and using tablets and smartphones — is contributing to the increase. A recent study found that the more educated a person is, the more likely he or she will be nearsighted. A number of other studies show that children who spend time outdoors are less likely to develop high myopia. But no one is certain whether the eye benefits from ultraviolet light or whether time outside simply means time away from near work. © 2014 The New York Times Company
Joan Raymond TODAY contributor It’s well established that baby talk plays a huge role in helping the wee widdle babies learn to tawk. And — no surprise — moms talk more to babies than dads do. But it seems that the baby's sex plays a role, too: Moms may be talking more to their infant daughters than their sons during the early weeks and months of a child’s life. In a new study published Monday in the online edition of Pediatrics, researchers looked at the language interactions between 33 late preterm and term infants and their parents by capturing 3,000 hours of recordings. Somewhat surprisingly, the researchers found that moms interacted vocally more with infant daughters rather than sons both at birth and 44 weeks post-menstrual age (equivalent to 1 month old.) Male adults responded more frequently to infant boys than infant girls, but the difference did not reach statistical significance, say the researchers. “We wanted to look more at gender and factors that affect these essentially mini-conversations that parents have with infants,” says lead author and neonatologist Dr. Betty Vohr, director of the Neonatal Follow-Up Program at Women & Infants Hospital of Rhode Island. “Infants are primed to vocalize and have reciprocal interactions.”
By RICHARD A. FRIEDMAN ATTENTION deficit hyperactivity disorder is now the most prevalent psychiatric illness of young people in America, affecting 11 percent of them at some point between the ages of 4 and 17. The rates of both diagnosis and treatment have increased so much in the past decade that you may wonder whether something that affects so many people can really be a disease. And for a good reason. Recent neuroscience research shows that people with A.D.H.D. are actually hard-wired for novelty-seeking — a trait that had, until relatively recently, a distinct evolutionary advantage. Compared with the rest of us, they have sluggish and underfed brain reward circuits, so much of everyday life feels routine and understimulating. To compensate, they are drawn to new and exciting experiences and get famously impatient and restless with the regimented structure that characterizes our modern world. In short, people with A.D.H.D. may not have a disease, so much as a set of behavioral traits that don’t match the expectations of our contemporary culture. From the standpoint of teachers, parents and the world at large, the problem with people with A.D.H.D. looks like a lack of focus and attention and impulsive behavior. But if you have the “illness,” the real problem is that, to your brain, the world that you live in essentially feels not very interesting. One of my patients, a young woman in her early 20s, is prototypical. “I’ve been on Adderall for years to help me focus,” she told me at our first meeting. Before taking Adderall, she found sitting in lectures unendurable and would lose her concentration within minutes. Like many people with A.D.H.D., she hankered for exciting and varied experiences and also resorted to alcohol to relieve boredom. But when something was new and stimulating, she had laserlike focus. I knew that she loved painting and asked her how long she could maintain her interest in her art. “No problem. I can paint for hours at a stretch.” Rewards like sex, money, drugs and novel situations all cause the release of dopamine in the reward circuit of the brain, a region buried deep beneath the cortex. Aside from generating a sense of pleasure, this dopamine signal tells your brain something like, “Pay attention, this is an important experience that is worth remembering.” © 2014 The New York Times Company
Maanvi Singh How does a sunset work? We love to look at one, but Jolanda Blackwell wanted her eighth-graders to really think about it, to wonder and question. So Blackwell, who teaches science at Oliver Wendell Holmes Junior High in Davis, Calif., had her students watch a video of a sunset on YouTube as part of a physics lesson on motion. "I asked them: 'So what's moving? And why?' " Blackwell says. The students had a lot of ideas. Some thought the sun was moving; others, of course, knew that a sunset is the result of the Earth spinning around on its axis. Once she got the discussion going, the questions came rapid-fire. "My biggest challenge usually is trying to keep them patient," she says. "They just have so many burning questions." Students asking questions and then exploring the answers. That's something any good teacher lives for. And at the heart of it all is curiosity. Blackwell, like many others teachers, understands that when kids are curious, they're much more likely to stay engaged. But why? What, exactly, is curiosity and how does it work? A study published in the October issue of the journal Neuron suggests that the brain's chemistry changes when we become curious, helping us better learn and retain information. © 2014 NPR
By CATHERINE SAINT LOUIS More than 50 children in 23 states have had mysterious episodes of paralysis to their arms or legs, according to data gathered by the Centers for Disease Control and Prevention. The cause is not known, although some doctors suspect the cases may be linked to infection with enterovirus 68, a respiratory virus that has sickened thousands of children in recent months. Concerned by a cluster of cases in Colorado, the C.D.C. last month asked doctors and state health officials nationwide to begin compiling detailed reports about cases of unusual limb weakness in children. Experts convened by the agency plan next week to release interim guidelines on managing the condition. That so many children have had full or partial paralysis in a short period is unusual, but officials said that the cases seemed to be extremely rare. “At the moment, it looks like whatever the chances are of getting this syndrome are less than one in a million,” said Mark A. Pallansch, the director of the division of viral diseases at the C.D.C. Some of the affected children have lost the use of a leg or an arm, and are having physical therapy to keep their muscles conditioned. Others have sustained more extensive damage and require help breathing. Marie, who asked to be identified by her middle name to protect her family’s privacy, said her 4-year-old son used to climb jungle gyms. But in late September, after the whole family had been sick with a respiratory illness, he started having trouble climbing onto the couch. He walked into Boston Children’s Hospital the day he was admitted. But soon his neck grew so weak, it “flopped completely back like he was a newborn,” Marie said. Typically, the time from when weakness begins until it reaches its worst is one to three days. But for her son, eight mornings in a row, he awoke with a "brand new deficit" until he had some degree of weakness in each limb and had trouble breathing. He was eventually transferred to a Spaulding rehabilitation center, where he is now. © 2014 The New York Times Company
By Virginia Morell Human fetuses are clever students, able to distinguish male from female voices and the voices of their mothers from those of strangers between 32 and 39 weeks after conception. Now, researchers have demonstrated that the embryos of the superb fairy-wren (Malurus cyaneus, pictured), an Australian songbird, also learn to discriminate among the calls they hear. The scientists played 1-minute recordings to 43 fairy-wren eggs collected from nests in the wild. The eggs were between days 9 and 13 of a 13- to 14-day incubation period. The sounds included white noise, a contact call of a winter wren, or a female fairy-wren’s incubation call. Those embryos that listened to the fairy-wrens’ incubation calls and the contact calls of the winter wrens lowered their heart rates, a sign that they were learning to discriminate between the calls of a different species and those of their own kind, the researchers report online today in the Proceedings of the Royal Society B. (None showed this response to the white noise.) Thus, even before hatching, these small birds’ brains are engaged in tasks requiring attention, learning, and possibly memory—the first time embryonic learning has been seen outside humans, the scientists say. The behavior is key because fairy-wren embryos must learn a password from their mothers’ incubation calls; otherwise, they’re less successful at soliciting food from their parents after hatching. © 2014 American Association for the Advancement of Science.
By Paula Span First, an acknowledgment: Insomnia bites. S. Bliss, a reader from Albuquerque, comments that even taking Ativan, he or she awakens at 4:30 a.m., can’t get back to sleep and suffers “a state of sleep deprivation and eventually a kind of walking exhaustion.” Molly from San Diego bemoans “confusion, anxiety, exhaustion, depression, loss of appetite, frankly a loss of will to go on,” all consequences of her sleeplessness. She memorably adds, “Give me Ambien or give me death.” Marciacornute reports that she’s turned to vodka (prompting another reader to wonder if Medicare will cover booze). After several rounds of similar laments here (and not only here; insomnia is prevalent among older adults), I found the results of a study by University of Chicago researchers particularly striking. What if people who report sleep problems are actually getting enough hours of sleep, overall? What if they’re not getting significantly less sleep than people who don’t complain of insomnia? Maybe there’s something else going on. It has always been difficult to ascertain how much people sleep; survey questions are unreliable (how can you tell when you’ve dozed off?), and wiring people with electrodes creates such an abnormal situation that the results may bear little resemblance to ordinary nightlife. Enter the actigraph, a wrist-motion monitor. “The machines have gotten better, smaller, less clunky and more reliable,” said Linda Waite, a sociologist and a co-author of the study. By having 727 older adults across the United States (average age: almost 72) wear actigraphs for three full days, Dr. Waite and her colleagues could tell when subjects were asleep and when they weren’t. Then they could compare their reported insomnia to their actual sleep patterns. Overall, in this random sample, taken from an ongoing national study of older adults, people didn’t appear sleep-deprived. They fell asleep at 10:27 p.m. on average, and awakened at 6:22 a.m. After subtracting wakeful periods during the night, they slept an average seven and a quarter hours. But averages don’t tell us much, so let’s look more closely at their reported insomnia. “What was surprising to us is that there’s very little association between people’s specific sleep problems and what the actigraph shows,” Dr. Waite said. © 2014 The New York Times Company
By Eric Niiler Has our reliance on iPhones and other instant-info devices harmed our memories? Michael Kahana, a University of Pennsylvania psychology professor who studies memory, says maybe: “We don’t know what the long-lasting impact of this technology will be on our brains and our ability to recall.” Kahana, 45, who has spent the past 20 years looking at how the brain creates memories, is leading an ambitious four-year Pentagon project to build a prosthetic memory device that can be implanted into human brains to help veterans with traumatic brain injuries. He spoke by telephone with The Post about what we can do to preserve or improve memory. Practicing the use of your memory is helpful. The other thing which I find helpful is sleep, which I don’t get enough of. As a general principle, skills that one continues to practice are skills that one will maintain in the face of age-related changes in cognition. [As for all those brain games available], I am not aware of any convincing data that mental exercises have a more general effect other than maintaining the skills for those exercises. I think the jury is out on that. If you practice doing crossword puzzles, you will preserve your ability to do crossword puzzles. If you practice any other cognitive skill, you will get better at that as well. Michael Kahana once could name every student in a class of 100. Now, says the University of Pennsylvania psychology professor who studies memory, “I find it too difficult even with a class of 20.” (From Michael Kahana)
Keyword: Learning & Memory
Link ID: 20249 - Posted: 10.28.2014
By PAM BELLUCK Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age. In a small study in the journal Nature Neuroscience, healthy people, ages 50 to 69, who drank a mixture high in antioxidants called cocoa flavanols for three months performed better on a memory test than people who drank a low-flavanol mixture. On average, the improvement of high-flavanol drinkers meant they performed like people two to three decades younger on the study’s memory task, said Dr. Scott A. Small, a neurologist at Columbia University Medical Center and the study’s senior author. They performed about 25 percent better than the low-flavanol group. “An exciting result,” said Craig Stark, a neurobiologist at the University of California, Irvine, who was not involved in the research. “It’s an initial study, and I sort of view this as the opening salvo.” He added, “And look, it’s chocolate. Who’s going to complain about chocolate?” The findings support recent research linking flavanols, especially epicatechin, to improved blood circulation, heart health and memory in mice, snails and humans. But experts said the new study, although involving only 37 participants and partly funded by Mars Inc., the chocolate company, goes further and was a well-controlled, randomized trial led by experienced researchers. Besides improvements on the memory test — a pattern recognition test involving the kind of skill used in remembering where you parked the car or recalling the face of someone you just met — researchers found increased function in an area of the brain’s hippocampus called the dentate gyrus, which has been linked to this type of memory. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20246 - Posted: 10.27.2014
By Gary Stix Scott Small, a professor of neurology at Columbia University’s College of Physicians and Surgeons, researches Alzheimer’s, but he also studies the memory loss that occurs during the normal aging process. Research on the commonplace “senior moments” focuses on the hippocampus, an area of the brain involved with formation of new memories. In particular, one area of the hippocampus, the dentate gyrus, which helps distinguish one object from another, has lured researchers on age-related memory problems. In a study by Small and colleagues published Oct. 26 in Nature Neuroscience, naturally occurring chemicals in cocoa increased dentate gyrus blood flow. Psychological testing showed that the pattern recognition abilities of a typical 60-year-old on a high dose of the cocoa phytochemicals in the 37-person study matched those of a 30-or 40-year old after three months. The study received support from the food company Mars, but Small cautions against going out to gorge on Snickers Bars, as most of the beneficial chemicals, or flavanols, are removed when processing cocoa. An edited transcript of an interview with Small follows: Can you explain what you found in your study? The main motive of the study was to causally establish an anatomical source of age-related memory loss. A number of labs have shown in the last 10 years that there’s one area of the brain called the dentate gyrus that is linked to the aging process. But no one has tested that concept. Until now the observations have been correlational. There is decreased function in that region and, to prove causation, we were trying to see if we could reverse that. © 2014 Scientific American
Keyword: Learning & Memory
Link ID: 20245 - Posted: 10.27.2014
By Neuroskeptic A new paper threatens to turn the world of autism neuroscience upside down. Its title is Anatomical Abnormalities in Autism?, and it claims that, well, there aren’t very many. Published in Cerebral Cortex by Israeli researchers Shlomi Haar and colleagues, the new research reports that there are virtually no differences in brain anatomy between people with autism and those without. What makes Haar et al.’s essentially negative claims so powerful is that their study had a huge sample size: they included structural MRI scans from 539 people diagnosed with high-functioning autism spectrum disorder (ASD) and 573 controls. This makes the paper an order of magnitude bigger than a typical structural MRI anatomy study in this field. The age range was 6 to 35. The scans came from the public Autism Brain Imaging Data Exchange (ABIDE) database, a data sharing initiative which pools scans from 18 different neuroimaging centers. Haar et al. examined the neuroanatomy of the cases and controls using the popular FreeSurfer software package. What did they find? Well… not much. First off, the ASD group had no differences in overall brain size (intracranial volume). Nor were there any group differences in the volumes of most brain areas; the only significant finding here was an increased ventricle volume in the ASD group, but even this had a small effect size (d = 0.34). Enlarged ventricles is not specific to ASD by any means – the same thing has been reported in schizophrenia, dementia, and many other brain disorders.
by Neurobonkers A paper published in Nature Reviews Neuroscience last week addressed the prevalence of neuromyths among educators. The paper has been widely reported, but the lion's share of the coverage glossed over the impact that neuromyths have had in the real world. Your first thought after reading the neuromyths in the table below — which were widely believed by teachers — may well be, "so what?" It is true that some of the false beliefs are relatively harmless. For example, encouraging children to drink a little more water might perhaps result in the consumption of less sugary drinks. This may do little if anything to reduce hyperactivity but could encourage a more nutritious diet which might have impacts on problems such as Type II diabetes. So, what's the harm? The paper addressed a number of areas where neuromyths have had real world impacts on educators and policymakers, which may have resulted negatively on the provision of education. The graph above, reprinted in the Nature Reviews Neuroscience, paper has been included as empirical data in educational policy documents to provide evidence for an "allegedly scientific argument for withdrawing public funding of university education." The problem? The data is made up. The graph is in fact a model that is based on the false assumption that investment before the age of three will have many times the benefit of investment made in education later in life. The myth of three — the belief that there is a critical window to educate children before the age of three, after which point the trajectory is fixed — is one of the most persistent neuromyths. Viewed on another level, while some might say investment in early education can never be a bad thing, how about the implication that the potential of a child is fixed at such an early point in their life, when in reality their journey has just begun. © Copyright 2014, The Big Think, Inc
By CLIVE THOMPSON “You just crashed a little bit,” Adam Gazzaley said. It was true: I’d slammed my rocket-powered surfboard into an icy riverbank. This was at Gazzaley’s San Francisco lab, in a nook cluttered with multicolored skullcaps and wires that hooked up to an E.E.G. machine. The video game I was playing wasn’t the sort typically pitched at kids or even middle-aged, Gen X gamers. Indeed, its intended users include people over 60 — because the game might just help fend off the mental decline that accompanies aging. It was awfully hard to play, even for my Call of Duty-toughened brain. Project: Evo, as the game is called, was designed to tax several mental abilities at once. As I maneuvered the surfboard down winding river pathways, I was supposed to avoid hitting the sides, which required what Gazzaley said was “visual-motor tracking.” But I also had to watch out for targets: I was tasked with tapping the screen whenever a red fish jumped out of the water. The game increased in difficulty as I improved, making the river twistier and obliging me to remember turns I’d taken. (These were “working-memory challenges.”) Soon the targets became more confusing — I was trying to tap blue birds and green fish, but the game faked me out by mixing in green birds and blue fish. This was testing my “selective attention,” or how quickly I could assess a situation and react to it. The company behind Project: Evo is now seeking approval from the Food and Drug Administration for the game. If it gets that government stamp, it might become a sort of cognitive Lipitor or Viagra, a game that your doctor can prescribe for your aging mind. After only two minutes of play, I was making all manner of mistakes, stabbing frantically at the wrong fish as the game sped up. “It’s hard,” Gazzaley said, smiling broadly as he took back the iPad I was playing on. “It’s meant to really push it.” “Brain training” games like Project: Evo have become big business, with Americans spending an estimated $1.3 billion a year on them. They are also a source of controversy. © 2014 The New York Times Company
By Emily Underwood Aging baby boomers and seniors would be better off going for a hike than sitting down in front of one of the many video games designed to aid the brain, a group of nearly 70 researchers asserted this week in a critique of some of the claims made by the brain-training industry. With yearly subscriptions running as much as $120, an expanding panoply of commercial brain games promises to improve memory, processing speed, and problem-solving, and even, in some cases, to stave off Alzheimer’s disease. Many companies, such as Lumosity and Cogmed, describe their games as backed by solid scientific evidence and prominently note that neuroscientists at top universities and research centers helped design the programs. But the cited research is often “only tangentially related to the scientific claims of the company, and to the games they sell,” according to the statement released Monday by the Stanford Center on Longevity in Palo Alto, California, and the Max Planck Institute for Human Development in Berlin. Although the letter, whose signatories include many researchers outside those two organizations, doesn’t point to specific bad actors, it concludes that there is “little evidence that playing brain games improves underlying broad cognitive abilities, or that it enables one to better navigate a complex realm of everyday life.” A similar statement of concern was published in 2008 with a smaller number of signatories, says Ulman Lindenberger of the Max Planck Institute for Human Development, who helped organize both letters. Although Lindenberger says there was no particular trigger for the current statement, he calls it the “expression of a growing collective concern among a large number of cognitive psychologists and neuroscientists who study human cognitive aging.” © 2014 American Association for the Advancement of Science
By PAUL VITELLO Most adults do not remember anything before the age of 3 or 4, a gap that researchers had chalked up to the vagaries of the still-developing infant brain. By some accounts, the infant brain was just not equipped to remember much. Textbooks referred to the deficiency as infant amnesia. Carolyn Rovee-Collier, a developmental psychologist at Rutgers University who died on Oct. 2 at 72, challenged the theory, showing in a series of papers in the early 1980s that babies remember plenty. A 3-month-old can recall what he or she learned yesterday, she found, and a 9-month-old can remember a game for as long as a month and a half. She cited experiments suggesting that memory processes in adults and infants are virtually the same, and argued that infant memories were never lost. They just become increasingly harder to retrieve as the child grows, learns language and loses touch with the visual triggers that had kept those memories sharp — a view from between the bars of a crib, say, or the view of the floor as a crawler, not a toddler, sees it. Not all of Dr. Rovee-Collier’s theories won over the psychology establishment, which still uses the infant amnesia concept to explain why people do not remember life as a baby. But her insights about an infant’s short-term memory and ability to learn have been widely accepted, and have helped recast scientific thinking about the infant mind over the past 30 years. Since the first of her 200 papers was published, infant cognitive studies has undergone a boom in university programs around the country. It was a field that had been largely unexplored in any systematic way by the giants of psychological theory. Freud and Jean Piaget never directly addressed the subject of infant memory. William James, considered the father of American psychology, once hazarded a guess that the human baby’s mind was a place of “blooming, buzzing confusion.” © 2014 The New York Times Company