Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Kate Baggaley A buildup of rare versions of genes that control the activity of nerve cells in the brain increases a person’s risk for bipolar disorder, researchers suggest in a paper posted online the week of February 16 in Proceedings of the National Academy of Sciences. “There are many different variants in many different genes that contribute to the genetic risk,” says coauthor Jared Roach, a geneticist at the Institute for Systems Biology in Seattle. “We think that most people with bipolar disorder will have inherited several of these…risk variants.” The bulk of a person’s risk for bipolar disorder comes from genetics, but only a quarter of that risk can be explained by common variations in genes. Roach’s team sequenced the genomes of 200 people from 41 families with a history of bipolar disorder. They then identified 164 rare forms of genes that show up more often in people with the condition. People with bipolar disorder had, on average, six of these rare forms, compared with just one, on average, found in their healthy relatives and the general population. The identified genes control the ability of ions, or charged particles, to enter or leave nerve cells, or neurons. This affects neurons’ ability to pass information through the brain. Some of the gene variants probably increase how much neurons fire while others decrease it, the researchers say. Future research will need to explain what role these brain changes play in bipolar disorder. Citations S.A. Ament et al. Rare variants in neuronal excitability genes influence risk for bipolar disorder. Proceedings of the National Academy of Sciences. Published online the week of February 16, 2015. doi:10.1073/pnas.1424958112. © Society for Science & the Public 2000 - 2015
By PAULA SPAN The word “benzodiazepines” and the phrase “widely prescribed for anxiety and insomnia” appear together so frequently that they may remind you of the apparently unbreakable connection between “powerful” and “House Ways and Means Committee.” But now we have a better sense of just how widely prescribed these medications are. A study in this month’s JAMA Psychiatry reports that among 65- to 80-year-old Americans, close to 9 percent use one of these sedative-hypnotics, drugs like Valium, Xanax, Ativan and Klonopin. Among older women, nearly 11 percent take them. “That’s an extraordinarily high rate of use for any class of medications,” said Michael Schoenbaum, a senior adviser at the National Institutes of Mental Health and a co-author of the new report. “It seemed particularly striking given the identified clinical concerns associated with benzodiazepine use in anybody, but especially in older adults.” He was referring to decades of warnings about the potentially unhappy consequences of benzodiazepines for older users. The drugs still are recommended for a handful of specific disorders, including acute alcohol withdrawal and, sometimes, seizures and panic attacks. But concerns about the overuse of benzodiazepines have been aired again and again: in the landmark nursing home reform law of 1987, in the American Geriatrics Society’s Choosing Wisely list of questionable practices in 2013, in last year’s study in the journal BMJ suggesting an association with Alzheimer’s disease. Benzodiazepine users face increased risks of falls and fractures, of auto accidents, of reduced cognition. “Even after one or two doses, you have impaired cognitive performance on memory and other neuropsychological tests, compared to a placebo,” said Dr. D.P. Devanand, director of geriatric psychiatry at Columbia University Medical Center. © 2015 The New York Times Company
Carl Zimmer In 2010, a graduate student named Tamar Gefen got to know a remarkable group of older people. They had volunteered for a study of memory at the Feinberg School of Medicine at Northwestern University. Although they were all over age 80, Ms. Gefen and her colleagues found that they scored as well on memory tests as people in their 50s. Some complained that they remembered too much. She and her colleagues referred to them as SuperAgers. Many were also friends. “A couple tried to set me up with their grandsons,” Ms. Gefen said. She was impressed by their resilience and humor: “It takes wisdom to a whole new level.” Recently, Ms. Gefen’s research has taken a sharp turn. At the outset of the study, the volunteers agreed to donate their brains for medical research. Some of them have died, and it has been Ms. Gefen’s job to look for anatomical clues to their extraordinary minds. “I had this enormous privilege I can’t even begin to describe,” she said. “I knew them and tested them in life and in death. At the end, I was the one looking at them through a microscope.” Ms. Gefen and her colleagues are now starting to publish the results of these post-mortem studies. Last month in The Journal of Neuroscience, the scientists reported that one of the biggest differences involves peculiar, oversize brain cells known as von Economo neurons. SuperAgers have almost five times as many of them as other people. Learning what makes these brains special could help point researchers to treatments for Alzheimer’s disease and other kinds of mental decline. But it is hard to say how an abundance of von Economo neurons actually helps the brain. © 2015 The New York Times Company
|By Esther Landhuis Whereas cholesterol levels measured in a routine blood test can serve as a red flag for heart disease, there’s no simple screen for impending Alzheimer’s. A new Silicon Valley health start-up hopes to change that. A half million Americans die of Alzheimer’s disease each year. Most are diagnosed after a detailed medical workup and extensive neurological and psychological tests that gauge mental function and rule out other causes of dementia. Yet things begin going awry some 10 to 15 years before symptoms show. Spinal fluid analyses and positron emission tomography (PET) scans can detect a key warning sign—buildup of amyloid-beta protein in the brain. Studies suggest that adults with high brain amyloid have elevated risk for Alzheimer’s and stand the best chance of benefiting from treatments should they become available. Getting Alzheimer’s drugs to market requires long and costly clinical studies, which some experts say have failed thus far because experimental drugs were tested too late in the disease process. By the time people show signs of dementia, their brains have lost neurons and no current therapy can revive dead cells. That is why drug trials are looking to recruit seniors with preclinical Alzheimer’s who are on the verge of decline but otherwise look healthy. This poses a tall order. Spinal taps are cumbersome and PET costs $3,000 per scan. “There’s no cheap, fast, noninvasive test that can accurately identify people at risk of Alzheimer’s,” says Brad Dolin, chief technology officer of Neurotrack. The company is developing a computerized visual test that might fit the bill. © 2015 Scientific American
By Siri Carpenter “I don’t look like I have a disability, do I?” Jonas Moore asks me. I shake my head. No, I say — he does not. Bundled up in a puffy green coat in a drafty Starbucks, Moore, 35 and sandy-haired, doesn’t stand out in the crowd seeking refuge from the Wisconsin cold. His handshake is firm and his blue eyes meet mine as we talk. He comes across as intelligent and thoughtful, if perhaps a bit reserved. His disability — autism — is invisible. That’s part of the problem, says Moore. Like most people with autism spectrum disorders, he finds relationships challenging. In the past, he has been quick to anger and has had what he calls “meltdowns.” Those who don’t know he has autism can easily misinterpret his actions. “People think that when I do misbehave I’m somehow intentionally trying to be a jerk,” Moore says. “That’s just not the case.” His difficulty managing emotions has gotten him into some trouble, and he’s had a hard time holding onto jobs — an outcome he might have avoided, he says, if his coworkers and bosses had better understood his intentions. Over time, things have gotten better. Moore has held the same job for five years, vacuuming commercial buildings on a night cleaning crew. He attributes his success to getting the right amount of medication and therapy, to time maturing him and to the fact that he now works mostly alone. Moore is fortunate. His parents help support him financially. He has access to good mental health care. And with the help of the state’s division of vocational rehabilitation, he has found a job that suits him. Many adults with autism are not so lucky. © Society for Science & the Public 2000 - 2015.
Link ID: 20574 - Posted: 02.13.2015
By Michelle Roberts Health editor, BBC News online Women trying for a baby and those in the first three months of pregnancy should not drink any alcohol, updated UK guidelines say. The Royal College of Obstetricians and Gynaecologists (RCOG) had previously said a couple of glasses of wine a week was acceptable. It now says abstinence is the only way to be certain that the baby is not harmed. There is no proven safe amount that women can drink during pregnancy. The updated advice now chimes with guidelines from the National Institute for Health and Care Excellence (NICE). In the US, experts say there is no safe time to drink during pregnancy. But the RCOG highlights around the time of conception and the first three months of pregnancy as the most risky. Drinking alcohol may affect the unborn baby as some will pass through the placenta. Around conception and during the first three months, it may increase the chance of miscarriage, says the RCOG. After this time, women are advised to not drink more than one to two units, more than once or twice a week, it says. Drinking more than this could affect the development of the baby, in particular the way the baby's brain develops and the way the baby grows in the womb, which can lead to foetal growth restriction and increase the risk of stillbirth and premature labour, says the advice. © 2015 BBC
By Amy Ellis Nutt When we tell stories about our lives, most of us never have our memories questioned. NBC's Brian Williams, like other high-profile people in the past, is finding out what happens when questions arise. Williams's faux pas – retelling a story of his helicopter coming under fire in Iraq a dozen years ago when it was actually the helicopter flying ahead of him – was much like Hillary Rodham Clinton's during the 2008 presidential campaign. Her story was about coming under fire during a visit to an airfield in Bosnia 12 years earlier. George W. Bush also misremembered when, on several occasions, he told audiences that on 9/11 he watched the first plane fly into the north tower of the World Trade Center on TV, just before entering that classroom in Florida to read a book to school kids. In each case, these were highly emotional moments. Williams's helicopter made an emergency landing in the desert behind the aircraft that was hit; Clinton was made to don a flak jacket and was told her airplane might not be able to land at the airport in Bosnia because of sniper fire in the area; and Bush was told by an aide about the first crash into World Trade Center just before entering the classroom. That each of those memories was false created huge public relations headaches for Clinton and Williams. But the fact is that false memories are not that uncommon, especially when they involve highly emotional events. Scientists have been telling us for years that memory of autobiographical events, also known as episodic memory, is pliable and even unreliable. The consensus from neuroimaging studies and laboratory experiments is that episodic memory is not like replaying a film but more like reconstructing an event from bits and pieces of information. Memories are stored in clusters of neurons called engrams, and the proteins responsible for storing those memories, scientists say, are modified and changed just by the reconstruction process of remembering.
Keyword: Learning & Memory
Link ID: 20566 - Posted: 02.09.2015
By Kate Baggaley Stem cells can help heal long-term brain damage suffered by rats blasted with radiation, researchers report in the Feb. 5 Cell Stem Cell. The treatment allows the brain to rebuild the insulation on its nerve cells so they can start carrying messages again. The researchers directed human stem cells to become a type of brain cell that is destroyed by radiation, a common cancer treatment, then grafted the cells into the brains of irradiated rats. Within a few months, the rats’ performance on learning and memory tests improved. “This technique, translated to humans, could be a major step forward for the treatment of radiation-induced brain … injury,” says Jonathan Glass, a neurologist at Emory University in Atlanta. Steve Goldman, a neurologist at the University of Rochester in New York, agrees that the treatment could repair a lot of the damage caused by radiation. “Radiation therapy … is very effective, but the problem is patients end up with severe disability,” he says. “Fuzzy thinking, a loss in higher intellectual functions, decreases in memory — all those are part and parcel of radiation therapy to the brain.” For children, the damage can be profound. “Those kids have really significant detriments in their adult IQs,” Goldman says. Radiation obliterates cells that mature into oligodendrocytes, a type of cell that coats the message-carrying part of nerve cells with insulation. Without that cover, known as the myelin sheath, nerve cells can’t transmit information, leading to memory and other brain problems. © Society for Science & the Public 2000 - 2015
Alison Abbott Fabienne never found out why she went into labour three months too early. But on a quiet afternoon in June 2007, she was hit by accelerating contractions and was rushed to the nearest hospital in rural Switzerland, near Lausanne. When her son, Hugo, was born at 26 weeks of gestation rather than the typical 40, he weighed just 950 grams and was immediately placed in intensive care. Three days later, doctors told Fabienne that ultrasound pictures of Hugo's brain indicated that he had had a severe haemorrhage from his immature blood vessels. “I just exploded into tears,” she says. Both she and her husband understood that the prognosis for Hugo was grim: he had a very high risk of cerebral palsy, a neurological condition that can lead to a life of severe disability. The couple agreed that they did not want to subject their child to that. “We immediately told the doctors that we did not want fierce medical intervention to keep him alive — and saw the relief on the doctors' faces,” recalls Fabienne, who requested that her surname not be used. That night was the most tortured of her life. The next day, however, before any change had been made to Hugo's treatment, his doctors proposed a new option to confirm the diagnosis: a brain scan using magnetic resonance imaging (MRI). This technique, which had been newly adapted for premature babies, would allow the doctors to predict the risk of cerebral palsy more accurately than with ultrasound alone, which has a high false-positive rate. Hugo's MRI scan showed that the damage caused by the brain haemorrhage was limited, and his risk of severe cerebral palsy was likely to be relatively low. So just 24 hours after their decision to let his life end, Hugo's parents did an about-turn. They agreed that the doctors should try to save him. © 2015 Nature Publishing Group
Keyword: Development of the Brain
Link ID: 20555 - Posted: 02.05.2015
By Amanda Baker While we all may vary on just how much time we like spending with other people, humans are overall very social beings. Scientists have already found this to be reflected in our health and well-being – with social isolation being associated with more depression, worse health, and a shorter life. Looking even deeper, they find evidence of our social nature reflected in the very structure of our brains. Just thinking through your daily interactions with your friends or siblings probably gives you dozens of examples of times when it was important to interpret or predict the feelings and behaviors of other people. Our brains agree. Over time parts of our brains have been developed specifically for those tasks, but apparently not all social interaction was created equally. When researchers study the brains of people trying to predict the thoughts and feelings of others, they can actually see a difference in the brain activity depending on whether that person is trying to understand a friend versus a stranger. Even at the level of blood flowing through your brain, you treat people you know well differently than people you don’t. These social interactions also extend into another important area of the brain: the nucleus accumbens. This structure is key in the reward system of the brain, with activity being associated with things that leave you feeling good. Curious if this could have a direct connection with behavior, one group of scientists studied a very current part of our behavior as modern social beings: Facebook use. © 2015 Scientific American
Keyword: Development of the Brain
Link ID: 20554 - Posted: 02.05.2015
By ALAN SCHWARZ High numbers of students are beginning college having felt depressed and overwhelmed during the previous year, according to an annual survey released on Thursday, reinforcing some experts’ concern about the emotional health of college freshmen. The survey of more than 150,000 students nationwide, “The American Freshman: National Norms Fall 2014,” found that 9.5 percent of respondents had frequently “felt depressed” during the past year, a significant rise over the 6.1 percent reported five years ago. Those who “felt overwhelmed” by schoolwork and other commitments rose to 34.6 percent from 27.1 percent. Conducted by the Cooperative Institutional Research Program at the University of California, Los Angeles’s Higher Education Research Institute for almost 50 years, the survey assesses hundreds of matters ranging from political views to exercise habits. It is considered one of the most comprehensive snapshots of trends among recent high school seniors and is of particular interest to people involved in mental well-being. “It’s a public health issue,” said Dr. Anthony L. Rostain, a psychiatrist and co-chairman of a University of Pennsylvania task force on students’ emotional health. “We’re expecting more of students: There’s a sense of having to compete in a global economy, and they think they have to be on top of their game all the time. It’s no wonder they feel overwhelmed.” Other survey results indicated that students were spending more time on academics and socializing less — trends that would normally be lauded. But the lead author of the study, Kevin Eagan, cautioned that the shift could result in higher levels of stress. © 2015 The New York Times Company
|By Andrea Anderson and Victoria Stern Blood type may affect brain function as we age, according to a new large, long-term study. People with the rare AB blood type, present in less than 10 percent of the population, have a higher than usual risk of cognitive problems as they age. University of Vermont hematologist Mary Cushman and her colleagues used data from a national study called REGARDS, which has been following 30,239 African-American and Caucasian individuals older than 45 since 2007. The aim of the study is to understand the heavy stroke toll seen in the southeastern U.S., particularly among African-Americans. Cushman's team focused on information collected twice yearly via phone surveys that evaluate cognitive skills such as learning, short-term memory and executive function. The researchers zeroed in on 495 individuals who showed significant declines on at least two of the three phone survey tests. When they compared that cognitively declining group with 587 participants whose mental muster remained robust, researchers found that impairment in thinking was roughly 82 percent more likely in individuals with AB blood type than in those with A, B or O blood types, even after taking their race, sex and geography into account. The finding was published online last September in Neurology. The seemingly surprising result has some precedent: past studies suggest non-O blood types are linked to elevated incidence of heart disease, stroke and blood clots—vascular conditions that could affect brain function. Yet these cardiovascular consequences are believed to be linked to the way non-O blood types coagulate, which did not seem to contribute to the cognitive effects described in the new study. The researchers speculate that other blood-group differences, such as how likely cells are to stick to one another or to blood vessel walls, might affect brain function. © 2015 Scientific American
Link ID: 20552 - Posted: 02.05.2015
By Maria Konnikova R. T. first heard about the Challenger explosion as she and her roommate sat watching television in their Emory University dorm room. A news flash came across the screen, shocking them both. R. T., visibly upset, raced upstairs to tell another friend the news. Then she called her parents. Two and a half years after the event, she remembered it as if it were yesterday: the TV, the terrible news, the call home. She could say with absolute certainty that that’s precisely how it happened. Except, it turns out, none of what she remembered was accurate. R. T. was a student in a class taught by Ulric Neisser, a cognitive psychologist who had begun studying memory in the seventies. Early in his career, Neisser became fascinated by the concept of flashbulb memories—the times when a shocking, emotional event seems to leave a particularly vivid imprint on the mind. William James had described such impressions, in 1890, as “so exciting emotionally as almost to leave a scar upon the cerebral tissues.” The day following the explosion of the Challenger, in January, 1986, Neisser, then a professor of cognitive psychology at Emory, and his assistant, Nicole Harsch, handed out a questionnaire about the event to the hundred and six students in their ten o’clock psychology 101 class, “Personality Development.” Where were the students when they heard the news? Whom were they with? What were they doing? The professor and his assistant carefully filed the responses away. In the fall of 1988, two and a half years later, the questionnaire was given a second time to the same students. It was then that R. T. recalled, with absolute confidence, her dorm-room experience. But when Neisser and Harsch compared the two sets of answers, they found barely any similarities.
Keyword: Learning & Memory
Link ID: 20548 - Posted: 02.05.2015
By Katherine Ellison Dr. Mark Bertin is no A.D.H.D. pill-pusher. The Pleasantville, N.Y., developmental pediatrician won’t allow drug marketers in his office, and says he doesn’t always prescribe medication for children diagnosed with attention deficit hyperactivity disorder. Yet Dr. Bertin has recently changed the way he talks about medication, offering parents a powerful argument. Recent research, he says, suggests the pills may “normalize” the child’s brain over time, rewiring neural connections so that a child would feel more focused and in control, long after the last pill was taken. “There might be quite a profound neurological benefit,” he said in an interview. A growing number of doctors who treat the estimated 6.4 million American children diagnosed with A.D.H.D. are hearing that stimulant medications not only help treat the disorder but may actually be good for their patients’ brains. In an interview last spring with Psych Congress Network, http://www.psychcongress.com/video/are-A.D.H.D.-medications-neurotoxic-or-neuroprotective-16223an Internet news site for mental health professionals, Dr. Timothy Wilens, chief of child and adolescent psychiatry at Massachusetts General Hospital, said “we have enough data to say they’re actually neuroprotective.” The pills, he said, help “normalize” the function and structure of brains in children with A.D.H.D., so that, “over years, they turn out to look more like non-A.D.H.D. kids.” Medication is already by far the most common treatment for A.D.H.D., with roughly 4 million American children taking the pills — mostly stimulants, such as amphetamines and methylphenidate. Yet the decision can be anguishing for parents who worry about both short-term and long-term side effects. If the pills can truly produce long-lasting benefits, more parents might be encouraged to start their children on these medications early and continue them for longer. Leading A.D.H.D. experts, however, warn the jury is still out. © 2015 The New York Times Company
The longer a teenager spends using electronic devices such as tablets and smartphones, the worse their sleep will be, a study of nearly 10,000 16- to 19-year-olds suggests. More than two hours of screen time after school was strongly linked to both delayed and shorter sleep. Almost all the teens from Norway said they used the devices shortly before going to bed. Many said they often got less than five hours sleep a night, BMJ Open reports. The teens were asked questions about their sleep routine on weekdays and at weekends, as well as how much screen time they clocked up outside school hours. On average, girls said they spent around five and a half hours a day watching TV or using computers, smartphones or other electronic devices. And boys spent slightly more time in front of a screen - around six and a half hours a day, on average. Playing computer games was more popular among the boys, whereas girls were more likely to spend their time chatting online. teen using a laptop Any type of screen use during the day and in the hour before bedtime appeared to disrupt sleep - making it more difficult for teenagers to nod off. And the more hours they spent on gadgets, the more disturbed their sleep became. When daytime screen use totalled four or more hours, teens had a 49% greater risk of taking longer than an hour to fall asleep. These teens also tended to get less than five hours of sleep per night. Sleep duration went steadily down as gadget use increased. © 2015 BBC
|By Esther Landhuis One in nine Americans aged 65 and older has Alzheimer's disease, a fatal brain disorder with no cure or effective treatment. Therapy could come in the form of new drugs, but some experts suspect drug trials have failed so far because compounds were tested too late in the disease's progression. By the time people show signs of dementia, their brains have lost neurons. No therapy can revive dead cells, and little can be done to create new ones. So researchers running trials now seek participants who still pass as cognitively normal but are on the verge of decline. These “preclinical” Alzheimer's patients may represent a window of opportunity for therapeutic intervention. How to identify such individuals before they have symptoms presents a challenge, however. Today most Alzheimer's patients are diagnosed after a detailed medical workup and extensive tests that gauge mental function. Other tests, such as spinal fluid analyses and positron-emission tomography (PET) scans, can detect signs of approaching disease and help pinpoint the preclinical window but are cumbersome or expensive. “There's no cheap, fast, noninvasive test that can identify people at risk of Alzheimer's,” says Brad Dolin, chief technology officer of Neurotrack in Palo Alto, Calif.—a company developing a computerized visual screening test for Alzheimer's. Unlike other cognitive batteries, the Neurotrack test requires no language or motor skills. Participants view images on a monitor while a camera tracks their eye movements. The test draws on research by co-founder Stuart Zola of Emory University, who studies learning and memory in monkeys. When presented with a pair of images—one novel, the other familiar—primates fixate longer on the novel one. But if the hippocampus is damaged, as it is in people with Alzheimer's, the subject does not show a clear preference for the novel images. © 2015 Scientific American
By KEN BELSON A new study of N.F.L. retirees found that those who began playing tackle football when they were younger than 12 years old had a higher risk of developing memory and thinking problems later in life. The study, published in the medical journal Neurology by researchers at the Boston University School of Medicine, was based on tests given to 42 former N.F.L. players, ages 41 to 65, who had experienced cognitive problems for at least six months. Half the players started playing tackle football before age 12, and the other half began at 12 or older. Those former N.F.L. players who started playing before 12 years old performed “significantly worse” on every test measure after accounting for the total number of years played and the age of the players when they took the tests. Those players recalled fewer words from a list they had learned 15 minutes earlier, and their mental flexibility was diminished compared with players who began playing tackle football at 12 or older. The age of 12 was chosen as a benchmark because it is roughly the point by which brains in young boys are thought to have already undergone key periods of development. Research has shown that boys younger than 12 who injure their brains can take longer to recover and have poor cognition in childhood. The findings are likely to fuel an already fierce debate about when it is safe to allow children to begin playing tackle football and other contact sports. Youth leagues are under scrutiny for putting children at risk with head injuries. Pop Warner and many other youth leagues have added training protocols, have limited contact in practice and have adjusted weight and age limits to try to reduce head injuries and the risks associated with them. But some leagues continue to allow children as young as 5 to play tackle football. © 2015 The New York Times Company
Criminal psychopaths learn to respond differently to punishment cues than others in jail and may need more reward-focused treatments, new research suggests. Criminals such as Paul Bernardo, Ted Bundy and Clifford Olson, who scored high on psychopathy checklists, were known to be callous and unemotional. Psychopaths derive pleasure from being manipulative and use premeditated aggression to get what they want with no regard for those who are hurt. The search for what makes them tick has shown some physical differences in their brains such as reductions in grey matter. Now researchers in London, Montreal and Bethseda, Md., have used functional MRI imaging to assess how the brains of 12 violent criminals with psychopathy, 20 violent criminals with antisocial personality disorder but not psychopathy (such as those with a history of impulsivity and risk-taking), and 18 healthy people who were not criminals responded differently to rewards and punishment. "In the room with them, there's the sense that the weight of what they've done and the deleterious effect this is having on their lives doesn't really hold for them," said Dr. Nigel Blackwood of King's College London, a senior author of the paper in Wednesday's issue of Lancet Psychiatry. It's only at the moment in the scanner when the sanction of lost points cues them to change their behaviour that the differences between violent psychopaths and those with antisocial personality disorder appear. ©2015 CBC/Radio-Canada
Alison Abbott If you have to make a complex decision, will you do a better job if you absorb yourself in, say, a crossword puzzle instead of ruminating about your options? The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink. But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA — and find no evidence for it. Their conclusion, published this week in Judgement and Decision Making, is based on a large experiment that they designed to provide the best chance of capturing the effect should it exist, along with a sophisticated statistical analysis of previously published data1. The report adds to broader concerns about the quality of psychology studies and to an ongoing controversy about the extent to which unconscious thought in general can influence behaviour. “The bigger debate is about how clever our unconscious is,” says cognitive psychologist David Shanks of University College London. “This carefully constructed paper makes a great contribution.” Shanks published a review last year that questioned research claiming that various unconscious influences, including UTA, affect decision making2. © 2015 Nature Publishing Group
Ian Sample, science editor People who carry a mutated gene linked to longer lifespan have extra tissue in part of the brain that seems to protect them against mental decline in old age. The finding has shed light on a biological pathway that researchers now hope to turn into a therapy that slows the progression of Alzheimer’s disease and other forms of dementia. Brain scans of more than 400 healthy men and women aged 53 and over found that those who carried a single copy of a particular gene variant had a larger brain region that deals with planning and decision making. Further tests on the group found that those with an enlarged right dorsolateral prefrontal cortex (rDLPFC), as the brain region is known, fared better on a series of mental tasks. About one in five people inherits a single copy of the gene variant, or allele, known as KL-VS, which improves heart and kidney function, and on average adds about three years to human lifespan, according to Dena Dubal, a neurologist at University of California, San Francisco. Her latest work suggests that the same genetic mutation has broader effects on the brain. While having a larger rDLPFC accounted for only 12% of the improvement in people’s mental test scores, Dubal suspects the gene alters the brain in other ways, perhaps by improving the connections that form between neurons.