Links for Keyword: Attention
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Jessica Shugart People who need sugary snacks to stay sharp throughout the day could be prisoners of their own beliefs. The brain works just fine without regular shots of sugar in people who believe their willpower is unlimited, a new study shows. “There's a dominant theory in psychology that willpower is limited, and whenever you exert yourself to do a hard task or to resist a temptation, you deplete this limited resource,” says psychologist Carol Dweck from Stanford University. Previous studies have shown that mental exertion diminishes blood glucose levels and that a person’s willpower can be rejuvenated by ingesting a sugary drink. But Dweck’s earlier work led her to suspect that people’s attitudes about willpower may be responsible for that effect. In the new study, published online August 19 in the Proceedings of the National Academy of Sciences, Dweck, along with colleagues at the University of Zurich in Switzerland, focused on how attitudes about willpower may shape a person’s sugar dependence in the face of a challenge. The scientists also tested whether altering these beliefs might liberate a person from such a calorie-rich requirement. In the first of three experiments, the researchers asked students about their attitudes on willpower, then gave them lemonade sweetened with either sugar or a sugar substitute. Ten minutes after downing the sweet beverage, the students took tests of self-control and mental acuity. The students who subscribed to a self-generating belief about unlimited willpower scored equally well whether their drinks contained sugar or not. But the students who felt willpower was limited needed sugar to perform as well as the other group did. © Society for Science & the Public 2000 - 2013
Related chapters from BP7e: Chapter 13: Homeostasis: Active Regulation of the Internal Environment; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 9: Homeostasis: Active Regulation of the Internal Environment; Chapter 14: Attention and Consciousness
Link ID: 18537 - Posted: 08.20.2013
What people experience as death creeps in—after the heart stops and the brain becomes starved of oxygen—seems to lie beyond the reach of science. But the authors of a new study on dying rats make a bold claim: After cardiac arrest, the rodents’ brains enter a state similar to heightened consciousness in humans. The researchers suggest that if the same is true for people, such brain activity could be the source of the visions and other sensations that make up so-called near-death experiences. Estimated to occur in about 20% of patients who survive cardiac arrest, near-death experiences are frequently described as hypervivid or “realer-than-real,” and often include leaving the body and observing oneself from outside, or seeing a bright light. The similarities between these reports are hard to ignore, but the conversation about near-death experiences often bleeds into metaphysics: Are these visions produced solely by the brain, or are they a glimpse at an afterlife outside the body? Neurologist Jimo Borjigin of the University of Michigan, Ann Arbor, got interested in near-death experiences during a different project—measuring the hormone levels in the brains of rodents after a stroke. Some of the animals in her lab died unexpectedly, and her measurements captured a surge in neurochemicals at the moment of their death. Previous research in rodents and humans has shown that electrical activity surges in the brain right after the heart stops, then goes flat after a few seconds. Without any evidence that this final blip contains meaningful brain activity, Borjigin says “it’s perhaps natural for people to assume that [near-death] experiences came from elsewhere, from more supernatural sources.” But after seeing those neurochemical surges in her animals, she wondered about those last few seconds, hypothesizing that even experiences seeming to stretch for days in a person’s memory could originate from a brief “knee-jerk reaction” of the dying brain. © 2012 American Association for the Advancement of Science.
By Nathan Seppa The late rock and roll singer Jim Morrison was not a poster boy for public safety — and was no authority on safe driving. After all, later in “Roadhouse Blues,” he has beer for breakfast. But the opening line of that Doors’ song still resonates as sound guidance. If only such good advice could stand the test of time. “Roadhouse Blues” hit the airwaves in 1970, long before the unlikely marriage of driving and talking on a cell phone. Millions of people now routinely conduct remote conversations while driving, despite research showing that it’s dangerous — even with two eyes on the road and both hands upon the wheel. It turns out that hands don’t matter. It’s the conversation that can be lethal. Cell phone conversations impede what a driver sees and processes, a number of studies have shown. That, in turn, slows reactions and other faculties. This distracted state should be familiar to everyone. “That’s why you can drive home and not remember having driven home,” says Daniel Simons, a psychologist at the University of Illinois at Urbana-Champaign. “Just because you look at something doesn’t mean you see it.” Simons has shown that people assigned to observe certain activities in a lab setting can totally miss other events occurring in the very same space. The on-road versions of such blind spots show up when drivers engaged in a cell phone conversation fail to look at side streets or watch for pedestrians. This distraction may seem subtle and even fleeting, but it takes a toll: The risk of an accident quadruples when the driver is on the phone, studies have suggested. © Society for Science & the Public 2000 - 2013
Keith Barry, skilled magician that he is, doesn't give away his tricks but he does give the audience a clue when he says "magic is all about directing attention." Neuroscientists have long known that attention plays a key role in perception, and yet, we still don't fully understand the details of how attention works and what neural mechanisms are involved. Only a small fraction of the information that comes in through our eyes is actually perceived by our conscious brains. Attention is the filter that directs what is most salient in our environment to our conscious awareness. Almost all magic tricks somehow take advantage of loopholes in attention. For instance, a key strategy for magicians taps into something which cognitive neuroscientists call "inattentional or perceptual blindness", our inability to notice an object or feature in a visual scene because attention is directed elsewhere. You have experienced this phenomenon yourself. Your brain is constantly bombarded with stimuli, and it is impossible to pay attention to them all. While your attention is focused on one thing -- neuroscientists call this the "attentional spotlight"-- your ability to perceive objects outside this focus area is compromised. Indeed, if we could record the activity of the neurons in your brain that track to the visual scene, the neural responses for those areas outside the attentional spotlight would be dampened. The magician takes advantage of this phenomenon. By distracting your attention with sly hand movements, lively banter, humor, or skillful shifts of gaze, he can move your "attentional spotlight," while manipulating the action elsewhere, all without your knowing it and indeed while you think you are paying close attention! So, there you have it -- the neuroscientific answer to "how did he do that?" But, the fact that there is a logical, brain-based explanation behind the magician's tricks is not so surprising. The more interesting question is, if neuroscience can explain magic, can magic teach us anything about neuroscience? © 2013 TheHuffingtonPost.com, Inc.
By Lucas Laursen My cousin Guillermo Cassinello Toscano was on the train that derailed in Santiago de Compostela, Spain, last week when it went around a bend at twice the speed limit. Cassinello heard a loud vibration and then a powerful bump and then found himself surrounded by bloody bodies in wagon number nine. Shaking, he escaped the wreckage through either a door or a hole in the train—he cannot recall—then sat amid the smoke and debris next to the track and began to cry. Seventy-nine passengers died. Cassinello doesn’t remember everything that happened to him. The same mechanisms that kept his brain sharp enough to escape immediate danger may also make it harder for him both to recall the accident, and to put the trauma behind him. "The normal thing is that the person doesn't remember the moment of the accident or right after," says clinical psychologist Javier Rodriguez Escobar of trauma therapy team Grupo Isis in Seville, who helped treat and study victims of the 2004 Madrid train bombings. That's because the mind and the body enter a more alert but also more stressed state, with trade-offs that can save your life, but harm your mind’s memory-making abilities. As the train fell over, several changes would have swept through Cassinello’s body. His adrenal glands, near his kidneys, would have released adrenaline (also known as epinephrine) into his bloodstream. The adrenaline would have directed blood to the powerful muscles of his arms and legs, where it would help him escape the wreckage faster. The hormone would have raised his heart and breathing rates. It also would have stimulated his vagus nerve, which runs from his spine to his brain. Although adrenaline cannot cross the blood–brain barrier, the vagus can promote noradrenaline production in the brain. That hormone activates the amygdala, which helps form memories. © 2013 Scientific American
By Jessica Shugart Lunch at a restaurant with a friend could lessen the brain’s aptitude for detailed tasks back at work, a new study suggests. If an error-free afternoon is the goal, perhaps workers should consider hastily consuming calories alone at their desks. But bosses shouldn’t rush to glue workers to their chairs just yet. The research is only a first stab at teasing out how a sociable lunch affects work performance, says study leader Werner Sommer of Humboldt University in Berlin. Researchers have long thought that dining with others fosters mental well-being, cooperation and creativity. To test the effects of a midday social hour on the brain’s capacity to get through the workday, Sommer and his colleagues gave 32 university students lunch in one of two settings and then tested their mental focus. Half of the students enjoyed meals over a leisurely hour with a friend at a casual Italian restaurant. The other group picked up their meals from the same restaurant, but had only 20 minutes to eat alone in a drab office. People who went out to lunch got to choose from a limited vegetarian menu; participants in the office group had meals that matched the choice of a member of the other group. After lunch, the group that dined in bland solitude performed better on a task that assesses rapid decision making and focus, the researchers report July 31 in PLOS ONE. Measurements of brain activity also suggested that the brain’s error-monitoring system could be running at sub-par levels in those who ate out. © Society for Science & the Public 2000 - 2013
Avoiding temptation works better than relying on willpower alone, a study of brain activity finds. "Struggles with self-control pervade daily life and characterize an array of dysfunctional behaviours, including addiction, overeating, overspending and procrastination," Molly Crockett, a postdoctoral fellow at University College London, and her co-authors said in today's issue of the journal Neuron. "Our research suggests that the most effective way to beat temptations is to avoid facing them in the first place," she said in a release. In the experiment, researchers studied 58 healthy heterosexual males in Cambridge and 20 in Amsterdam. Investigators used functional MRI as part of the study of self-control to explore the neural mechanisms involved. At the beginning of the trial, participants were shown a series of 400 images of women in lingerie or swimwear and were asked to rank them on a scale of zero to 10 on how enjoyable they were. Each man's preferences were then used to present small, short-term rewards or a large reward after a delay. Small rewards were mildly enjoyable erotic pictures and large rewards were extremely enjoyable ones. (The scientists said they could not use money, for example, since subjects could only reap the rewards of money once they left the lab. Food rewards like juice could interfere with the MRI readings.) © CBC 2013
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 18405 - Posted: 07.25.2013
by Helen Thomson "I told my daughter her living room TV was out of sync. Then I noticed the kitchen telly was also dubbed badly. Suddenly I noticed that her voice was out of sync too. It wasn't the TV, it was me." Ever watched an old movie, only for the sound to go out of sync with the action? Now imagine every voice you hear sounds similarly off-kilter – even your own. That's the world PH lives in. Soon after surgery for a heart problem, he began to notice that something wasn't quite right. "I was staying with my daughter and they like to have the television on in their house. I turned to my daughter and said 'you ought to get a decent telly, one where the sound and programme are synchronised'. I gave a little chuckle. But they said 'there's nothing wrong with the TV'." Puzzled, he went to the kitchen to make a cup of tea. "They've got another telly up on the wall and it was the same. I went into the lounge and I said to her 'hey you've got two TVs that need sorting!'." That was when he started to notice that his daughter's speech was out of time with her lip movements too. "It wasn't the TV, it was me. It was happening in real life." PH is the first confirmed case of someone who hears people speak before registering the movement of their lips. His situation is giving unique insights into how our brains unify what we hear and see. It's unclear why PH's problem started when it did – but it may have had something to do with having acute pericarditis, inflammation of the sac around the heart, or the surgery he had to treat it. © Copyright Reed Business Information Ltd
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 18350 - Posted: 07.06.2013
by Emily Underwood Pay attention! Whether it's listening to a teacher giving instructions or completing a word problem, the ability to tune out distractions and focus on a task is key to academic success. Now, a new study suggests that a brief training program in attention for 3- to 5-year-olds and their families could help boost brain activity and narrow the academic achievement gap between low- and high-income students. Children from families of low socioeconomic status generally score lower than more affluent kids on standardized tests of intelligence, language, spatial reasoning, and math, says Priti Shah, a cognitive neuroscientist at the University of Wisconsin who was not involved in the study. "That's just a plain fact." A more controversial question that scientists and politicians have batted around for decades, says Shah, is "What is the source of that difference?" Part of it may be genetic, but environmental factors, ranging from prenatal nutrition to exposure to toxic substances like lead, may also account for the early childhood differences in cognitive ability that appear by age 3 or 4. So far, however, "there aren't that many randomized, controlled trials that show that the environment has an impact on a child's abilities," Shah says. The new study does just that. It focuses on the ability to hone in on a task and ignore distractions, which "leverages every single thing we do," says cognitive neuroscientist Helen Neville at the University of Oregon, Eugene. For more than 30 years, Neville and her colleagues have been studying the neural bases of this ability, called selective attention. © 2010 American Association for the Advancement of Science
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 18339 - Posted: 07.03.2013
Zoe Cormier By trawling through data from 35 million users of online ‘brain-training’ tools, researchers have conducted a survey of what they say is the world’s largest data set of human cognitive performance. Their preliminary results show that drinking moderately correlates with better cognitive performance and that sleeping too little or too much has a negative association. The study, published this week in Frontiers in Human Neuroscience1, analysed user data from Lumosity, a collection of web-based games made by Lumos Labs, based in San Francisco, California. Researchers at Lumos conducted the study in collaboration with scientists at two US universities as part of the Human Cognition Project, which the authors describe as “a collaborative research effort to describe the human mind”. The authors examined results from more than 600 million completed tasks — which measured players’ speed, memory capacity and cognitive flexibility — to get a snapshot of how lifestyle factors can affect cognition and how learning ability changes with age. Users who enjoyed one or two alcoholic drinks a day tended to perform better on cognitive tasks than teetotallers and heavier drinkers, whose scores dropped as the number of daily drinks increased. The optimal sleep time was seven hours, with performance worsening for every hour of sleep lost or added. The study authors also looked at performance over time for users who returned to the same brain-training tasks at least 25 times. Performance decreased with age, but the ability to learn new tasks that relied on ‘crystallized knowledge’ (such as vocabulary) did not decline as quickly as it did for those that measured ‘fluid intelligence’ (such as the ability to memorize new sets of information). © 2013 Nature Publishing Group,
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 18301 - Posted: 06.24.2013
Meghan Holohan NBC News Most of us can't actually be as attractive as professional good-looking people like Kate Upton. But new research shows that an electrical shock to the brain can make people perceive other people to be more attractive. The research may one day point toward new treatments for neurological disorders like depression or Parkinson's. Another workday with your drab, dull-looking coworkers. If only your world was filled with the beautiful people - more Kate Uptons than Katie from accounting, more Jon Hamms than John from HR. Actually, technology exists that could almost make that possible -- provided you're OK with an electric shock to your brain. But the brain zap isn't some party game. Findings from a new California Institute of Technology study could one day help lead to new, noninvasive ways to study and treat mental disorders. The Caltech researchers found that people who receive a mild electrical shock deep within the brain ranked people as more attractive than they did before the jolt. It might sound like a silly thing to study, but Vikram Chib, lead author of the paper, explains that rating the attractiveness of faces is one of the hallmark tasks used to diagnose neurological problems like depression, schizophrenia or Parkinson's. Chib, a postdoctoral scholar at Caltech, wanted to know how an area nestled deep with the brain called the midbrain influenced mood and behavior, and if there were a way to manipulate it noninvasively. The midbrain is believed to be the source of dopamine, a neurotransmitter that plays a role in disorders like depression, schizophrenia, and Parkinson’s disease. While drugs do treat these disorders, Chib and his colleague, Shinsuke Shimojo, hoped that noninvasive deep brain stimulation could change only the midbrain, without influencing the entire body.
MONKEYS may have a primitive version of the human ability to put ourselves in another's shoes. Intelligent animals such as apes can intuit others' intentions, suggesting they have some theory of mind capability. But only humans can reason that others may not hold their own beliefs. To study this difference, Rogier Mars of the University of Oxford and colleagues scanned 36 people's brains. Using an algorithm, they created a map of how an area associated with theory of mind is connected to brain regions linked to abilities such as face recognition and interpretation. Next, the researchers scanned 12 macaque brains for a similar pattern of connections. An area involved in facial recognition had a similar pattern, suggesting involvement in abstract thought. That doesn't necessarily mean the structures share a function, Mars says. Theory of mind is probably a spectrum of ways of thinking, he says, and humans got better at it as they evolved. Laurie Santos of Yale University says the structural differences may one day tell us why non-human primates lack the ability to think about others' beliefs. © Copyright Reed Business Information Ltd.
By Keith Payne It was a summer evening when Tony Cornell tried to make the residents of Cambridge, England see a ghost. He got dressed up in a sheet and walked through a public park waving his arms about. Meanwhile his assistants observed the bystanders for any hint that they noticed something strange. No, this wasn’t Candid Camera. Cornell was a researcher interested in the paranormal. The idea was first to get people to notice the spectacle, and then see how they understood what their eyes were telling them. Would they see the apparition as a genuine ghost or as something more mundane, like a bloke in a bed sheet? The plan was foiled when not a single bystander so much as raised an eye brow. Several cows did notice, however, and they followed Cornell on his ghostly rambles. Was it just a fluke, or did people “not want to see” the besheeted man, as Cornell concluded in his 1959 report? Okay, that stunt was not a very good experiment, but twenty years later the eminent psychologist Ulric Neisser did a better job. He filmed a video of two teams of students passing a basketball back and forth, and superimposed another video of a girl with an umbrella walking right through the center of the screen. When he asked subjects in his study to count the number of times the ball was passed, an astonishing 79 percent failed to notice the girl with the umbrella. In the years since, hundreds of studies have backed up the idea that when attention is occupied with one thing, people often fail to notice other things right before their eyes. When you first learn about these studies they seem deeply strange. Is it really possible that we are constantly failing to notice things right in front of us? Is there some mysterious force screening what we see and what remains hidden? © 2013 Scientific American
by Helen Thomson Sean O'Connor is a very rational man. But he also tried, unsuccessfully, to sever his spine, and still feels a need to be paralysed. Sean has body integrity identity disorder (BIID), which causes him to feel that his limbs just don't belong to his body. Sean's legs function correctly and he has full sensation in them, but they feel disconnected from him. "I don't hate my limbs – they just feel wrong," he says. "I'm aware that they are as nature designed them to be, but there is an intense discomfort at being able to feel my legs and move them." The cause of his disorder has yet to be pinpointed, but it almost certainly stems from a problem in the early development of his brain. "My earliest memories of feeling I should be paralysed go back to when I was 4 or 5 years old," says Sean. The first case of BIID was reported in the 18th century, when a French surgeon was held at gunpoint by an Englishman who demanded that one of his legs be removed. The surgeon, against his will, performed the operation. Later, he received a handsome payment from the Englishman, with an accompanying letter of thanks for removing "a limb which put an invincible obstacle to my happiness" (Experimental Brain Research, DOI: 10.1007/s00221-009-2043-7). We now think that there are at least two forms of BIID. In one, people wish that part of their body were paralysed. Another form causes people to want to have a limb removed. BIID doesn't have to affect limbs either – there have been anecdotal accounts of people wishing they were blind or deaf. © Copyright Reed Business Information Ltd.
By Bruce Bower In its idealized form, science resembles a championship boxing match. Theories square off, each vying for the gold belt engraved with “Truth.” Under the stern eyes of a host of referees, one theory triumphs by best explaining available evidence — at least until the next bout. But in the real world, science sometimes works more like a fashion show. Researchers clothe plausible explanations of experimental findings in glittery statistical suits and gowns. These gussied-up hypotheses charm journal editors and attract media coverage with carefully orchestrated runway struts, never having to battle competitors. Then there’s psychology. Even more than other social scientists — and certainly more than physical scientists — psychologists tend to overlook or dismiss hypotheses that might topple their own, says Klaus Fiedler of the University of Heidelberg in Germany. They explain experimental findings with ambiguous terms that make no testable predictions at all; they build careers on theories that have never bested a competitor in a fair scientific fight. In many cases, no one knows or bothers to check how much common ground one theory shares with others that address the same topic. Problems like these, Fiedler and his colleagues contended last November in Perspectives in Psychological Science, afflict sets of related theories about such psychological phenomena as memory and decision making. In the end, that affects how well these phenomena are understood. © Society for Science & the Public 2000 - 2013
Linda Carroll TODAY contributor We all get lost or disoriented once in a while, but for Sharon Roseman, being lost is a way of life. A little quirk in her brain makes it impossible to recognize landmarks and find her way around neighborhoods that should have become familiar long ago. “I can literally see my house out the car window, but I have no clue that it’s my house,” Roseman told NBC’s Kristen Dahlgren. Roseman, 64, suffers from developmental topographical disorientation, or DTD, a disorder that had flown under brain researchers’ radar until very recently. DTD was first described as a single case study in a paper published online in 2008 in the journal Neuropsychologia. At the time, it was thought to be extremely rare, says the study’s lead author, Giuseppe Iaria, professor of cognitive neuroscience at the University of Calgary. But since then, Iaria has discovered nearly 1,000 other people with DTD and he thinks there may be a lot more. He currently estimates that about 2 percent of the population may be constantly coping with orientation and navigation problems caused by the disorder. DTD is a profound and disabling deficit. Nothing, not even the layout of a house you’ve lived in for decades, ever becomes familiar. And for Roseman that has made life very trying. When her kids would cry in the night, she would struggle to find her way to them.
by Helen Thomson "I've been in a crowded elevator with mirrors all around, and a woman will move and I'll go to get out the way and then realise: 'oh that woman is me'." Heather Sellers has prosopagnosia, more commonly known as face blindness. "I can't remember any image of the human face. It's simply not special to me," she says. "I don't process them like I do a car or a dog. It's not a visual problem, it's a perception problem." Heather knew from a young age that something was different about the way she navigated her world, but her condition wasn't diagnosed until she was in her 30s. "I always knew something was wrong – it was impossible for me to trust my perceptions of the world. I was diagnosed as anxious. My parents thought I was crazy." The condition is estimated to affect around 2.5 per cent of the population, and it's common for those who have it not to realise that anything is wrong. "In many ways it's a subtle disorder," says Heather. "It's easy for your brain to compensate because there are so many other things you can use to identify a person: hair colour, gait or certain clothes. But meet that person out of context and it's socially devastating." As a child, she was once separated from her mum at a grocery store. Store staff reunited the pair, but it was confusing for Heather, since she didn't initially recognise her mother. "But I didn't know that I wasn't recognising her." © Copyright Reed Business Information Ltd
by Lizzie Wade If you were a rat living in a completely virtual world like in the movie The Matrix, could you tell? Maybe not, but scientists studying your brain might be able to. Today, researchers report that certain cells in rat brains work differently when the animals are in virtual reality than when they are in the real world. The neurons in question are known as place cells, which fire in response to specific physical locations in the outside world and reside in the hippocampus, the part of the brain responsible for spatial navigation and memory. As you walk out of your house every day, the same place cell fires each time you reach the shrub that's two steps away from your door. It fires again when you reach the same place on your way back home, even though you are traveling in the opposite direction. Scientists have long suspected that these place cells help the brain generate a map of the world around us. But how do the place cells know when to fire in the first place? Previous research showed that the cells rely on three different kinds of information. First, they analyze "visual cues," or what you see when you look around. Then, there are what researchers call "self-motion cues." These cues come from how your body moves in space and are the reason you can still find your way around a room with the lights out. The final type of information is the "proximal cues," which encompass everything else about the environment you're in. The smell of a bakery on your way to work, the sounds of a street jammed with traffic, and the springy texture of grass in a park are all proximal cues. © 2010 American Association for the Advancement of Science.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 18112 - Posted: 05.04.2013
Alison Abbott Thinking about a professor just before you take an intelligence test makes you perform better than if you think about football hooligans. Or does it? An influential theory that certain behaviour can be modified by unconscious cues is under serious attack. A paper published in PLoS ONE last week1 reports that nine different experiments failed to replicate this example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks. David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE, is among sceptical scientists calling for Dijksterhuis to design a detailed experimental protocol to be carried out indifferent laboratories to pin down the effect. Dijksterhuis has rejected the request, saying that he “stands by the general effect” and blames the failure to replicate on “poor experiments”. An acrimonious e-mail debate on the subject has been dividing psychologists, who are already jittery about other recent exposures of irreproducible results (see Nature 485, 298–300; 2012). “It’s about more than just replicating results from one paper,” says Shanks, who circulated a draft of his study in October; the failed replications call into question the underpinnings of ‘unconscious-thought theory’. © 2013 Nature Publishing Group
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 18104 - Posted: 05.01.2013
By JAMES GORMAN TRONDHEIM, Norway — In 1988, two determined psychology students sat in the office of an internationally renowned neuroscientist in Oslo and explained to him why they had to study with him. Unfortunately, the researcher, Per Oskar Andersen, was hesitant, May-Britt Moser said as she and her husband, Edvard I. Moser, now themselves internationally recognized neuroscientists, recalled the conversation recently. He was researching physiology and they were interested in the intersection of behavior and physiology. But, she said, they wouldn’t take no for an answer. “We sat there for hours. He really couldn’t get us out of his office,” Dr. May-Britt Moser said. “Both of us come from nonacademic families and nonacademic places,” Edvard said. “The places where we grew up, there was no one with any university education, no one to ask. There was no recipe on how to do these things.” “And how to act politely,” May-Britt interjected. “It was just a way to get to the point where we wanted to be. But seen now, when I know the way people normally do it,” he said, smiling at the memory of his younger self, “I’m quite impressed.” So, apparently, was Dr. Andersen. In the end, he yielded to the Mosers’ combination of furious curiosity and unwavering determination and took them on as graduate students. They have impressed more than a few people since. In 2005, they and their colleagues reported the discovery of cells in rats’ brains that function as a kind of built-in navigation system that is at the very heart of how animals know where they are, where they are going and where they have been. They called them grid cells. © 2013 The New York Times Company