Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 429

by Helen Thomson We meet in a pub, we have a few drinks, some dinner and then you lean in for a kiss. You predict, based on our previous interactions, that the kiss will be reciprocated – rather than landing you with a slap in the face. All our social interactions require us to anticipate another person's undecided intentions and actions. Now, researchers have discovered specific brain cells that allow monkeys to do this. It is likely that the cells do the same job in humans. Keren Haroush and Ziv Williams at Harvard Medical School trained monkeys to play a version of the prisoner's dilemma, a game used to study cooperation. The monkeys sat next to each other and decided whether or not they wanted to cooperate with their companion, by moving a joystick to pick either option. Moving the joystick towards an orange circle meant cooperate, a blue triangle meant "not this time". Neither monkey could see the other's face, or receive any clues about their planned action. If the monkeys cooperated, both received four drops of juice. If one cooperated and the other decided not to, the one who cooperated received one drop, and the other received six drops of juice. If both declined to work together they both received two drops of juice. Once both had made their selections, they could see what the other monkey had chosen and hear the amount of juice their companion was enjoying. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20627 - Posted: 02.27.2015

By Christie Aschwanden Paul Offit likes to tell a story about how his wife, pediatrician Bonnie Offit, was about to give a child a vaccination when the kid was struck by a seizure. Had she given the injection a minute sooner, Paul Offit says, it would surely have appeared as though the vaccine had caused the seizure and probably no study in the world would have convinced the parent otherwise. (The Offits have such studies at the ready — Paul is the director of the Vaccine Education Center at the Children’s Hospital of Philadelphia and author of “Deadly Choices: How the Anti-Vaccine Movement Threatens Us All.”) Indeed, famous anti-vaxxer Jenny McCarthy has said her son’s autism and seizures are linked to “so many shots” because vaccinations preceded his symptoms. But, as Offit’s story suggests, the fact that a child became sick after a vaccine is not strong evidence that the immunization was to blame. Psychologists have a name for the cognitive bias that makes us prone to assigning a causal relationship to two events simply because they happened one after the other: the “illusion of causality.” A study recently published in the British Journal of Psychology investigates how this illusion influences the way we process new information. Its finding: Causal illusions don’t just cement erroneous ideas in the mind; they can also prevent new information from correcting them. Helena Matute, a psychologist at Deusto University in Bilbao, Spain, and her colleagues enlisted 147 college students to take part in a computer-based task in which they each played a doctor who specializes in a fictitious rare disease and assessed whether new medications could cure it. ©2015 ESPN Internet Ventures.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20595 - Posted: 02.19.2015

Tom Stafford Trusting your instincts may help you to make better decisions than thinking hard, a study suggests. It is a common misconception that we know our own minds. As I move around the world, walking and talking, I experience myself thinking thoughts. "What shall I have for lunch?", I ask myself. Or I think, "I wonder why she did that?" and try and figure it out. It is natural to assume that this experience of myself is a complete report of my mind. It is natural, but wrong. There's an under-mind, all psychologists agree – an unconscious which does a lot of the heavy lifting in the process of thinking. If I ask myself what is the capital of France the answer just comes to mind – Paris! If I decide to wiggle my fingers, they move back and forth in a complex pattern that I didn't consciously prepare, but which was delivered for my use by the unconscious. The big debate in psychology is exactly what is done by the unconscious, and what requires conscious thought. Or to use the title of a notable paper on the topic, 'Is the unconscious smart or dumb?' One popular view is that the unconscious can prepare simple stimulus-response actions, deliver basic facts, recognise objects and carry out practised movements. Complex cognition involving planning, logical reasoning and combining ideas, on the other hand, requires conscious thought. A recent experiment by a team from Israel scores points against this position. Ran Hassin and colleagues used a neat visual trick called Continuous Flash Suppression to put information into participants’ minds without them becoming consciously aware of it.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20594 - Posted: 02.19.2015

By Virginia Morell To prevent their hives from being attacked by invaders, wasps must quickly distinguish friend from foe. They typically do this by sniffing out foreigners, as outsiders tend to have a different scent than the home colony. Now researchers have discovered that, like a few other wasp species, a tiny social wasp (Liostenogaster flavolineata) from Malaysia employs an additional security measure: facial recognition. The wasps’ nests are typically found in large aggregations with as many as 150 built close together, and each colony faces persistent landing attempts by outsiders from these other nests. To find out why and how these wasps employ both vision and scent to determine if an incoming wasp is a comrade, scientists carried out a series of experiments on 50 colonies (see photo above) in the wild. Close to the nests, the researchers dangled lures made of captured and killed wasps. The lures had been given different treatments. For instance, some lures made from nest mates were coated with a foe’s scent, whereas outsiders were painted with the colony’s odor. The wasps, it turns out, pay more attention to facial markings than to scent when faced with a possible intruder, the team reports online today in the Proceedings of the Royal Society B. Indeed, in tests where the wasps could assess both an intruder’s face and scent, they relied solely on facial recognition and immediately attacked those whose faces they didn’t know, ignoring their odor. That’s the safest strategy, the scientists note, because the wasps can recognize another’s face at a distance, but need to actually touch another wasp to detect her scent—not a bad ploy for a tiny-brained insect. © 2015 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 20547 - Posted: 02.05.2015

By ERICA GOODE A study suggests that newborn chicks map numbers spatially, associating low numerical values with space to their left. Credit Rosa Rugani/University of Padova Asked to picture the numbers from one to 10, most people will imagine a straight line with one at the left end and 10 at the right. This “mental number line,” as researchers have termed it, is so pervasive that some scientists have argued that the spatial representation of numbers is hard-wired into the brain, part of a primitive number system that underlies humans’ capacity for higher mathematics. Now a team of Italian researchers has found that newborn chicks, like humans, appear to map numbers spatially, associating smaller amounts with the left side and larger amounts with the right side. The chicks, trained to seek out mealworms behind white plastic panels printed with varying numbers of identical red squares, repeatedly demonstrated a preference for the left when the number of squares was small and for the right when the number was larger. The research, led by Rosa Rugani, a psychologist who at the time was at the University of Padova, will appear in Friday’s issue of the journal Science. Researchers demonstrated that chickens naturally order numbers left to right. When the number five is in the middle, chickens naturally go left for lower numbers and to the right for higher numbers. Publish Date January 29, 2015. In their report, the researchers said the findings supported the idea that the left-right orientation for numbers is innate rather than determined by culture or education — a possibility that was raised by some studies that found that in Arabic-speaking countries where letters and numbers are read right to left, the mental number scale was reversed. But the new research, Dr. Rugani and her colleagues wrote, indicates that orienting numbers in space may represent “a universal cognitive strategy available soon after birth.” Tyler Marghetis, a doctoral candidate in psychology at the University of California, San Diego, who has published research on the spatial association of numbers, called the researcher’s studies “very cool.” © 2015 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 20538 - Posted: 01.31.2015

Alison Abbott If you have to make a complex decision, will you do a better job if you absorb yourself in, say, a crossword puzzle instead of ruminating about your options? The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink. But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA — and find no evidence for it. Their conclusion, published this week in Judgement and Decision Making, is based on a large experiment that they designed to provide the best chance of capturing the effect should it exist, along with a sophisticated statistical analysis of previously published data1. The report adds to broader concerns about the quality of psychology studies and to an ongoing controversy about the extent to which unconscious thought in general can influence behaviour. “The bigger debate is about how clever our unconscious is,” says cognitive psychol­ogist David Shanks of University College London. “This carefully constructed paper makes a great contribution.” Shanks published a review last year that questioned research claiming that various unconscious influences, including UTA, affect decision making2. © 2015 Nature Publishing Group

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 20528 - Posted: 01.28.2015

|By Christof Koch Faces are the glue that holds us together and that gives us our identity. All of us but the visually impaired and blind are experts at recognizing people's identity, gender, age and ethnicity from looking at their faces. First impressions of attractiveness or competence take but a brief glimpse of somebody's face. Newly born infants already tend to fixate on faces. This bias also turns up in art. Paintings and movies are filled with faces staring at the viewer. Who can forget the endless close-ups of the feuding husband and wife in Ingmar Bergman's Cimmerian masterpiece Scenes from a Marriage? Because recognizing a face is so vital to our social lives, it comes as no surprise that a lot of real estate in the cerebral cortex—the highly convoluted region that makes up the bulk of our brain—is devoted to a task crucial to processing faces and their identity. We note whether someone looks our way or not. We discern emotional expressions, whether they register joy, fear or anger. Indeed, functional brain imaging has identified a set of adjacent regions, referred to as the fusiform face area (FFA), that are situated on the left and the right sides of the brain, at the bottom of the temporal lobe of the cerebral cortex. The FFA turns up its activity when subjects look at portraits or close-ups of faces or even when they just think about these images. Two just published studies of the brain's visual networks, including the FFA, enlarge what we know about the physical basis of face perception. Both explore the unique access to the brain afforded by patients whose epileptic seizures have proved resistant to drugs. A surgical treatment finds the locations in the brain where the hypersynchronized activity that characterizes a seizure begins before spreading from its point of origin to engulf one or sometimes both hemispheres. If a single point—a focus where the seizure begins—can be found, it can be removed. After this procedure, a patient usually has significantly fewer seizures—and some remain seizure-free. To triangulate the location of the focus, neurosurgeons insert electrodes into the brain to monitor electrical activity that occurs during a seizure. © 2015 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20523 - Posted: 01.27.2015

|By Stephen L. Macknik and Susana Martinez-Conde To a neuroscientist, the trouble with cocktail parties is not that we do not love cocktails or parties (many neuroscientists do). Instead what we call “the cocktail party problem” is the mystery of how anyone can have a conversation at a cocktail party at all. Consider a typical scene: You have a dozen or more lubricated and temporarily uninhibited adults telling loud, improbable stories at increasing volumes. Interlocutors guffaw and slap backs. Given the decibel level, it is a minor neural miracle that any one of these revelers can hear and parse one word from any other. The alcohol does not help, but it is not the main source of difficulties. The cocktail party problem is that there is just too much going on at once: How can our brain filter out the noise to focus on the wanted information? This problem is a central one for perceptual neuroscience—and not just during cocktail parties. The entire world we live in is quite literally too much to take in. Yet the brain does gather all of this information somehow and sorts it in real time, usually seamlessly and correctly. Whereas the physical reality consists of comparable amounts of signal and noise for many of the sounds and sights around you, your perception is that the conversation or object that interests you remains in clear focus. So how does the brain accomplish this feat? One critical component is that our neural circuits simplify the problem by actively ignoring—suppressing—anything that is not task-relevant. Our brain picks its battles. It stomps out irrelevant information so that the good stuff has a better chance of rising to awareness. This process, colloquially called attention, is how the brain sorts the wheat from the chaff. © 2014 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 20440 - Posted: 12.23.2014

Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20347 - Posted: 11.24.2014

By Meeri Kim Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible? A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron. “I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ” Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume. Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance. “It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20296 - Posted: 11.10.2014

By Greg Miller This robot causes people to experience the illusory sensation of someone standing behind them. © Alain Herzog/EPFL People who’ve stared death in the face and lived to tell about it—mountain climbers who’ve made a harrowing descent, say, or survivors of the World Trade Center attacks—sometimes report that just when their situation seemed impossible, a ghostly presence appeared. People with schizophrenia and certain types of neurological damage sometimes report similar experiences, which scientists call, aptly, “feeling of presence.” Now a team of neuroscientists says it has identified a set of brain regions that seems to be involved in generating this illusion. Better yet, they’ve built a robot that can cause ordinary people to experience it in the lab. The team was led by Olaf Blanke, a neurologist and neuroscientist at the Swiss Federal Institute of Technology in Lausanne. Blanke has a long-standing interest in creepy illusions of bodily perception. Studying these bizarre phenomena, he says, could point to clues about the biology of mental illness and the mechanisms of human consciousness. In 2006, for example, Blanke and colleagues published a paper in Nature that had one of the best titles you’ll ever see in a scientific journal: “Induction of an illusory shadow person.” In that study, they stimulated the brain of a young woman who was awaiting brain surgery for severe epilepsy. Surgeons had implanted electrodes on the surface of her brain to monitor her seizures, and when the researchers passed a mild current through the electrodes, stimulating a small region at the intersection of the temporal and parietal lobes of her brain, she experienced what she described as a shadowy presence lurking nearby, mimicking her own posture. Colored areas indicate regions of overlap in the lesions of neurological patients who experienced feeling of presence illusions. © 2014 Condé Nast.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20290 - Posted: 11.08.2014

by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20284 - Posted: 11.06.2014

By Christian Jarrett It feels to me like interest in the brain has exploded. I’ve seen huge investments in brain science by the USA and Europe (the BRAIN Initiative and the Human Brain Project), I’ve read about the rise in media coverage of neuroscience, and above all, I’ve noticed how journalists and bloggers now often frame stories as being about the brain as opposed to the person. Look at these recent headlines: “Why your brain loves storytelling” (Harvard Business Review); “How Netflix is changing our brains” (Forbes); and “Why your brain wants to help one child in need — but not millions” (NPR). There are hundreds more, and in each case, the headline could be about “you” but the writer chooses to make it about “your brain”. Consider too the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law. It was only a matter of time before someone announced that we’re in the midst of a neurorevolution. In 2009 Zach Lynch did that, publishing his The Neuro Revolution: How Brain Science is Changing Our World. Having said all that, I’m conscious that my own perspective is heavily biased. I earn my living writing about neuroscience and psychology. I’m vigilant for all things brain. Maybe the research investment and brain-obsessed media headlines are largely irrelevant to the general public. I looked into this question recently and was surprised by what I found. There’s not a lot of research but that which exists (such as this, on the teen brain) suggests neuroscience has yet to make an impact on most people’s everyday lives. Indeed, I made Myth #20 in my new book Great Myths of the Brain “Neuroscience is transforming human self-understanding”. WIRED.com © 2014 Condé Nast.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 1: An Introduction to Brain and Behavior
Link ID: 20282 - Posted: 11.06.2014

By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20253 - Posted: 10.28.2014

By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20244 - Posted: 10.27.2014

By KONIKA BANERJEE and PAUL BLOOM ON April 15, 2013, James Costello was cheering on a friend near the finish line at the Boston Marathon when the bombs exploded, severely burning his arms and legs and sending shrapnel into his flesh. During the months of surgery and rehabilitation that followed, Mr. Costello developed a relationship with one of his nurses, Krista D’Agostino, and they soon became engaged. Mr. Costello posted a picture of the ring on Facebook. “I now realize why I was involved in the tragedy,” he wrote. “It was to meet my best friend, and the love of my life.” Mr. Costello is not alone in finding meaning in life events. People regularly do so for both terrible incidents, such as being injured in an explosion, and positive ones, like being cured of a serious disease. As the phrase goes, everything happens for a reason. Where does this belief come from? One theory is that it reflects religious teachings — we think that events have meaning because we believe in a God that plans for us, sends us messages, rewards the good and punishes the bad. But research from the Yale Mind and Development Lab, where we work, suggests that this can’t be the whole story. In one series of studies, recently published in the journal Cognition, we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life that determines how events turn out. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20219 - Posted: 10.20.2014

by Laura Starecheski From the self-affirmations of Stuart Smalley on Saturday Night Live to countless videos on YouTube, saying nice things to your reflection in the mirror is a self-help trope that's been around for decades, and seems most often aimed at women. The practice, we're told, can help us like ourselves and our bodies more, and even make us more successful — allow us to chase our dreams! Impressed, but skeptical, I took this self-talk idea to one of the country's leading researchers on body image to see if it's actually part of clinical practice. David Sarwer is a psychologist and clinical director at the Center for Weight and Eating Disorders at the University of Pennsylvania. He says that, in fact, a mirror is one of the first tools he uses with some new patients. He stands them in front of a mirror and coaches them to use gentler, more neutral language as they evaluate their bodies. "Instead of saying, 'My abdomen is disgusting and grotesque,' " Sarwer explains, he'll prompt a patient to say, " 'My abdomen is round, my abdomen is big; it's bigger than I'd like it to be.' " The goal, he says, is to remove "negative and pejorative terms" from the patient's self-talk. The underlying notion is that it's not enough for a patient to lose physical weight — or gain it, as some women need to — if she doesn't also change the way her body looks in her mind's eye. This may sound weird. You're either a size 4 or a size 8, right? Not mentally, apparently. In a 2013 study from the Netherlands, scientists watched women with anorexia walk through doorways in a lab. The women, they noticed, turned their shoulders and squeezed sideways, even when they had plenty of room. © 2014 NPR

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20178 - Posted: 10.08.2014

By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was ­texting a girlfriend at the time, adding in unmistakable anguish that he can’t even ­remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20124 - Posted: 09.27.2014

By Melissa Dahl If you are the sort of person who has a hard time just watching TV — if you’ve got to be simultaneously using your iPad or laptop or smartphone — here’s some bad news. New research shows a link between juggling multiple digital devices and a lower-than-usual amount of gray matter, the stuff that’s made up of brain cells, in the region of the brain associated with cognitive and emotional control. More details, via the press release: The researchers at the University of Sussex's Sackler Centre for Consciousness used functional magnetic resonance imaging (fMRI) to look at the brain structures of 75 adults, who had all answered a questionnaire regarding their use and consumption of media devices, including mobile phones and computers, as well as television and print media. They found that, independent of individual personality traits, people who used a higher number of media devices concurrently also had smaller grey matter density in the part of the brain known as the anterior cingulate cortex (ACC), the region notably responsible for cognitive and emotional control functions. But a predilection for using several devices at once isn’t necessarily causing a decrease in gray matter, the authors note — this is a purely correlational finding. As Earl Miller, a neuroscientist at MIT who was not involved in this research, wrote in an email, “It could be (in fact, is possibly more likely) that the relationship is the other way around.” In other words, the people who are least content using just one device at a time may have less gray matter in the first place.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20123 - Posted: 09.27.2014

by Helen Thomson My, what big eyes you have – you must be trying really hard. A study of how pupils dilate with physical effort could allow us to make strenuous tasks seem easier by zapping specific areas of the brain. We know pupils dilate with mental effort, when we think about a difficult maths problem, for example. To see if this was also true of physical exertion, Alexandre Zenon at the Catholic University of Louvain in Belgium, measured the pupils of 18 volunteers as they squeezed a device which reads grip strength. Sure enough, the more force they exerted, the larger their pupils. To see whether pupil size was related to actual or perceived effort, the volunteers were asked to squeeze the device with four different grip strengths. Various tests enabled the researchers to tell how much effort participants felt they used, from none at all to the most effort possible. Comparing the results from both sets of experiments suggested that pupil dilation correlated more closely with perceived effort than actual effort. The fact that both mental effort and perceived physical effort are reflected in pupil size suggests there is a common representation of effort in the brain, says Zenon. To see where in the brain this might be, the team looked at which areas were active while similar grip tasks were being performed. Zenon says they were able to identify areas within the supplementary motor cortex – which plays a role in movement – associated with how effortful a task is perceived to be. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 20121 - Posted: 09.27.2014