Links for Keyword: Attention
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By ERICA GOODE A study suggests that newborn chicks map numbers spatially, associating low numerical values with space to their left. Credit Rosa Rugani/University of Padova Asked to picture the numbers from one to 10, most people will imagine a straight line with one at the left end and 10 at the right. This “mental number line,” as researchers have termed it, is so pervasive that some scientists have argued that the spatial representation of numbers is hard-wired into the brain, part of a primitive number system that underlies humans’ capacity for higher mathematics. Now a team of Italian researchers has found that newborn chicks, like humans, appear to map numbers spatially, associating smaller amounts with the left side and larger amounts with the right side. The chicks, trained to seek out mealworms behind white plastic panels printed with varying numbers of identical red squares, repeatedly demonstrated a preference for the left when the number of squares was small and for the right when the number was larger. The research, led by Rosa Rugani, a psychologist who at the time was at the University of Padova, will appear in Friday’s issue of the journal Science. Researchers demonstrated that chickens naturally order numbers left to right. When the number five is in the middle, chickens naturally go left for lower numbers and to the right for higher numbers. Publish Date January 29, 2015. In their report, the researchers said the findings supported the idea that the left-right orientation for numbers is innate rather than determined by culture or education — a possibility that was raised by some studies that found that in Arabic-speaking countries where letters and numbers are read right to left, the mental number scale was reversed. But the new research, Dr. Rugani and her colleagues wrote, indicates that orienting numbers in space may represent “a universal cognitive strategy available soon after birth.” Tyler Marghetis, a doctoral candidate in psychology at the University of California, San Diego, who has published research on the spatial association of numbers, called the researcher’s studies “very cool.” © 2015 The New York Times Company
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Language and Our Divided Brain
Link ID: 20538 - Posted: 01.31.2015
Alison Abbott If you have to make a complex decision, will you do a better job if you absorb yourself in, say, a crossword puzzle instead of ruminating about your options? The idea that unconscious thought is sometimes more powerful than conscious thought is attractive, and echoes ideas popularized by books such as writer Malcolm Gladwell’s best-selling Blink. But within the scientific community, ‘unconscious-thought advantage’ (UTA) has been controversial. Now Dutch psychologists have carried out the most rigorous study yet of UTA — and find no evidence for it. Their conclusion, published this week in Judgement and Decision Making, is based on a large experiment that they designed to provide the best chance of capturing the effect should it exist, along with a sophisticated statistical analysis of previously published data1. The report adds to broader concerns about the quality of psychology studies and to an ongoing controversy about the extent to which unconscious thought in general can influence behaviour. “The bigger debate is about how clever our unconscious is,” says cognitive psychologist David Shanks of University College London. “This carefully constructed paper makes a great contribution.” Shanks published a review last year that questioned research claiming that various unconscious influences, including UTA, affect decision making2. © 2015 Nature Publishing Group
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 20528 - Posted: 01.28.2015
|By Christof Koch Faces are the glue that holds us together and that gives us our identity. All of us but the visually impaired and blind are experts at recognizing people's identity, gender, age and ethnicity from looking at their faces. First impressions of attractiveness or competence take but a brief glimpse of somebody's face. Newly born infants already tend to fixate on faces. This bias also turns up in art. Paintings and movies are filled with faces staring at the viewer. Who can forget the endless close-ups of the feuding husband and wife in Ingmar Bergman's Cimmerian masterpiece Scenes from a Marriage? Because recognizing a face is so vital to our social lives, it comes as no surprise that a lot of real estate in the cerebral cortex—the highly convoluted region that makes up the bulk of our brain—is devoted to a task crucial to processing faces and their identity. We note whether someone looks our way or not. We discern emotional expressions, whether they register joy, fear or anger. Indeed, functional brain imaging has identified a set of adjacent regions, referred to as the fusiform face area (FFA), that are situated on the left and the right sides of the brain, at the bottom of the temporal lobe of the cerebral cortex. The FFA turns up its activity when subjects look at portraits or close-ups of faces or even when they just think about these images. Two just published studies of the brain's visual networks, including the FFA, enlarge what we know about the physical basis of face perception. Both explore the unique access to the brain afforded by patients whose epileptic seizures have proved resistant to drugs. A surgical treatment finds the locations in the brain where the hypersynchronized activity that characterizes a seizure begins before spreading from its point of origin to engulf one or sometimes both hemispheres. If a single point—a focus where the seizure begins—can be found, it can be removed. After this procedure, a patient usually has significantly fewer seizures—and some remain seizure-free. To triangulate the location of the focus, neurosurgeons insert electrodes into the brain to monitor electrical activity that occurs during a seizure. © 2015 Scientific American
|By Stephen L. Macknik and Susana Martinez-Conde To a neuroscientist, the trouble with cocktail parties is not that we do not love cocktails or parties (many neuroscientists do). Instead what we call “the cocktail party problem” is the mystery of how anyone can have a conversation at a cocktail party at all. Consider a typical scene: You have a dozen or more lubricated and temporarily uninhibited adults telling loud, improbable stories at increasing volumes. Interlocutors guffaw and slap backs. Given the decibel level, it is a minor neural miracle that any one of these revelers can hear and parse one word from any other. The alcohol does not help, but it is not the main source of difficulties. The cocktail party problem is that there is just too much going on at once: How can our brain filter out the noise to focus on the wanted information? This problem is a central one for perceptual neuroscience—and not just during cocktail parties. The entire world we live in is quite literally too much to take in. Yet the brain does gather all of this information somehow and sorts it in real time, usually seamlessly and correctly. Whereas the physical reality consists of comparable amounts of signal and noise for many of the sounds and sights around you, your perception is that the conversation or object that interests you remains in clear focus. So how does the brain accomplish this feat? One critical component is that our neural circuits simplify the problem by actively ignoring—suppressing—anything that is not task-relevant. Our brain picks its battles. It stomps out irrelevant information so that the good stuff has a better chance of rising to awareness. This process, colloquially called attention, is how the brain sorts the wheat from the chaff. © 2014 Scientific American
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 20440 - Posted: 12.23.2014
Kate Szell “I once asked Clara who she was. It was so embarrassing, but she’d had a haircut, so how was I to know?” That’s Rachel, she’s 14 and counts Clara as one of her oldest and best friends. There’s nothing wrong with Rachel’s sight, yet she struggles to recognise others. Why? Rachel is face blind. Most of us take for granted the fact that we recognise someone after a quick glance at their face. We don’t realise we’re doing something very different when we look at a face compared with when we look at anything else. To get a feeling of how peculiar facial recognition is, try recognising people by looking at their hands, instead of their faces. Tricky? That’s exactly how Rachel feels – only she’s not looking at hands, she’s looking straight into someone’s eyes. Specific areas of the brain process facial information. Damage to those areas gives rise to prosopagnosia or “face blindness”: an inability or difficulty with recognising faces. While brain damage-induced prosopagnosia is rare, prosopagnosia itself is not. Studies suggest around 2% of the population could have some form of prosopagnosia. These “developmental” prosopagnosics seem to be born without the ability to recognise faces and don’t acquire it, relying instead on all manner of cues, from gait to hairstyles, to tell people apart. Kirsten Dalrymple from the University of Minnesota is one of a handful of researchers looking into developmental prosopagnosia. Her particular interest is in prosopagnosic children. “Some seem to cope without much of a problem but, for others, it’s a totally different story,” she says. “They can become very socially withdrawn and can also be at risk of walking off with strangers.” © 2014 Guardian News and Media Limited o
By Meeri Kim Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible? A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron. “I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ” Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume. Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance. “It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.
By Greg Miller This robot causes people to experience the illusory sensation of someone standing behind them. © Alain Herzog/EPFL People who’ve stared death in the face and lived to tell about it—mountain climbers who’ve made a harrowing descent, say, or survivors of the World Trade Center attacks—sometimes report that just when their situation seemed impossible, a ghostly presence appeared. People with schizophrenia and certain types of neurological damage sometimes report similar experiences, which scientists call, aptly, “feeling of presence.” Now a team of neuroscientists says it has identified a set of brain regions that seems to be involved in generating this illusion. Better yet, they’ve built a robot that can cause ordinary people to experience it in the lab. The team was led by Olaf Blanke, a neurologist and neuroscientist at the Swiss Federal Institute of Technology in Lausanne. Blanke has a long-standing interest in creepy illusions of bodily perception. Studying these bizarre phenomena, he says, could point to clues about the biology of mental illness and the mechanisms of human consciousness. In 2006, for example, Blanke and colleagues published a paper in Nature that had one of the best titles you’ll ever see in a scientific journal: “Induction of an illusory shadow person.” In that study, they stimulated the brain of a young woman who was awaiting brain surgery for severe epilepsy. Surgeons had implanted electrodes on the surface of her brain to monitor her seizures, and when the researchers passed a mild current through the electrodes, stimulating a small region at the intersection of the temporal and parietal lobes of her brain, she experienced what she described as a shadowy presence lurking nearby, mimicking her own posture. Colored areas indicate regions of overlap in the lesions of neurological patients who experienced feeling of presence illusions. © 2014 Condé Nast.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20290 - Posted: 11.08.2014
by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.
By Christian Jarrett It feels to me like interest in the brain has exploded. I’ve seen huge investments in brain science by the USA and Europe (the BRAIN Initiative and the Human Brain Project), I’ve read about the rise in media coverage of neuroscience, and above all, I’ve noticed how journalists and bloggers now often frame stories as being about the brain as opposed to the person. Look at these recent headlines: “Why your brain loves storytelling” (Harvard Business Review); “How Netflix is changing our brains” (Forbes); and “Why your brain wants to help one child in need — but not millions” (NPR). There are hundreds more, and in each case, the headline could be about “you” but the writer chooses to make it about “your brain”. Consider too the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law. It was only a matter of time before someone announced that we’re in the midst of a neurorevolution. In 2009 Zach Lynch did that, publishing his The Neuro Revolution: How Brain Science is Changing Our World. Having said all that, I’m conscious that my own perspective is heavily biased. I earn my living writing about neuroscience and psychology. I’m vigilant for all things brain. Maybe the research investment and brain-obsessed media headlines are largely irrelevant to the general public. I looked into this question recently and was surprised by what I found. There’s not a lot of research but that which exists (such as this, on the teen brain) suggests neuroscience has yet to make an impact on most people’s everyday lives. Indeed, I made Myth #20 in my new book Great Myths of the Brain “Neuroscience is transforming human self-understanding”. WIRED.com © 2014 Condé Nast.
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 1: An Introduction to Brain and Behavior
Link ID: 20282 - Posted: 11.06.2014
By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company
By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company
Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20244 - Posted: 10.27.2014
By KONIKA BANERJEE and PAUL BLOOM ON April 15, 2013, James Costello was cheering on a friend near the finish line at the Boston Marathon when the bombs exploded, severely burning his arms and legs and sending shrapnel into his flesh. During the months of surgery and rehabilitation that followed, Mr. Costello developed a relationship with one of his nurses, Krista D’Agostino, and they soon became engaged. Mr. Costello posted a picture of the ring on Facebook. “I now realize why I was involved in the tragedy,” he wrote. “It was to meet my best friend, and the love of my life.” Mr. Costello is not alone in finding meaning in life events. People regularly do so for both terrible incidents, such as being injured in an explosion, and positive ones, like being cured of a serious disease. As the phrase goes, everything happens for a reason. Where does this belief come from? One theory is that it reflects religious teachings — we think that events have meaning because we believe in a God that plans for us, sends us messages, rewards the good and punishes the bad. But research from the Yale Mind and Development Lab, where we work, suggests that this can’t be the whole story. In one series of studies, recently published in the journal Cognition, we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life that determines how events turn out. © 2014 The New York Times Company
by Laura Starecheski From the self-affirmations of Stuart Smalley on Saturday Night Live to countless videos on YouTube, saying nice things to your reflection in the mirror is a self-help trope that's been around for decades, and seems most often aimed at women. The practice, we're told, can help us like ourselves and our bodies more, and even make us more successful — allow us to chase our dreams! Impressed, but skeptical, I took this self-talk idea to one of the country's leading researchers on body image to see if it's actually part of clinical practice. David Sarwer is a psychologist and clinical director at the Center for Weight and Eating Disorders at the University of Pennsylvania. He says that, in fact, a mirror is one of the first tools he uses with some new patients. He stands them in front of a mirror and coaches them to use gentler, more neutral language as they evaluate their bodies. "Instead of saying, 'My abdomen is disgusting and grotesque,' " Sarwer explains, he'll prompt a patient to say, " 'My abdomen is round, my abdomen is big; it's bigger than I'd like it to be.' " The goal, he says, is to remove "negative and pejorative terms" from the patient's self-talk. The underlying notion is that it's not enough for a patient to lose physical weight — or gain it, as some women need to — if she doesn't also change the way her body looks in her mind's eye. This may sound weird. You're either a size 4 or a size 8, right? Not mentally, apparently. In a 2013 study from the Netherlands, scientists watched women with anorexia walk through doorways in a lab. The women, they noticed, turned their shoulders and squeezed sideways, even when they had plenty of room. © 2014 NPR
Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20178 - Posted: 10.08.2014
By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was texting a girlfriend at the time, adding in unmistakable anguish that he can’t even remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company
By Melissa Dahl If you are the sort of person who has a hard time just watching TV — if you’ve got to be simultaneously using your iPad or laptop or smartphone — here’s some bad news. New research shows a link between juggling multiple digital devices and a lower-than-usual amount of gray matter, the stuff that’s made up of brain cells, in the region of the brain associated with cognitive and emotional control. More details, via the press release: The researchers at the University of Sussex's Sackler Centre for Consciousness used functional magnetic resonance imaging (fMRI) to look at the brain structures of 75 adults, who had all answered a questionnaire regarding their use and consumption of media devices, including mobile phones and computers, as well as television and print media. They found that, independent of individual personality traits, people who used a higher number of media devices concurrently also had smaller grey matter density in the part of the brain known as the anterior cingulate cortex (ACC), the region notably responsible for cognitive and emotional control functions. But a predilection for using several devices at once isn’t necessarily causing a decrease in gray matter, the authors note — this is a purely correlational finding. As Earl Miller, a neuroscientist at MIT who was not involved in this research, wrote in an email, “It could be (in fact, is possibly more likely) that the relationship is the other way around.” In other words, the people who are least content using just one device at a time may have less gray matter in the first place.
Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20123 - Posted: 09.27.2014
by Helen Thomson My, what big eyes you have – you must be trying really hard. A study of how pupils dilate with physical effort could allow us to make strenuous tasks seem easier by zapping specific areas of the brain. We know pupils dilate with mental effort, when we think about a difficult maths problem, for example. To see if this was also true of physical exertion, Alexandre Zenon at the Catholic University of Louvain in Belgium, measured the pupils of 18 volunteers as they squeezed a device which reads grip strength. Sure enough, the more force they exerted, the larger their pupils. To see whether pupil size was related to actual or perceived effort, the volunteers were asked to squeeze the device with four different grip strengths. Various tests enabled the researchers to tell how much effort participants felt they used, from none at all to the most effort possible. Comparing the results from both sets of experiments suggested that pupil dilation correlated more closely with perceived effort than actual effort. The fact that both mental effort and perceived physical effort are reflected in pupil size suggests there is a common representation of effort in the brain, says Zenon. To see where in the brain this might be, the team looked at which areas were active while similar grip tasks were being performed. Zenon says they were able to identify areas within the supplementary motor cortex – which plays a role in movement – associated with how effortful a task is perceived to be. © Copyright Reed Business Information Ltd.
Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.
By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.
By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.
By Gary Stix A gamma wave is a rapid, electrical oscillation in the brain. A scan of the academic literature shows that gamma waves may be involved with learning memory and attention—and, when perturbed, may play a part in schizophrenia, epilepsy Alzheimer’s, autism and ADHD. Quite a list and one of the reasons that these brainwaves, cycling at 25 to 80 times per second, persist as an object of fascination to neuroscientists. Despite lingering interest, much remains elusive when trying to figure out how gamma waves are produced by specific molecules within neurons—and what the oscillations do to facilitate communication along the brains’ trillions and trillions of connections. A group of researchers at the Salk Institute in La Jolla, California has looked beyond the preeminent brain cell—the neuron— to achieve new insights about gamma waves. At one time, neuroscience textbooks depicted astrocytes as a kind of pit crew for neurons, providing metabolic support and other functions for the brain’s rapid-firing information-processing components. In recent years, that picture has changed as new studies have found that astrocytes, like neurons, also have an alternate identity as information processors. This research demonstrates astrocytes’ ability to spritz chemicals known as neurotransmitters that communicate with other brain cells. Given that both neurons and astrocytes perform some of the same functions, it has been difficult to tease out what specifically astrocytes are up to. Hard evidence for what these nominal cellular support players might contribute in forming memories or focusing attention has been lacking. © 2014 Scientific American
Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 19939 - Posted: 08.12.2014