Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 420

By Meeri Kim Patients suffering from pagophagia compulsively crave and chomp on ice, even scraping buildup off freezer walls for a fix. The disorder appears to be caused by an iron deficiency, and supplements of the mineral tend to ease the cravings. But what is it about ice that makes it so irresistible? A new study proposes that, like a strong cup of coffee, ice may give those with insufficient iron a much-needed mental boost. Fatigue is the most common symptom of iron-deficiency anemia, which occurs when the body can’t produce enough oxygen-carrying hemoglobin because of low iron. “I had a friend who was suffering from iron-deficiency anemia who was just crunching through massive amounts of ice a day,” said study author Melissa Hunt, a clinical psychologist at the University of Pennsylvania. “She said: ‘It’s like a cup of coffee. I don’t feel awake until I have a cup of ice in my hand.’ ” Hunt and her colleagues had both anemic and healthy subjects complete a standardized, 22-minute attention test commonly used to diagnose attention deficit hyperactivity disorder. Just before the test, participants were given either a cup of ice or lukewarm water to consume. Iron-deficient subjects who had sipped on water performed far more slugglishly on the test than controls, as expected. But those who ate ice beforehand did just as well as their healthy counterparts. For healthy subjects, having a cup of ice instead of water appeared to make no difference in test performance. “It’s not like craving a dessert. It’s more like needing a cup of coffee or that cigarette,” Hunt said.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20296 - Posted: 11.10.2014

By Greg Miller This robot causes people to experience the illusory sensation of someone standing behind them. © Alain Herzog/EPFL People who’ve stared death in the face and lived to tell about it—mountain climbers who’ve made a harrowing descent, say, or survivors of the World Trade Center attacks—sometimes report that just when their situation seemed impossible, a ghostly presence appeared. People with schizophrenia and certain types of neurological damage sometimes report similar experiences, which scientists call, aptly, “feeling of presence.” Now a team of neuroscientists says it has identified a set of brain regions that seems to be involved in generating this illusion. Better yet, they’ve built a robot that can cause ordinary people to experience it in the lab. The team was led by Olaf Blanke, a neurologist and neuroscientist at the Swiss Federal Institute of Technology in Lausanne. Blanke has a long-standing interest in creepy illusions of bodily perception. Studying these bizarre phenomena, he says, could point to clues about the biology of mental illness and the mechanisms of human consciousness. In 2006, for example, Blanke and colleagues published a paper in Nature that had one of the best titles you’ll ever see in a scientific journal: “Induction of an illusory shadow person.” In that study, they stimulated the brain of a young woman who was awaiting brain surgery for severe epilepsy. Surgeons had implanted electrodes on the surface of her brain to monitor her seizures, and when the researchers passed a mild current through the electrodes, stimulating a small region at the intersection of the temporal and parietal lobes of her brain, she experienced what she described as a shadowy presence lurking nearby, mimicking her own posture. Colored areas indicate regions of overlap in the lesions of neurological patients who experienced feeling of presence illusions. © 2014 Condé Nast.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20290 - Posted: 11.08.2014

by Helen Thomson A MAN with the delusional belief that an impostor has taken his wife's place is helping shed light on how we recognise loved ones. Capgras syndrome is a rare condition in which a person insists that a person they are close to – most commonly a spouse – has been replaced by an impostor. Sometimes they even believe that a much-loved pet has also been replaced by a lookalike. Anecdotal evidence suggests that people with Capgras only misidentify the people that they are closest to. Chris Fiacconi at Western University in London, Ontario, Canada, and his team wanted to explore this. They performed recognition tests and brain scans on two male volunteers with dementia – one who had Capgras, and one who didn't – and compared the results with those of 10 healthy men of a similar age. For months, the man with Capgras believed that his wife had been replaced by an impostor and was resistant to any counterargument, often asking his son why he was so convinced that the woman was his mother. First the team tested whether or not the volunteers could recognise celebrities they would have been familiar with throughout their lifetime, such as Marilyn Monroe. Volunteers were presented with celebrities' names, voices or pictures, and asked if they recognised them and, if so, how much information they could recall about that person. The man with Capgras was more likely to misidentify the celebrities by face or voice compared with the volunteer without Capgras, or the 10 healthy men. None of the volunteers had problems identifying celebrities by name (Frontiers in Human Neuroscience, doi.org/wrw). © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20284 - Posted: 11.06.2014

By Christian Jarrett It feels to me like interest in the brain has exploded. I’ve seen huge investments in brain science by the USA and Europe (the BRAIN Initiative and the Human Brain Project), I’ve read about the rise in media coverage of neuroscience, and above all, I’ve noticed how journalists and bloggers now often frame stories as being about the brain as opposed to the person. Look at these recent headlines: “Why your brain loves storytelling” (Harvard Business Review); “How Netflix is changing our brains” (Forbes); and “Why your brain wants to help one child in need — but not millions” (NPR). There are hundreds more, and in each case, the headline could be about “you” but the writer chooses to make it about “your brain”. Consider too the emergence of new fields such as neuroleadership, neuroaesthetics and neuro-law. It was only a matter of time before someone announced that we’re in the midst of a neurorevolution. In 2009 Zach Lynch did that, publishing his The Neuro Revolution: How Brain Science is Changing Our World. Having said all that, I’m conscious that my own perspective is heavily biased. I earn my living writing about neuroscience and psychology. I’m vigilant for all things brain. Maybe the research investment and brain-obsessed media headlines are largely irrelevant to the general public. I looked into this question recently and was surprised by what I found. There’s not a lot of research but that which exists (such as this, on the teen brain) suggests neuroscience has yet to make an impact on most people’s everyday lives. Indeed, I made Myth #20 in my new book Great Myths of the Brain “Neuroscience is transforming human self-understanding”. WIRED.com © 2014 Condé Nast.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 1: An Introduction to Brain and Behavior
Link ID: 20282 - Posted: 11.06.2014

By C. NATHAN DeWALL How many words does it take to know you’re talking to an adult? In “Peter Pan,” J. M. Barrie needed just five: “Do you believe in fairies?” Such belief requires magical thinking. Children suspend disbelief. They trust that events happen with no physical explanation, and they equate an image of something with its existence. Magical thinking was Peter Pan’s key to eternal youth. The ghouls and goblins that will haunt All Hallows’ Eve on Friday also require people to take a leap of faith. Zombies wreak terror because children believe that the once-dead can reappear. At haunted houses, children dip their hands in buckets of cold noodles and spaghetti sauce. Even if you tell them what they touched, they know they felt guts. And children surmise that with the right Halloween makeup, costume and demeanor, they can frighten even the most skeptical adult. We do grow up. We get jobs. We have children of our own. Along the way, we lose our tendencies toward magical thinking. Or at least we think we do. Several streams of research in psychology, neuroscience and philosophy are converging on an uncomfortable truth: We’re more susceptible to magical thinking than we’d like to admit. Consider the quandary facing college students in a clever demonstration of magical thinking. An experimenter hands you several darts and instructs you to throw them at different pictures. Some depict likable objects (for example, a baby), others are neutral (for example, a face-shaped circle). Would your performance differ if you lobbed darts at a baby? It would. Performance plummeted when people threw the darts at the baby. Laura A. King, the psychologist at the University of Missouri who led this investigation, notes that research participants have a “baseless concern that a picture of an object shares an essential relationship with the object itself.” Paul Rozin, a psychology professor at the University of Pennsylvania, argues that these studies demonstrate the magical law of similarity. Our minds subconsciously associate an image with an object. When something happens to the image, we experience a gut-level intuition that the object has changed as well. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20253 - Posted: 10.28.2014

By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20244 - Posted: 10.27.2014

By KONIKA BANERJEE and PAUL BLOOM ON April 15, 2013, James Costello was cheering on a friend near the finish line at the Boston Marathon when the bombs exploded, severely burning his arms and legs and sending shrapnel into his flesh. During the months of surgery and rehabilitation that followed, Mr. Costello developed a relationship with one of his nurses, Krista D’Agostino, and they soon became engaged. Mr. Costello posted a picture of the ring on Facebook. “I now realize why I was involved in the tragedy,” he wrote. “It was to meet my best friend, and the love of my life.” Mr. Costello is not alone in finding meaning in life events. People regularly do so for both terrible incidents, such as being injured in an explosion, and positive ones, like being cured of a serious disease. As the phrase goes, everything happens for a reason. Where does this belief come from? One theory is that it reflects religious teachings — we think that events have meaning because we believe in a God that plans for us, sends us messages, rewards the good and punishes the bad. But research from the Yale Mind and Development Lab, where we work, suggests that this can’t be the whole story. In one series of studies, recently published in the journal Cognition, we asked people to reflect on significant events from their own lives, such as graduations, the births of children, falling in love, the deaths of loved ones and serious illnesses. Unsurprisingly, a majority of religious believers said they thought that these events happened for a reason and that they had been purposefully designed (presumably by God). But many atheists did so as well, and a majority of atheists in a related study also said that they believed in fate — defined as the view that life events happen for a reason and that there is an underlying order to life that determines how events turn out. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20219 - Posted: 10.20.2014

by Laura Starecheski From the self-affirmations of Stuart Smalley on Saturday Night Live to countless videos on YouTube, saying nice things to your reflection in the mirror is a self-help trope that's been around for decades, and seems most often aimed at women. The practice, we're told, can help us like ourselves and our bodies more, and even make us more successful — allow us to chase our dreams! Impressed, but skeptical, I took this self-talk idea to one of the country's leading researchers on body image to see if it's actually part of clinical practice. David Sarwer is a psychologist and clinical director at the Center for Weight and Eating Disorders at the University of Pennsylvania. He says that, in fact, a mirror is one of the first tools he uses with some new patients. He stands them in front of a mirror and coaches them to use gentler, more neutral language as they evaluate their bodies. "Instead of saying, 'My abdomen is disgusting and grotesque,' " Sarwer explains, he'll prompt a patient to say, " 'My abdomen is round, my abdomen is big; it's bigger than I'd like it to be.' " The goal, he says, is to remove "negative and pejorative terms" from the patient's self-talk. The underlying notion is that it's not enough for a patient to lose physical weight — or gain it, as some women need to — if she doesn't also change the way her body looks in her mind's eye. This may sound weird. You're either a size 4 or a size 8, right? Not mentally, apparently. In a 2013 study from the Netherlands, scientists watched women with anorexia walk through doorways in a lab. The women, they noticed, turned their shoulders and squeezed sideways, even when they had plenty of room. © 2014 NPR

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20178 - Posted: 10.08.2014

By ROBERT KOLKER Reggie Shaw is the man responsible for the most moving portion of “From One Second to the Next,” the director Werner Herzog’s excruciating (even by Werner Herzog standards) 35-minute public service announcement, released last year as part of AT&T’s “It Can Wait” campaign against texting and driving. In the film, Shaw, now in his 20s, recounts the rainy morning in September 2006 that he crossed the line of a Utah highway, knocking into a car containing two scientists, James Furfaro and Keith O’Dell, who were heading to work nearby. Both men were killed. Shaw says he was ­texting a girlfriend at the time, adding in unmistakable anguish that he can’t even ­remember what he was texting about. He is next seen taking part in something almost inconceivable: He enters the scene where one of the dead men’s daughters is being interviewed, and receives from that woman a warm, earnest, tearful, cathartic hug. Reggie Shaw’s redemptive journey — from thoughtless, inadvertent killer to denier of his own culpability to one of the nation’s most powerful spokesmen on the dangers of texting while behind the wheel — was first brought to national attention by Matt Richtel, a reporter for The New York Times, whose series of articles about distracted driving won a Pulitzer Prize in 2010. Now, five years later, in “A Deadly Wandering,” Richtel gives Shaw’s story the thorough, emotional treatment it is due, interweaving a detailed chronicle of the science behind distracted driving. As an instructive social parable, Richtel’s densely reported, at times forced yet compassionate and persuasive book deserves a spot next to “Fast Food Nation” and “To Kill a Mockingbird” in America’s high school curriculums. To say it may save lives is self-evident. What makes the deaths in this book so affecting is how ordinary they are. Two men get up in the morning. They get behind the wheel. A stranger loses track of his car. They crash. The two men die. The temptation is to make the tragedy bigger than it is, to invest it with meaning. Which may explain why Richtel wonders early on if Reggie Shaw lied about texting and driving at first because he was in denial, or because technology “can hijack the brain,” polluting his memory. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20124 - Posted: 09.27.2014

By Melissa Dahl If you are the sort of person who has a hard time just watching TV — if you’ve got to be simultaneously using your iPad or laptop or smartphone — here’s some bad news. New research shows a link between juggling multiple digital devices and a lower-than-usual amount of gray matter, the stuff that’s made up of brain cells, in the region of the brain associated with cognitive and emotional control. More details, via the press release: The researchers at the University of Sussex's Sackler Centre for Consciousness used functional magnetic resonance imaging (fMRI) to look at the brain structures of 75 adults, who had all answered a questionnaire regarding their use and consumption of media devices, including mobile phones and computers, as well as television and print media. They found that, independent of individual personality traits, people who used a higher number of media devices concurrently also had smaller grey matter density in the part of the brain known as the anterior cingulate cortex (ACC), the region notably responsible for cognitive and emotional control functions. But a predilection for using several devices at once isn’t necessarily causing a decrease in gray matter, the authors note — this is a purely correlational finding. As Earl Miller, a neuroscientist at MIT who was not involved in this research, wrote in an email, “It could be (in fact, is possibly more likely) that the relationship is the other way around.” In other words, the people who are least content using just one device at a time may have less gray matter in the first place.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 15: Language and Our Divided Brain; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20123 - Posted: 09.27.2014

by Helen Thomson My, what big eyes you have – you must be trying really hard. A study of how pupils dilate with physical effort could allow us to make strenuous tasks seem easier by zapping specific areas of the brain. We know pupils dilate with mental effort, when we think about a difficult maths problem, for example. To see if this was also true of physical exertion, Alexandre Zenon at the Catholic University of Louvain in Belgium, measured the pupils of 18 volunteers as they squeezed a device which reads grip strength. Sure enough, the more force they exerted, the larger their pupils. To see whether pupil size was related to actual or perceived effort, the volunteers were asked to squeeze the device with four different grip strengths. Various tests enabled the researchers to tell how much effort participants felt they used, from none at all to the most effort possible. Comparing the results from both sets of experiments suggested that pupil dilation correlated more closely with perceived effort than actual effort. The fact that both mental effort and perceived physical effort are reflected in pupil size suggests there is a common representation of effort in the brain, says Zenon. To see where in the brain this might be, the team looked at which areas were active while similar grip tasks were being performed. Zenon says they were able to identify areas within the supplementary motor cortex – which plays a role in movement – associated with how effortful a task is perceived to be. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 20121 - Posted: 09.27.2014

Some people don't just work — they text, Snapchat, check Facebook and Tinder, listen to music and work. And a new study reveals those multitaskers have brains that look different than those of people who stick to one task. Researchers at the University of Sussex scanned 75 adults using an fMRI to examine their gray matter. Those who admitted to multitasking with a variety of electronic devices at once had less dense gray matter in their anterior cingulate cortexes (ACC). This region controls executive function, such as working memory, reasoning, planning and execution. There is no way of knowing if people with smaller anterior cingulate cortexes are more likely to multitask or if multitaskers are shrinking their gray matter. It could even show that our brains become more efficient from multitasking, said Dr. Gary Small, director of UCLA’s Memory and Aging Research Center at the Semel Institute for Neuroscience and Human Behavior, who was not involved in the study. “When you exercise the brain … it becomes effective at performing a mental task,” he said. While previous research has shown that multitasking leads to more mistakes, Small said research remains important to our understanding of something we’re all guilty of doing.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20115 - Posted: 09.25.2014

By Katy Waldman In the opening chapter of Book 1 of My Struggle, by Karl Ove Knausgaard, the 8-year-old narrator sees a ghost in the waves. He is watching a televised report of a rescue effort at sea—“the sky is overcast, the gray-green swell heavy but calm”—when suddenly, on the surface of the water, “the outline of a face emerges.” We might guess from this anecdote that Karl, our protagonist, is both creative and troubled. His limber mind discerns patterns in chaos, but the patterns are illusions. “The lunatic, the lover, and the poet,” Shakespeare wrote, “have such seething brains, such shaping fantasies.” Their imaginations give “to airy nothing a local habitation and a name.” A seething brain can be a great asset for an artist, but, like Knausgaard’s churning, gray-green swell, it can be dangerous too. Inspired metaphors, paranormal beliefs, conspiracy theories, and delusional episodes may all exist on a single spectrum, recent research suggests. The name for the concept that links them is apophenia. A German scientist named Klaus Conrad coined apophanie (from the Greek apo, away, and phaenein, to show) in 1958. He was describing the acute stage of schizophrenia, during which unrelated details seem saturated in connections and meaning. Unlike an epiphany—a true intuition of the world’s interconnectedness—an apophany is a false realization. Swiss psychologist Peter Brugger introduced the term into English when he penned a chapter in a 2001 book on hauntings and poltergeists. Apophenia, he said, was a weakness of human cognition: the “pervasive tendency … to see order in random configurations,” an “unmotivated seeing of connections,” the experience of “delusion as revelation.” On the phone he unveiled his favorite formulation yet: “the tendency to be overwhelmed by meaningful coincidences.” © 2014 The Slate Group LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20088 - Posted: 09.18.2014

By Jena McGregor We've all heard the conventional wisdom for better managing our time and organizing our professional and personal lives. Don't try to multitask. Turn the email and Facebook alerts off to help stay focused. Make separate to-do lists for tasks that require a few minutes, a few hours and long-term planning. But what's grounded in real evidence and what's not? In his new book The Organized Mind, Daniel Levitin — a McGill University professor of psychology and behavioral neuroscience — explores how having a basic understanding of the way the brain works can help us think about organizing our homes, our businesses, our time and even our schools in an age of information overload. We spoke with Levitin about why multi-tasking never works, what images of good leaders' brains actually look like, and why email and Twitter are so incredibly addicting. The following transcript of our conversation has been edited for length and clarity. Q. What was your goal in writing this book? A. Neuroscientists have learned a lot in the last 10 or 15 years about how the brain organizes information, and why we pay attention to some things and forget others. But most of this information hasn't trickled down to the average reader. There are a lot of books about how to get organized and a lot of books about how to be better and more productive at business, but I don't know of one that grounds any of these in the science.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 20049 - Posted: 09.09.2014

By Gary Stix A gamma wave is a rapid, electrical oscillation in the brain. A scan of the academic literature shows that gamma waves may be involved with learning memory and attention—and, when perturbed, may play a part in schizophrenia, epilepsy Alzheimer’s, autism and ADHD. Quite a list and one of the reasons that these brainwaves, cycling at 25 to 80 times per second, persist as an object of fascination to neuroscientists. Despite lingering interest, much remains elusive when trying to figure out how gamma waves are produced by specific molecules within neurons—and what the oscillations do to facilitate communication along the brains’ trillions and trillions of connections. A group of researchers at the Salk Institute in La Jolla, California has looked beyond the preeminent brain cell—the neuron— to achieve new insights about gamma waves. At one time, neuroscience textbooks depicted astrocytes as a kind of pit crew for neurons, providing metabolic support and other functions for the brain’s rapid-firing information-processing components. In recent years, that picture has changed as new studies have found that astrocytes, like neurons, also have an alternate identity as information processors. This research demonstrates astrocytes’ ability to spritz chemicals known as neurotransmitters that communicate with other brain cells. Given that both neurons and astrocytes perform some of the same functions, it has been difficult to tease out what specifically astrocytes are up to. Hard evidence for what these nominal cellular support players might contribute in forming memories or focusing attention has been lacking. © 2014 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 19939 - Posted: 08.12.2014

By DANIEL J. LEVITIN THIS month, many Americans will take time off from work to go on vacation, catch up on household projects and simply be with family and friends. And many of us will feel guilty for doing so. We will worry about all of the emails piling up at work, and in many cases continue to compulsively check email during our precious time off. But beware the false break. Make sure you have a real one. The summer vacation is more than a quaint tradition. Along with family time, mealtime and weekends, it is an important way that we can make the most of our beautiful brains. Every day we’re assaulted with facts, pseudofacts, news feeds and jibber-jabber, coming from all directions. According to a 2011 study, on a typical day, we take in the equivalent of about 174 newspapers’ worth of information, five times as much as we did in 1986. As the world’s 21,274 television stations produce some 85,000 hours of original programming every day (by 2003 figures), we watch an average of five hours of television per day. For every hour of YouTube video you watch, there are 5,999 hours of new video just posted! If you’re feeling overwhelmed, there’s a reason: The processing capacity of the conscious mind is limited. This is a result of how the brain’s attentional system evolved. Our brains have two dominant modes of attention: the task-positive network and the task-negative network (they’re called networks because they comprise distributed networks of neurons, like electrical circuits within the brain). The task-positive network is active when you’re actively engaged in a task, focused on it, and undistracted; neuroscientists have taken to calling it the central executive. The task-negative network is active when your mind is wandering; this is the daydreaming mode. These two attentional networks operate like a seesaw in the brain: when one is active the other is not. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 19936 - Posted: 08.11.2014

Ian Sample, science correspondent The human brain can judge the apparent trustworthiness of a face from a glimpse so fleeting, the person has no idea they have seen it, scientists claim. Researchers in the US found that brain activity changed in response to how trustworthy a face appeared to be when the face in question had not been consciously perceived. Scientists made the surprise discovery during a series of experiments that were designed to shed light on the the neural processes that underpin the snap judgments people make about others. The findings suggest that parts of our brains are doing more complex subconscious processing of the outside world than many researchers thought. Jonathan Freeman at New York University said the results built on previous work that shows "we form spontaneous judgments of other people that can be largely outside awareness." The study focused on the activity of the amygdala, a small almond-shaped region deep inside the brain. The amygdala is intimately involved with processing strong emotions, such as fear. Its central nucleus sends out the signals responsible for the famous and evolutionarily crucial "fight-or-flight" response. Prior to the study, Freeman asked a group of volunteers to rate the trustworthiness of a series of faces. People tend to agree when they rank trustworthiness – faces with several key features, such as more furrowed brows and shallower cheekbones, are consistently rated as less trustworthy. Freeman then invited a different group of people to take part in the experiments. Each lay in an MRI scanner while images of faces flashed up on a screen before them. Each trustworthy or untrustworthy face flashed up for a matter of milliseconds. Though their eyes had glimpsed the images, the participants were not aware they had seen the faces. © 2014 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19924 - Posted: 08.07.2014

David Robson It’s not often that you look at your meal to find it staring back at you. But when Diane Duyser picked up her cheese toastie, she was in for a shock. “I went to take a bite out of it, and then I saw this lady looking back at me,” she told the Chicago Tribune. “It scared me at first.” As word got around, it soon began to spark more attention, and eventually a casino paid Duyser $28,000 to exhibit the toasted sandwich. For many, the woman’s soft, full features and serene expression recalls famous depictions of the Virgin Mary. But I’ve always thought the curled hair, parted lips and heavy eyelids evoke a more modern idol. Whichever Madonna you think you can see, she joins good company; Jesus has also been seen in toast, as well as a taco, a pancake and a banana peel, while Buzzfeed recently ran photos of peppers that look like British politicians. “If someone reports seeing Jesus in a piece of toast, you’d think they must be nuts,” says Kang Lee, at the University of Toronto, Canada. “But it’s very pervasive... We are primed to see faces in every corner of the visual world.” Lee has shown that rather than being a result of divine intervention, these experiences reflect the powerful influence of our imagination over our perception. Indeed, his explanation may mean that you never trust your eyes again. Pareidolia, as this experience is known, is by no means a recent phenomenon. Leonardo da Vinci described seeing characters in natural markings on stone walls, which he believed could help inspire his artworks. In the 1950s, the Bank of Canada had to withdraw a series of banknotes because a grinning devil leapt from the random curls of the Queen’s hair (although I can’t, for the life of me, see the merest hint of a horn in Her Majesty’s locks). The Viking I spacecraft, meanwhile, appeared to photograph a carved face in the rocky landscape of Mars. BBC © 2014

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19912 - Posted: 08.02.2014

|By Nathan Collins Time, space and social relationships share a common language of distance: we speak of faraway places, close friends and the remote past. Maybe that is because all three share common patterns of brain activity, according to a January study in the Journal of Neuroscience. Curious to understand why the distance metaphor works across conceptual domains, Dartmouth College psychologists used functional MRI scans to analyze the brains of 15 people as they viewed pictures of household objects taken at near or far distances, looked at photographs of friends or acquaintances, and read phrases such as “in a few seconds” or “a year from now.” Patterns of activity in the right inferior parietal lobule, a region thought to handle distance information, robustly predicted whether a participant was thinking about near versus far in any of the categories—indicating that certain aspects of time, space and relationships are all processed in a similar way in the brain. The results, the researchers say, suggest that higher-order brain functions are organized more around computations such as near versus far than conceptual domains such as time or social relationships. © 2014 Scientific American

Related chapters from BP7e: Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 15: Language and Our Divided Brain
Link ID: 19860 - Posted: 07.21.2014

|By Ferris Jabr You know the exit is somewhere along this stretch of highway, but you have never taken it before and do not want to miss it. As you carefully scan the side of the road for the exit sign, numerous distractions intrude on your visual field: billboards, a snazzy convertible, a cell phone buzzing on the dashboard. How does your brain focus on the task at hand? To answer this question, neuroscientists generally study the way the brain strengthens its response to what you are looking for—jolting itself with an especially large electrical pulse when you see it. Another mental trick may be just as important, according to a study published in April in the Journal of Neuroscience: the brain deliberately weakens its reaction to everything else so that the target seems more important in comparison. Cognitive neuroscientists John Gaspar and John McDonald, both at Simon Fraser University in British Columbia, arrived at the conclusion after asking 48 college students to take attention tests on a computer. The volunteers had to quickly spot a lone yellow circle among an array of green circles without being distracted by an even more eye-catching red circle. All the while the researchers monitored electrical activity in the students' brains using a net of electrodes attached to their scalps. The recorded patterns revealed that their brains consistently suppressed reactions to all circles except the one they were looking for—the first direct evidence of this particular neural process in action. © 2014 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 19788 - Posted: 07.03.2014