Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 517

By Melissa Dahl A rule that spans time and space and morning routines: It is entirely too easy to underestimate the time it takes to get to work. Maybe once — one time — it took just 20 minutes to get to work, but it typically takes 25 to 30, and you know that, but still you leave late and, thus, arrive late. It’s dumb. It is also, maybe, human nature. As Christian Jarrett at BPS Research Digest reports, a team of neuroscientists has just uncovered a very handy if rather complicated excuse for tardiness — it seems people tend to underestimate how long it will take to travel familiar routes. The laws of time and space do not actually bend in order to transport you to work or school more quickly, but at least part of you believes that they will. And yet the oddest part of this new study, published in the journal Hippocampus, is that the participants tended to overestimate the physical length of those routes, even as they underestimated how long it would take to travel them. It does make a certain amount of sense that people would exaggerate the breadth of familiar distances, because the level of detail you’ve stored about them matters to your memory. If you remember every Starbucks and street corner you pass on the way you usually walk to school, for instance, the walking route will likely feel longer when you recall it than one you don’t know as well. As Jarrett explains, the researchers “thought a more detailed neural representation would make that space seem larger.” And when they asked a group of students — all of whom had been living in the same building in London for 9 months — to draw a little map of their neighborhood, this is indeed what they found. The students exaggerated the physical distance of the routes they walked the most, drawing their maps a little bigger they should have. © 2016, New York Media LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22797 - Posted: 10.28.2016

By Kensy Cooperrider, Rafael Núñez “What is the difference between yesterday and tomorrow?” The Yupno man we were interviewing, Danda, paused to consider his answer. A group of us sat on a hillside in the Yupno Valley, a remote nook high in the mountains of Papua New Guinea. Only days earlier we had arrived on a single-engine plane. After a steep hike from the grass airstrip, we found ourselves in the village of Gua, one of about 20 Yupno villages dotting the rugged terrain. We came all the way here because we are interested in time—in how Yupno people understand concepts such as past, present and future. Are these ideas universal, or are they products of our language, our culture and our environment? As we interviewed Danda and others in the village, we listened to what they said about time, but we paid even closer attention to what they did with their hands as they spoke. Gestures can be revealing. Ask English speakers about the difference between yesterday and tomorrow, and they might thrust a hand over the shoulder when referring to the past and then forward when referring to the future. Such unreflective movements reveal a fundamental way of thinking in which the past is at our backs, something that we “leave behind,” and the future is in front of us, something to “look forward” to. Would a Yupno speaker do the same? Danda was making just the kinds of gestures we were hoping for. As he explained the Yupno word for “yesterday,” his hand swept backward; as he mentioned “tomorrow,” it leaped forward. We all sat looking up a steep slope toward a jagged ridge, but as the light faded, we changed the camera angle, spinning around so that we and Danda faced in the opposite direction, downhill. With our backs now to the ridge, we looked over the Yupno River meandering toward the Bismarck Sea. “Let's go over that one more time,” we suggested. © 2016 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22778 - Posted: 10.22.2016

By Catherine Caruso Imagine you are faced with the classic thought experiment dilemma: You can take a pile of money now or wait and get an even bigger stash of cash later on. Which option do you choose? Your level of self-control, researchers have found, may have to do with a region of the brain that lets us take the perspective of others—including that of our future self. A study, published today in Science Advances, found that when scientists used noninvasive brain stimulation to disrupt a brain region called the temporoparietal junction (TPJ), people appeared less able to see things from the point of view of their future selves or of another person, and consequently were less likely to share money with others and more inclined to opt for immediate cash instead of waiting for a larger bounty at a later date. The TPJ, which is located where the temporal and parietal lobes meet, plays an important role in social functioning, particularly in our ability to understand situations from the perspectives of other people. However, according to Alexander Soutschek, an economist at the University of Zurich and lead author on the study, previous research on self-control and delayed gratification has focused instead on the prefrontal brain regions involved in impulse control. “When you have a closer look at the literature, you sometimes find in the neuroimaging data that the TPJ is also active during delay of gratification,” Soutschek says, “but it's never interpreted.” © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22772 - Posted: 10.20.2016

Emily Badger One of the newest chew toys in the presidential campaign is “implicit bias,” a term Mike Pence repeatedly took exception to in the vice-presidential debate on Tuesday. Police officers hear all this badmouthing, said Mr. Pence, Donald J. Trump’s running mate, in response to a question about whether society demands too much of law enforcement. They hear politicians painting them with one broad brush, with disdain, with automatic cries of implicit bias. He criticized Hillary Clinton for saying, in the first presidential debate, that everyone experiences implicit bias. He suggested a black police officer who shoots a black civilian could not logically experience such bias. “Senator, please,” Mr. Pence said, addressing his Democratic opponent, Tim Kaine, “enough of this seeking every opportunity to demean law enforcement broadly by making the accusation of implicit bias every time tragedy occurs.” The concept, in his words, came across as an insult, a put-down on par with branding police as racists. Many Americans may hear it as academic code for “racist.” But that connotation does not line up with scientific research on what implicit bias is and how it really operates. Researchers in this growing field say it isn’t just white police officers, but all of us, who have biases that are subconscious, hidden even to ourselves. Implicit bias is the mind’s way of making uncontrolled and automatic associations between two concepts very quickly. In many forms, implicit bias is a healthy human adaptation — it’s among the mental tools that help you mindlessly navigate your commute each morning. It crops up in contexts far beyond policing and race (if you make the rote assumption that fruit stands have fresher produce, that’s implicit bias). But the same process can also take the form of unconsciously associating certain identities, like African-American, with undesirable attributes, like violence. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22730 - Posted: 10.08.2016

André Corrêa d’Almeida and Amanda Sue Grossi Development. Poverty. Africa. These are just three words on a page – almost no information at all – but how many realities did our readers just conjure? And how many thoughts filled the spaces in-between? Cover yourselves. Your biases are showing. In the last few decades, groundbreaking work by psychologists and behavioural economists has exposed unconscious biases in the way we think. And as the World Bank’s 2015 World Development Report points out, development professionals are not immune to these biases. There is a real possibility that seemingly unbiased and well-intentioned development professionals are capable of making consequential mistakes, with significant impacts upon the lives of others, namely the poor. The problem arises when mindsets are just that – set. As the work of Daniel Kahneman and Amos Tversky has shown, development professionals – like people generally – have two systems of thinking; the automatic and the deliberative. For the automatic, instead of performing complex rational calculations every time we need to make a decision, much of our thinking relies on pre-existing mental models and shortcuts. These are based on assumptions we create throughout our lives and that stem from our experiences and education. More often than not, these mental models are incomplete and shortcuts can lead us down the wrong path. Thinking automatically then becomes thinking harmfully. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22653 - Posted: 09.15.2016

Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22646 - Posted: 09.12.2016

Chris Chambers One of the most compelling impressions in everyday life is that wherever we look, we “see” everything that is happening in front of us – much like a camera. But this impression is deceiving. In reality our senses are bombarded by continual waves of stimuli, triggering an avalanche of sensations that far exceed the brain’s capacity. To make sense of the world, the brain needs to determine which sensations are the most important for our current goals, focusing resources on the ones that matter and throwing away the rest. These computations are astonishingly complex, and what makes attention even more remarkable is just how effortless it is. The mammalian attention system is perhaps the most efficient and precisely tuned junk filter we know of, refined through millions of years of annoying siblings (and some evolution). Attention is amazing but no system is ever perfect. Our brain’s computational reserves are large but not infinite, and under the right conditions we can “break it” and peek behind the curtain. This isn’t just a fun trick – understanding these limits can yield important insights into psychology and neurobiology, helping us to diagnose and treat impairments that follow brain injury and disease. Thanks to over a hundred years of psychology research, it’s relatively easy to reveal attention in action. One way is through the phenomenon of change blindness. Try it yourself by following the instructions in the short video below (no sound). When we think of the term “blindness” we tend to assume a loss of vision caused by damage to the eye or optic nerves. But as you saw in the video, change blindness is completely normal and is caused by maxing out your attentional capacity. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22633 - Posted: 09.06.2016

A new study by investigators at Brigham and Women's Hospital in collaboration with researchers at the University of York and Leeds in the UK and MD Andersen Cancer Center in Texas puts to the test anecdotes about experienced radiologists' ability to sense when a mammogram is abnormal. In a paper published August 29 in the Proceedings of the National Academy of Sciences, visual attention researchers showed radiologists mammograms for half a second and found that they could identify abnormal mammograms at better than chance levels. They further tested this ability through a series of experiments to explore what signal may alert radiologists to the presence of a possible abnormality, in the hopes of using these insights to improve breast cancer screening and early detection. "Radiologists can have 'hunches' after a first look at a mammogram. We found that these hunches are based on something real in the images. It's really striking that in the blink of an eye, an expert can pick up on something about that mammogram that indicates abnormality," said Jeremy Wolfe, PhD, senior author of the study and director of the Visual Attention Laboratory at BWH. "Not only that, but they can detect something abnormal in the other breast, the breast that does not contain a lesion." In the clinic, radiologists carefully evaluate mammograms and may use computer automated systems to help screen the images. Although they would never assess an image in half a second in the clinic, the ability of experts to extract the "gist" of an image quickly suggests that there may be a detectable signs of breast cancer that radiologists are rapidly picking up. Copyright 2016 ScienceDaily

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22627 - Posted: 09.05.2016

By Jessica Hamzelou Feel like you’ve read this before? Most of us have experienced the eerie familiarity of déjà vu, and now the first brain scans of this phenomenon have revealed why – it’s a sign of our brain checking its memory. Déjà vu was thought to be caused by the brain making false memories, but research by Akira O’Connor at the University of St Andrews, UK, and his team now suggests this is wrong. Exactly how déjà vu works has long been a mystery, partly because its fleeting and unpredictable nature makes it difficult to study. To get around this, O’Connor and his colleagues developed a way to trigger the sensation of déjà vu in the lab. The team’s technique uses a standard method to trigger false memories. It involves telling a person a list of related words – such as bed, pillow, night, dream – but not the key word linking them together, in this case, sleep. When the person is later quizzed on the words they’ve heard, they tend to believe they have also heard “sleep” – a false memory. To create the feeling of déjà vu, O’ Connor’s team first asked people if they had heard any words beginning with the letter “s”. The volunteers replied that they hadn’t. This meant that when they were later asked if they had heard the word sleep, they were able to remember that they couldn’t have, but at the same time, the word felt familiar. “They report having this strange experience of déjà vu,” says O’Connor. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22565 - Posted: 08.17.2016

By EUGENE M. CARUSO, ZACHARY C. BURNS and BENJAMIN A. CONVERSE Watching slow-motion footage of an event can certainly improve our judgment of what happened. But can it also impair judgment? This question arose in the 2009 murder trial of a man named John Lewis, who killed a police officer during an armed robbery of a Dunkin’ Donuts in Philadelphia. Mr. Lewis pleaded guilty; the only question for the jury was whether the murder resulted from a “willful, deliberate and premeditated” intent to kill or — as Mr. Lewis argued — from a spontaneous, panicked reaction to seeing the officer enter the store unexpectedly. The key piece of evidence was a surveillance video of the shooting, which the jury saw both in real time and in slow motion. The jury found that Mr. Lewis had acted with premeditation, and he was sentenced to death. Mr. Lewis appealed the decision, arguing that the slow-motion video was prejudicial. Specifically, he claimed that watching the video in slow motion artificially stretched the relevant time period and created a “false impression of premeditation.” Did it? We recently conducted a series of experiments whose results are strikingly consistent with that claim. Our studies, published this week in the Proceedings of the National Academy of Sciences, show that seeing replays of an action in slow motion leads viewers to believe that the actor had more time to think before acting than he actually did. The result is that slow motion makes actions seem more intentional, more premeditated. In one of our studies, participants watched surveillance video of a fatal shooting that occurred outside a convenience store during an armed robbery. We gave them a set of instructions similar to those given to the jurors in Mr. Lewis’s case, asking them to decide whether the crime was premeditated or not. We assigned half our participants to watch the video in slow motion and the other half to watch it at regular speed. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22525 - Posted: 08.08.2016

By BENEDICT CAREY Solving a hairy math problem might send a shudder of exultation along your spinal cord. But scientists have historically struggled to deconstruct the exact mental alchemy that occurs when the brain successfully leaps the gap from “Say what?” to “Aha!” Now, using an innovative combination of brain-imaging analyses, researchers have captured four fleeting stages of creative thinking in math. In a paper published in Psychological Science, a team led by John R. Anderson, a professor of psychology and computer science at Carnegie Mellon University, demonstrated a method for reconstructing how the brain moves from understanding a problem to solving it, including the time the brain spends in each stage. The imaging analysis found four stages in all: encoding (downloading), planning (strategizing), solving (performing the math), and responding (typing out an answer). “I’m very happy with the way the study worked out, and I think this precision is about the limit of what we can do” with the brain imaging tools available, said Dr. Anderson, who wrote the report with Aryn A. Pyke and Jon M. Fincham, both also at Carnegie Mellon. To capture these quicksilver mental operations, the team first taught 80 men and women how to interpret a set of math symbols and equations they had not seen before. The underlying math itself wasn’t difficult, mostly addition and subtraction, but manipulating the newly learned symbols required some thinking. The research team could vary the problems to burden specific stages of the thinking process — some were hard to encode, for instance, while others extended the length of the planning stage. The scientists used two techniques of M.R.I. data analysis to sort through what the participants’ brains were doing. One technique tracked the neural firing patterns during the solving of each problem; the other identified significant shifts from one kind of mental state to another. The subjects solved 88 problems each, and the research team analyzed the imaging data from those solved successfully. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 22496 - Posted: 07.30.2016

By ERICA GOODE You are getting sleepy. Very sleepy. You will forget everything you read in this article. Hypnosis has become a common medical tool, used to reduce pain, help people stop smoking and cure them of phobias. But scientists have long argued about whether the hypnotic “trance” is a separate neurophysiological state or simply a product of a hypnotized person’s expectations. A study published on Thursday by Stanford researchers offers some evidence for the first explanation, finding that some parts of the brain function differently under hypnosis than during normal consciousness. The study was conducted with functional magnetic resonance imaging, a scanning method that measures blood flow in the brain. It found changes in activity in brain areas that are thought to be involved in focused attention, the monitoring and control of the body’s functioning, and the awareness and evaluation of a person’s internal and external environments. “I think we have pretty definitive evidence here that the brain is working differently when a person is in hypnosis,” said Dr. David Spiegel, a professor of psychiatry and behavioral sciences at Stanford who has studied the effectiveness of hypnosis. Functional imaging is a blunt instrument and the findings can be difficult to interpret, especially when a study is looking at activity levels in many brain areas. Still, Dr. Spiegel said, the findings might help explain the intense absorption, lack of self-consciousness and suggestibility that characterize the hypnotic state. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 22493 - Posted: 07.30.2016

James M. Broadway “Where did the time go?” middle-aged and older adults often remark. Many of us feel that time passes more quickly as we age, a perception that can lead to regrets. According to psychologist and BBC columnist Claudia Hammond, “the sensation that time speeds up as you get older is one of the biggest mysteries of the experience of time.” Fortunately, our attempts to unravel this mystery have yielded some intriguing findings. In 2005, for instance, psychologists Marc Wittmann and Sandra Lenhoff, both then at Ludwig Maximilian University of Munich, surveyed 499 participants, ranging in age from 14 to 94 years, about the pace at which they felt time moving—from “very slowly” to “very fast.” For shorter durations—a week, a month, even a year—the subjects' perception of time did not appear to increase with age. Most participants felt that the clock ticked by quickly. But for longer durations, such as a decade, a pattern emerged: older people tended to perceive time as moving faster. When asked to reflect on their lives, the participants older than 40 felt that time elapsed slowly in their childhood but then accelerated steadily through their teenage years into early adulthood. There are good reasons why older people may feel that way. When it comes to how we perceive time, humans can estimate the length of an event from two very different perspectives: a prospective vantage, while an event is still occurring, or a retrospective one, after it has ended. In addition, our experience of time varies with whatever we are doing and how we feel about it. In fact, time does fly when we are having fun. Engaging in a novel exploit makes time appear to pass more quickly in the moment. But if we remember that activity later on, it will seem to have lasted longer than more mundane experiences. © 2016 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22447 - Posted: 07.16.2016

By SUNITA SAH A POPULAR remedy for a conflict of interest is disclosure — informing the buyer (or the patient, etc.) of the potential bias of the seller (or the doctor, etc.). Disclosure is supposed to act as a warning, alerting consumers to their adviser’s stake in the matter so they can process the advice accordingly. But as several recent studies I conducted show, there is an underappreciated problem with disclosure: It often has the opposite of its intended effect, not only increasing bias in advisers but also making advisees more likely to follow biased advice. When I worked as a physician, I witnessed how bias could arise from numerous sources: gifts or sponsorships from the pharmaceutical industry; compensation for performing particular procedures; viewing our own specialties as delivering more effective treatments than others’ specialties. Although most physicians, myself included, tend to believe that we are invulnerable to bias, thus making disclosures unnecessary, regulators insist on them, assuming that they work effectively. To some extent, they do work. Disclosing a conflict of interest — for example, a financial adviser’s commission or a physician’s referral fee for enrolling patients into clinical trials — often reduces trust in the advice. But my research has found that people are still more likely to follow this advice because the disclosure creates increased pressure to follow the adviser’s recommendation. It turns out that people don’t want to signal distrust to their adviser or insinuate that the adviser is biased, and they also feel pressure to help satisfy their adviser’s self-interest. Instead of functioning as a warning, disclosure can become a burden on advisees, increasing pressure to take advice they now trust less. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22416 - Posted: 07.09.2016

By Emily Rosenzweig Life deals most of us a consistent stream of ego blows, be they failures at work, social slights, or unrequited love. Social psychology has provided decades of insight into just how adept we are at defending ourselves against these psychic threats. We discount negative feedback, compare ourselves favorably to those who are worse off than us, attribute our failures to others, place undue value on our own strengths, and devalue opportunities denied to us–all in service of protecting and restoring our sense of self-worth. As a group, this array of motivated mental processes that support mood repair and ego defense has been called the “psychological immune system.” Particularly striking to social psychologists is our ability to remain blind to our use of these motivated strategies, even when it is apparent to others just how biased we are. However there are times when we either cannot remain blind to our own psychological immune processes, or where we may find ourselves consciously wanting to use them expressly for the purpose of restoring our ego or our mood. What then? Can we believe a conclusion we reach even when we know that we arrived at it in a biased way? For example, imagine you’ve recently gone through a breakup and want to get over your ex. You decide to make a mental list of all of their character flaws in an effort to feel better about the relationship ending. A number of prominent social psychologists have suggested you’re out of luck—knowing that you’re focusing only on your ex’s worst qualities prevents you from believing the conclusion you’ve come to that you’re better off without him or her. In essence, they argue that we must remain blind to our own biased mental processes in order to reap their ego-restoring benefits. And in many ways this closely echoes the position that philosophers like Mele have taken about the possibility of agentic self-deception. © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22404 - Posted: 07.07.2016

Mo Costandi There’s much more to visual perception than meets the eye. What we see is not merely a matter of patterns of light falling on the retina, but rather is heavily influenced by so-called ‘top-down’ brain mechanisms, which can alter the visual information, and other types of sensory information, that enters the brain before it even reaches our conscious awareness. A striking example of this is a phenomenon called inattentional blindness, whereby narrowly focusing one’s attention on one visual stimulus makes us oblivious to other stimuli, even though they otherwise may be glaringly obvious, as demonstrated by the infamous ‘Invisible Gorilla’ study. Now researchers say they have discovered another extreme form of blindness, in which people fail to notice an unexpected image, even when shown by itself and staring them in the face. Marjan Persuh and Robert Melara of the City University of New York designed two experiments to investigate whether people’s prior expectations could block their awareness of meaningful and important visual stimuli. In the first, they recruited 20 student volunteers and asked them to perform a visual discrimination task. They were shown a series of images, consisting of successive pairs of faces, each of which were presented for half a second on a computer screen, and asked to indicate whether each pair showed faces of people of the same or different sex. Towards the end of each session, the participants were presented with a simple shape, which flashed onto the screen for one tenth of a second. They were then asked if they had seen anything new and, after replying, were told that a shape had indeed appeared, and asked to select the correct one from a display of four. This shape recognition task was then repeated in one final control trial. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22394 - Posted: 07.04.2016

By MOSHE BAR A FRIEND of mine has a bad habit of narrating his experiences as they are taking place. I tease him for being a bystander in his own life. To be fair, we all fail to experience life to the fullest. Typically, our minds are too occupied with thoughts to allow complete immersion even in what is right in front of us. Sometimes, this is O.K. I am happy not to remember passing a long stretch of my daily commute because my mind has wandered and my morning drive can be done on autopilot. But I do not want to disappear from too much of life. Too often we eat meals without tasting them, look at something beautiful without seeing it. An entire exchange with my daughter (please forgive me) can take place without my being there at all. Recently, I discovered how much we overlook, not just about the world, but also about the full potential of our inner life, when our mind is cluttered. In a study published in this month’s Psychological Science, the graduate student Shira Baror and I demonstrate that the capacity for original and creative thinking is markedly stymied by stray thoughts, obsessive ruminations and other forms of “mental load.” Many psychologists assume that the mind, left to its own devices, is inclined to follow a well-worn path of familiar associations. But our findings suggest that innovative thinking, not routine ideation, is our default cognitive mode when our minds are clear. In a series of experiments, we gave participants a free-association task while simultaneously taxing their mental capacity to different degrees. In one experiment, for example, we asked half the participants to keep in mind a string of seven digits, and the other half to remember just two digits. While the participants maintained these strings in working memory, they were given a word (e.g., shoe) and asked to respond as quickly as possible with the first word that came to mind (e.g., sock). © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22360 - Posted: 06.25.2016

By Nancy Szokan Let’s begin by defining something psychologists call “ego depletion.” This is the idea that all of us have only a certain amount of self-control, and if we use up too much in one part of our lives, we will have less to use in others. An early example came from a 1998 study in which participants were tempted with a chocolate treat before being given a difficult puzzle: Those who resisted the temptation seemed to have used up some of their willpower, because they gave up on the puzzle faster than the treat eaters. There have been many subsequent studies about ego depletion, including its apparent effects on physical performance: In 2012, athletes who were given a difficult mental task before a physical challenge exhibited less determination to do well on the sports test than those who hadn’t done the puzzle. But recently a replication study (in which researchers repeat a published experiment to see if they come up with the same results) tested more than 2,000 participants at 24 labs and found the ego depletion effect to be very small or nonexistent. I Which, as Lea Winerman reports, has led such psychologists as Michael Inzlicht of the University of Toronto to a crisis of confidence. Maybe, he thinks, ego depletion and the other social psychological effects he has made a career of studying are “proven” by unreliable research. “I used to think there were errors, but that the errors were minor and it was fine,” Winerman quotes Inzlicht as saying in the June issue of Monitor on Psychology, a publication of the American Psychological Association. “But as I started surveying the field, I started thinking we’ve been making some major mistakes.”

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22337 - Posted: 06.20.2016

Alva Noë Sometimes the mind wanders. Thoughts pop into consciousness. Ideas or images are present when just a moment before they were not. Scientists recently have been turning their attention to making sense of this. One natural picture of the phenomenon goes something like this. Typically, our thoughts and feelings are shaped by what we are doing, by what there is around us. The world captures our attention and compels our minds this way or that. What explains the fact that you think of a red car when there is a red car in front of you is, well, the red car. And similarly, it is that loud noise that causes you to orient yourself to the commotion that is producing it. In such cases, we might say, the mind is coupled to the world around it and the world, in a way, plays us the way a person might play a piano. But sometimes, even without going to sleep, we turn away from the world. We turn inward. We are contemplative or detached. We decouple ourselves from the environment and we are set free, as it were, to let our minds play themselves. This natural picture has gained some support from the discovery of the so-called Default Mode Network. The DMN is a network of neural systems whose activation seems to be suppressed by active engagement with the world around us; DMN, in contrast, is activated (or rather, it tends to return to baseline levels of activity) precisely when we detach ourselves from what's going on around us. The DMN is the brain running in neutral. One of the leading hypotheses to explain mind-wandering and the emergence of spontaneous thoughts is that this is the result of the operation of the brain's Default Mode Network. (See this for a review of this literature.) © 2016 npr

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22331 - Posted: 06.18.2016

By Rachel Feltman Archerfish are already stars of the animal kingdom for their stunning spit-takes. They shoot high-powered water jets from their mouths to stun prey, making them one of just a few fish species known to use tools. But by training Toxotes chatareus to direct those jets of spit at certain individuals, scientists have shown that the little guys have another impressive skill: They seem to be able to distinguish one human face from another, something never before witnessed in fish and spotted just a few times in non-human animals. The results, published Tuesday in the Nature journal Scientific Reports, could help us understand how humans got so good at telling each other apart. Or how most people got to be good at that, anyway. I'm terrible at it. It's generally accepted that the fusiform gyrus, a brain structure located in the neocortex, allows humans to tell one another apart with a speed and accuracy that other species can't manage. But there's some debate over whether human faces are so innately complex — and that distinguishing them is more difficult than other tricks of memory or pattern recognition — that this region of the brain is a necessary facilitator of the skill that evolved especially for it. Birds, which have been shown to distinguish humans from one another, have the same structure. But some researchers still think that facial recognition might be something that humans learn — it's not an innate skill — and that the fusiform gyrus is just the spot where we happen to process all the necessary information.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22299 - Posted: 06.08.2016