Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 506

Rosie Mestel The 2016 US election was a powerful reminder that beliefs tend to come in packages: socialized medicine is bad, gun ownership is a fundamental right, and climate change is a myth — or the other way around. Stances that may seem unrelated can cluster because they have become powerful symbols of membership of a group, says Dan Kahan, who teaches law and psychology at Yale Law School in New Haven, Connecticut. And the need to keep believing can further distort people’s perceptions and their evaluation of evidence. Here, Kahan tells Nature about the real-world consequences of group affinity and cognitive bias, and about research that may point to remedies. This interview has been edited for length and clarity. One measure is how individualistic or communitarian people are, and how egalitarian or hierarchical. Hierarchical and individualistic people tend to have confidence in markets and industry: those represent human ingenuity and power. People who are egalitarian and communitarian are suspicious of markets and industry. They see them as responsible for social disparity. It’s natural to see things you consider honourable as good for society, and things that are base, as bad. Such associations will motivate people’s assessment of evidence. Can you give an example? In a study, we showed people data from gun-control experiments and varied the results1. People who were high in numeracy always saw when a study supported their view. If it didn’t support their view, they didn’t notice — or argued their way out of it. © 2016 Macmillan Publishers Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22946 - Posted: 12.03.2016

By Alison Howell What could once only be imagined in science fiction is now increasingly coming to fruition: Drones can be flown by human brains' thoughts. Pharmaceuticals can help soldiers forget traumatic experiences or produce feelings of trust to encourage confession in interrogation. DARPA-funded research is working on everything from implanting brain chips to "neural dust" in an effort to alleviate the effects of traumatic experience in war. Invisible microwave beams produced by military contractors and tested on U.S. prisoners can produce the sensation of burning at a distance. What all these techniques and technologies have in common is that they're recent neuroscientific breakthroughs propelled by military research within a broader context of rapid neuroscientific development, driven by massive government-funded projects in both America and the European Union. Even while much about the brain remains mysterious, this research has contributed to the rapid and startling development of neuroscientific technology. And while we might marvel at these developments, it is also undeniably true that this state of affairs raises significant ethical questions. What is the proper role – if any – of neuroscience in national defense or war efforts? My research addresses these questions in the broader context of looking at how international relations, and specifically warfare, are shaped by scientific and medical expertise and technology. 2016 © U.S. News & World Report L.P.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 22944 - Posted: 12.03.2016

By Melissa Dahl Considering its origin story, it’s not so surprising that hypnosis and serious medical science have often seemed at odds. The man typically credited with creating hypnosis, albeit in a rather primitive form, is Franz Mesmer, a doctor in 18th-century Vienna. (Mesmer, mesmerize. Get it?) Mesmer developed a general theory of disease he called “animal magnetism,” which held that every living thing carries within it an internal magnetic force, in liquid form. Illness arises when this fluid becomes blocked, and can be cured if it can be coaxed to flow again, or so Mesmer’s thinking went. To get that fluid flowing, as science journalist Jo Marchant describes in her recent book, Cure, Mesmer “simply waved his hands to direct it through his patients’ bodies” — the origin of those melodramatic hand motions that stage hypnotists use today.” After developing a substantial following — “mesmerism” became “the height of fashion” in late 1780s Paris, writes Marchant — Mesmer became the subject of what was essentially the world’s first clinical trial. King Louis XVI pulled together a team of the world’s top scientists, including Benjamin Franklin, who tested mesmerism and found its capacity to “cure” was, essentially, a placebo effect. “Not a shred of evidence exists for any fluid,” Franklin wrote. “The practice … is the art of increasing the imagination by degrees.” Maybe so. But that doesn’t mean it doesn’t work. © 2016, New York Media LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 5: The Sensorimotor System
Link ID: 22931 - Posted: 11.30.2016

By Yasemin Saplakoglu Even if you don’t have rhythm, your pupils do. In a new study, neuroscientists played drumming patterns from Western music, including beats typical in pop and rock, while asking volunteers to focus on computer screens for an unrelated fast-paced task that involved pressing the space bar as quickly as possible in response to a signal on the screen. Unbeknownst to the participants, the music omitted strong and weak beats at random times. (You can listen below for an example of a music clip they used. If you listen carefully, you can hear bass and hi-hat beats omitted throughout.) Eye scanners tracked the dilations of the subjects’ pupils as the music played. Their pupils enlarged when the rhythms dropped certain beats, even though the participants weren’t paying attention to the music. The biggest dilations matched the omissions of the beats in the most prominent locations in the music, usually the important first beat in a repeated set of notes. The results suggest that we may have an automatic sense of “hierarchical meter”—a pattern of strong and weak beats—that governs our expectations of music, the researchers write in the February 2017 issue of Brain and Cognition. Perhaps, the authors say, our eyes reveal clues into the importance that music and rhythm plays in our lives. © 2016 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 22920 - Posted: 11.29.2016

Hannah Devlin The human brain is predisposed to learn negative stereotypes, according to research that offers clues as to how prejudice emerges and spreads through society. The study found that the brain responds more strongly to information about groups who are portrayed unfavourably, adding weight to the view that the negative depiction of ethnic or religious minorities in the media can fuel racial bias. Hugo Spiers, a neuroscientist at University College London, who led the research, said: “The newspapers are filled with ghastly things people do ... You’re getting all these news stories and the negative ones stand out. When you look at Islam, for example, there’s so many more negative stories than positive ones and that will build up over time.” The scientists also uncovered a characteristic brain signature seen when participants were told a member of a “bad” group had done something positive - an observation that is likely to tally with the subjective experience of minorities. “Whenever someone from a really bad group did something nice they were like, ‘Oh, weird,’” said Spiers. Previous studies have identified brain areas involved in gender or racial stereotyping, but this is the first attempt to investigate how the brain learns to link undesirable traits to specific groups and how this is converted into prejudice over time. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22821 - Posted: 11.02.2016

Laura Sanders The eyes may reveal whether the brain’s internal stopwatch runs fast or slow. Pupil size predicted whether a monkey would over- or underestimate a second, scientists report in the Nov. 2 Journal of Neuroscience. Scientists knew that pupils get bigger when a person is paying attention. They also knew that paying attention can influence how people perceive the passage of time. Using monkeys, the new study links pupil size and timing directly. “What they’ve done here is connect those dots,” says neuroscientist Thalia Wheatley of Dartmouth College. More generally, the study shows how the eyes are windows into how the brain operates. “There’s so much information coming out of the eyes,” Wheatley says. Neuroscientist Masaki Tanaka of Hokkaido University School of Medicine in Japan and colleagues trained three Japanese macaques to look at a spot on a computer screen after precisely one second had elapsed. The study measured the monkeys’ subjective timing abilities: The monkeys had to rely on themselves to count the milliseconds. Just before each trial, the researchers measured pupil diameters. When the monkeys underestimated a second by looking too soon, their pupil sizes were slightly larger than in trials in which the monkeys overestimated a second, the researchers found. That means that when pupils were large, the monkeys felt time zoom by. But when pupils were small, time felt slower. |© Society for Science & the Public 2000 - 2016.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22818 - Posted: 11.02.2016

Nicola Davis The proficiency of elite football referees could be down to their eagle eyes, say researchers. A study of elite and sub-elite referees has found that a greater tendency to predict and watch contact zones between players contributes to the greater accuracy of top-level referees. “Over the years they develop so much experience that they now can anticipate, very well, future events so that they can already direct their attention to those pieces of information where they expect something to happen,” said lead author Werner Helsen from the University of Leuven. Keith Hackett, a former football referee and former general manager of the Professional Game Match Officials Limited, said the research chimed with his own experiences. “In working with elite referees for a number of years I have recognised their ability to see, recognise think and then act in a seamless manner,” he said. “They develop skill sets that enable them to see and this means good game-reading and cognitive skills to be in the right place at the right time.” Mistakes, he believes, often come down to poor visual perception. “Last week, we saw an elite referee fail to detect the violent act of [Moussa] Sissoko using his arm/elbow, putting his opponent’s safety at risk,” he said. “The review panel, having received confirmation from the referee that he failed to see the incident despite looking in the direction of the foul challenge, were able to act.” Writing in the journal Cognitive Research, researchers from the University of Leuven in Belgium and Brunel University in west London say they recruited 39 referees, 20 of whom were elite referees and 19 were experienced but had never refereed at a professional level. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22817 - Posted: 11.01.2016

By Jesse Singal For a long time, the United States’ justice system has been notorious for its proclivity for imprisoning children. Because of laws that grant prosecutors and judges discretion to bump juveniles up to the category of “adult” when they commit crimes deemed serious enough by the authorities, the U.S. is an outlier in locking up kids, with some youthful defendants even getting life sentences. Naturally, this has attracted a great deal of outrage and advocacy from human-rights organizations, who argue that kids, by virtue of not lacking certain judgment, foresight, and decision-making abilities, should be treated a bit more leniently. Writing for the Marshall Project and drawing on some interesting brain science, Dana Goldstein takes the argument about youth incarceration even further: We should also rethink our treatment of offenders who are young adults. As Goldstein explains, the more researchers study the brain, the more they realize that it takes decades for the organ to develop fully and to impart to its owners their full, adult capacities for reasoning. “Altogether,” she writes, “the research suggests that brain maturation continues into one’s twenties and even thirties.” Many of these insights come from the newest generation of neuroscience research. “Everyone has always known that there are behavioral changes throughout the lifespan,” Catherine Lebel, an assistant professor of radiology at the University of Calgary who has conducted research into brain development, told Goldstein. “It’s only with new imaging techniques over the last 15 years that we’ve been able to get at some of these more subtle changes.” ! © 2016, New York Media LLC.

Related chapters from BP7e: Chapter 7: Life-Span Development of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 22813 - Posted: 11.01.2016

By Diana Kwon Can you feel your heart beating? Most people cannot, unless they are agitated or afraid. The brain masks the sensation of the heart in a delicate balancing act—we need to be able to feel our pulse racing occasionally as an important signal of fear or excitement, but most of the time the constant rhythm would be distracting or maddening. A growing body of research suggests that because of the way the brain compensates for our heartbeat, it may be vulnerable to perceptual illusions—if they are timed just right. In a study published in May in the Journal of Neuroscience, a team at the Swiss Federal Institute of Technology in Lausanne conducted a series of studies on 143 participants and found that subjects took longer to identify a flashing object when it appeared in sync with the rhythm of their heartbeats. Using functional MRI, they also found that activity in the insula, a brain area associated with self-awareness, was suppressed when people viewed these synchronized images. The authors suggest that the flashing object was suppressed by the brain because it got lumped in with all the other bodily changes that occur with each heartbeat—the eyes make tiny movements, eye pressure changes slightly, the chest expands and contracts. “The brain knows that the heartbeat is coming from the self, so it doesn't want to be bothered by the sensory consequences of these signals,” says Roy Salomon, one of the study's co-authors. © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22810 - Posted: 10.31.2016

By Melissa Dahl A rule that spans time and space and morning routines: It is entirely too easy to underestimate the time it takes to get to work. Maybe once — one time — it took just 20 minutes to get to work, but it typically takes 25 to 30, and you know that, but still you leave late and, thus, arrive late. It’s dumb. It is also, maybe, human nature. As Christian Jarrett at BPS Research Digest reports, a team of neuroscientists has just uncovered a very handy if rather complicated excuse for tardiness — it seems people tend to underestimate how long it will take to travel familiar routes. The laws of time and space do not actually bend in order to transport you to work or school more quickly, but at least part of you believes that they will. And yet the oddest part of this new study, published in the journal Hippocampus, is that the participants tended to overestimate the physical length of those routes, even as they underestimated how long it would take to travel them. It does make a certain amount of sense that people would exaggerate the breadth of familiar distances, because the level of detail you’ve stored about them matters to your memory. If you remember every Starbucks and street corner you pass on the way you usually walk to school, for instance, the walking route will likely feel longer when you recall it than one you don’t know as well. As Jarrett explains, the researchers “thought a more detailed neural representation would make that space seem larger.” And when they asked a group of students — all of whom had been living in the same building in London for 9 months — to draw a little map of their neighborhood, this is indeed what they found. The students exaggerated the physical distance of the routes they walked the most, drawing their maps a little bigger they should have. © 2016, New York Media LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22797 - Posted: 10.28.2016

By Kensy Cooperrider, Rafael Núñez “What is the difference between yesterday and tomorrow?” The Yupno man we were interviewing, Danda, paused to consider his answer. A group of us sat on a hillside in the Yupno Valley, a remote nook high in the mountains of Papua New Guinea. Only days earlier we had arrived on a single-engine plane. After a steep hike from the grass airstrip, we found ourselves in the village of Gua, one of about 20 Yupno villages dotting the rugged terrain. We came all the way here because we are interested in time—in how Yupno people understand concepts such as past, present and future. Are these ideas universal, or are they products of our language, our culture and our environment? As we interviewed Danda and others in the village, we listened to what they said about time, but we paid even closer attention to what they did with their hands as they spoke. Gestures can be revealing. Ask English speakers about the difference between yesterday and tomorrow, and they might thrust a hand over the shoulder when referring to the past and then forward when referring to the future. Such unreflective movements reveal a fundamental way of thinking in which the past is at our backs, something that we “leave behind,” and the future is in front of us, something to “look forward” to. Would a Yupno speaker do the same? Danda was making just the kinds of gestures we were hoping for. As he explained the Yupno word for “yesterday,” his hand swept backward; as he mentioned “tomorrow,” it leaped forward. We all sat looking up a steep slope toward a jagged ridge, but as the light faded, we changed the camera angle, spinning around so that we and Danda faced in the opposite direction, downhill. With our backs now to the ridge, we looked over the Yupno River meandering toward the Bismarck Sea. “Let's go over that one more time,” we suggested. © 2016 Scientific American,

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22778 - Posted: 10.22.2016

By Catherine Caruso Imagine you are faced with the classic thought experiment dilemma: You can take a pile of money now or wait and get an even bigger stash of cash later on. Which option do you choose? Your level of self-control, researchers have found, may have to do with a region of the brain that lets us take the perspective of others—including that of our future self. A study, published today in Science Advances, found that when scientists used noninvasive brain stimulation to disrupt a brain region called the temporoparietal junction (TPJ), people appeared less able to see things from the point of view of their future selves or of another person, and consequently were less likely to share money with others and more inclined to opt for immediate cash instead of waiting for a larger bounty at a later date. The TPJ, which is located where the temporal and parietal lobes meet, plays an important role in social functioning, particularly in our ability to understand situations from the perspectives of other people. However, according to Alexander Soutschek, an economist at the University of Zurich and lead author on the study, previous research on self-control and delayed gratification has focused instead on the prefrontal brain regions involved in impulse control. “When you have a closer look at the literature, you sometimes find in the neuroimaging data that the TPJ is also active during delay of gratification,” Soutschek says, “but it's never interpreted.” © 2016 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22772 - Posted: 10.20.2016

Emily Badger One of the newest chew toys in the presidential campaign is “implicit bias,” a term Mike Pence repeatedly took exception to in the vice-presidential debate on Tuesday. Police officers hear all this badmouthing, said Mr. Pence, Donald J. Trump’s running mate, in response to a question about whether society demands too much of law enforcement. They hear politicians painting them with one broad brush, with disdain, with automatic cries of implicit bias. He criticized Hillary Clinton for saying, in the first presidential debate, that everyone experiences implicit bias. He suggested a black police officer who shoots a black civilian could not logically experience such bias. “Senator, please,” Mr. Pence said, addressing his Democratic opponent, Tim Kaine, “enough of this seeking every opportunity to demean law enforcement broadly by making the accusation of implicit bias every time tragedy occurs.” The concept, in his words, came across as an insult, a put-down on par with branding police as racists. Many Americans may hear it as academic code for “racist.” But that connotation does not line up with scientific research on what implicit bias is and how it really operates. Researchers in this growing field say it isn’t just white police officers, but all of us, who have biases that are subconscious, hidden even to ourselves. Implicit bias is the mind’s way of making uncontrolled and automatic associations between two concepts very quickly. In many forms, implicit bias is a healthy human adaptation — it’s among the mental tools that help you mindlessly navigate your commute each morning. It crops up in contexts far beyond policing and race (if you make the rote assumption that fruit stands have fresher produce, that’s implicit bias). But the same process can also take the form of unconsciously associating certain identities, like African-American, with undesirable attributes, like violence. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22730 - Posted: 10.08.2016

André Corrêa d’Almeida and Amanda Sue Grossi Development. Poverty. Africa. These are just three words on a page – almost no information at all – but how many realities did our readers just conjure? And how many thoughts filled the spaces in-between? Cover yourselves. Your biases are showing. In the last few decades, groundbreaking work by psychologists and behavioural economists has exposed unconscious biases in the way we think. And as the World Bank’s 2015 World Development Report points out, development professionals are not immune to these biases. There is a real possibility that seemingly unbiased and well-intentioned development professionals are capable of making consequential mistakes, with significant impacts upon the lives of others, namely the poor. The problem arises when mindsets are just that – set. As the work of Daniel Kahneman and Amos Tversky has shown, development professionals – like people generally – have two systems of thinking; the automatic and the deliberative. For the automatic, instead of performing complex rational calculations every time we need to make a decision, much of our thinking relies on pre-existing mental models and shortcuts. These are based on assumptions we create throughout our lives and that stem from our experiences and education. More often than not, these mental models are incomplete and shortcuts can lead us down the wrong path. Thinking automatically then becomes thinking harmfully. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22653 - Posted: 09.15.2016

Laura Sanders By sneakily influencing brain activity, scientists changed people’s opinions of faces. This covert neural sculpting relied on a sophisticated brain training technique in which people learn to direct their thoughts in specific ways. The results, published September 8 in PLOS Biology, support the idea that neurofeedback methods could help reveal how the brain’s behavior gives rise to perceptions and emotions. What’s more, the technique may ultimately prove useful for easing traumatic memories and treating disorders such as depression. The research is still at an early stage, says neurofeedback researcher Michelle Hampson of Yale University, but, she notes, “I think it has great promise.” Takeo Watanabe of Brown University and colleagues used functional MRI to measure people’s brain activity in an area called the cingulate cortex as participants saw pictures of faces. After participants had rated each face, a computer algorithm sorted their brain responses into patterns that corresponded to faces they liked and faces they disliked. With this knowledge in hand, the researchers then attempted to change people’s face preferences by subtly nudging brain activity in the cingulate cortex. In step 2 of the experiment, returning to the fMRI scanner, participants saw an image of a face that they had previously rated as neutral. Just after that, they were shown a disk. The goal, the participants were told, was simple: make the disk bigger by using their brains. They had no idea that the only way to make the disk grow was to think in a very particular way. |© Society for Science & the Public 2000 - 201

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22646 - Posted: 09.12.2016

Chris Chambers One of the most compelling impressions in everyday life is that wherever we look, we “see” everything that is happening in front of us – much like a camera. But this impression is deceiving. In reality our senses are bombarded by continual waves of stimuli, triggering an avalanche of sensations that far exceed the brain’s capacity. To make sense of the world, the brain needs to determine which sensations are the most important for our current goals, focusing resources on the ones that matter and throwing away the rest. These computations are astonishingly complex, and what makes attention even more remarkable is just how effortless it is. The mammalian attention system is perhaps the most efficient and precisely tuned junk filter we know of, refined through millions of years of annoying siblings (and some evolution). Attention is amazing but no system is ever perfect. Our brain’s computational reserves are large but not infinite, and under the right conditions we can “break it” and peek behind the curtain. This isn’t just a fun trick – understanding these limits can yield important insights into psychology and neurobiology, helping us to diagnose and treat impairments that follow brain injury and disease. Thanks to over a hundred years of psychology research, it’s relatively easy to reveal attention in action. One way is through the phenomenon of change blindness. Try it yourself by following the instructions in the short video below (no sound). When we think of the term “blindness” we tend to assume a loss of vision caused by damage to the eye or optic nerves. But as you saw in the video, change blindness is completely normal and is caused by maxing out your attentional capacity. © 2016 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22633 - Posted: 09.06.2016

A new study by investigators at Brigham and Women's Hospital in collaboration with researchers at the University of York and Leeds in the UK and MD Andersen Cancer Center in Texas puts to the test anecdotes about experienced radiologists' ability to sense when a mammogram is abnormal. In a paper published August 29 in the Proceedings of the National Academy of Sciences, visual attention researchers showed radiologists mammograms for half a second and found that they could identify abnormal mammograms at better than chance levels. They further tested this ability through a series of experiments to explore what signal may alert radiologists to the presence of a possible abnormality, in the hopes of using these insights to improve breast cancer screening and early detection. "Radiologists can have 'hunches' after a first look at a mammogram. We found that these hunches are based on something real in the images. It's really striking that in the blink of an eye, an expert can pick up on something about that mammogram that indicates abnormality," said Jeremy Wolfe, PhD, senior author of the study and director of the Visual Attention Laboratory at BWH. "Not only that, but they can detect something abnormal in the other breast, the breast that does not contain a lesion." In the clinic, radiologists carefully evaluate mammograms and may use computer automated systems to help screen the images. Although they would never assess an image in half a second in the clinic, the ability of experts to extract the "gist" of an image quickly suggests that there may be a detectable signs of breast cancer that radiologists are rapidly picking up. Copyright 2016 ScienceDaily

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 22627 - Posted: 09.05.2016

By Jessica Hamzelou Feel like you’ve read this before? Most of us have experienced the eerie familiarity of déjà vu, and now the first brain scans of this phenomenon have revealed why – it’s a sign of our brain checking its memory. Déjà vu was thought to be caused by the brain making false memories, but research by Akira O’Connor at the University of St Andrews, UK, and his team now suggests this is wrong. Exactly how déjà vu works has long been a mystery, partly because its fleeting and unpredictable nature makes it difficult to study. To get around this, O’Connor and his colleagues developed a way to trigger the sensation of déjà vu in the lab. The team’s technique uses a standard method to trigger false memories. It involves telling a person a list of related words – such as bed, pillow, night, dream – but not the key word linking them together, in this case, sleep. When the person is later quizzed on the words they’ve heard, they tend to believe they have also heard “sleep” – a false memory. To create the feeling of déjà vu, O’ Connor’s team first asked people if they had heard any words beginning with the letter “s”. The volunteers replied that they hadn’t. This meant that when they were later asked if they had heard the word sleep, they were able to remember that they couldn’t have, but at the same time, the word felt familiar. “They report having this strange experience of déjà vu,” says O’Connor. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 22565 - Posted: 08.17.2016

By EUGENE M. CARUSO, ZACHARY C. BURNS and BENJAMIN A. CONVERSE Watching slow-motion footage of an event can certainly improve our judgment of what happened. But can it also impair judgment? This question arose in the 2009 murder trial of a man named John Lewis, who killed a police officer during an armed robbery of a Dunkin’ Donuts in Philadelphia. Mr. Lewis pleaded guilty; the only question for the jury was whether the murder resulted from a “willful, deliberate and premeditated” intent to kill or — as Mr. Lewis argued — from a spontaneous, panicked reaction to seeing the officer enter the store unexpectedly. The key piece of evidence was a surveillance video of the shooting, which the jury saw both in real time and in slow motion. The jury found that Mr. Lewis had acted with premeditation, and he was sentenced to death. Mr. Lewis appealed the decision, arguing that the slow-motion video was prejudicial. Specifically, he claimed that watching the video in slow motion artificially stretched the relevant time period and created a “false impression of premeditation.” Did it? We recently conducted a series of experiments whose results are strikingly consistent with that claim. Our studies, published this week in the Proceedings of the National Academy of Sciences, show that seeing replays of an action in slow motion leads viewers to believe that the actor had more time to think before acting than he actually did. The result is that slow motion makes actions seem more intentional, more premeditated. In one of our studies, participants watched surveillance video of a fatal shooting that occurred outside a convenience store during an armed robbery. We gave them a set of instructions similar to those given to the jurors in Mr. Lewis’s case, asking them to decide whether the crime was premeditated or not. We assigned half our participants to watch the video in slow motion and the other half to watch it at regular speed. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22525 - Posted: 08.08.2016

By BENEDICT CAREY Solving a hairy math problem might send a shudder of exultation along your spinal cord. But scientists have historically struggled to deconstruct the exact mental alchemy that occurs when the brain successfully leaps the gap from “Say what?” to “Aha!” Now, using an innovative combination of brain-imaging analyses, researchers have captured four fleeting stages of creative thinking in math. In a paper published in Psychological Science, a team led by John R. Anderson, a professor of psychology and computer science at Carnegie Mellon University, demonstrated a method for reconstructing how the brain moves from understanding a problem to solving it, including the time the brain spends in each stage. The imaging analysis found four stages in all: encoding (downloading), planning (strategizing), solving (performing the math), and responding (typing out an answer). “I’m very happy with the way the study worked out, and I think this precision is about the limit of what we can do” with the brain imaging tools available, said Dr. Anderson, who wrote the report with Aryn A. Pyke and Jon M. Fincham, both also at Carnegie Mellon. To capture these quicksilver mental operations, the team first taught 80 men and women how to interpret a set of math symbols and equations they had not seen before. The underlying math itself wasn’t difficult, mostly addition and subtraction, but manipulating the newly learned symbols required some thinking. The research team could vary the problems to burden specific stages of the thinking process — some were hard to encode, for instance, while others extended the length of the planning stage. The scientists used two techniques of M.R.I. data analysis to sort through what the participants’ brains were doing. One technique tracked the neural firing patterns during the solving of each problem; the other identified significant shifts from one kind of mental state to another. The subjects solved 88 problems each, and the research team analyzed the imaging data from those solved successfully. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 22496 - Posted: 07.30.2016