Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 526

Laura Sanders Not too long ago, the internet was stationary. Most often, we’d browse the Web from a desktop computer in our living room or office. If we were feeling really adventurous, maybe we’d cart our laptop to a coffee shop. Looking back, those days seem quaint. Today, the internet moves through our lives with us. We hunt Pokémon as we shuffle down the sidewalk. We text at red lights. We tweet from the bathroom. We sleep with a smartphone within arm’s reach, using the device as both lullaby and alarm clock. Sometimes we put our phones down while we eat, but usually faceup, just in case something important happens. Our iPhones, Androids and other smartphones have led us to effortlessly adjust our behavior. Portable technology has overhauled our driving habits, our dating styles and even our posture. Despite the occasional headlines claiming that digital technology is rotting our brains, not to mention what it’s doing to our children, we’ve welcomed this alluring life partner with open arms and swiping thumbs. Scientists suspect that these near-constant interactions with digital technology influence our brains. Small studies are turning up hints that our devices may change how we remember, how we navigate and how we create happiness — or not. Somewhat limited, occasionally contradictory findings illustrate how science has struggled to pin down this slippery, fast-moving phenomenon. Laboratory studies hint that technology, and its constant interruptions, may change our thinking strategies. Like our husbands and wives, our devices have become “memory partners,” allowing us to dump information there and forget about it — an off-loading that comes with benefits and drawbacks. Navigational strategies may be shifting in the GPS era, a change that might be reflected in how the brain maps its place in the world. Constant interactions with technology may even raise anxiety in certain settings. |© Society for Science & the Public 2000 - 2017

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 23385 - Posted: 03.21.2017

By Nicole Mortillaro, CBC News Have you ever been witness to an event with a friend only to conclude you both had different accounts about what had occurred? This is known as perception bias. Our views and beliefs can cloud the way we perceive things — and perception bias can take on many forms. New research published in the Journal of Personality and Social Psychology found that people tend to perceive young black men as larger, stronger and more threatening than white men of the same size. This, the authors say, could place them at risk in situations with police. The research was prompted by recent police shootings against black men in the United States — particularly those involving descriptions of men that didn't correspond with reality. Take, for example, the case of Dontre Hamilton. In 2014, the unarmed Hamilton was shot 14 times and killed by police in Milkwaukee. The officer involved testified that he believed he would have been easily overpowered by Hamilton, who he described as having a muscular build. But the autopsy report found that Hamilton was just five foot seven and weighed 169 pounds. Looking at the Hamilton case, as well as many other examples, the researchers sought to determine whether or not there were psychologically driven preconceived notions about black men over white men. ©2017 CBC/Radio-Canada.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 23359 - Posted: 03.15.2017

Laurel Hamers Mistakes can be learning opportunities, but the brain needs time for lessons to sink in. When facing a fast and furious stream of decisions, even the momentary distraction of noting an error can decrease accuracy on the next choice, researchers report in the March 15 Journal of Neuroscience. “We have a brain region that monitors and says ‘you messed up’ so that we can correct our behavior,” says psychologist George Buzzell, now at the University of Maryland in College Park. But sometimes, that monitoring system can backfire, distracting us from the task at hand and causing us to make another error. “There does seem to be a little bit of time for people, after mistakes, where you're sort of offline,” says Jason Moser, a psychologist at Michigan State University in East Lansing, who wasn’t part of the study. To test people’s response to making mistakes, Buzzell and colleagues at George Mason University in Fairfax, Va., monitored 23 participants’ brain activity while they worked through a challenging task. Concentric circles flashed briefly on a screen, and participants had to respond with one hand if the two circles were the same color and the other hand if the circles were subtly different shades. After making a mistake, participants generally answered the next question correctly if they had a second or so to recover. But when the next challenge came very quickly after an error, as little as 0.2 seconds, accuracy dropped by about 10 percent. Electrical activity recorded from the visual cortex showed that participants paid less attention to the next trial if they had just made a mistake than if they had responded correctly. |© Society for Science & the Public 2000 - 2017

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 23358 - Posted: 03.15.2017

By PHILIP FERNBACH and STEVEN SLOMAN How can so many people believe things that are demonstrably false? The question has taken on new urgency as the Trump administration propagates falsehoods about voter fraud, climate change and crime statistics that large swaths of the population have bought into. But collective delusion is not new, nor is it the sole province of the political right. Plenty of liberals believe, counter to scientific consensus, that G.M.O.s are poisonous, and that vaccines cause autism. The situation is vexing because it seems so easy to solve. The truth is obvious if you bother to look for it, right? This line of thinking leads to explanations of the hoodwinked masses that amount to little more than name calling: “Those people are foolish” or “Those people are monsters.” Such accounts may make us feel good about ourselves, but they are misguided and simplistic: They reflect a misunderstanding of knowledge that focuses too narrowly on what goes on between our ears. Here is the humbler truth: On their own, individuals are not well equipped to separate fact from fiction, and they never will be. Ignorance is our natural state; it is a product of the way the mind works. What really sets human beings apart is not our individual mental capacity. The secret to our success is our ability to jointly pursue complex goals by dividing cognitive labor. Hunting, trade, agriculture, manufacturing — all of our world-altering innovations — were made possible by this ability. Chimpanzees can surpass young children on numerical and spatial reasoning tasks, but they cannot come close on tasks that require collaborating with another individual to achieve a goal. Each of us knows only a little bit, but together we can achieve remarkable feats. © 2017 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23316 - Posted: 03.06.2017

By Ruth Williams Scientists at New York University’s School of Medicine have probed the deepest layers of the cerebral cortices of mice to record the activities of inhibitory interneurons when the animals are alert and perceptive. The team’s findings reveal that these cells exhibit different activities depending on the cortical layer they occupy, suggesting a level of complexity not previously appreciated. In their paper published in Science today (March 2), the researchers also described the stimulatory and inhibitory inputs that regulate these cells, adding further details to the picture of interneuron operations within the cortical circuitry. “It is an outstanding example of circuit analysis and a real experimental tour de force,” said neuroscientist Massimo Scanziani of the University of California, San Diego, who was not involved in the work. Christopher Moore of Brown University in Providence, Rhode Island, who also did not participate in the research, echoed Scanziani’s sentiments. “It’s just a beautiful paper,” he said. “They do really hard experiments and come up with what seem to be really valid [observations]. It’s a well-done piece of work.” The mammalian cerebral cortex is a melting pot of information, where signals from sensory inputs, emotions, and memories are combined and processed to produce a coherent perception of the world. Excitatory cells are the most abundant type of cortical neurons and are thought to be responsible for the relay and integration of this information, while the rarer interneurons inhibit the excitatory cells to suppress information flow. Interneurons are “a sort of gatekeeper in the cortex,” said Scanziani. © 1986-2017 The Scientist

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 23314 - Posted: 03.04.2017

By Catherine Caruso When a football player clocks an opponent on the field, it often does not look so bad—until we see it in slow motion. Suddenly, a clean, fair tackle becomes a dirty play, premeditated to maim (as any bar full of indignant fans will loudly confirm). But why? A study published last August in the Proceedings of the National Academy of Sciences USA suggests that slow motion leads us to believe that the people involved were acting with greater intent. Researchers designed experiments based on a place where slow-motion video comes up a lot: the courtroom. They asked subjects to imagine themselves as jurors and watch a video of a convenience store robbery and shooting, either in slow motion or in real time. Those who watched the slow-motion video reported thinking the robber had more time to act and was acting with greater intent. The effect persisted even when the researchers displayed a timer on the screen to emphasize exactly how much time was passing, and it was reduced yet still present when subjects watched a combination of real-time and slow-motion videos of the crime (as they might in an actual courtroom). Participants also ascribed greater intent to a football player ramming an opponent when they viewed the play in slow motion. Werner Helsen, a kinesiologist at the University of Leuven in Belgium, who was not involved in the study, says the findings are in line with his own research on perception and decision making in crime scene interventions and violent soccer plays. One possible explanation for this slo-mo effect stems from our sense of time, which author Benjamin Converse, a psychologist at the University of Virginia, describes as “quite malleable.” He explains that when we watch footage in slow motion, we cannot help but assume that because we as viewers have more time to think through the events as they unfold, the same holds true for the people in the video. © 2017 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23311 - Posted: 03.04.2017

By Drake Baer If you’re going to get any sort of science done, an experiment needs a control group: the unaffected, possibly placebo-ed population that didn’t take part in whatever intervention it is you’re trying to study. Back in the earlier days of cognitive neuroscience, the control condition was intuitive enough: Just let the person in the brain scanner lie in repose, awake yet quiet, contemplating the tube they’re inside of. But in 1997, 2001, and beyond, studies kept coming out saying that it wasn’t much of a control at all. When the brain is “at rest,” it’s doing anything but resting. When you don’t give its human anything to do, brain areas related to processing emotions, recalling memory, and thinking about what’s to come become quietly active. These self-referential streams of thought are so pervasive that in a formative paper Marcus Raichle, a Washington University neurologist who helped found the field, declared it to be the “the default mode of brain function,” and the constellation of brain areas that carry it out are the default mode network, or DMN. Because when given nothing else to do, the brain defaults to thinking about the person it’s embedded in. Since then, the DMN has been implicated in everything from depression to creativity. People who daydream more tend to have a more active DMN; relatedly, dreaming itself appears to be an amplified version of mind-wandering. In Buddhist traditions, this chattering described by neuroscientists as the default mode is a dragon to be tamed, if not slain. Chögyam Trungpa, who was instrumental in bringing Tibetan Buddhism to the U.S., said the meditation practice is “necessary generally because our thinking pattern, our conceptualized way of conducting our life in the world, is either too manipulative, imposing itself upon the world, or else runs completely wild and uncontrolled,” he wrote in Cutting Through Spiritual Materialism. “Therefore, our meditation practice must begin with ego’s outermost layer, the discursive thoughts which continually run through our minds, our mental gossip.” © 2017, New York Media LLC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 23296 - Posted: 03.01.2017

By JOANNA KLEIN The good news is, the human brain is flexible and efficient. This helps us make sense of the world. But the bad news is, the human brain is flexible and efficient. This means the brain can sometimes make mistakes. You can watch this tension play out when the brain tries to connect auditory and visual speech. It’s why we may find a poorly dubbed kung fu movie hard to believe, and why we love believing the gibberish in those Bad Lip Reading Videos on YouTube. “By dubbing speech that is reasonably consistent with the available mouth movements, we can utterly change the meaning of what the original talker was saying,” said John Magnotti, a neuroscientist at Baylor College of Medicine in Texas. “Sometimes we can detect that something is a little off, but the illusion is usually quite compelling.” In a study published Thursday in PLOS Computational Biology, Dr. Magnotti and Michael Beauchamp, also a neuroscientist at Baylor College of Medicine, tried to pin down why our brains are susceptible to these kinds of perceptual mistakes by looking at a well-known speech illusion called the McGurk effect. By comparing mathematical models for how the brain integrates senses important in detecting speech, they found that the brain uses vision, hearing and experience when making sense of speech. If the mouth and voice are likely to come from the same person, the brain combines them; otherwise, they are kept separate. “You may think that when you’re talking to someone you’re just listening to their voice,” said Dr. Beauchamp, who led the study. “But it turns out that what their face is doing is actually profoundly influencing what you are perceiving.” © 2017 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 23261 - Posted: 02.21.2017

By Sam Wong Can a mouse be mindful? Researchers believe they have created the world’s first mouse model of meditation by using light to trigger brain activity similar to what meditation induces. The mice involved appeared less anxious, too. Human experiments show that meditation reduces anxiety, lowers levels of stress hormones and improves attention and cognition. In one study of the effects of two to four weeks of meditation training, Michael Posner of the University of Oregon and colleagues discovered changes in the white matter in volunteers’ brains, related to the efficiency of communication between different brain regions. The changes, picked up in scans, were particularly noticeable between the anterior cingulate cortex (ACC) and other areas. Since the ACC regulates activity in the amygdala, a region that controls fearful responses, Posner’s team concluded that the changes in white matter could be responsible for meditation’s effects on anxiety. The mystery was how meditation could alter the white matter in this way. Posner’s team figured that it was related to changes in theta brainwaves, measured using electrodes on the scalp. Meditation increases theta wave activity, even when people are no longer meditating. To test the theory, the team used optogenetics – they genetically engineered certain cells to be switched on by light. In this way, they were able to use pulses of light on mice to stimulate theta brainwave-like activity in the ACC. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23260 - Posted: 02.21.2017

by Bethany Brookshire Gender bias works in subtle ways, even in the scientific process. The latest illustration of that: Scientists recommend women less often than men as reviewers for scientific papers, a new analysis shows. That seemingly minor oversight is yet another missed opportunity for women that might end up having an impact on hiring, promotions and more. Peer review is one of the bricks in the foundation supporting science. A researcher’s results don’t get published in a journal until they successfully pass through a gauntlet of scientific peers, who scrutinize the paper for faulty findings, gaps in logic or less-than-meticulous methods. The scientist submitting the paper gets to suggest names for those potential reviewers. Scientific journal editors may contact some of the recommended scientists, and then reach out to a few more. But peer review isn’t just about the paper (and scientist) being examined. Being the one doing the reviewing “has a number of really positive benefits,” says Brooks Hanson, an earth scientist and director of publications at the American Geophysical Union in Washington, D.C. “You read papers differently as a reviewer than you do as a reader or author. You look at issues differently. It’s a learning experience in how to write papers and how to present research.” Serving as a peer reviewer can also be a networking tool for scientific collaborations, as reviewers seek out authors whose work they admired. And of course, scientists put the journals they review for on their resumes when they apply for faculty positions, research grants and awards. |© Society for Science & the Public 2000 - 2017.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 8: Hormones and Sex
Link ID: 23185 - Posted: 02.04.2017

Noah Charney The Chinese government just arrested a group of people associated with a sham tourist attraction that had lured hundreds of sight-seers to a fake Terracotta Warriors exhibit, comprised entirely of modern replicas. Sotheby’s recently hired Jamie Martin of Orion Analytical, a forensic specialist at testing art, who then discovered that a Parmigianino painting recently sold is actually a modern forgery (Sotheby’s returned the buyer’s money and then sued the person for whom they sold it). And the Ringling Museum in Sarasota, Florida, is hoping that a painting of Philip IV of Spain in their collection will be definitively determined to be by Velazquez, and not a copy in the style of Velazquez. And that’s just in the last week or so. Art forgery and authenticity seems to be in the news just about every week (to my publicist’s delight). But I’m on a bit of a brainstorm. After my interview with Nobel Prize winner Dr. Eric Kandel on the neuroscience behind how we humans understand art, I’ve developed a keen interest in art and the mind. I tackled selfies, self-portraits and facial recognition recently, as well as what happens when the brain fails to function properly and neglects to recognize the value of art. Since my last book was a history of forgery, it was perhaps inevitable that I would wonder about the neurology of the recognition of originals versus copies. But while I looked into forgery from a wide variety of angles for the book, neuroscience was not one of them. © 2017 Salon Media Group, Inc.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 7: Vision: From Eye to Brain
Link ID: 23166 - Posted: 01.30.2017

By ADAM BEAR and JOSHUA KNOBE What’s normal? Perhaps the answer seems obvious: What’s normal is what’s typical — what is average. But in a recent paper in the journal Cognition, we argue that the situation is more complicated than that. After conducting a series of experiments that examined how people decide whether something is normal or not, we found that when people think about what is normal, they combine their sense of what is typical with their sense of what is ideal. Normal, in other words, turns out to be a blend of statistical and moral notions. Our key finding can be illustrated with a simple example. Ask yourself, “What is the average number of hours of TV that people watch in a day?” Then ask yourself a question that might seem very similar: “What is the normal number of hours of TV for a person to watch in a day?” If you are like most of our experimental participants, you will not give the same answer to the second question that you give to the first. Our participants said the “average” number was about four hours and the “normal” number was about three hours. In addition, they said that the “ideal” number was about 2.5 hours. This has an interesting implication. It suggests that people’s conception of the normal deviates from the average in the direction of what they think ought to be so. Our studies found this same pattern in numerous other cases: the normal grandmother, the normal salad, the normal number of students to be bullied in a middle school. Again and again, our participants did not take the normal to be the same as the average. Instead, what people picked out as the “normal thing to do” or a “normal such-and-such” tended to be intermediate between what they thought was typical and what they thought was ideal. © 2017 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23165 - Posted: 01.30.2017

By Carl Bialik A woman has never come closer to the presidency than Hillary Clinton did in winning the popular vote in November. Yet as women march in Washington on Saturday, many of them to protest the presidency of Donald Trump, an important obstacle to the first woman president remains: the hidden, internalized bias many people hold against career advancement by women. And perhaps surprisingly, there is evidence that women hold more of this bias, on average, than men do. There has been lots of discussion of the role that overt sexism played in both Trump’s campaign and at the ballot box. A YouGov survey conducted two weeks before the election, for example, found that Trump voters had much higher levels of sexism, on average, than Clinton voters, as measured by their level of agreement with statements such as “women seek to gain power by getting control over men.” An analysis of the survey found that sexism played a big role in explaining people’s votes, after controlling for other factors, including gender and political ideology. Other research has reached similar conclusions. Two recent studies of voters, however, suggest that another, subtler form of bias may also have been a factor in the election. These studies looked at what’s known as “implicit bias,” the unconscious tendency to associate certain qualities with certain groups — in this case, the tendency to associate men with careers and women with family. Researchers have found that this kind of bias is stronger on average in women than in men, and, among women, it is particularly strong among political conservatives. And at least according to one study, this unconscious bias was especially strong among one group in 2016: women who supported Trump.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23134 - Posted: 01.23.2017

By Jordan Axt Imagine playing a game where you’re seated in front of four decks of cards. On the back of two decks are pictures of puppies; on the other two are pictures of spiders. Each deck has some cards that win points and others that lose points. In general, the puppy decks are “good” in that they win you more points than they lose while the spider decks are “bad” in that they lose you more points they win. You repeatedly select cards in hopes of winning as many points as possible. This game seems pretty easy— and it is. Most players favor the puppy decks from the start and quickly learn to continue favoring them because they produce more points. However, if the pictures on the decks are reversed, the game becomes a little harder. People may have a tougher time initially favoring spider decks because it’s difficult to learn that something people fear like spiders brings positive outcomes and something people enjoy like puppies brings negative outcomes. Performance on this learning task is best when one’s attitudes and motivations are aligned. For instance, when puppies earn you more points than spiders, people’s preference for puppies can lead people to select more puppies initially, and a motivation to earn as many points as possible leads people to select more and more puppies over time. But when spiders earn you more points than spiders, people have to overcome their initial aversion to spiders in order to perform well. © 2017 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23130 - Posted: 01.21.2017

By Rachael Lallensack A video game is helping researchers learn more about how tiny European starlings keep predators at bay. Their massive flocks, consisting of hundreds to thousands of birds, fly together in a mesmerizing, pulsating pattern called a murmuration. For a long time, researchers have suspected that the bigger the flock, the harder it is for predators like falcons and hawks to take down any one member, something known as “confusion effect.” Now, researchers have analyzed that effect—in human hunters. Using the first 3D computer program to simulate a murmuration, scientists tested how well 25 players, acting as flying predators, could target and pursue virtual starlings, whose movements were simulated based on data from real starling flocks (see video above). The team’s findings reaffirmed the confusion effect: The larger the simulated flocks, the harder it was for the “predators” to single out and catch individual prey, the researchers report this week in Royal Society Open Science. So maybe sometimes, it’s not so bad to get lost in a crowd. © 2017 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 10: Vision: From Eye to Brain; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 7: Vision: From Eye to Brain; Chapter 14: Attention and Consciousness
Link ID: 23115 - Posted: 01.18.2017

By Alan Burdick Some nights—more than I like, lately—I wake to the sound of the bedside clock. The room is dark, without detail, and it expands in such a way that it seems as if I’m outdoors, under an empty sky, or underground, in a cavern. I might be falling through space. I might be dreaming. I could be dead. Only the clock moves, its tick steady, unhurried. At these moments I have the most chilling understanding that time moves in only one direction. I’m tempted to look at the clock, but I already know that it’s the same time it always is: 4 A.M., or 4:10 A.M., or once, for a disconcerting stretch of days, 4:27 A.M. Even without looking, I could deduce the time from the ping of the bedroom radiator gathering steam in winter or the infrequency of the cars passing by on the street outside. In 1917, the psychologist Edwin G. Boring and his wife, Lucy, described an experiment in which they woke people at intervals to see if they knew what time it was; the average estimate was accurate to within fifty minutes, although almost everyone thought it was later than it actually was. They found that subjects were relying on internal or external signals: their degree of sleepiness or indigestion (“The dark brown taste in your mouth is never bad when you have been asleep only a short time”), the moonlight, “bladder cues,” the sounds of cars or roosters. “When a man is asleep, he has in a circle round him the chain of the hours, the sequence of the years, the order of the heavenly bodies,” Proust wrote. “Instinctively he consults them when he awakes, and in an instant reads off his own position on the earth’s surface and the time that has elapsed during his slumbers.” © 2017 Condé Nast.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 10: Biological Rhythms and Sleep
Link ID: 23109 - Posted: 01.16.2017

By Maggie Koerth-Baker “The president can’t have a conflict of interest,” Donald Trump told The New York Times in November. He appears to have meant that in the legal sense — the president isn’t bound by the same conflict-of-interest laws that loom over other executive branch officials and employees.1 But that doesn’t mean the president’s interests can’t be in conflict. When he takes office Jan. 20, Trump will be tangled in a wide array of situations in which his personal connections and business coffers are pulling him in one direction while the interests of the American presidency and people pull him in another. For example, Trump is the president of a vineyard in Virginia that’s requesting foreign worker visas from the government he’ll soon lead. He’s also involved in an ongoing business partnership with the Philippines’ diplomatic trade envoy — a relationship that could predispose Trump to accepting deals that are more favorable to that country than he otherwise might. Once he’s in office, he will appoint some members of the labor board that could hear disputes related to his hotels. Neither Trump nor his transition team replied to interview requests for this article, but his comments to the Times suggest that he genuinely believes he can be objective and put the country first, despite financial and social pressures to do otherwise. Unfortunately, science says he’s probably wrong.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23108 - Posted: 01.16.2017

By Victoria Gill Science reporter, BBC News Direct recordings have revealed what is happening in our brains as we make sense of speech in a noisy room. Focusing on one conversation in a loud, distracting environment is called "the cocktail party effect". It is a common festive phenomenon and of interest to researchers seeking to improve speech recognition technology. Neuroscientists recorded from people's brains during a test that recreated the moment when unintelligible speech suddenly makes sense. A team measured people's brain activity as the words of a previously unintelligible sentence suddenly became clear when a subject was told the meaning of the "garbled speech". The findings are published in the journal Nature Communications. Lead researcher Christopher Holdgraf from the University of California, Berkeley, and his colleagues were able to work with epilepsy patients, who had had a portion of their skull removed and electrodes placed on the brain surface to track their seizures. First, the researchers played a very distorted, garbled sentence to each subject, which almost no-one was able to understand. They then played a normal, easy to understand version of the same sentence and immediately repeated the garbled version. "After hearing the intact sentence" the researchers explained in their paper, all the subjects understood the subsequent "noisy version". The brain recordings showed this moment of recognition as brain activity patterns in the areas of the brain that are known to be associated with processing sound and understanding speech. When the subjects heard the very garbled sentence, the scientists reported that they saw little activity in those parts of the brain. Hearing the clearly understandable sentence then triggered patterns of activity in those brain areas. © 2016 BBC.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 9: Hearing, Vestibular Perception, Taste, and Smell
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 23004 - Posted: 12.22.2016

Answer by Paul King, Director of Data Science, on Quora: There are hundreds of surprising, perspective-shifting insights about the nature of reality that come from neuroscience. Every bizarre neurological syndrome, every visual illusion, and every clever psychological experiment reveals something entirely unexpected about our experience of the world that we take for granted. Here are a few to give a flavor: 1. Perceptual reality is entirely generated by our brain. We hear voices and meaning from air pressure waves. We see colors and objects, yet our brain only receives signals about reflected photons. The objects we perceive are a construct of the brain, which is why optical illusions can fool the brain. Recommended by Forbes 2. We see the world in narrow disjoint fragments. We think we see the whole world, but we are looking through a narrow visual portal onto a small region of space. You have to move your eyes when you read because most of the page is blurry. We don't see this, because as soon as we become curious about part of the world, our eyes move there to fill in the detail before we see it was missing. While our eyes are in motion, we should see a blank blur, but our brain edits this out. 3. Body image is dynamic and flexible. Our brain can be fooled into thinking a rubber arm or a virtual reality hand is actually a part of our body. In one syndrome, people believe one of their limbs does not belong to them. One man thought a cadaver limb had been sewn onto his body as a practical joke by doctors. 4. Our behavior is mostly automatic, even though we think we are controlling it.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 22980 - Posted: 12.13.2016

By DANIEL A. YUDKIN and JAY VAN BAVEL During the first presidential debate, Hillary Clinton argued that “implicit bias is a problem for everyone, not just police.” Her comment moved to the forefront of public conversation an issue that scientists have been studying for decades: namely, that even well-meaning people frequently harbor hidden prejudices against members of other racial groups. Studies have shown that these subtle biases are widespread and associated with discrimination in legal, economic and organizational settings. Critics of this notion, however, protest what they see as a character smear — a suggestion that everybody, deep down, is racist. Vice President-elect Mike Pence has said that an “accusation of implicit bias” in cases where a white police officer shoots a black civilian serves to “demean law enforcement.” Writing in National Review, David French claimed that the concept of implicit bias lets people “indict entire communities as bigoted.” But implicit bias is not about bigotry per se. As new research from our laboratory suggests, implicit bias is grounded in a basic human tendency to divide the social world into groups. In other words, what may appear as an example of tacit racism may actually be a manifestation of a broader propensity to think in terms of “us versus them” — a prejudice that can apply, say, to fans of a different sports team. This doesn’t make the effects of implicit bias any less worrisome, but it does mean people should be less defensive about it. Furthermore, our research gives cause for optimism: Implicit bias can be overcome with rational deliberation. In a series of experiments whose results were published in The Journal of Experimental Psychology: General, we set out to determine how severely people would punish someone for stealing. Our interest was in whether a perpetrator’s membership in a particular group would influence the severity of the punishment he or she received. © 2016 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 22979 - Posted: 12.12.2016