Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 541

by Helen Thompson Paper wasps have a knack for recognizing faces, and a new study adds to our understanding of what that means in a wasp’s brain. Most wasps of a given species look the same, but some species of paper wasp (Polistes sp.) display varied colors and markings. Recognizing these patterns is at the core of the wasps’ social interactions. One species, Polistes fuscatus, is especially good at detecting differences in faces — even better than they are at detecting other patterns. To zero on the roots of this ability, biologist Ali Berens of Georgia Tech and her colleagues set up recognition exercises of faces and basic patterns for P. fuscatus wasps and P. metricus wasps — a species that doesn’t naturally recognize faces but can be trained to do so in the lab. After the training, scientists extracted DNA from the wasps’ brains and looked at which bits of DNA or genes were active. The researchers found 237 genes that were at play only in P. fuscatus during facial recognition tests. A few of the genes have been linked to honeybee visual learning, and some correspond to brain signaling with the neurotransmitters serotonin and tachykinin. In the brain, picking up on faces goes beyond basic pattern learning, the researchers conclude June 14 in the Journal of Experimental Biology. It’s possible that some of the same genes also play a broader role in how organisms such as humans and sheep tell one face from another. © Society for Science & the Public 2000 - 2017

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23742 - Posted: 06.15.2017

Laurel Hamers A monkey’s brain builds a picture of a human face somewhat like a Mr. Potato Head — piecing it together bit by bit. The code that a monkey’s brain uses to represent faces relies not on groups of nerve cells tuned to specific faces — as has been previously proposed — but on a population of about 200 cells that code for different sets of facial characteristics. Added together, the information contributed by each nerve cell lets the brain efficiently capture any face, researchers report June 1 in Cell. “It’s a turning point in neuroscience — a major breakthrough,” says Rodrigo Quian Quiroga, a neuroscientist at the University of Leicester in England who wasn’t part of the work. “It’s a very simple mechanism to explain something as complex as recognizing faces.” Until now, Quiroga says, the leading explanation for the way the primate brain recognizes faces proposed that individual nerve cells, or neurons, respond to certain types of faces (SN: 6/25/05, p. 406). A system like that might work for the few dozen people with whom you regularly interact. But accounting for all of the peripheral people encountered in a lifetime would require a lot of neurons. It now seems that the brain might have a more efficient strategy, says Doris Tsao, a neuroscientist at Caltech. Tsao and coauthor Le Chang used statistical analyses to identify 50 variables that accounted for the greatest differences between 200 face photos. Those variables represented somewhat complex changes in the face — for instance, the hairline rising while the face becomes wider and the eyes becomes further-set. |© Society for Science & the Public 2000 - 2017.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23701 - Posted: 06.02.2017

By Mitch Leslie Colin Wahl, a market research consultant in Chapel Hill, North Carolina, was recovering nicely from triple bypass surgery last year when he noticed a white spot on the incision. It proved to be an obstinate infection that required three further surgeries to eradicate. Wahl, now 61, says his mind hasn't been as sharp since. "It's little things mostly related to memory." An avid recreational hockey player, he would forget to bring his skates or sticks to the rink. Certain words became elusive. Just hours after talking to a colleague about Tasmania, he couldn't recall the word. Instead, he says, the phrase "Outback Australia" was stuck in his mind. "I'm trying to remember something and something else slips into that memory slot." Many of us can recount a similar story about a friend, colleague, or loved one—usually elderly—whose mental condition deteriorated after a visit to an operating room. "The comment that ‘So-and-so has never been the same after the operation’ is pervasive," says anesthesiologist Roderic Eckenhoff of the University of Pennsylvania. Often, surgical patients are beset by postoperative delirium—delusions, confusion, and hallucinations—but that usually fades quickly. Other people develop what has been dubbed postoperative cognitive dysfunction (POCD), suffering problems with memory, attention, and concentration that can last months or even a lifetime. POCD not only disrupts patients' lives, but may also augur worse to come. According to a 2008 study, people who have POCD 3 months after they leave the hospital are nearly twice as likely to die within a year as are surgical patients who report no mental setbacks. With the ballooning senior population needing more surgeries, "this is going to become an epidemic," says anesthesiologist Mervyn Maze of the University of California, San Francisco. © 2017 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 10: Biological Rhythms and Sleep
Link ID: 23691 - Posted: 06.01.2017

Giuseppe Gangarossa Could it be possible to run a normal existence without social life? Indeed, sociability is an important aspect for individuals and social interaction builds our lives. In fact, social interaction enhances quality of life and improves the stability of communities. Impaired sociability is a classical symptom observed in many neuropsychiatric disorders including autism, schizophrenia, depression, anxiety and generalized fear. Interestingly, many studies have pointed to the medial prefrontal cortex (mPFC), a brain area located in the ventromedial part of the frontal lobe, as key region involved in the neural bases of sociability (Valk et al, 2015; Treadway et al., 2015; Frith et al., 2007). The prelimbic cortex (PL) and the infralimbic cortex (IL), two subregions of the mPFC, have been strongly suggested to play an important role in the neural mechanisms underlying sociability as isolation rearing in rats results in impaired social behavior and structural modifications in the PL and IL. Isolation rearing is a neurodevelopmental manipulation that produces neurochemical, structural, and behavioral alterations in rodents that in many ways are consistent with psychiatric disorders such as schizophrenia, anxiety and depression. In particular, it has been shown that isolation rearing can alter the volume of mPFC, the dendritic length and the spine density of pyramidal neurons. However, the detailed mechanisms involved in sociability disorders remain elusive and poorly understood. A recent article published in Plos ONE by Minami and colleagues aimed at measuring neural activity in the PL and IL of control and isolated rats during social interaction in order to determine whether there is neural activity related to social behavior in these areas.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 23688 - Posted: 06.01.2017

by Angela Chen@chengela What happens when you look up and see a ball headed toward you? Without even thinking about it, you flinch. That might be because our brains are constantly living our lives in fast-forward, playing out the action in our head before it happens. Humans have to navigate, and respond to, an environment that is always changing. Our brain compensates for this by constantly making predictions about what’s going to happen, says Mattias Ekman, a researcher at Radboud University Nijmegen in the Netherlands. We’ve known this for a while, but these predictions are usually associative. An example: if you see a hamburger, your brain might predict that there will be fries nearby. In a study published today in the journal Nature Communications, Ekman and other scientists focused instead on how the brain predicts motion. So they used brain scans to track what happened as participants observed a moving dot. First, 29 volunteers looked at a white dot the size of a ping-pong ball. The dot went from left to right and then reversed directions. The volunteers watched the dot for about five minutes while scientists scanned their brains with ultra-fast fMRI. This way, the researchers know what pattern of brain activity was activated in the visual cortex while they watched the dot. After these five minutes, the researchers showed only the beginning of the sequence to the volunteers. Here, the scans showed that the brain “autocompletes” the full sequence — and it does it at twice the rate of the actual event. So if a dot took two seconds to go across the screen, the brain predicted the entire sequence in one second. “You’re actually already trying to predict what’s going to happen,” says Ekman. “These predictions are hypothetical, so in a way you’re trying to generate new memories that match the future.” © 2017 Vox Media, Inc.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23653 - Posted: 05.24.2017

Jon Hamilton It took an explosion and 13 pounds of iron to usher in the modern era of neuroscience. In 1848, a 25-year-old railroad worker named Phineas Gage was blowing up rocks to clear the way for a new rail line in Cavendish, Vt. He would drill a hole, place an explosive charge, then pack in sand using a 13-pound metal bar known as a tamping iron. But in this instance, the metal bar created a spark that touched off the charge. That, in turn, "drove this tamping iron up and out of the hole, through his left cheek, behind his eye socket, and out of the top of his head," says Jack Van Horn, an associate professor of neurology at the Keck School of Medicine at the University of Southern California. Gage didn't die. But the tamping iron destroyed much of his brain's left frontal lobe, and Gage's once even-tempered personality changed dramatically. "He is fitful, irreverent, indulging at times in the grossest profanity, which was not previously his custom," wrote John Martyn Harlow, the physician who treated Gage after the accident. This sudden personality transformation is why Gage shows up in so many medical textbooks, says Malcolm Macmillan, an honorary professor at the Melbourne School of Psychological Sciences and the author of An Odd Kind of Fame: Stories of Phineas Gage. "He was the first case where you could say fairly definitely that injury to the brain produced some kind of change in personality," Macmillan says. © 2017 npr

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 23643 - Posted: 05.22.2017

By MARTIN E. P. SELIGMAN and JOHN TIERNEY We are misnamed. We call ourselves Homo sapiens, the “wise man,” but that’s more of a boast than a description. What makes us wise? What sets us apart from other animals? Various answers have been proposed — language, tools, cooperation, culture, tasting bad to predators — but none is unique to humans. What best distinguishes our species is an ability that scientists are just beginning to appreciate: We contemplate the future. Our singular foresight created civilization and sustains society. It usually lifts our spirits, but it’s also the source of most depression and anxiety, whether we’re evaluating our own lives or worrying about the nation. Other animals have springtime rituals for educating the young, but only we subject them to “commencement” speeches grandly informing them that today is the first day of the rest of their lives. A more apt name for our species would be Homo prospectus, because we thrive by considering our prospects. The power of prospection is what makes us wise. Looking into the future, consciously and unconsciously, is a central function of our large brain, as psychologists and neuroscientists have discovered — rather belatedly, because for the past century most researchers have assumed that we’re prisoners of the past and the present. Behaviorists thought of animal learning as the ingraining of habit by repetition. Psychoanalysts believed that treating patients was a matter of unearthing and confronting the past. Even when cognitive psychology emerged, it focused on the past and present — on memory and perception. But it is increasingly clear that the mind is mainly drawn to the future, not driven by the past. Behavior, memory and perception can’t be understood without appreciating the central role of prospection. We learn not by storing static records but by continually retouching memories and imagining future possibilities. Our brain sees the world not by processing every pixel in a scene but by focusing on the unexpected. © 2017 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 23641 - Posted: 05.20.2017

By Bret Stetka For many hours a day they pluck dirt, debris and bugs from each other’s fur. Between grooming sessions they travel in troops to search for food. When ignored by mom, they throw tantrums; when not ignored by zoo-goers, they throw feces. Through these behaviors, monkeys demonstrate they understand the meaning of social interactions with other monkeys. They recognize when their peers are grooming one another and infer social rank from seeing such actions within their group. But it has long been unclear how the brains of our close evolutionary relatives actually process what they observe of these social situations. New findings published Thursday in Science offer a clue. A team of researchers from The Rockefeller University have identified a network in the monkey brain dedicated exclusively to analyzing social interactions. And they believe this network could be akin to human brains’ social circuitry. In the new work—led by Winrich Freiwald, an associate professor of neurosciences and behavior—four rhesus macaques viewed videos of various social and physical interactions while undergoing functional magnetic resonance imaging. (Monkeys love watching TV, so they paid attention.) They were shown clips of monkeys interacting, as well as performing tasks on their own. They also watched videos of various physical interactions among inanimate objects. © 2017 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 23637 - Posted: 05.19.2017

By Agata Blaszczak-Boxe We tend to be worse at telling apart faces of other races than those of our own race, studies have found. Now research shows some people are completely blind to features that make other-race faces distinct. Such an impairment could have important implications for eyewitness testimony in situations involving other-race suspects. The ability to distinguish among members of one's own race varies wildly: some people can tell strangers apart effortlessly, whereas others cannot even recognize the faces of their own family and friends (a condition known as prosopagnosia). Psychologist Lulu Wan of the Australian National University and her colleagues wanted to quantify the distribution of abilities for recognizing other-race faces. They asked 268 Caucasians born and raised in Australia to memorize a series of six Asian faces and conducted the same experiment, involving Caucasian faces, with a group of 176 Asians born and raised in Asia who moved to Australia to attend university. In 72 trials, every participant was then shown sets of three faces and had to point to the one he or she had learned in the memorization task. The authors found that 26 Caucasian and 10 Asian participants—8 percent of the collective study population—did so badly on the test that they met the criteria for clinical-level impairment. “We know that we are poor at recognizing other-race faces,” says Jim Tanaka, a professor of psychology at the University of Victoria in British Columbia, who was not involved in the research. “This study shows just how poor some people are.” Those individuals “would be completely useless in terms of their legal value as an eyewitness,” says study co-author Elinor McKone, a professor of psychology at the Australian National University. The world's legal systems do not, however, take into account individual differences in other-race face recognition, she notes. © 2017 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 23602 - Posted: 05.11.2017

Ian Sample Science editor It isn’t big and it isn’t clever. But the benefits, known to anyone who has moved home, climbed a mountain, or pushed a broken-down car, have finally been confirmed: according to psychologists, swearing makes you stronger. The upside of letting profanities fly emerged from a series of experiments with people who repeated either a swear word or a neutral word as they pounded away on an exercise bike, or performed a simple hand-grip test. When people cursed their way through the half-minute bike challenge, their peak power rose by 24 watts on average, according to the study. In the 10-second grip task, swearers boosted their strength by the equivalent of 2.1kg, researchers found. “In the short period of time we looked at there are benefits from swearing,” said Richard Stephens, a psychologist at Keele University, who presented the results at the British Psychological Society meeting in Brighton. Stephens enrolled 29 people aged about 21 for the cycling test, and 52 people with a typical age of 19 for the hand-grip test. All were asked to choose a swearword to repeat in the studies, based on a term they might utter if they banged their head. For the neutral word, the volunteers were asked to pick a word they might use to describe a table, such as “wooden” or “brown”. © 2017 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 15: Brain Asymmetry, Spatial Cognition, and Language
Link ID: 23575 - Posted: 05.05.2017

Long assumed to be a mere “relay,” an often-overlooked egg-like structure in the middle of the brain also turns out to play a pivotal role in tuning-up thinking circuity. A trio of studies in mice funded by the National Institutes of Health are revealing that the thalamus sustains the ability to distinguish categories and hold thoughts in mind. By manipulating activity of thalamus neurons, scientists were able to control an animal’s ability to remember how to find a reward. In the future, the thalamus might even become a target for interventions to reduce cognitive deficits in psychiatric disorders such as schizophrenia, researchers say. “If the brain works like an orchestra, our results suggest the thalamus may be its conductor,” explained Michael Halassa, M.D., Ph.D. (link is external), of New York University (NYU) Langone Medical Center, a BRAINS Award grantee of the NIH’s National Institute of Mental Health (NIMH), and also a grantee of the National Institute of Neurological Disorders and Stroke (NINDS). “It helps ensembles play in-sync by boosting their functional connectivity.” Three independent teams of investigators led by Halassa, Joshua Gordon, M.D., Ph.D., formerly of Columbia University, New York City, now NIMH director, in collaboration with Christoph Kellendonk, Ph.D. (link is external) of Columbia, and Karel Svoboda, PhD (link is external), at Howard Hughes Medical Institute Janelia Research Campus, Ashburn, Virginia, in collaboration with Charles Gerfen, Ph.D., of the NIMH Intramural Research Program, report on the newfound role for the thalamus online May 3, 2017 in the journals Nature and Nature Neuroscience.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 23571 - Posted: 05.04.2017

By Thomas MacMillan “Time” is the most common noun in the English language, Dean Buonomano tells us on the first page of his new book, Your Brain Is a Time Machine: The Neuroscience and Physics of Time. But our despite fixation with time, and its obvious centrality in our lives, we still struggle to fully understand it. From a psychology perspective, for instance, time seems to flow by, sometimes slowly — like when we’re stuck in line at the DMV — and sometimes quickly — like when we’re lost in an engrossing novel. But from a physics perspective, time may be simply another dimension in the universe, like length, height, or width. Buonomano, a professor of neuroscience at UCLA, lays out the latest, best theories about how we understand time, illuminating a fundamental aspect of being human. The human brain, he writes, is a time machine that allows us to mentally travel backward and forward, to plan for the future and agonizingly regret that past like no other animal. And, he argues, our brains are time machines like clocks are time machines: constantly tracking the passage of time, whether it’s circadian rhythms that tell us when to go to sleep, or microsecond calculations that allow us to the hear the difference between “They gave her cat-food” and “They gave her cat food.” In an interview with Science of Us, Buonomano spoke about planning for the future as a basic human activity, the limits of be-here-now mindfulness, and the inherent incompatibility between physicists’ and neuroscientists’ understanding of the nature of time. I finished reading your book late last night and went to bed sort of planning our interview today, and then woke up at about 3:30 a.m. ready to do the interview, with my head full of insistent thoughts about questions that I should ask you. So was that my brain being a — maybe malfunctioning — time machine? I think this is consistent with the notion that the brain is an organ that’s future-oriented. As far as survival goes, the evolutionary value of the brain is to act in the present to ensure survival in the future, whether survival is figuring out a good place to get food, or doing an interview, I suppose. ! © Invalid Date, New York Media LLC

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 23537 - Posted: 04.26.2017

Nicola Davis Apes are on a par with human infants in being able to tell when people have an accurate belief about a situation or are actually mistaken, researchers say. While previous work has shown that great apes understand the goals, desires and perceptions of others, scientists say the latest finding reveals an important cognitive ability. “For the last 30 or more years people thought that belief understanding is the key marker of humans and really differentiates us from other species – and this does not seem to be the case,” said David Buttelmann, co-author of the research from the Max Planck Institute for Evolutionary Anthropology in Germany. Apes can guess what others are thinking - just like humans, study finds Read more The results follow on the heels of a study published last year which also suggests that apes understand the concept of false beliefs – after research that used eye-tracking technology to monitor the gaze of apes exposed to various pranks carried out by an actor dressed in a King Kong suit. But the new study, says Buttelmann, is an important step forward, showing that apes not only understand false belief in others, but apply that understanding to their own actions. Writing in the journal Plos One, Buttelmann and colleagues described exploring the understanding of false belief in 34 great apes, including bonobos, chimpanzees and orangutans, using a test that can be passed by human infants at one to two years of age. © 2017 Guardian News and Media Limited

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23457 - Posted: 04.06.2017

Laurel Hamers SAN FRANCISCO — When faced with simple math problems, people who get jittery about the subject may rely more heavily on certain brain circuitry than math-savvy people do. The different mental approach could help explain why people with math anxiety struggle on more complicated problems, researchers reported March 25 at the Cognitive Neuroscience Society’s annual meeting. While in fMRI machines, adults with and without math anxiety evaluated whether simple arithmetic problems, such as 9+2=11, were correct or incorrect. Both groups had similar response times and accuracy on the problems, but brain scans turned up differences. Specifically, in people who weren’t anxious about math, lower activation of the frontoparietal attention network was linked to better performance. That brain network is involved in working memory and problem solving. Math-anxious people showed no correlation between performance and frontoparietal network activity. People who used the circuit less were probably getting ahead by automating simple arithmetic, said Hyesang Chang, a cognitive neuroscientist at the University of Chicago. Because math-anxious people showed more variable brain activity overall, Chang speculated that they might instead be using a variety of computationally demanding strategies. This scattershot approach works fine for simple math, she said, but might get maxed out when the math is more challenging. Citations H. Chang et al. Simple arithmetic: Not so simple for highly math anxious individuals. Cognitive Neuroscience Society Annual Meeting, San Francisco, March 25, 2017. |© Society for Science & the Public 2000 - 2017.

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Consciousness
Link ID: 23411 - Posted: 03.28.2017

By Scott Barry Kaufman Rarely do I read a scientific paper that overwhelms me with so much excitement, awe, and reverence. Well, a new paper in Psychological Science has really got me revved up, and I am bursting to share their findings with you! Most research on mind-wandering and daydreaming draws on either two methods: strict, laboratory conditions that ask people to complete boring, cognitive tasks and retrospective surveys that ask people to recall how often they daydream in daily life. It has been rather difficult to compare these results to each other; laboratory tasks aren't representative of how we normally go about our day, and surveys are prone to memory distortion. In this new, exciting study, Michael Kane and colleagues directly compared laboratory mind-wandering with real-life mind wandering within the same person, and used an important methodology called "experience-sampling" that allows the researcher to capture people's ongoing stream of consciousness. For 7 days, 8 times a day, the researchers randomly asked 274 undergraduates at North Carolina at Greensboro whether they were mind-wandering and the quality of their daydreams. They also asked them to engage in a range of tasks in the laboratory that assessed their rates of mind-wandering, the contents of their off-task thoughts, and their "executive functioning" (a set of skills that helps keep things in memory despite distractions and focus on the relevant details). What did they find? © 2017 Scientific American

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23409 - Posted: 03.27.2017

Laura Sanders Not too long ago, the internet was stationary. Most often, we’d browse the Web from a desktop computer in our living room or office. If we were feeling really adventurous, maybe we’d cart our laptop to a coffee shop. Looking back, those days seem quaint. Today, the internet moves through our lives with us. We hunt Pokémon as we shuffle down the sidewalk. We text at red lights. We tweet from the bathroom. We sleep with a smartphone within arm’s reach, using the device as both lullaby and alarm clock. Sometimes we put our phones down while we eat, but usually faceup, just in case something important happens. Our iPhones, Androids and other smartphones have led us to effortlessly adjust our behavior. Portable technology has overhauled our driving habits, our dating styles and even our posture. Despite the occasional headlines claiming that digital technology is rotting our brains, not to mention what it’s doing to our children, we’ve welcomed this alluring life partner with open arms and swiping thumbs. Scientists suspect that these near-constant interactions with digital technology influence our brains. Small studies are turning up hints that our devices may change how we remember, how we navigate and how we create happiness — or not. Somewhat limited, occasionally contradictory findings illustrate how science has struggled to pin down this slippery, fast-moving phenomenon. Laboratory studies hint that technology, and its constant interruptions, may change our thinking strategies. Like our husbands and wives, our devices have become “memory partners,” allowing us to dump information there and forget about it — an off-loading that comes with benefits and drawbacks. Navigational strategies may be shifting in the GPS era, a change that might be reflected in how the brain maps its place in the world. Constant interactions with technology may even raise anxiety in certain settings. |© Society for Science & the Public 2000 - 2017

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 23385 - Posted: 03.21.2017

By Nicole Mortillaro, CBC News Have you ever been witness to an event with a friend only to conclude you both had different accounts about what had occurred? This is known as perception bias. Our views and beliefs can cloud the way we perceive things — and perception bias can take on many forms. New research published in the Journal of Personality and Social Psychology found that people tend to perceive young black men as larger, stronger and more threatening than white men of the same size. This, the authors say, could place them at risk in situations with police. The research was prompted by recent police shootings against black men in the United States — particularly those involving descriptions of men that didn't correspond with reality. Take, for example, the case of Dontre Hamilton. In 2014, the unarmed Hamilton was shot 14 times and killed by police in Milkwaukee. The officer involved testified that he believed he would have been easily overpowered by Hamilton, who he described as having a muscular build. But the autopsy report found that Hamilton was just five foot seven and weighed 169 pounds. Looking at the Hamilton case, as well as many other examples, the researchers sought to determine whether or not there were psychologically driven preconceived notions about black men over white men. ©2017 CBC/Radio-Canada.

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 11: Emotions, Aggression, and Stress
Link ID: 23359 - Posted: 03.15.2017

Laurel Hamers Mistakes can be learning opportunities, but the brain needs time for lessons to sink in. When facing a fast and furious stream of decisions, even the momentary distraction of noting an error can decrease accuracy on the next choice, researchers report in the March 15 Journal of Neuroscience. “We have a brain region that monitors and says ‘you messed up’ so that we can correct our behavior,” says psychologist George Buzzell, now at the University of Maryland in College Park. But sometimes, that monitoring system can backfire, distracting us from the task at hand and causing us to make another error. “There does seem to be a little bit of time for people, after mistakes, where you're sort of offline,” says Jason Moser, a psychologist at Michigan State University in East Lansing, who wasn’t part of the study. To test people’s response to making mistakes, Buzzell and colleagues at George Mason University in Fairfax, Va., monitored 23 participants’ brain activity while they worked through a challenging task. Concentric circles flashed briefly on a screen, and participants had to respond with one hand if the two circles were the same color and the other hand if the circles were subtly different shades. After making a mistake, participants generally answered the next question correctly if they had a second or so to recover. But when the next challenge came very quickly after an error, as little as 0.2 seconds, accuracy dropped by about 10 percent. Electrical activity recorded from the visual cortex showed that participants paid less attention to the next trial if they had just made a mistake than if they had responded correctly. |© Society for Science & the Public 2000 - 2017

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 13: Memory, Learning, and Development
Link ID: 23358 - Posted: 03.15.2017

By PHILIP FERNBACH and STEVEN SLOMAN How can so many people believe things that are demonstrably false? The question has taken on new urgency as the Trump administration propagates falsehoods about voter fraud, climate change and crime statistics that large swaths of the population have bought into. But collective delusion is not new, nor is it the sole province of the political right. Plenty of liberals believe, counter to scientific consensus, that G.M.O.s are poisonous, and that vaccines cause autism. The situation is vexing because it seems so easy to solve. The truth is obvious if you bother to look for it, right? This line of thinking leads to explanations of the hoodwinked masses that amount to little more than name calling: “Those people are foolish” or “Those people are monsters.” Such accounts may make us feel good about ourselves, but they are misguided and simplistic: They reflect a misunderstanding of knowledge that focuses too narrowly on what goes on between our ears. Here is the humbler truth: On their own, individuals are not well equipped to separate fact from fiction, and they never will be. Ignorance is our natural state; it is a product of the way the mind works. What really sets human beings apart is not our individual mental capacity. The secret to our success is our ability to jointly pursue complex goals by dividing cognitive labor. Hunting, trade, agriculture, manufacturing — all of our world-altering innovations — were made possible by this ability. Chimpanzees can surpass young children on numerical and spatial reasoning tasks, but they cannot come close on tasks that require collaborating with another individual to achieve a goal. Each of us knows only a little bit, but together we can achieve remarkable feats. © 2017 The New York Times Company

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Consciousness
Link ID: 23316 - Posted: 03.06.2017

By Ruth Williams Scientists at New York University’s School of Medicine have probed the deepest layers of the cerebral cortices of mice to record the activities of inhibitory interneurons when the animals are alert and perceptive. The team’s findings reveal that these cells exhibit different activities depending on the cortical layer they occupy, suggesting a level of complexity not previously appreciated. In their paper published in Science today (March 2), the researchers also described the stimulatory and inhibitory inputs that regulate these cells, adding further details to the picture of interneuron operations within the cortical circuitry. “It is an outstanding example of circuit analysis and a real experimental tour de force,” said neuroscientist Massimo Scanziani of the University of California, San Diego, who was not involved in the work. Christopher Moore of Brown University in Providence, Rhode Island, who also did not participate in the research, echoed Scanziani’s sentiments. “It’s just a beautiful paper,” he said. “They do really hard experiments and come up with what seem to be really valid [observations]. It’s a well-done piece of work.” The mammalian cerebral cortex is a melting pot of information, where signals from sensory inputs, emotions, and memories are combined and processed to produce a coherent perception of the world. Excitatory cells are the most abundant type of cortical neurons and are thought to be responsible for the relay and integration of this information, while the rarer interneurons inhibit the excitatory cells to suppress information flow. Interneurons are “a sort of gatekeeper in the cortex,” said Scanziani. © 1986-2017 The Scientist

Related chapters from BP7e: Chapter 18: Attention and Higher Cognition; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Related chapters from MM:Chapter 14: Attention and Consciousness; Chapter 3: Neurophysiology: The Generation, Transmission, and Integration of Neural Signals
Link ID: 23314 - Posted: 03.04.2017