Links for Keyword: Attention

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 644

By Kashmir Hill and Jeremy White There are now businesses that sell fake people. On the website Generated.Photos, you can buy a “unique, worry-free” fake person for $2.99, or 1,000 people for $1,000. If you just need a couple of fake people — for characters in a video game, or to make your company website appear more diverse — you can get their photos for free on ThisPersonDoesNotExist.com. Adjust their likeness as needed; make them old or young or the ethnicity of your choosing. If you want your fake person animated, a company called Rosebud.AI can do that and can even make them talk. These simulated people are starting to show up around the internet, used as masks by real people with nefarious intent: spies who don an attractive face in an effort to infiltrate the intelligence community; right-wing propagandists who hide behind fake profiles, photo and all; online harassers who troll their targets with a friendly visage. The A.I. system sees each face as a complex mathematical figure, a range of values that can be shifted. Choosing different values — like those that determine the size and shape of eyes — can alter the whole image. For other qualities, our system used a different approach. Instead of shifting values that determine specific parts of the image, the system first generated two images to establish starting and end points for all of the values, and then created images in between. The creation of these types of fake images only became possible in recent years thanks to a new type of artificial intelligence called a generative adversarial network. In essence, you feed a computer program a bunch of photos of real people. It studies them and tries to come up with its own photos of people, while another part of the system tries to detect which of those photos are fake. The back-and-forth makes the end product ever more indistinguishable from the real thing. The portraits in this story were created by The Times using GAN software that was made publicly available by the computer graphics company Nvidia. © 2020 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27589 - Posted: 11.21.2020

Diana Kwon It all began with a cough. Three years ago Tracey McNiven, a Scottish woman in her mid-30s, caught a bad chest infection that left her with a persistent cough that refused to subside, even after medication. A few months later strange symptoms started to appear. McNiven noticed numbness spreading through her legs and began to feel that their movement was out of her control. When she walked, she felt like a marionette, with someone else pulling the strings. Over the course of two weeks the odd loss of sensation progressively worsened. Then, one evening at home, McNiven's legs collapsed beneath her. “I was lying there, and I felt like I couldn't breathe,” she recalls. “I couldn't feel below my waist.” McNiven's mother rushed her to the hospital where she remained for more than half a year. During her first few weeks in the hospital, McNiven endured a barrage of tests as doctors tried to uncover the cause of her symptoms. It could be a progressive neurodegenerative condition such as motor neuron disease, they thought. Or maybe it was multiple sclerosis, a disease in which the body's own immune cells attack the nervous system. Bafflingly, however, the brain scans, blood tests, spinal taps and everything else came back normal. McNiven's predicament is not uncommon. According to one of the most comprehensive assessments of neurology clinics to date, roughly a third of patients have neurological symptoms that are deemed to be either partially or entirely unexplained. These may include tremor, seizures, blindness, deafness, pain, paralysis and coma and can parallel those of almost any neurological disease. In some patients, such complications can persist for years or even decades; some people require wheelchairs or cannot get out of bed. Although women are more often diagnosed than men, such seemingly inexplicable illness can be found in anyone and across the life span. © 2020 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 27586 - Posted: 11.18.2020

By Benedict Carey Merriam-Webster’s defines a time warp as a “discontinuity, suspension or anomaly” in the otherwise normal passage of time; this year all three terms could apply. It seems like March happened 10 years ago; everyday may as well be Wednesday, and still, somehow, here come the holidays — fast, just like every year. Some bard or novelist may yet come forth to help explain the paradoxes of pandemic time, both its Groundhog Days and the blurs of stress and fear for those on the front lines, or who had infectious people in their household. But brain science also has something to say about the relationship between perceived time and the Greenwich Mean variety, and why the two may slip out of sync. In a new study, a research team based in Dallas reported the first strong evidence to date of so-called “time cells” in the human brain. The finding, posted by the journal PNAS, was not unexpected: In recent years, several research groups have isolated neurons in rodents that track time intervals. It’s where the scientists look for these cells, and how they identified them, that provide some insight into the subjective experiences of time. “The first thing to say is that, strictly speaking, there is no such thing as ‘time cells’ in the brain,” said Gyorgy Buzsaki, a neuroscientist at New York University who was not involved in the new research. “There is no neural clock. What happens in the brain is neurons change in response to other neurons.” He added, “Having said that, it’s a useful concept to talk about how this neural substrate represents the passage of what we call time.” In the new study, a team led by Dr. Bradley Lega, a neurosurgeon at UT Southwestern Medical Center, analyzed the firing of cells in the medial temporal area, a region deep in the brain that is essential for memory formation and retrieval. It’s a natural place to look: Memories must be somehow “time-stamped” to retain some semblance of sequence, or chronological order. © 2020 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27576 - Posted: 11.10.2020

By Scott Barry Kaufman Do you get excited and energized by the possibility of learning something new and complex? Do you get turned on by nuance? Do you get really stimulated by new ideas and imaginative scenarios? If so, you may have an influx of dopamine in your synapses, but not where we traditionally think of this neurotransmitter flowing. In general, the potential for growth from disorder has been encoded deeply into our DNA. We didn’t only evolve the capacity to regulate our defensive and destructive impulses, but we also evolved the capacity to make sense of the unknown. Engaging in exploration allows us to integrate novel or unexpected events with existing knowledge and experiences, a process necessary for growth. Dopamine production is essential for growth. But there are so many misconceptions about the role of dopamine in cognition and behavior. Dopamine is often labeled the “feel-good molecule,” but this is a gross mischaracterization of this neurotransmitter. As personality neuroscientist Colin DeYoung (a close colleague of mine) notes, dopamine is actually the “neuromodulator of exploration.” Dopamine’s primary role is to make us want things, not necessarily like things. We get the biggest rush of dopamine coursing through our brains at the possibility of reward, but this rush is no guarantee that we’ll actually like or even enjoy the thing once we get it. Dopamine is a huge energizing force in our lives, driving our motivation to explore and facilitating the cognitive and behavioral processes that allow us to extract the most delights from the unknown. If dopamine is not all about feeling good, then why does the feel-good myth persist in the public imagination? I think it’s because so much research on dopamine has been conducted with regard to its role in motivating exploration toward our more primal “appetitive” rewards, such as chocolate, social attention, social status, sexual partners, gambling or drugs like cocaine. © 2020 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 4: Development of the Brain
Link ID: 27549 - Posted: 10.26.2020

Shawna Williams In Greek mythology, Orpheus descends to the underworld and persuades Hades to allow him to take his dead wife, Eurydice, back to the realm of the living. Hades agrees, but tells Orpheus that he must not look back until he has exited the underworld. Despite the warning, Orpheus glances behind him on his way out to check whether Eurydice is indeed following him—and loses her forever. The story hints at a dark side to curiosity, a drive to seek certain kinds of knowledge even when doing so is risky—and even if the information serves no practical purpose at the time. In fact, the way people pursue information they’re curious about can resemble the drive to attain more tangible rewards such as food—a parallel that hasn’t been lost on scientists. To investigate the apparent similarity between curiosity and hunger, researchers led by Kou Murayama of the University of Reading in the UK recently devised an experiment to compare how the brain processes desires for food and knowledge, and the risks people are willing to take to satisfy those desires. Beginning in 2016, the team recruited 32 volunteers and instructed them not to eat for at least two hours before coming into the lab. After they arrived, the volunteers’ fingers were hooked up to electrodes that could deliver a weak current, and researchers calibrated the level of electricity to what each participant reported was uncomfortable, but not painful. Then, still hooked up to the electrodes, the volunteers were asked to gamble: they viewed either a photo of a food item or a video of a magician performing a trick, followed by a visual depiction of their odds of “winning” that round (which ranged from 1:6 to 5:6). © 1986–2020 The Scientist.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 27535 - Posted: 10.21.2020

Jon Hamilton Mental illness can run in families. And Dr. Kafui Dzirasa grew up in one of these families. His close relatives include people with schizophrenia, bipolar disorder and depression. As a medical student, he learned about the ones who'd been committed to psychiatric hospitals or who "went missing" and were discovered in alleyways. Dzirasa decided to dedicate his career to "figuring out how to make science relevant to ultimately help my own family." He became a psychiatrist and researcher at Duke University and began to study the links between genes and brain disorders. Then Dzirasa realized something: "I was studying genes that were specifically related to illness in folks of European ancestry." His family had migrated from West Africa, which meant anything he discovered might not apply to them. Dzirasa also realized that people with his ancestry were missing not only from genetics research but from the entire field of brain science. "It was a really crushing moment for me," he says. So when a group in Baltimore asked Dzirasa to help do something about the problem, he said yes. The group is the African Ancestry Neuroscience Research Initiative. It's a partnership between community leaders and the Lieber Institute for Brain Development, an independent, nonprofit research organization on the medical campus of Johns Hopkins University. © 2020 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 27491 - Posted: 09.28.2020

Jordana Cepelewicz Our sense of time may be the scaffolding for all of our experience and behavior, but it is an unsteady and subjective one, expanding and contracting like an accordion. Emotions, music, events in our surroundings and shifts in our attention all have the power to speed time up for us or slow it down. When presented with images on a screen, we perceive angry faces as lasting longer than neutral ones, spiders as lasting longer than butterflies, and the color red as lasting longer than blue. The watched pot never boils, and time flies when we’re having fun. Last month in Nature Neuroscience, a trio of researchers at the Weizmann Institute of Science in Israel presented some important new insights into what stretches and compresses our experience of time. They found evidence for a long-suspected connection between time perception and the mechanism that helps us learn through rewards and punishments. They also demonstrated that the perception of time is wedded to our brain’s constantly updated expectations about what will happen next. “Everyone knows the saying that ‘time flies when you’re having fun,’” said Sam Gershman, a cognitive neuroscientist at Harvard University who was not involved in the study. “But the full story might be more nuanced: Time flies when you’re having more fun than you expected.” “Time” doesn’t mean just one thing to the brain. Different brain regions rely on varied neural mechanisms to track its passage, and the mechanisms that govern our experience seem to change from one situation to the next. All Rights Reserved © 2020

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27488 - Posted: 09.25.2020

Neuroskeptic Why do particular brain areas tend to adopt particular roles? Is the brain "wired" by genetics to organize itself in a certain way, or does brain organization emerge from experience? One part of the brain has been the focus of a great deal of nature-vs-nurture debate. It's called the fusiform face area (FFA) and, as the name suggests, it seems to be most active during perception of faces. It's broadly accepted that the FFA responds most strongly to faces in most people, but there's controversy over why this is. Is the FFA somehow innately devoted to faces, or does its face specialization arise through experience? In the latest contribution to this debate, a new study argues that the FFA doesn't need any kind of visual experience to be face selective. The researchers, N. Apurva Ratan Murty et al., show that the FFA activates in response to touching faces, even in people who were born blind and have never seen a face. Murty et al. designed an experiment in which participants — 15 sighted and 15 congenitally blind people — could touch objects while their brain activity was recorded with fMRI. A 3D printer was used to create models of faces and other objects, and the participants could explore these with their hands, thanks to a rotating turntable. The key result was that touching the faces produced a similar pattern of activity in both the blind and sighted people, and this activity was also similar to when sighted people viewed faces visually: In a follow-up experiment with n=7 of the congenitally blind participants, Murty et al. found that the same face-selective areas in these individuals also responded to "face-related" sounds, such as laughing or chewing sounds, more than other sounds. (This replicates earlier work.) © 2020 Kalmbach Media Co.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 27459 - Posted: 09.07.2020

Katherine May Sunday morning. I walk down to the beach with the dog straining at her lead. I’m already on high alert. It’s the moment in the week when people are most likely to be wandering along the seafront, feeling chatty. I’m mentally priming myself, sorting through the categories I might encounter: parents from the schoolyard (hopefully with their children), people I’ve worked with (increasingly hopeless), neighbours from the surrounding streets (no chance). I should have gone to the woods today. It’s too risky. I cross the road and hear, “Katherine! Hello!” I wonder if I can get away with pretending I didn’t notice. I’m wearing earbuds, which is usually a good precaution, but this woman is determined. She crosses the road diagonally, waving. “How the hell are you?” she says. Straight hair, mousy blonde. No glasses, no tattoos. Jeans, a grey sweatshirt. For God’s sake, why are these people so studiedly ordinary? I fidget with my phone, trying to buy time. Her face is plain. I don’t mean plain as in “ugly”. I mean plain as in vanilla: bland, unremarkable. There’s nothing here that I might have stored in words. Her nose is straight. Her eyes are blue. Her teeth are orderly. And she knows me. “Hi!” I say, as warmly as possible. “How are you?” This can sometimes elicit clues. Not today. One of the many side-effects of being face-blind is that you become uncomfortably aware of the ordinariness of most interactions. We have stopped in the street to say absolutely nothing to each other. And only one of us knows the context. The dog lunges to her feet and pulls in the direction of the sea. “Looks like she’s desperate to get going!” I say, laughing, “So sorry! Lovely to see you!” And I’m off at a gallop before this woman, whoever the hell she is, can think about joining me. I didn’t always know I was face-blind. I grew up thinking that I just didn’t remember people. This, as a friend once told me, seemed a lot like arrogance – an aloof lack of interest in others. But that’s not how it felt on the inside. © 2020 Guardian News & Media Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27447 - Posted: 09.02.2020

By Elizabeth Preston We’re all getting used to face masks, either wearing them or figuring out who we’re looking at. They can even trip up those of us who are experts in faces. “Actually, I just had an experience today,” said Marlene Behrmann, a cognitive neuroscientist at Carnegie Mellon University who has spent decades studying the science of facial recognition. She went to meet a colleague outside the hospital where they collaborate, and didn’t realize the person was sitting right in front of her, wearing a mask. In fairness, “She’s cut her hair very short,” Dr. Behrmann said. Scientists have some ideas about why masks make recognizing others’ faces difficult, based on studying the brains of average people, as well as people who struggle to recognize anyone at all. But even when everyone around us is incognito, we still have ways to find each other. “We use face recognition in every aspect of our social interaction,” said Erez Freud, a psychologist with the Centre for Vision Research at York University in Toronto. In the faces of others, we find clues about their personality, gender and emotions. “This is something very fundamental to our perception. And suddenly, faces do not look the same,” Dr. Freud said. That’s why Dr. Freud and co-authors decided to study how masks impair people’s facial recognition skills. They recruited nearly 500 adults to complete a common face memory task online. Participants viewed unfamiliar faces and then tried to recognize them under increasingly difficult conditions. Half the participants saw faces with surgical-style masks covering their mouths and noses. People scored substantially worse on the test when faces were masked. The authors posted their findings, which have not yet completed peer review, online last month. © 2020 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27446 - Posted: 09.02.2020

By Jillian Kramer We spend a substantial part of our days visually scanning an area for something we want—our keys or ketchup, for example. For scientists the way we do so “provides a window into how our minds sift through the information that arrives at our eyes,” says Jason Fischer, a cognitive neuroscientist at Johns Hopkins University. Past research has focused on readily apparent visual characteristics such as color, shape and size. But an object's intrinsic physical properties—things we know from experience but cannot see, such as hardness—also come into play. “You may not be able to immediately see that a brick is heavier than a soda can and harder than a piece of cake, but you know it. And that knowledge guides how you act on a brick as compared with those other objects,” says Fischer, senior author on a new study led by graduate student Li Guo. “We asked whether that knowledge about objects' hidden physical properties is, in itself, something you can use to locate objects faster.” The study was published online in May in the Journal of Experimental Psychology: General. Researchers asked study participants to pick out the image of an item in a grid of other objects as quickly as possible. Each grid was controlled for the color, size and shape of the objects presented, so participants could not use easy visual cues. For example, when they were asked to find a cutting board, the grid also included softer but similarly colored items such as a croissant and a bandage and similarly shaped items, among them a sponge, pillow and paper bag. © 2020 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27423 - Posted: 08.18.2020

Alison Abbott Two years ago, Jennifer Li and Drew Robson were trawling through terabytes of data from a zebrafish-brain experiment when they came across a handful of cells that seemed to be psychic. The two neuroscientists had planned to map brain activity while zebrafish larvae were hunting for food, and to see how the neural chatter changed. It was their first major test of a technological platform they had built at Harvard University in Cambridge, Massachusetts. The platform allowed them to view every cell in the larvae’s brains while the creatures — barely the size of an eyelash — swam freely in a 35-millimetre-diameter dish of water, snacking on their microscopic prey. Out of the scientists’ mountain of data emerged a handful of neurons that predicted when a larva was next going to catch and swallow a morsel. Some of these neurons even became activated many seconds before the larva fixed its eyes on the prey1. Something else was strange. Looking in more detail at the data, the researchers realized that the ‘psychic’ cells were active for an unusually long time — not seconds, as is typical for most neurons, but many minutes. In fact, more or less the duration of the larvae’s hunting bouts. “It was spooky,” says Li. “None of it made sense.” Li and Robson turned to the literature and slowly realized that the cells must be setting an overall ‘brain state’ — a pattern of prolonged brain activity that primed the larvae to engage with the food in front of them. The pair learnt that, in the past few years, other scientists using various approaches and different species had also found internal brain states that alter how an animal behaves, even when nothing has changed in its external environment. © 2020 Springer Nature Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 27417 - Posted: 08.12.2020

By Serena Puang When I was in elementary school, I occasionally had trouble falling asleep, and people would tell me to count sheep. I had seen the activity graphically depicted in cartoons, but when I tried it, I never saw anything — just black. I’ve been counting silently into the darkness for years. There were other puzzling comments about visualizing things. My dad would poke fun at my bad sense of direction and reference a “mental map” of the city that he used for navigation. I thought he had superhuman powers. But then, in my freshman year of college, I was struggling through Chinese, while my friend Shayley found it easy. I asked her how she did it, and she told me she was just “visualizing the characters.” That’s when I discovered I had aphantasia, the inability to conjure mental images. Little is known about the condition, but its impact on my education led me to wonder about how it might be impacting others. Aphantasia was first described by Sir Francis Galton in 1880 but remained largely neglected until Dr. Adam Zeman, a cognitive neurologist at the University of Exeter in England, began his work in the early 2000s and coined the name from the Greek word “phantasia,” which means “imagination.” “My interest in it was sparked by a patient who had lost the ability to visualize following a cardiac procedure,” Dr. Zeman said. “He gave a very compelling account. His dreams became avisual; he ceased to enter a visual world when he read a novel.” Dr. Zeman wrote about the case, calling the patient MX, and in 2010, the science journalist Carl Zimmer wrote about it in Discover magazine, and later, in The Times. Hundreds of people started contacting Dr. Zeman, saying they were just like MX, except that they had never had the ability to visualize. © 2020 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27371 - Posted: 07.16.2020

By Courtney Linder Perception is certainly not always reality. Some people might think this image is a rabbit, for example, while others see it as a raven: But what if your brain just stopped recognizing numbers one day? That's precisely the basis for a recent Johns Hopkins University study about a man with a rare brain anomaly that prevents him from seeing certain numbers. Instead, the man told doctors, he sees squiggles that look like spaghetti, like in this video: And it's not just a matter of perception for him—not an optical illusion, nor something a Rorschach test could psychoanalyze away. It's actually proof that our brains can processes the world around us, and yet we could have no awareness of those sights. "We present neurophysiological evidence of complex cognitive processing in the absence of awareness, raising questions about the conditions necessary for visual awareness," the scientists note in a new paper published in the journal Proceedings of the National Academy of Sciences. RFS—the name researchers use to refer to the man in the study—has been diagnosed with a rare degenerative brain disease that has led to extensive atrophy in his cortex and basal ganglia. Atrophy is basically a loss of neurons and connective tissue, so you can think of it as the brain shrinking, in a sense. The cortex is the gray matter in your brain that controls things like attention, perception, awareness, and consciousness, while the basal ganglia are responsible for motor learning, executive functions, and emotional behaviors. ©2020 Hearst Magazine Media, Inc.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 27338 - Posted: 07.01.2020

Béatrice Pudelko Fear, anxiety, worry, lack of motivation and difficulty concentrating — students cite all sorts of reasons for opposing distance learning. But are these excuses or real concerns? What does science say? At the beginning of the pandemic, when universities and CEGEPs, Québec’s junior colleges, were putting in place scenarios to continue teaching at a distance, students expressed their opposition by noting that the context was “not conducive to learning.” Teachers also felt that the students were “simply not willing to continue learning in such conditions.” A variety of negative emotions were reported in opinion columns, letters and surveys. A petition was even circulated calling for a suspension of the winter session, which Education Minister Jean-François Roberge refused. Students are not the only ones who have difficulty concentrating on intellectual tasks. In a column published in La Presse, Chantal Guy says that like many of her colleagues, she can’t devote herself to in-depth reading. “After a few pages, my mind wanders and just wants to go check out Dr. Arruda’s damn curve,” Guy wrote, referring to Horacio Arruda, the province’s public health director. In short: “It’s not the time that’s lacking in reading, it’s the concentration,” she said. “People don’t have the head for that.” Why do students feel they don’t have the ability for studies? Recent advances in cognitive science provide insights into the links between negative emotions and cognition in tasks that require sustained intellectual investment. © 2010–2020, The Conversation US, Inc.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 27293 - Posted: 06.09.2020

Rebecca Schiller When behavioural scientist Dr Pragya Agarwal moved from Delhi to York more than 20 years ago, her first priority was to blend in. As a single parent, a woman of colour and an academic, she worked hard to “water down” the things that made her different from those around her. Yet the more she tried to fit in, the more Agarwal began to ask herself why humans appear programmed to create “in groups” and distrust those on the outside. “Unconscious bias has become a buzzword in recent years,” explains Agarwal. “We are all biased and, though some biases can be harmless, many aren’t.” These are the issues she unravels in her book Sway: Unravelling Unconscious Bias, and she confronts some uncomfortable truths along the way. Advertisement Agarwal argues that humans aren’t naturally rational creatures, and with our brains constantly bombarded with information, we rely on cognitive short cuts: patterns of learned thinking based on what has worked for us in the past, the messages we receive from others and our evolutionary programming. “Cognitive short cuts evolved to help us survive,” she says. “The problem is that we still have these responses and they don’t work well in the modern world.” In our tribal past, the consequences of wrongly assuming that an outsider was peaceful or free from disease could be so damaging that being overcautious became a human evolutionary strategy. The result is the tendency to generalise: speedily assigning those around us to groups based on race, academic status, social class or gender and ignoring details that contradict our existing beliefs. Once we’ve placed a person in a box, Agarwal suggests we are more inclined to choose the dehumanising and dangerous approach of treating them according to the stereotypes we associate with that box rather than as an individual. It’s an experience the author has had herself. © 2020 Guardian News & Media Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27186 - Posted: 04.14.2020

By Pragya Agarwal If you have seen the documentary Free Solo, you will be familiar with Alex Honnold. He ascends without protective equipment of any kind in treacherous landscapes where, above about 15 meters, any slip is generally lethal. Even just watching him pressed against the rock with barely any handholds makes me nauseous. In a functional magnetic resonance imaging (fMRI) test with Honnold, neurobiologist Jane Joseph found there was near zero activation in his amygdala. This is a highly unusual brain reaction and may explain why Alex feels no threat in free solo climbs that others wouldn’t dare attempt. But this also shows how our amygdala activates in that split second to warn us, and why it plays an important role in our unconscious biases. Having spent many years researching unconscious bias for my book, I have realized that it remains problematic to pinpoint as it is hidden and is often in complete contrast to what are our expected beliefs. Neuroimaging research is beginning to give us more insight into the formation of our unconscious biases. Recent fMRI neuroscience studies demonstrate that people use different areas of the brain when reasoning about familiar and unfamiliar situations. The neural zones that respond to stereotypes primarily include the amygdala, the prefrontal cortex, the posterior cingulate and the anterior temporal cortex, and that they are described as all “lighting up like a Christmas tree” when stereotypes are activated (certain parts of the brain become more activated than others during certain tasks). People also use different areas of the brain when reasoning about familiar and unfamiliar situations. When we meet someone new, we are not merely focusing on our verbal interaction. © 2020 Scientific American,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 2: Functional Neuroanatomy: The Cells and Structure of the Nervous System
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 27184 - Posted: 04.13.2020

By Douglas Starr When Jennifer Eberhardt appeared on The Daily Show with Trevor Noah in April 2019, she had a hard time keeping a straight face. But some of the laughs were painful. Discussing unconscious racial bias, which she has studied for years, the Stanford University psychologist mentioned the “other-race effect,” in which people have trouble recognizing faces of other racial groups. Criminals have learned to exploit the effect, she told Noah. In Oakland, California, a gang of black teenagers caused a mini–crime wave of purse snatchings among middle-aged women in Chinatown. When police asked the teens why they targeted that neighborhood, they said the Asian women, when faced with a lineup, “couldn’t tell the brothers apart.” “That is one of the most horrible, fantastic stories ever!” said Noah, a black South African. But it was true. Eberhardt has written that the phrase “they all look alike,” long the province of the bigot, “is actually a function of biology and exposure.” There’s no doubt plenty of overt bigotry exists, Eberhardt says; but she has found that most of us also harbor bias without knowing it. It stems from our brain’s tendency to categorize things—a useful function in a world of infinite stimuli, but one that can lead to discrimination, baseless assumptions, and worse, particularly in times of hurry or stress. Over the decades, Eberhardt and her Stanford team have explored the roots and ramifications of unconscious bias, from the level of the neuron to that of society. In cleverly designed experiments, she has shown how social conditions can interact with the workings of our brain to determine our responses to other people, especially in the context of race. Eberhardt’s studies are “strong methodologically and also super real-world relevant,” says Dolly Chugh of New York University’s Stern School of Business, a psychologist who studies decision-making. © 2020 American Association for the Advancement of Science.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 27145 - Posted: 03.27.2020

As we get older, we become more easily distracted, but it isn't always a disadvantage, according to researchers. Tarek Amer, a psychology postdoctoral research fellow at Columbia University, says that although our ability to focus our attention on specific things worsens as we get older, our ability to take in broad swaths of information remains strong. So in general, older adults are able to retain information that a more focused person could not. For the last few years, Amer's research has focused mainly on cognitive control, a loose term that describes one's ability to focus their attention. His work at the University of Toronto, where he received his PhD in 2018, looked specifically at older adults aged 60 to 80. Amer joined Spark host Nora Young to discuss his research and how it could be implemented in practical ways. What happens to our ability to concentrate as we get older? There's a lot of research that shows as we get older, this ability tends to decline or is reduced with age. So essentially, what we see is that relative to younger adults, older adults have a harder time focusing on one thing while ignoring distractions. This distraction can be from the external world. This can also be internally based distractions, such as our own thoughts, which are usually not related to the task at hand. With respect to mind wandering specifically, the literature is ... mixed. [The] typical finding is that older adults tend to, at least in lab-based tasks, mind wander less. So I know that you've been looking, in your own research, at concentration and memory formation. So what exactly are you studying? One of the things I was interested in is whether this [decline in the ability to concentrate] could be associated with any benefits in old age. For example, one thing that we showed is that when older and younger adults perform a task that includes both task-relevant as well as task-irrelevant information, older adults are actually processing both types of information. So if we give them a memory task at the end that actually is testing memory for the irrelevant information … we see that older adults actually outperform younger adults. ©2020 CBC/Radio-Canada.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 27116 - Posted: 03.14.2020

Dori Grijseels In 2016, three neuroscientists wrote a commentary article arguing that, to truly understand the brain, neuroscience needed to change. From that paper, the International Brain Laboratory (IBL) was born. The IBL, now a collaboration between 22 labs across the world, is unique in biology. The IBL is modeled on physics collaborations, like the ATLAS experiment at CERN, where thousands of scientists work together on a common problem, sharing data and resources during the process. This was in response to the main criticism that the paper’s authors, Zachary Mainen, Michael Häusser and Alexandre Pouget, had about existing neuroscience collaborations: labs came together to discuss generalities, but all the experiments were done separately. They wanted to create a collaboration in which scientists worked together throughout the process, even though their labs may be distributed all over the globe. The IBL decided to focus on one brain function only: decision-making. Decision-making engages the whole brain, since it requires using both input from the senses and information about previous experiences. If someone is thinking about bringing a sweater when they go out, they will use their senses to determine whether it looks and feels cold outside, but they might also remember that, yesterday, they were cold without a sweater. For its first published (in pre-print form) experiment, seven labs of the 22 collaborating in the IBL tested 101 mice on their decision-making ability. The mice saw a black and white grating either to their right or to their left. They then had to twist a little Lego wheel to move the grating to the middle. By rewarding them with sugary water whenever they did the task correctly, the mice gradually learned. It is easy for them to decide which way to twist the wheel if the grating has a high contrast, because it stands out compared to the background of their visual field. However, the mice were also presented with a more ambiguously-patterned grating not easily distinguishable from the background, so the decision of which way to turn the wheel was more difficult. In some cases, the grating was even indistinguishable from the background. Between all seven labs –which were spread across three countries – the mice completed this task three million times. © 2017 – 2019 Massive Science Inc.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 27102 - Posted: 03.07.2020