Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 677

By Ellen Barry It is a truism that time seems to expand or contract depending on our circumstances: In a state of terror, seconds can stretch. A day spent in solitude can drag. When we’re trying to meet a deadline, hours race by. A study published this month in the journal Psychophysiology by psychologists at Cornell University found that, when observed at the level of microseconds, some of these distortions could be driven by heartbeats, whose length is variable from moment to moment. The psychologists fitted undergraduates with electrocardiograms to measure the length of each heartbeat precisely, and then asked them to estimate the length of brief audio tones. The psychologists discovered that after a longer heartbeat interval, subjects tended to perceive the tone as longer; shorter intervals led subjects to assess the tone as shorter. Subsequent to each tone, the subjects’ heartbeat intervals lengthened. A lower heart rate appeared to assist with perception, said Saeedeh Sadeghi, a doctoral candidate at Cornell and the study’s lead author. “When we need to perceive things from the outside world, the beats of the heart are noise to the cortex,” she said. “You can sample the world more — it’s easier to get things in — when the heart is silent.” The study provides more evidence, after an era of research focusing on the brain, that “there is no single part of the brain or body that keeps time — it’s all a network,” she said, adding, “The brain controls the heart, and the heart, in turn, impacts the brain.” Interest in the perception of time has exploded since the Covid pandemic, when activity outside the home came to an abrupt halt for many and people around the world found themselves facing stretches of undifferentiated time. A study of time perception conducted during the first year of the lockdown in Britain found that 80 percent of participants reported distortions in time, in different directions. On average, older, more socially isolated people reported that time slowed, and younger, more active people reported that it sped up. © 2023 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28704 - Posted: 03.15.2023

By Stephani Sutherland Tara Ghormley has always been an overachiever. She finished at the top of her class in high school, graduated summa cum laude from college and earned top honors in veterinary school. She went on to complete a rigorous training program and build a successful career as a veterinary internal medicine specialist. But in March 2020 she got infected with the SARS-CoV-2 virus—just the 24th case in the small, coastal central California town she lived in at the time, near the site of an early outbreak in the COVID pandemic. “I could have done without being first at this,” she says. Almost three years after apparently clearing the virus from her body, Ghormley is still suffering. She gets exhausted quickly, her heartbeat suddenly races, and she goes through periods where she can't concentrate or think clearly. Ghormley and her husband, who have relocated to a Los Angeles suburb, once spent their free time visiting their “happiest place on Earth”—Disneyland—but her health prevented that for more than a year. She still spends most of her days off resting in the dark or going to her many doctors' appointments. Her early infection and ongoing symptoms make her one of the first people in the country with “long COVID,” a condition where symptoms persist for at least three months after the infection and can last for years. The syndrome is known by medical professionals as postacute sequelae of COVID-19, or PASC. People with long COVID have symptoms such as pain, extreme fatigue and “brain fog,” or difficulty concentrating or remembering things. As of February 2022, the syndrome was estimated to affect about 16 million adults in the U.S. and had forced between two million and four million Americans out of the workforce, many of whom have yet to return. Long COVID often arises in otherwise healthy young people, and it can follow even a mild initial infection. The risk appears at least slightly higher in people who were hospitalized for COVID and in older adults (who end up in the hospital more often). Women and those at socioeconomic disadvantage also face higher risk, as do people who smoke, are obese, or have any of an array of health conditions, particularly autoimmune disease. Vaccination appears to reduce the danger but does not entirely prevent long COVID.

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 14: Attention and Higher Cognition
Link ID: 28667 - Posted: 02.15.2023

By Betsy Mason Some fish can recognize their own faces in photos and mirrors, an ability usually attributed to humans and other animals considered particularly brainy, such as chimpanzees, scientists report. Finding the ability in fish suggests that self-awareness may be far more widespread among animals than scientists once thought. “It is believed widely that the animals that have larger brains will be more intelligent than animals of the small brain,” such as fish, says animal sociologist Masanori Kohda of Osaka Metropolitan University in Japan. It may be time to rethink that assumption, Kohda says. Kohda’s previous research showed that bluestreak cleaner wrasses can pass the mirror test, a controversial cognitive assessment that purportedly reveals self-awareness, or the ability to be the object of one’s own thoughts. The test involves exposing an animal to a mirror and then surreptitiously putting a mark on the animal’s face or body to see if they will notice it on their reflection and try to touch it on their body. Previously only a handful of large-brained species, including chimpanzees and other great apes, dolphins, elephants and magpies, have passed the test. In a new study, cleaner fish that passed the mirror test were then able to distinguish their own faces from those of other cleaner fish in still photographs. This suggests that the fish identify themselves the same way humans are thought to — by forming a mental image of one’s face, Kohda and colleagues report February 6 in the Proceedings of the National Academy of Sciences. “I think it’s truly remarkable that they can do this,” says primatologist Frans de Waal of Emory University in Atlanta who was not involved in the research. “I think it’s an incredible study.” © Society for Science & the Public 2000–2023.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28659 - Posted: 02.08.2023

By John M. Beggs Over the last few decades, an idea called the critical brain hypothesis has been helping neuroscientists understand how the human brain operates as an information-processing powerhouse. It posits that the brain is always teetering between two phases, or modes, of activity: a random phase, where it is mostly inactive, and an ordered phase, where it is overactive and on the verge of a seizure. The hypothesis predicts that between these phases, at a sweet spot known as the critical point, the brain has a perfect balance of variety and structure and can produce the most complex and information-rich activity patterns. This state allows the brain to optimize multiple information processing tasks, from carrying out computations to transmitting and storing information, all at the same time. To illustrate how phases of activity in the brain — or, more precisely, activity in a neural network such as the brain — might affect information transmission through it, we can play a simple guessing game. Imagine that we have a network with 10 layers and 40 neurons in each layer. Neurons in the first layer will only activate neurons in the second layer, and those in the second layer will only activate those in the third layer, and so on. Now, I will activate some number of neurons in the first layer, but you will only be able to observe the number of neurons active in the last layer. Let’s see how well you can guess the number of neurons I activated under three different strengths of network connections. First, let’s consider weak connections. In this case, neurons typically activate independently of each other, and the pattern of network activity is random. No matter how many neurons I activate in the first layer, the number of neurons activated in the last layer will tend toward zero because the weak connections dampen the spread of activity. This makes our guessing game incredibly difficult. The amount of information about the first layer that you can learn from the last layer is practically nothing. All Rights Reserved © 2023

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 14: Attention and Higher Cognition
Link ID: 28652 - Posted: 02.01.2023

Jon Hamilton Time is woven into our personal memories. Recall a childhood fall from a bike and the brain replays the entire episode in excruciating detail: the glimpse of wet leaves on the road ahead, the moment of weightless dread, and then the painful impact. This exact sequence has been embedded in the memory, thanks to some special neurons known as time cells. When the brain detects a notable event, time cells begin a highly orchestrated performance, says Marc Howard, who directs the Brain, Behavior, and Cognition program at Boston University. "What we find is that the cells fire in a sequence," he says. "So cell one might fire immediately, but cell two waits a little bit, followed by cell three, cell four, and so on." As each cell fires, it places a sort of time stamp on an unfolding experience. And the same cells fire in the same order when we retrieve a memory of the experience, even something mundane. "If I remember being in my kitchen and making a cup of coffee," Howard says, "the time cells that were active at that moment are re-activated." They recreate the grinder's growl, the scent of Arabica, the curl of steam rising from a fresh mug – and your neurons replay these moments in sequence every time you summon the memory. This system appears to explain how we are able to virtually travel back in time, and play mental movies of our life experiences. There are also hints that time cells play a critical role in imagining future events. Without time cells, our memories would lack order. In an experiment at the University of California, San Diego, scientists gave several groups of people a tour of the campus. The tour included 11 planned events, including finding change in a vending machine and drinking from a water fountain. © 2022 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28608 - Posted: 12.21.2022

By Yasemin Saplakoglu Memory and perception seem like entirely distinct experiences, and neuroscientists used to be confident that the brain produced them differently, too. But in the 1990s neuroimaging studies revealed that parts of the brain that were thought to be active only during sensory perception are also active during the recall of memories. “It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” said Sam Ling, an associate professor of neuroscience and director of the Visual Neuroscience Lab at Boston University. Could our memory of a beautiful forest glade, for example, be just a re-creation of the neural activity that previously enabled us to see it? “The argument has swung from being this debate over whether there’s even any involvement of sensory cortices to saying ‘Oh, wait a minute, is there any difference?’” said Christopher Baker, an investigator at the National Institute of Mental Health who runs the learning and plasticity unit. “The pendulum has swung from one side to the other, but it’s swung too far.” Even if there is a very strong neurological similarity between memories and experiences, we know that they can’t be exactly the same. “People don’t get confused between them,” said Serra Favila, a postdoctoral scientist at Columbia University and the lead author of a recent Nature Communications study. Her team’s work has identified at least one of the ways in which memories and perceptions of images are assembled differently at the neurological level. When we look at the world, visual information about it streams through the photoreceptors of the retina and into the visual cortex, where it is processed sequentially in different groups of neurons. Each group adds new levels of complexity to the image: Simple dots of light turn into lines and edges, then contours, then shapes, then complete scenes that embody what we’re seeing. Simons Foundation © 2022

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 28597 - Posted: 12.15.2022

By Jim Davies Living for the moment gets a bad rap. If you’re smart, people say, you should work toward a good future, sacrificing fun and pleasure in the present. Yet there are good reasons to discount the future, which is why economists tend to do it when making predictions. Would you rather find $5 when you’re in elementary school, or in your second marriage? People tend to get richer as they age. Five dollars simply means more to you when you’re 9 than when you’re 49. Also, the future is uncertain. We can’t always trust there’ll be one. It’s likely some kids in Walter Mischel’s famous “marshmallow experiment”—which asked kids to wait to eat a marshmallow to get another one—didn’t actually believe that the experimenter would come through with the second marshmallow, and so ate the first marshmallow right away. Saving for retirement makes no sense if in five years a massive meteor cuts human civilization short. Economists call this the “catastrophe” or “hazard” rate. For Sangil “Arthur” Lee, a psychologist at the University of California, Berkeley, where he’s a postdoc, a hazard rate makes sense from an evolutionary perspective. “You might not survive until next winter, so there is some inherent trade off that you need to make, which is not only specific for humans, but also for animals,” he said. While an undergraduate, Lee experimented with delay-discounting tasks using pigeons. The pigeons would peck one button to get a small amount of pellets now, or peck a different button to get large amounts of pellets later. “What we know,” Lee said, “is that across pigeons, monkeys, rats, and various animals, they also discount future rewards in pretty much a similar way that humans do, which is this sort of hyperbolic fashion.” We discount future rewards by a lot very quickly, more so than we would be if discounting the future exponentially, but the hyperbolic discount rate eases after a bit. What makes us discount the future? Lee, in a new study with his colleagues, pins it at least partly on our powers of imagination.1 When we think about what hasn’t yet happened, it tends to be abstract. Things right now, on the other hand, we think of in more tangible terms. Several behavioral studies have supported the idea that what we cannot clearly imagine, we value less. © 2022 NautilusThink Inc, All rights reserved.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28552 - Posted: 11.16.2022

By Phil Jaekl One fine spring afternoon this year, as I was out running errands in the small Norwegian town where I live, a loud beep startled me into awareness. What had just been on my mind? After a moment’s pause, I realized something strange. I’d been thinking two things at the same time—rehearsing the combination of a new bike lock and contemplating whether I should wear the clunky white beeper that had just sounded into a bank. How, I wondered, could I have been saying two things simultaneously in my mind? Was I deceiving myself? Was this, mentally, normal? I silenced the beeper on my belt and pulled out my phone to make a voice memo of the bizarre experience before I walked into the bank; aesthetics be damned. I was in the midst of an experiment that involved keeping a log of my inner thoughts for Russ Hurlburt, a senior psychologist at the University of Las Vegas. For decades, Hurlburt has been motivated by one question: How, exactly, do we experience our own mental life? It’s a simple enough question. And, one might argue, an existentially important one. But it’s a surprisingly vexing query to try to answer. Once we turn our gaze inward, the subjective squishiness of our mental experience seems to defy objective scrutiny. For centuries, philosophers and psychologists have presumed our mental life is composed primarily of a single-stream inner monologue. I know that’s what I had assumed, and my training in cognitive neuroscience had never led me to suppose otherwise. Hurlburt, however, finds this armchair conclusion “dramatically wrong.”1 © 2022 NautilusThink Inc,

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28505 - Posted: 10.08.2022

Inside a Berlin neuroscience lab one day last year, Subject 1 sat on a chair with their arms up and their bare toes pointed down. Hiding behind them, with full access to the soles of their feet, was Subject 2, waiting with fingers curled. At a moment of their choosing, Subject 2 was instructed to take the open shot: Tickle the hell out of their partner. In order to capture the moment, a high-speed GoPro was pointed at Subject 1’s face and body. Another at their feet. A microphone hung nearby. As planned, Subject 1 couldn’t help but laugh. The fact that they couldn’t help it is what has drawn Michael Brecht, leader of the research group from Humboldt University, to the neuroscience of tickling and play. It’s funny, but it’s also deeply mysterious—and understudied. “It’s been a bit of a stepchild of scientific investigation,” Brecht says. After all, brain and behavior research typically skew toward gloom, topics like depression, pain, and fear. “But,” he says, “I think there are also more deep prejudices against play—it's something for children.” The prevailing wisdom holds that laughter is a social behavior among certain mammals. It’s a way of disarming others, easing social tensions, and bonding. Chimps do it. Dogs and dolphins too. Rats are the usual subjects in tickling studies. If you flip ’em over and go to town on their bellies, they’ll squeak at a pitch more than twice as high as the limit of human ears. But there are plenty of lingering mysteries about tickling, whether among rats or people. The biggest one of all: why we can’t tickle ourselves. “If you read the ancient Greeks, Aristotle was wondering about ticklishness. Also Socrates, Galileo Galilei, and Francis Bacon,” says Konstantina Kilteni, a cognitive neuroscientist who studies touch and tickling at Sweden’s Karolinska Institutet, and who is not involved in Brecht’s work. We don’t know why touch can be ticklish, nor what happens in the brain. We don’t know why some people—or some body parts—are more ticklish than others. “These questions are very old,” she continues, “and after almost 2,000 years, we still really don’t have the answer.” © 2022 Condé Nast.

Related chapters from BN: Chapter 15: Emotions, Aggression, and Stress; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 14: Attention and Higher Cognition
Link ID: 28504 - Posted: 10.08.2022

By Ed Yong On March 25, 2020, Hannah Davis was texting with two friends when she realized that she couldn’t understand one of their messages. In hindsight, that was the first sign that she had COVID-19. It was also her first experience with the phenomenon known as “brain fog,” and the moment when her old life contracted into her current one. She once worked in artificial intelligence and analyzed complex systems without hesitation, but now “runs into a mental wall” when faced with tasks as simple as filling out forms. Her memory, once vivid, feels frayed and fleeting. Former mundanities—buying food, making meals, cleaning up—can be agonizingly difficult. Her inner world—what she calls “the extras of thinking, like daydreaming, making plans, imagining”—is gone. The fog “is so encompassing,” she told me, “it affects every area of my life.” For more than 900 days, while other long-COVID symptoms have waxed and waned, her brain fog has never really lifted. Of long COVID’s many possible symptoms, brain fog “is by far one of the most disabling and destructive,” Emma Ladds, a primary-care specialist from the University of Oxford, told me. It’s also among the most misunderstood. It wasn’t even included in the list of possible COVID symptoms when the coronavirus pandemic first began. But 20 to 30 percent of patients report brain fog three months after their initial infection, as do 65 to 85 percent of the long-haulers who stay sick for much longer. It can afflict people who were never ill enough to need a ventilator—or any hospital care. And it can affect young people in the prime of their mental lives. Long-haulers with brain fog say that it’s like none of the things that people—including many medical professionals—jeeringly compare it to. It is more profound than the clouded thinking that accompanies hangovers, stress, or fatigue. For Davis, it has been distinct from and worse than her experience with ADHD. It is not psychosomatic, and involves real changes to the structure and chemistry of the brain. It is not a mood disorder: “If anyone is saying that this is due to depression and anxiety, they have no basis for that, and data suggest it might be the other direction,” Joanna Hellmuth, a neurologist at UC San Francisco, told me. (c) 2022 by The Atlantic Monthly Group. All Rights Reserved.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28487 - Posted: 09.21.2022

By Tim Vernimmen When psychologist Jonathan Smallwood set out to study mind-wandering about 25 years ago, few of his peers thought that was a very good idea. How could one hope to investigate these spontaneous and unpredictable thoughts that crop up when people stop paying attention to their surroundings and the task at hand? Thoughts that couldn’t be linked to any measurable outward behavior? But Smallwood, now at Queen’s University in Ontario, Canada, forged ahead. He used as his tool a downright tedious computer task that was intended to reproduce the kinds of lapses of attention that cause us to pour milk into someone’s cup when they asked for black coffee. And he started out by asking study participants a few basic questions to gain insight into when and why minds tend to wander, and what subjects they tend to wander toward. After a while, he began to scan participants’ brains as well, to catch a glimpse of what was going on in there during mind-wandering. Smallwood learned that unhappy minds tend to wander in the past, while happy minds often ponder the future. He also became convinced that wandering among our memories is crucial to help prepare us for what is yet to come. Though some kinds of mind-wandering — such as dwelling on problems that can’t be fixed — may be associated with depression, Smallwood now believes mind-wandering is rarely a waste of time. It is merely our brain trying to get a bit of work done when it is under the impression that there isn’t much else going on. Smallwood, who coauthored an influential 2015 overview of mind-wandering research in the Annual Review of Psychology, is the first to admit that many questions remain to be answered. © 2022 Annual Reviews

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28461 - Posted: 09.03.2022

By Elizabeth Landau Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circle’s circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street. In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculation—credited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujan—doesn’t involve anything more complicated than adding, dividing, and squaring numbers. “How could you not say that’s amazing?” Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom. As a fellow pi enthusiast—I am well known among friends for hosting Pi Day pie parties—I had to agree with him that it’s a dazzling formula. But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe. In the 1950s, some educators began to observe a phenomenon they called mathemaphobia in students,1 though this was just one of a long list of academic phobias they saw in students. Today, nearly 1 in 5 U.S. adults suffers from high levels of math anxiety, according to some estimates,2 and a 2016 study found that 11 percent of university students experienced “high enough levels of mathematics anxiety to be in need of counseling.”3 Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.4 While many questions remain about the underlying reasons, high school math scores in the U.S. tend to rank significantly lower than those in many other countries. In 2018, for example, American students ranked 30th in the world in their math scores on the PISA exam, an international assessment given every three years. © 2022 NautilusThink Inc,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28459 - Posted: 09.03.2022

Heidi Ledford It’s not just in your head: a desire to curl up on the couch after a day spent toiling at the computer could be a physiological response to mentally demanding work, according to a study that links mental fatigue to changes in brain metabolism. The study, published on 11 August in Current Biology1, found that participants who spent more than six hours working on a tedious and mentally taxing assignment had higher levels of glutamate — an important signalling molecule in the brain. Too much glutamate can disrupt brain function, and a rest period could allow the brain to restore proper regulation of the molecule, the authors note. At the end of their work day, these study participants were also more likely than those who had performed easier tasks to opt for short-term, easily won financial rewards of lesser value than larger rewards that come after a longer wait or involve more effort. The study is important in its effort to link cognitive fatigue with neurometabolism, says behavioural neuroscientist Carmen Sandi at the Swiss Federal Institute of Technology in Lausanne. But more research — potentially in non-human animals — will be needed to establish a causal link between feelings of exhaustion and metabolic changes in the brain, she adds. “It’s very good to start looking into this aspect,” says Sandi. “But for now this is an observation, which is a correlation.” Tired brain Previous research has demonstrated effects of mental strain on physiological parameters such as heart-rate variability and blood flow, but these tend to be subtle, says Martin Hagger, a health psychologist at the University of California, Merced. “It’s not like when you’re exercising skeletal muscle,” he says. “But it is perceptible.” Cognitive neuroscientist Antonius Wiehler at the Paris Brain Institute and his colleagues thought that the effects of cognitive fatigue could be due to metabolic changes in the brain. The team enrolled 40 participants and assigned 24 of them to perform a challenging task: for example, watching letters appear on a computer screen every 1.6 seconds and documenting when one matched a letter that had appeared three letters ago. The other 16 participants were asked to perform a similar, but easier task. Both teams worked for just over six hours, with two ten-minute breaks. © 2022 Springer Nature Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28430 - Posted: 08.11.2022

By Chantel Prat I remember all too well that day early in the pandemic when we first received the “stay at home” order. My attitude quickly shifted from feeling like I got a “snow day” to feeling like a bird in a cage. Being a person who is both extraverted by nature and not one who enjoys being told what to do, the transition was pretty rough. But you know what? I got used to it. Though the pandemic undoubtedly affected some of your lives more than others, I know it touched every one of us in ways we will never forget. And now, after two years and counting, I am positive that every person reading this is fundamentally different from when the pandemic started. Because that’s how our brains work. They are molded by our experiences so that we can fit into all kinds of different situations—even the decidedly suboptimal ones. MOTHER TONGUE: Neuroscientist and psychologist Chantel Prat says the languages we speak play a huge role in shaping our minds and brains. Photo by Shaya Bendix Lyon. This is actually one of the most human things about all of our brains. In fact, according to some contemporary views of human evolution, our ancestors underwent a “cognitive revolution” precisely because they were forced to adapt. Based on evidence suggesting that the size of our ancestors’ brains increased following periods of extreme weather instability, one popular explanation for our remarkable flexibility is that the hominids who were not able to adapt to environmental changes didn’t survive. In other words, the brains of modern humans were selected for their ability to learn and adapt to changing environments. But one of the major costs of this remarkable flexibility is that humans are born without any significant preconceived notions about how things work. If you’ve ever had a conversation with someone about an event you both participated in that left you feeling like one of you was delusional because your stories were so different, you might have a hint about how much your experiences have shaped the way you understand the world around you. This can be insanely frustrating because—let’s face it—our own brains are really convincing when they construct our personal version of reality. Remember the Dress? Though it can feel like gaslighting when someone has a different reality from yours, it’s also entirely possible that you both were reporting your version of the truth. At the end of the day, the way people remember a story reflects differences in the way they experienced the original event. The scientific explanation for this boils down to differences in perspective. © 2022 NautilusThink Inc,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 28427 - Posted: 08.11.2022

By S. Hussain Hussain Ather You reach over a stove to pick up a pot. What you didn’t realize was that the burner was still on. Ouch! That painful accident probably taught you a lesson. It’s adaptive to learn from unexpected events so that we don’t repeat our mistakes. Our brain may be primed to pay extra attention when we are surprised. In a recent Nature study, researchers at the Massachusetts Institute of Technology found evidence that a hormone, noradrenaline, alters brain activity—and an animal’s subsequent behavior—in these startling moments. Noradrenaline is one of several chemicals that can flood the brain with powerful signals. Past research shows that noradrenaline is involved when we are feeling excited, anxious or alert and that it contributes to learning. But the new research shows it plays a strong role in responses to the unexpected. The M.I.T. team used a method called optogenetics to study noradrenaline in mice. The scientists added special light-sensitive proteins to neurons that work as an “off switch” for the cells when hit by pulses of laser light. They focused on modifying a brain area called the locus coeruleus, which holds cells responsible for releasing noradrenaline. With lasers, the researchers were able to stop these cells from producing the hormone in specific circumstances. They combined this method with photo tagging, a technique in which proteins flash with light, allowing the scientists to observe activity in the locus coeruleus cells and then determine how much noradrenaline was produced. Then the researchers designed a trial-and-error learning task for the rodents. The mice could push levers when they heard a sound. There were two sounds. After high-frequency tones of about 12 kilohertz, mice that pushed a lever were rewarded with water they could drink. For low-frequency tones, around four kilohertz, the mice that hit the lever got a slightly unpleasant surprise: a discomforting puff of air was blown at them. Over time, mice learned to push the lever only when they heard high-frequency tones because they got water when they did so. They avoided the lever when they heard low-frequency tones. © 2022 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 28412 - Posted: 07.30.2022

Deepfakes – AI-generated videos and pictures of people – are becoming more and more realistic. This makes them the perfect weapon for disinformation and fraud. But while you might consciously be tricked by a deepfake, new evidence suggests that your brain knows better. Fake portraits cause different signals to fire on brain scans, according to a paper published in Vision Research. While you consciously can’t spot the fake (for those playing at home, the face on the right is the phony), your neurons are more reliable. “Your brain sees the difference between the two images. You just can’t see it yet,” says co-author Associate Professor Thomas Carlson, a researcher at the University of Sydney’s School of Psychology. The researchers asked volunteers to view a series of several hundred photos, some of which were real and some of which were fakes generated by a GAN (a Generative Adversarial Network, a common way of making deepfakes). One group of 200 participants was asked to guess which images were real, and which were fake, by pressing a button. A different group of 22 participants didn’t guess, but underwent electroencephalography (EEG) tests while they were viewing the images. The EEGs showed distinct signals when participants were viewing deepfakes, compared to real images. “The brain is responding different than when it sees a real image,” says Carlson. “It’s sort of difficult to figure out what exactly it’s picking up on, because all you can really see is that it is different – that’s something we’ll have to do more research to figure out.”

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28402 - Posted: 07.16.2022

By Eiman Azim, Sliman Bensmaia, Lee E. Miller, Chris Versteeg Imagine you are playing the guitar. You’re seated, supporting the instrument’s weight across your lap. One hand strums; the other presses strings against the guitar’s neck to play chords. Your vision tracks sheet music on a page, and your hearing lets you listen to the sound. In addition, two other senses make playing this instrument possible. One of them, touch, tells you about your interactions with the guitar. Another, proprioception, tells you about your arms’ and hands’ positions and movements as you play. Together, these two capacities combine into what scientists call somatosensation, or body perception. Our skin and muscles have millions of sensors that contribute to somatosensation. Yet our brain does not become overwhelmed by the barrage of these inputs—or from any of our other senses, for that matter. You’re not distracted by the pinch of your shoes or the tug of the guitar strap as you play; you focus only on the sensory inputs that matter. The brain expertly enhances some signals and filters out others so that we can ignore distractions and focus on the most important details. How does the brain accomplish these feats of focus? In recent research at Northwestern University, the University of Chicago and the Salk Institute for Biological Studies in La Jolla, Calif., we have illuminated a new answer to this question. Through several studies, we have discovered that a small, largely ignored structure at the very bottom of the brain stem plays a critical role in the brain’s selection of sensory signals. The area is called the cuneate nucleus, or CN. Our research on the CN not only changes the scientific understanding of sensory processing, but it might also lay the groundwork for medical interventions to restore sensation in patients with injury or disease. © 2022 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 28330 - Posted: 05.18.2022

Imma Perfetto Have you ever driven past an intersection and registered you should have turned right a street ago, or been in a conversation and, as soon as the words are out of your mouth, realised you really shouldn’t have said that thing you just did? It’s a phenomenon known as performance monitoring; an internal signal produced by the brain that lets you know when you’ve made a mistake. Performance monitoring is a kind of self-generated feedback that’s essential to managing our daily lives. Now, neuroscientists have discovered that signals from neurons in the brain’s medial frontal cortex are responsible for it. A new study published in Science reports that these signals are used to give humans the flexibility to learn new tasks and the focus to develop highly specific skills. “Part of the magic of the human brain is that it is so flexible,” says senior author Ueli Rutishauser, professor of Neurosurgery, Neurology, and Biomedical Sciences at Cedars-Sinai Medical Center, US. “We designed our study to decipher how the brain can generalise and specialise at the same time, both of which are critical for helping us pursue a goal.” They found that the performance monitoring signals help improve future attempts of a particular task by passing information to other areas of the brain. They also help the brain adjust its focus by signalling how much conflict or difficulty was encountered during the task. “An ‘Oops!’ moment might prompt someone to pay closer attention the next time they chat with a friend, or plan to stop at the store on the way home from work,” explains first author Zhongzheng Fu, researcher in the Rutishauser Laboratory at Cedars-Sinai.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28322 - Posted: 05.11.2022

Minuscule involuntary eye movements, known as microsaccades, can occur even while one is carefully staring at a fixed point in space. When paying attention to something in the peripheral vision (called covert attention), these microsaccades sometimes align towards the object of interest. New research by National Eye Institute (NEI) investigators shows that while these microsaccades seem to boost or diminish the strength of the brain signals underlying attention, the eye movements are not drivers of those brain signals. The findings will help researchers interpret studies about covert attention and may open new areas for research into attention disorders and behavior. NEI is part of the National Institutes of Health. Scientists working on the neuroscience of attention have recently become concerned that because both attention and eye movements, like microsaccades, involve the same groups of neurons in the brain, that microsaccades might be required for shifting attention. “If microsaccades were driving attention, that would bring into question a lot of previous research in the field.” said Richard Krauzlis, Ph.D., chief of the NEI Section on Eye Movements and Visual Selection, and senior author of a study report on the research. “This work shows that while microsaccades and attention do share some mechanisms, covert attention is not driven by eye movements.” Krauzlis’ previous research has shown that covert attention causes a modulation of certain neuronal signals in an evolutionarily ancient area of the brain called the superior colliculus, which is involved in the detection of events. When attention is being paid to a particular area – for example, the right-hand side of one’s peripheral vision – signals in the superior colliculus relating to events that occur in that area will receive an extra boost, while signals relating to events occurring somewhere else, like on the left-hand side, will be depressed.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 28254 - Posted: 03.26.2022

By Conor Feehly There's a paradox in our ability to pay attention. When we are hyper-focused on our surroundings, our senses become more acutely aware of the signals they pick up. But sometimes when we are paying attention, we miss things in our sensory field that are so glaringly obvious, on a second look we can’t help but question the legitimacy of our perception. Back in 1999, the psychologist Daniel Simons created a clever scenario that poignantly demonstrates this phenomenon. (Test it yourself in less than two minutes by watching Simons’ video here, which we recommend before the spoiler below.) In the scenario, there are two teams, each consisting of three players, with one team dressed in black and the other in white. The viewer is asked to count how many passes the team in white makes throughout the course of the video. Sure enough, as the video ends, most people are able to accurately guess the number of passes. Then the narrator asks: But did you see the gorilla? As it turns out, someone in a gorilla suit slowly walks into the scene, in plain sight. Most people who watch the video for the first time and focus on counting passes completely overlook the out-of-place primate. It seems strange, given the viewer’s intent observation of the small field of view where the scene unfolds. Predictive Processing Neuroscientist Anil Seth offers an interesting explanation of this phenomenon in his book Being You: A New Science of Consciousness. Seth’s description draws from one of neuroscience’s leading theories of cognition and perception. © 2022 Kalmbach Media Co.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28208 - Posted: 02.19.2022