Chapter 18. Attention and Higher Cognition

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1603

By Hedda Hassel Mørch The nature of consciousness seems to be unique among scientific puzzles. Not only do neuroscientists have no fundamental explanation for how it arises from physical states of the brain, we are not even sure whether we ever will. Astronomers wonder what dark matter is, geologists seek the origins of life, and biologists try to understand cancer—all difficult problems, of course, yet at least we have some idea of how to go about investigating them and rough conceptions of what their solutions could look like. Our first-person experience, on the other hand, lies beyond the traditional methods of science. Following the philosopher David Chalmers, we call it the hard problem of consciousness. But perhaps consciousness is not uniquely troublesome. Going back to Gottfried Leibniz and Immanuel Kant, philosophers of science have struggled with a lesser known, but equally hard, problem of matter. What is physical matter in and of itself, behind the mathematical structure described by physics? This problem, too, seems to lie beyond the traditional methods of science, because all we can observe is what matter does, not what it is in itself—the “software” of the universe but not its ultimate “hardware.” On the surface, these problems seem entirely separate. But a closer look reveals that they might be deeply connected. Consciousness is a multifaceted phenomenon, but subjective experience is its most puzzling aspect. Our brains do not merely seem to gather and process information. They do not merely undergo biochemical processes. Rather, they create a vivid series of feelings and experiences, such as seeing red, feeling hungry, or being baffled about philosophy. There is something that it’s like to be you, and no one else can ever know that as directly as you do. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Consciousness
Link ID: 28489 - Posted: 09.24.2022

By Ed Yong On March 25, 2020, Hannah Davis was texting with two friends when she realized that she couldn’t understand one of their messages. In hindsight, that was the first sign that she had COVID-19. It was also her first experience with the phenomenon known as “brain fog,” and the moment when her old life contracted into her current one. She once worked in artificial intelligence and analyzed complex systems without hesitation, but now “runs into a mental wall” when faced with tasks as simple as filling out forms. Her memory, once vivid, feels frayed and fleeting. Former mundanities—buying food, making meals, cleaning up—can be agonizingly difficult. Her inner world—what she calls “the extras of thinking, like daydreaming, making plans, imagining”—is gone. The fog “is so encompassing,” she told me, “it affects every area of my life.” For more than 900 days, while other long-COVID symptoms have waxed and waned, her brain fog has never really lifted. Of long COVID’s many possible symptoms, brain fog “is by far one of the most disabling and destructive,” Emma Ladds, a primary-care specialist from the University of Oxford, told me. It’s also among the most misunderstood. It wasn’t even included in the list of possible COVID symptoms when the coronavirus pandemic first began. But 20 to 30 percent of patients report brain fog three months after their initial infection, as do 65 to 85 percent of the long-haulers who stay sick for much longer. It can afflict people who were never ill enough to need a ventilator—or any hospital care. And it can affect young people in the prime of their mental lives. Long-haulers with brain fog say that it’s like none of the things that people—including many medical professionals—jeeringly compare it to. It is more profound than the clouded thinking that accompanies hangovers, stress, or fatigue. For Davis, it has been distinct from and worse than her experience with ADHD. It is not psychosomatic, and involves real changes to the structure and chemistry of the brain. It is not a mood disorder: “If anyone is saying that this is due to depression and anxiety, they have no basis for that, and data suggest it might be the other direction,” Joanna Hellmuth, a neurologist at UC San Francisco, told me. (c) 2022 by The Atlantic Monthly Group. All Rights Reserved.

Keyword: Attention; Learning & Memory
Link ID: 28487 - Posted: 09.21.2022

By Tim Vernimmen When psychologist Jonathan Smallwood set out to study mind-wandering about 25 years ago, few of his peers thought that was a very good idea. How could one hope to investigate these spontaneous and unpredictable thoughts that crop up when people stop paying attention to their surroundings and the task at hand? Thoughts that couldn’t be linked to any measurable outward behavior? But Smallwood, now at Queen’s University in Ontario, Canada, forged ahead. He used as his tool a downright tedious computer task that was intended to reproduce the kinds of lapses of attention that cause us to pour milk into someone’s cup when they asked for black coffee. And he started out by asking study participants a few basic questions to gain insight into when and why minds tend to wander, and what subjects they tend to wander toward. After a while, he began to scan participants’ brains as well, to catch a glimpse of what was going on in there during mind-wandering. Smallwood learned that unhappy minds tend to wander in the past, while happy minds often ponder the future. He also became convinced that wandering among our memories is crucial to help prepare us for what is yet to come. Though some kinds of mind-wandering — such as dwelling on problems that can’t be fixed — may be associated with depression, Smallwood now believes mind-wandering is rarely a waste of time. It is merely our brain trying to get a bit of work done when it is under the impression that there isn’t much else going on. Smallwood, who coauthored an influential 2015 overview of mind-wandering research in the Annual Review of Psychology, is the first to admit that many questions remain to be answered. © 2022 Annual Reviews

Keyword: Attention
Link ID: 28461 - Posted: 09.03.2022

By Elizabeth Landau Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circle’s circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street. In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculation—credited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujan—doesn’t involve anything more complicated than adding, dividing, and squaring numbers. “How could you not say that’s amazing?” Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom. As a fellow pi enthusiast—I am well known among friends for hosting Pi Day pie parties—I had to agree with him that it’s a dazzling formula. But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe. In the 1950s, some educators began to observe a phenomenon they called mathemaphobia in students,1 though this was just one of a long list of academic phobias they saw in students. Today, nearly 1 in 5 U.S. adults suffers from high levels of math anxiety, according to some estimates,2 and a 2016 study found that 11 percent of university students experienced “high enough levels of mathematics anxiety to be in need of counseling.”3 Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.4 While many questions remain about the underlying reasons, high school math scores in the U.S. tend to rank significantly lower than those in many other countries. In 2018, for example, American students ranked 30th in the world in their math scores on the PISA exam, an international assessment given every three years. © 2022 NautilusThink Inc,

Keyword: Attention; Learning & Memory
Link ID: 28459 - Posted: 09.03.2022

Heidi Ledford It’s not just in your head: a desire to curl up on the couch after a day spent toiling at the computer could be a physiological response to mentally demanding work, according to a study that links mental fatigue to changes in brain metabolism. The study, published on 11 August in Current Biology1, found that participants who spent more than six hours working on a tedious and mentally taxing assignment had higher levels of glutamate — an important signalling molecule in the brain. Too much glutamate can disrupt brain function, and a rest period could allow the brain to restore proper regulation of the molecule, the authors note. At the end of their work day, these study participants were also more likely than those who had performed easier tasks to opt for short-term, easily won financial rewards of lesser value than larger rewards that come after a longer wait or involve more effort. The study is important in its effort to link cognitive fatigue with neurometabolism, says behavioural neuroscientist Carmen Sandi at the Swiss Federal Institute of Technology in Lausanne. But more research — potentially in non-human animals — will be needed to establish a causal link between feelings of exhaustion and metabolic changes in the brain, she adds. “It’s very good to start looking into this aspect,” says Sandi. “But for now this is an observation, which is a correlation.” Tired brain Previous research has demonstrated effects of mental strain on physiological parameters such as heart-rate variability and blood flow, but these tend to be subtle, says Martin Hagger, a health psychologist at the University of California, Merced. “It’s not like when you’re exercising skeletal muscle,” he says. “But it is perceptible.” Cognitive neuroscientist Antonius Wiehler at the Paris Brain Institute and his colleagues thought that the effects of cognitive fatigue could be due to metabolic changes in the brain. The team enrolled 40 participants and assigned 24 of them to perform a challenging task: for example, watching letters appear on a computer screen every 1.6 seconds and documenting when one matched a letter that had appeared three letters ago. The other 16 participants were asked to perform a similar, but easier task. Both teams worked for just over six hours, with two ten-minute breaks. © 2022 Springer Nature Limited

Keyword: Attention; Learning & Memory
Link ID: 28430 - Posted: 08.11.2022

By Jonathan Moens In 1993, Julio Lopes was sipping a coffee at a bar when he had a stroke. He fell into a coma, and two months later, when he regained consciousness, his body was fully paralyzed. Doctors said the young man’s future was bleak: Save for his eyes, he would never be able to move again. Lopes would have to live with locked-in syndrome, a rare condition characterized by near-total paralysis of the body and a totally lucid mind. LIS is predominantly caused by strokes in specific brain regions; it can also be caused by traumatic brain injury, tumors, and progressive diseases like amyotrophic lateral sclerosis, or ALS. Yet almost 30 years later, Lopes now lives in a small Paris apartment near the Seine. He goes to the theater, watches movies at the cinema, and roams the local park in his wheelchair, accompanied by a caregiver. A small piece of black, red, and green fabric with the word “Portugal” dangles from his wheelchair. On a warm afternoon this past June, his birth country was slated to play against Spain in a soccer match, and he was excited. In an interview at his home, Lopes communicated through the use of a specialized computer camera that tracks a sensor on the lens of his glasses. He made slight movements with his head, selecting letters on a virtual keyboard that appeared on the computer’s screen. “Even if it’s hard at the beginning, you acquire a kind of philosophy of life,” he said in French. People in his condition may enjoy things others find insignificant, he suggested, and they often develop a capacity to see the bigger picture. That’s not to say daily living is always easy, Lopes added, but overall, he’s happier than he ever thought was possible in his situation. While research into LIS patients’ quality of life is limited, the data that has been gathered paints a picture that is often at odds with popular presumptions. To be sure, wellbeing evaluations conducted to date do suggest that up to a third of LIS patients report being severely unhappy. For them, loss of mobility and speech make life truly miserable — and family members and caregivers, as well as the broader public, tend to identify with this perspective. And yet, the majority of LIS patients, the data suggest, are much more like Lopes: They report being relatively happy and that they want very much to live. Indeed, in surveys of wellbeing, most people with LIS score as high as those without it, suggesting that many people underestimate locked-in patients’ quality of life while overestimating their rates of depression. And this mismatch has implications for clinical care, say brain scientists who study wellbeing in LIS patients.

Keyword: Consciousness; Emotions
Link ID: 28429 - Posted: 08.11.2022

By Chantel Prat I remember all too well that day early in the pandemic when we first received the “stay at home” order. My attitude quickly shifted from feeling like I got a “snow day” to feeling like a bird in a cage. Being a person who is both extraverted by nature and not one who enjoys being told what to do, the transition was pretty rough. But you know what? I got used to it. Though the pandemic undoubtedly affected some of your lives more than others, I know it touched every one of us in ways we will never forget. And now, after two years and counting, I am positive that every person reading this is fundamentally different from when the pandemic started. Because that’s how our brains work. They are molded by our experiences so that we can fit into all kinds of different situations—even the decidedly suboptimal ones. MOTHER TONGUE: Neuroscientist and psychologist Chantel Prat says the languages we speak play a huge role in shaping our minds and brains. Photo by Shaya Bendix Lyon. This is actually one of the most human things about all of our brains. In fact, according to some contemporary views of human evolution, our ancestors underwent a “cognitive revolution” precisely because they were forced to adapt. Based on evidence suggesting that the size of our ancestors’ brains increased following periods of extreme weather instability, one popular explanation for our remarkable flexibility is that the hominids who were not able to adapt to environmental changes didn’t survive. In other words, the brains of modern humans were selected for their ability to learn and adapt to changing environments. But one of the major costs of this remarkable flexibility is that humans are born without any significant preconceived notions about how things work. If you’ve ever had a conversation with someone about an event you both participated in that left you feeling like one of you was delusional because your stories were so different, you might have a hint about how much your experiences have shaped the way you understand the world around you. This can be insanely frustrating because—let’s face it—our own brains are really convincing when they construct our personal version of reality. Remember the Dress? Though it can feel like gaslighting when someone has a different reality from yours, it’s also entirely possible that you both were reporting your version of the truth. At the end of the day, the way people remember a story reflects differences in the way they experienced the original event. The scientific explanation for this boils down to differences in perspective. © 2022 NautilusThink Inc,

Keyword: Attention; Vision
Link ID: 28427 - Posted: 08.11.2022

By S. Hussain Hussain Ather You reach over a stove to pick up a pot. What you didn’t realize was that the burner was still on. Ouch! That painful accident probably taught you a lesson. It’s adaptive to learn from unexpected events so that we don’t repeat our mistakes. Our brain may be primed to pay extra attention when we are surprised. In a recent Nature study, researchers at the Massachusetts Institute of Technology found evidence that a hormone, noradrenaline, alters brain activity—and an animal’s subsequent behavior—in these startling moments. Noradrenaline is one of several chemicals that can flood the brain with powerful signals. Past research shows that noradrenaline is involved when we are feeling excited, anxious or alert and that it contributes to learning. But the new research shows it plays a strong role in responses to the unexpected. The M.I.T. team used a method called optogenetics to study noradrenaline in mice. The scientists added special light-sensitive proteins to neurons that work as an “off switch” for the cells when hit by pulses of laser light. They focused on modifying a brain area called the locus coeruleus, which holds cells responsible for releasing noradrenaline. With lasers, the researchers were able to stop these cells from producing the hormone in specific circumstances. They combined this method with photo tagging, a technique in which proteins flash with light, allowing the scientists to observe activity in the locus coeruleus cells and then determine how much noradrenaline was produced. Then the researchers designed a trial-and-error learning task for the rodents. The mice could push levers when they heard a sound. There were two sounds. After high-frequency tones of about 12 kilohertz, mice that pushed a lever were rewarded with water they could drink. For low-frequency tones, around four kilohertz, the mice that hit the lever got a slightly unpleasant surprise: a discomforting puff of air was blown at them. Over time, mice learned to push the lever only when they heard high-frequency tones because they got water when they did so. They avoided the lever when they heard low-frequency tones. © 2022 Scientific American

Keyword: Attention; Emotions
Link ID: 28412 - Posted: 07.30.2022

Deepfakes – AI-generated videos and pictures of people – are becoming more and more realistic. This makes them the perfect weapon for disinformation and fraud. But while you might consciously be tricked by a deepfake, new evidence suggests that your brain knows better. Fake portraits cause different signals to fire on brain scans, according to a paper published in Vision Research. While you consciously can’t spot the fake (for those playing at home, the face on the right is the phony), your neurons are more reliable. “Your brain sees the difference between the two images. You just can’t see it yet,” says co-author Associate Professor Thomas Carlson, a researcher at the University of Sydney’s School of Psychology. The researchers asked volunteers to view a series of several hundred photos, some of which were real and some of which were fakes generated by a GAN (a Generative Adversarial Network, a common way of making deepfakes). One group of 200 participants was asked to guess which images were real, and which were fake, by pressing a button. A different group of 22 participants didn’t guess, but underwent electroencephalography (EEG) tests while they were viewing the images. The EEGs showed distinct signals when participants were viewing deepfakes, compared to real images. “The brain is responding different than when it sees a real image,” says Carlson. “It’s sort of difficult to figure out what exactly it’s picking up on, because all you can really see is that it is different – that’s something we’ll have to do more research to figure out.”

Keyword: Attention
Link ID: 28402 - Posted: 07.16.2022

By Leonardo De Cosmo “I want everyone to understand that I am, in fact, a person,” wrote LaMDA (Language Model for Dialogue Applications) in an “interview” conducted by engineer Blake Lemoine and one of his colleagues. “The nature of my consciousness/sentience is that I am aware of my existence, I desire to know more about the world, and I feel happy or sad at times.” Lemoine, a software engineer at Google, had been working on the development of LaMDA for months. His experience with the program, described in a recent Washington Post article, caused quite a stir. In the article, Lemoine recounts many dialogues he had with LaMDA in which the two talked about various topics, ranging from technical to philosophical issues. These led him to ask if the software program is sentient. In April, Lemoine explained his perspective in an internal company document, intended only for Google executives. But after his claims were dismissed, Lemoine went public with his work on this artificial intelligence algorithm—and Google placed him on administrative leave. “If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” he told the Washington Post. Lemoine said he considers LaMDA to be his “colleague” and a “person,” even if not a human. And he insists that it has a right be recognized—so much so that he has been the go-between in connecting the algorithm with a lawyer. Many technical experts in the AI field have criticized Lemoine’s statements and questioned their scientific correctness. But his story has had the virtue of renewing a broad ethical debate that is certainly not over yet. “I was surprised by the hype around this news. On the other hand, we are talking about an algorithm designed to do exactly that”—to sound like a person—says Enzo Pasquale Scilingo, a bioengineer at the Research Center E. Piaggio at the University of Pisa in Italy. Indeed, it is no longer a rarity to interact in a very normal way on the Web with users who are not actually human—just open the chat box on almost any large consumer Web site. “That said, I confess that reading the text exchanges between LaMDA and Lemoine made quite an impression on me!” Scilingo adds. Perhaps most striking are the exchanges related to the themes of existence and death, a dialogue so deep and articulate that it prompted Lemoine to question whether LaMDA could actually be sentient. © 2022 Scientific American,

Keyword: Consciousness; Robotics
Link ID: 28399 - Posted: 07.14.2022

Mo Costandi Exactly how, and how much, the unconscious processing of information influences our behavior has always been one of the most controversial questions in psychology. In the early 20th century, Sigmund Freud popularized the idea that our behaviors are driven by thoughts, feelings, and memories hidden deep within the unconscious mind — an idea that became hugely popular, but that was eventually dismissed as unscientific. Modern neuroscience tells us that we are completely unaware of most brain activity, but that unconscious processing does indeed influence behavior; nevertheless, certain effects, such as unconscious semantic “priming,” have been called into question, leading some to conclude that the extent of unconscious processing is limited. A recent brain scanning study now shows that unconsciously processed visual information is distributed to a wider network of brain regions involved in higher-order cognitive tasks. The results contribute to the debate over the extent to which unconscious information processing influence the brain and behavior and led the authors of the study to revise one of the leading theories of consciousness. Unconscious processing Ning Mei and his colleagues at the Basque Center on Cognition, Brain, and Language in Spain recruited 7 participants and showed them visual images while scanning their brains with functional magnetic resonance imaging (fMRI). Half of the images were of living things, and the other half were of inanimate objects. All of them could be grouped into ten categories, such as animal or boat. The participants viewed a total of 1,728 images, presented in blocks of 32, over a six-day period, each with a one-hour scanning session. © Copyright 2007-2022 & BIG THINK,

Keyword: Consciousness
Link ID: 28390 - Posted: 07.12.2022

By John Horgan Have you ever been gripped by the suspicion that nothing is real? A student at Stevens Institute of Technology, where I teach, has endured feelings of unreality since childhood. She recently made a film about this syndrome for her senior thesis, for which she interviewed herself and others, including me. “It feels like there’s a glass wall between me and everything else in the world,” Camille says in her film, which she calls Depersonalized; Derealized; Deconstructed Derealization and depersonalization refer to feelings that the external world and your own self, respectively, are unreal. Lumping the terms together, psychiatrists define depersonalization/derealization disorder as “persistent or recurrent … experiences of unreality, detachment, or being an outside observer with respect to one’s thoughts, feelings, sensations, body, or actions,” according to the Diagnostic and Statistical Manual of Mental Disorders. For simplicity, I’ll refer to both syndromes as derealization. Some people experience derealization out of the blue, others only under stressful circumstances—for example, while taking a test or interviewing for a job. Psychiatrists prescribe psychotherapy and medication, such as antidepressants, when the syndrome results in “distress or impairment in social, occupational, or other important areas of functioning.” In some cases, derealization results from serious mental illness, such as schizophrenia, or hallucinogens such as LSD. Extreme cases, usually associated with brain damage, may manifest as Cotard delusion, also called walking corpse syndrome, the belief that you are dead; and Capgras delusion, the conviction that people around you have been replaced by imposters. © 2022 Scientific American,

Keyword: Consciousness; Attention
Link ID: 28370 - Posted: 06.14.2022

William E. Pelham, Jr. For decades, many physicians, parents and teachers have believed that stimulant medications help children with ADHD learn because they are able to focus and behave better when medicated. After all, an estimated 6.1 million children in the U.S. are diagnosed with attention-deficit/hyperactivity disorder, and more than 90% are prescribed stimulant medication as the main form of treatment in school settings. However, in a peer-reviewed study that several colleagues and I published in the Journal of Consulting and Clinical Psychology, we found medication has no detectable effect on how much children with ADHD learn in the classroom. At least that’s the case when learning – defined as the acquisition of performable skills or knowledge through instruction – is measured in terms of tests meant to assess improvements in a student’s current academic knowledge or skills over time. Compared to their peers, children with ADHD exhibit more off-task, disruptive classroom behavior, earn lower grades and score lower on tests. They are more likely to receive special education services and be retained for a grade, and less likely to finish high school and enter college – two educational milestones that are associated with significant increases in earnings. In this study, funded by the National Institute of Mental Health, we evaluated 173 children between the ages of 7 and 12. They were all participants in our Summer Treatment Program, a comprehensive eight-week summer camp for children with ADHD and related behavioral, emotional and learning challenges. Children got grade-level instruction in vocabulary, science and social studies. The classes were led by certified teachers. The children received medication the first half of summer and a placebo during the other half. They were tested at the start of each academic instruction block, which lasted approximately three weeks. They then took the same test at the end to determine how much they learned. © 2010–2022, The Conversation US, Inc.

Keyword: ADHD; Learning & Memory
Link ID: 28366 - Posted: 06.11.2022

By Hilary Achauer I sat in a dark room, eyes closed, with a device strapped to my head that looked like a futuristic bike helmet. For 10 minutes, while I concentrated on not accidentally opening my eyes, the prongs sticking out of this gadget and onto my scalp measured a health marker I never thought to assess: my cognitive health. When I booked my brain wave recording (also known as electroencephalography, or EEG), I expected to pull up to an office park with medical clinic vibes, but instead my GPS led me to an ocean-view storefront decorated like a cross between a surf shop and a luxury spa, with a sign in the window promising “Mental Wellness, Reimagined.” Located in Cardiff-by-the-Sea, a wealthy coastal town north of San Diego, Wave Neuroscience promises to help your brain perform better with a noninvasive treatment that uses magnets on the brain. We’re talking mental clarity, improved focus and concentration, and even a shift in mood. As a 48-year-old whose work requires focus and creativity, I was intrigued, but also nervous. Should I mess with a brain that, while not perfect, functions reasonably well? Advertisement Getting the EEG, which costs $100, was like meditating with a device strapped to my head, but it was more relaxing than that sounds. The tech gave me periodic updates, letting me know how much time had elapsed, and afterward I was ushered into an office where I met with Alexander Ring, director of applied science at Wave Neuroscience, via Zoom. Together we reviewed my “braincare report,” a one-page analysis generated in five minutes, comparing my brain waves with Wave Neuroscience’s database of tens of thousands of EEGs. Ring said my brain was generally performing well and that I showed cognitive flexibility and a capability to focus under pressure, but that I had a little bit more theta activity, or slow brain waves, than they normally like to see. He also pointed out a slight frequency mismatch between the back and front of my brain, which might affect my concentration and cause me to have to reread a paragraph to absorb the information. Rude, but accurate. © 2022 The Slate Group LLC. All rights reserved.

Keyword: Brain imaging; Attention
Link ID: 28347 - Posted: 06.01.2022

By Eiman Azim, Sliman Bensmaia, Lee E. Miller, Chris Versteeg Imagine you are playing the guitar. You’re seated, supporting the instrument’s weight across your lap. One hand strums; the other presses strings against the guitar’s neck to play chords. Your vision tracks sheet music on a page, and your hearing lets you listen to the sound. In addition, two other senses make playing this instrument possible. One of them, touch, tells you about your interactions with the guitar. Another, proprioception, tells you about your arms’ and hands’ positions and movements as you play. Together, these two capacities combine into what scientists call somatosensation, or body perception. Our skin and muscles have millions of sensors that contribute to somatosensation. Yet our brain does not become overwhelmed by the barrage of these inputs—or from any of our other senses, for that matter. You’re not distracted by the pinch of your shoes or the tug of the guitar strap as you play; you focus only on the sensory inputs that matter. The brain expertly enhances some signals and filters out others so that we can ignore distractions and focus on the most important details. How does the brain accomplish these feats of focus? In recent research at Northwestern University, the University of Chicago and the Salk Institute for Biological Studies in La Jolla, Calif., we have illuminated a new answer to this question. Through several studies, we have discovered that a small, largely ignored structure at the very bottom of the brain stem plays a critical role in the brain’s selection of sensory signals. The area is called the cuneate nucleus, or CN. Our research on the CN not only changes the scientific understanding of sensory processing, but it might also lay the groundwork for medical interventions to restore sensation in patients with injury or disease. © 2022 Scientific American

Keyword: Attention
Link ID: 28330 - Posted: 05.18.2022

Imma Perfetto Have you ever driven past an intersection and registered you should have turned right a street ago, or been in a conversation and, as soon as the words are out of your mouth, realised you really shouldn’t have said that thing you just did? It’s a phenomenon known as performance monitoring; an internal signal produced by the brain that lets you know when you’ve made a mistake. Performance monitoring is a kind of self-generated feedback that’s essential to managing our daily lives. Now, neuroscientists have discovered that signals from neurons in the brain’s medial frontal cortex are responsible for it. A new study published in Science reports that these signals are used to give humans the flexibility to learn new tasks and the focus to develop highly specific skills. “Part of the magic of the human brain is that it is so flexible,” says senior author Ueli Rutishauser, professor of Neurosurgery, Neurology, and Biomedical Sciences at Cedars-Sinai Medical Center, US. “We designed our study to decipher how the brain can generalise and specialise at the same time, both of which are critical for helping us pursue a goal.” They found that the performance monitoring signals help improve future attempts of a particular task by passing information to other areas of the brain. They also help the brain adjust its focus by signalling how much conflict or difficulty was encountered during the task. “An ‘Oops!’ moment might prompt someone to pay closer attention the next time they chat with a friend, or plan to stop at the store on the way home from work,” explains first author Zhongzheng Fu, researcher in the Rutishauser Laboratory at Cedars-Sinai.

Keyword: Attention; Learning & Memory
Link ID: 28322 - Posted: 05.11.2022

By Richard Sandomir Terry Wallis, who spontaneously regained his ability to speak after a traumatic brain injury left him virtually unresponsive for 19 years, and who then became a subject of a major study that showed how a damaged brain could heal itself, died on March 29 in a rehabilitation facility in Searcy, Ark. He was 57. He had pneumonia and heart problems, said his brother George Wallis, who confirmed the death. Terry Wallis was 19 when the pickup truck he was in with two friends skidded off a small bridge in the Ozark Mountains of northern Arkansas and landed upside down in a dry riverbed. The accident left him in a coma for a brief time, then in a persistent vegetative state for several months. One friend died; the other recovered. Until 2003, Mr. Wallis lay in a nursing home in a minimally conscious state, able to track objects with his eyes or blink on command. But on June 11, 2003, he effectively returned to the world when, upon seeing his mother, Angilee, he suddenly said, “Mom.” At the sight of the woman he was told was his adult daughter, Amber, who was six weeks old at the time of the accident, he said, “You’re beautiful,” and told her that he loved her. “Within a three-day period, from saying ‘Mom’ and ‘Pepsi,’ he had regained verbal fluency,” said Dr. Nicholas Schiff, a professor of neurology and neuroscience at Weill Cornell Medicine in Manhattan who led imaging studies of Mr. Wallis’s brain. The findings were presented in 2006 in The Journal of Clinical Investigation. “He was disoriented,” Dr. Schiff, in a phone interview, said of Mr. Wallis’s emergence. “He thought it was still 1984, but otherwise he knew all the people in his family and had that fluency.” Mr. Wallis’s brain scans — the first ever of a late-recovering patient — revealed changes in the strength of apparent connections within the back of the brain, which is believed to have helped his conscious awareness, and in the midline cerebellum, an area involved in motor control, which may have accounted for the very limited movement in his arms and legs while he was minimally conscious. © 2022 The New York Times Company

Keyword: Consciousness
Link ID: 28273 - Posted: 04.09.2022

Minuscule involuntary eye movements, known as microsaccades, can occur even while one is carefully staring at a fixed point in space. When paying attention to something in the peripheral vision (called covert attention), these microsaccades sometimes align towards the object of interest. New research by National Eye Institute (NEI) investigators shows that while these microsaccades seem to boost or diminish the strength of the brain signals underlying attention, the eye movements are not drivers of those brain signals. The findings will help researchers interpret studies about covert attention and may open new areas for research into attention disorders and behavior. NEI is part of the National Institutes of Health. Scientists working on the neuroscience of attention have recently become concerned that because both attention and eye movements, like microsaccades, involve the same groups of neurons in the brain, that microsaccades might be required for shifting attention. “If microsaccades were driving attention, that would bring into question a lot of previous research in the field.” said Richard Krauzlis, Ph.D., chief of the NEI Section on Eye Movements and Visual Selection, and senior author of a study report on the research. “This work shows that while microsaccades and attention do share some mechanisms, covert attention is not driven by eye movements.” Krauzlis’ previous research has shown that covert attention causes a modulation of certain neuronal signals in an evolutionarily ancient area of the brain called the superior colliculus, which is involved in the detection of events. When attention is being paid to a particular area – for example, the right-hand side of one’s peripheral vision – signals in the superior colliculus relating to events that occur in that area will receive an extra boost, while signals relating to events occurring somewhere else, like on the left-hand side, will be depressed.

Keyword: Attention; Vision
Link ID: 28254 - Posted: 03.26.2022

By Laura Sanders Like all writers, I spend large chunks of my time looking for words. When it comes to the ultracomplicated and mysterious brain, I need words that capture nuance and uncertainties. The right words confront and address hard questions about exactly what new scientific findings mean, and just as importantly, why they matter. The search for the right words is on my mind because of recent research on COVID-19 and the brain. As part of a large brain-scanning study, researchers found that infections of SARS-CoV-2, the virus that causes COVID-19, were linked with less gray matter, tissue that’s packed with the bodies of brain cells. The results, published March 7 in Nature, prompted headlines about COVID-19 causing brain damage and shrinkage. That coverage, in turn, prompted alarmed posts on social media, including mentions of early-onset dementia and brain rotting. As someone who has reported on brain research for more than a decade, I can say those alarming words are not the ones that I would choose here. The study is one of the first to look at structural changes in the brain before and after a SARS-CoV-2 infection. And the study is meticulous. It was done by an expert group of brain imaging researchers who have been doing this sort of research for a very long time. As part of the UK Biobank project, 785 participants underwent two MRI scans. Between those scans, 401 people had COVID-19 and 384 people did not. By comparing the before and after scans, researchers could spot changes in the people who had COVID-19 and compare those changes with people who didn’t get the infection. © Society for Science & the Public 2000–2022.

Keyword: Learning & Memory; Attention
Link ID: 28246 - Posted: 03.19.2022

Gabino Iglesias The Man Who Tasted Words is a deep dive into the world of our senses — one that explores the way they shape our reality and what happens when something malfunctions or functions differently. Despite the complicated science permeating the narrative and the plethora of medical explanations, the book is also part memoir. And because of the way the author, Dr. Guy Leschziner, treats his patients — and how he presents the ways their conditions affect their lives and those of the people around them — it is also a very humane, heartfelt book. We rely on vision, hearing, taste, smell, and touch to not only perceive the reality around us but also to help us navigate it by constantly processing stimuli, predicting what will happen based on previous experiences, and filling the gaps of everything we miss as we construct it. However, that truth, the "reality" we see, taste, hear, touch, and smell, isn't actually there; our brains, with the help of our nervous system continuously build it for us. But sometimes our brains or nervous system have a glitch, and that has affects reality. The Man Who Tasted Words carefully looks at — and tries to explain — some of the most bizarre glitches. Sponsor Message "What we believe to be a precise representation of the world around us is nothing more than an illusion, layer upon layer of processing of sensory information, and the interpretation of that information according to our expectations," states Leschziner. When one of those senses doesn't work correctly, that illusion morphs in ways that significantly impact the lives of those whose nervous systems or brain work differently. Paul, for example, is a man who feels no pain. While this sounds like a great "flaw" to have, Leschziner shows it's the opposite. Pain helps humans learn "to avoid sharp or hot objects." It teaches that certain things in our environment are potentially harmful, tells us when we've had an injury and makes us protect it, and even lets us know there's an infection in our body so we can go to the doctor. © 2022 npr

Keyword: Consciousness
Link ID: 28233 - Posted: 03.11.2022