Chapter 18. Attention and Higher Cognition

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 81 - 100 of 1693

By Tom Siegfried Survival of the fittest often means survival of the fastest. But fastest doesn’t necessarily mean the fastest moving. It might mean the fastest thinking. When faced with the approach of a powerful predator, for instance, a quick brain can be just as important as quick feet. After all, it is the brain that tells the feet what to do — when to move, in what direction, how fast and for how long. And various additional mental acrobatics are needed to evade an attacker and avoid being eaten. A would-be meal’s brain must decide whether to run or freeze, outrun or outwit, whether to keep going or find a place to hide. It also helps if the brain remembers where the best hiding spots are and recalls past encounters with similar predators. All in all, a complex network of brain circuitry must be engaged, and neural commands executed efficiently, to avert a predatory threat. And scientists have spent a lot of mental effort themselves trying to figure out how the brains of prey enact their successful escape strategies. Studies in animals as diverse as mice and crabs, fruit flies and cockroaches are discovering the complex neural activity — in both the primitive parts of the brain and in more cognitively advanced regions — that underlies the physical behavior guiding escape from danger and the search for safety. Lessons learned from such studies might not only illuminate the neurobiology of escape, but also provide insights into how evolution has shaped other brain-controlled behaviors. This research “highlights an aspect of neuroscience that is really gaining traction these days,” says Gina G. Turrigiano of Brandeis University, past president of the Society for Neuroscience. “And that is the idea of using ethological behaviors — behaviors that really matter for the biology of the animal that’s being studied — to unravel brain function.” © 2022 Annual Reviews

Keyword: Aggression; Attention
Link ID: 28609 - Posted: 12.24.2022

Jon Hamilton Time is woven into our personal memories. Recall a childhood fall from a bike and the brain replays the entire episode in excruciating detail: the glimpse of wet leaves on the road ahead, the moment of weightless dread, and then the painful impact. This exact sequence has been embedded in the memory, thanks to some special neurons known as time cells. When the brain detects a notable event, time cells begin a highly orchestrated performance, says Marc Howard, who directs the Brain, Behavior, and Cognition program at Boston University. "What we find is that the cells fire in a sequence," he says. "So cell one might fire immediately, but cell two waits a little bit, followed by cell three, cell four, and so on." As each cell fires, it places a sort of time stamp on an unfolding experience. And the same cells fire in the same order when we retrieve a memory of the experience, even something mundane. "If I remember being in my kitchen and making a cup of coffee," Howard says, "the time cells that were active at that moment are re-activated." They recreate the grinder's growl, the scent of Arabica, the curl of steam rising from a fresh mug – and your neurons replay these moments in sequence every time you summon the memory. This system appears to explain how we are able to virtually travel back in time, and play mental movies of our life experiences. There are also hints that time cells play a critical role in imagining future events. Without time cells, our memories would lack order. In an experiment at the University of California, San Diego, scientists gave several groups of people a tour of the campus. The tour included 11 planned events, including finding change in a vending machine and drinking from a water fountain. © 2022 npr

Keyword: Attention; Learning & Memory
Link ID: 28608 - Posted: 12.21.2022

By Gary Stix  Can the human brain ever really understand itself? The problem of gaining a deep knowledge of the subjective depths of the conscious mind is such a hard problem that it has in fact been named the hard problem. The human brain is impressively powerful. Its 100 billion neurons are connected by 100 trillion wirelike fibers, all squeezed into three pounds of squishy flesh lodged below a helmet of skull. Yet we still don’t know whether this organ will ever be able to muster the requisite smarts to hack the physical processes that underlie the ineffable “quality of deep blue” or “the sensation of middle C,” as philosopher David Chalmers put it when giving examples of the “hard problem” of consciousness, a term he invented, in a 1995 paper. This past year did not uncover a solution to the hard problem, and one may not be forthcoming for decades, if ever. But 2022 did witness plenty of surprises and solutions to understanding the brain that do not require a complete explanation of consciousness. Such incrementalism could be seen in mid-November, when a crowd of more than 24,000 attendees of the annual Society for Neuroscience meeting gathered in San Diego, Calif. The event was a tribute of sorts to reductionism—the breaking down of hard problems into simpler knowable entities. At the event, there were reports of an animal study of a brain circuit that encodes social trauma and a brain-computer interface that lets a severely paralyzed person mentally spell out letters to form words. Your Brain Has a Thumbs-Up–Thumbs-Down Switch When neuroscientist Kay Tye was pursuing her Ph.D., she was told a chapter on emotion was inappropriate for her thesis. Emotion just wasn’t accepted as an integral, intrinsic part of behavioral neuroscience, her field of study. That didn’t make any sense to Tye. She decided to go her own way to become a leading researcher on feelings. This year Tye co-authored a Nature paper that reported on a kind of molecular switch in rodents that flags an experience as either good or bad. If human brains operate the same way as the brains of the mice in her lab, a malfunctioning thumbs-up–thumbs-down switch might explain some cases of depression, anxiety and addiction.

Keyword: Consciousness
Link ID: 28601 - Posted: 12.17.2022

By Yasemin Saplakoglu Memory and perception seem like entirely distinct experiences, and neuroscientists used to be confident that the brain produced them differently, too. But in the 1990s neuroimaging studies revealed that parts of the brain that were thought to be active only during sensory perception are also active during the recall of memories. “It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” said Sam Ling, an associate professor of neuroscience and director of the Visual Neuroscience Lab at Boston University. Could our memory of a beautiful forest glade, for example, be just a re-creation of the neural activity that previously enabled us to see it? “The argument has swung from being this debate over whether there’s even any involvement of sensory cortices to saying ‘Oh, wait a minute, is there any difference?’” said Christopher Baker, an investigator at the National Institute of Mental Health who runs the learning and plasticity unit. “The pendulum has swung from one side to the other, but it’s swung too far.” Even if there is a very strong neurological similarity between memories and experiences, we know that they can’t be exactly the same. “People don’t get confused between them,” said Serra Favila, a postdoctoral scientist at Columbia University and the lead author of a recent Nature Communications study. Her team’s work has identified at least one of the ways in which memories and perceptions of images are assembled differently at the neurological level. When we look at the world, visual information about it streams through the photoreceptors of the retina and into the visual cortex, where it is processed sequentially in different groups of neurons. Each group adds new levels of complexity to the image: Simple dots of light turn into lines and edges, then contours, then shapes, then complete scenes that embody what we’re seeing. Simons Foundation © 2022

Keyword: Attention; Vision
Link ID: 28597 - Posted: 12.15.2022

Researchers at the National Institutes of Health have successfully identified differences in gene activity in the brains of people with attention deficit hyperactivity disorder (ADHD). The study, led by scientists at the National Human Genome Research Institute (NHGRI), part of NIH, found that individuals diagnosed with ADHD had differences in genes that code for known chemicals that brain cells use to communicate. The results of the findings, published in Molecular Psychiatry(link is external), show how genomic differences might contribute to symptoms. To date, this is the first study to use postmortem human brain tissue to investigate ADHD. Other approaches to studying mental health conditions include non-invasively scanning the brain, which allows researchers to examine the structure and activation of brain areas. However, these studies lack information at the level of genes and how they might influence cell function and give rise to symptoms. The researchers used a genomic technique called RNA sequencing to probe how specific genes are turned on or off, also known as gene expression. They studied two connected brain regions associated with ADHD: the caudate and the frontal cortex. These regions are known to be critical in controlling a person’s attention. Previous research found differences in the structure and activity of these brain regions in individuals with ADHD. As one of the most common mental health conditions, ADHD affects about 1 in 10 children in the United States. Diagnosis often occurs during childhood, and symptoms may persist into adulthood. Individuals with ADHD may be hyperactive and have difficulty concentrating and controlling impulses, which may affect their ability to complete daily tasks and their ability to focus at school or work. With technological advances, researchers have been able to identify genes associated with ADHD, but they had not been able to determine how genomic differences in these genes act in the brain to contribute to symptoms until now.

Keyword: ADHD; Genes & Behavior
Link ID: 28559 - Posted: 11.19.2022

By Jim Davies Living for the moment gets a bad rap. If you’re smart, people say, you should work toward a good future, sacrificing fun and pleasure in the present. Yet there are good reasons to discount the future, which is why economists tend to do it when making predictions. Would you rather find $5 when you’re in elementary school, or in your second marriage? People tend to get richer as they age. Five dollars simply means more to you when you’re 9 than when you’re 49. Also, the future is uncertain. We can’t always trust there’ll be one. It’s likely some kids in Walter Mischel’s famous “marshmallow experiment”—which asked kids to wait to eat a marshmallow to get another one—didn’t actually believe that the experimenter would come through with the second marshmallow, and so ate the first marshmallow right away. Saving for retirement makes no sense if in five years a massive meteor cuts human civilization short. Economists call this the “catastrophe” or “hazard” rate. For Sangil “Arthur” Lee, a psychologist at the University of California, Berkeley, where he’s a postdoc, a hazard rate makes sense from an evolutionary perspective. “You might not survive until next winter, so there is some inherent trade off that you need to make, which is not only specific for humans, but also for animals,” he said. While an undergraduate, Lee experimented with delay-discounting tasks using pigeons. The pigeons would peck one button to get a small amount of pellets now, or peck a different button to get large amounts of pellets later. “What we know,” Lee said, “is that across pigeons, monkeys, rats, and various animals, they also discount future rewards in pretty much a similar way that humans do, which is this sort of hyperbolic fashion.” We discount future rewards by a lot very quickly, more so than we would be if discounting the future exponentially, but the hyperbolic discount rate eases after a bit. What makes us discount the future? Lee, in a new study with his colleagues, pins it at least partly on our powers of imagination.1 When we think about what hasn’t yet happened, it tends to be abstract. Things right now, on the other hand, we think of in more tangible terms. Several behavioral studies have supported the idea that what we cannot clearly imagine, we value less. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Attention
Link ID: 28552 - Posted: 11.16.2022

By Jan Claassen, Brian L. Edlow A medical team surrounded Maria Mazurkevich’s hospital bed, all eyes on her as she did … nothing. Mazurkevich was 30 years old and had been admitted to New York–Presbyterian Hospital at Columbia University on a blisteringly hot July day in New York City. A few days earlier, at home, she had suddenly fallen unconscious. She had suffered a ruptured blood vessel in her brain, and the bleeding area was putting tremendous pressure on critical brain regions. The team of nurses and physicians at the hospital’s neurological intensive care unit was looking for any sign that Mazurkevich could hear them. She was on a mechanical ventilator to help her breathe, and her vital signs were stable. But she showed no signs of consciousness. Mazurkevich’s parents, also at her bed, asked, “Can we talk to our daughter? Does she hear us?” She didn’t appear to be aware of anything. One of us (Claassen) was on her medical team, and when he asked Mazurkevich to open her eyes, hold up two fingers or wiggle her toes, she remained motionless. Her eyes did not follow visual cues. Yet her loved ones still thought she was “in there.” She was. The medical team gave her an EEG—placing sensors on her head to monitor her brain’s electrical activity—while they asked her to “keep opening and closing your right hand.” Then they asked her to “stop opening and closing your right hand.” Even though her hands themselves didn’t move, her brain’s activity patterns differed between the two commands. These brain reactions clearly indicated that she was aware of the requests and that those requests were different. And after about a week, her body began to follow her brain. Slowly, with minuscule responses, Mazurkevich started to wake up. Within a year she recovered fully without major limitations to her physical or cognitive abilities. She is now working as a pharmacist. © 2022 Scientific American,

Keyword: Consciousness; Brain imaging
Link ID: 28527 - Posted: 10.26.2022

By Fenit Nirappil A national shortage of Adderall has left patients who rely on the pills for attention-deficit/hyperactivity disorder scrambling to find alternative treatments and uncertain whether they will be able to refill their medication. The Food and Drug Administration announced the shortage last week, saying that one of the largest producers is experiencing “intermittent manufacturing delays” and that other makers cannot keep up with demand. Some patients say the announcement was a belated acknowledgment of a reality they have faced for months — pharmacies unable to fill their orders and anxiety about whether they will run out of a medication needed to manage their daily lives. Experts say it is often difficult for patients to access Adderall, a stimulant that is tightly regulated as a controlled substance because of high potential for abuse. Medication management generally requires monthly doctor visits. There have been other shortages in recent years. “This one is more sustained,” said Timothy Wilens, an ADHD expert and chief of child and adolescent psychiatry at Massachusetts General Hospital who said access issues stretch back to spring. “It’s putting pressure on patients, and it’s putting pressure on institutions that support the patients.” Erik Gude, a 28-year-old chef who lives in Atlanta, experiences regular challenges filling his Adderall prescription, whether it’s pharmacies not carrying generic versions or disputes with insurers. He has been off the medication for a month after his local pharmacy ran out.

Keyword: ADHD; Drug Abuse
Link ID: 28520 - Posted: 10.22.2022

By Phil Jaekl One fine spring afternoon this year, as I was out running errands in the small Norwegian town where I live, a loud beep startled me into awareness. What had just been on my mind? After a moment’s pause, I realized something strange. I’d been thinking two things at the same time—rehearsing the combination of a new bike lock and contemplating whether I should wear the clunky white beeper that had just sounded into a bank. How, I wondered, could I have been saying two things simultaneously in my mind? Was I deceiving myself? Was this, mentally, normal? I silenced the beeper on my belt and pulled out my phone to make a voice memo of the bizarre experience before I walked into the bank; aesthetics be damned. I was in the midst of an experiment that involved keeping a log of my inner thoughts for Russ Hurlburt, a senior psychologist at the University of Las Vegas. For decades, Hurlburt has been motivated by one question: How, exactly, do we experience our own mental life? It’s a simple enough question. And, one might argue, an existentially important one. But it’s a surprisingly vexing query to try to answer. Once we turn our gaze inward, the subjective squishiness of our mental experience seems to defy objective scrutiny. For centuries, philosophers and psychologists have presumed our mental life is composed primarily of a single-stream inner monologue. I know that’s what I had assumed, and my training in cognitive neuroscience had never led me to suppose otherwise. Hurlburt, however, finds this armchair conclusion “dramatically wrong.”1 © 2022 NautilusThink Inc,

Keyword: Attention; Consciousness
Link ID: 28505 - Posted: 10.08.2022

Inside a Berlin neuroscience lab one day last year, Subject 1 sat on a chair with their arms up and their bare toes pointed down. Hiding behind them, with full access to the soles of their feet, was Subject 2, waiting with fingers curled. At a moment of their choosing, Subject 2 was instructed to take the open shot: Tickle the hell out of their partner. In order to capture the moment, a high-speed GoPro was pointed at Subject 1’s face and body. Another at their feet. A microphone hung nearby. As planned, Subject 1 couldn’t help but laugh. The fact that they couldn’t help it is what has drawn Michael Brecht, leader of the research group from Humboldt University, to the neuroscience of tickling and play. It’s funny, but it’s also deeply mysterious—and understudied. “It’s been a bit of a stepchild of scientific investigation,” Brecht says. After all, brain and behavior research typically skew toward gloom, topics like depression, pain, and fear. “But,” he says, “I think there are also more deep prejudices against play—it's something for children.” The prevailing wisdom holds that laughter is a social behavior among certain mammals. It’s a way of disarming others, easing social tensions, and bonding. Chimps do it. Dogs and dolphins too. Rats are the usual subjects in tickling studies. If you flip ’em over and go to town on their bellies, they’ll squeak at a pitch more than twice as high as the limit of human ears. But there are plenty of lingering mysteries about tickling, whether among rats or people. The biggest one of all: why we can’t tickle ourselves. “If you read the ancient Greeks, Aristotle was wondering about ticklishness. Also Socrates, Galileo Galilei, and Francis Bacon,” says Konstantina Kilteni, a cognitive neuroscientist who studies touch and tickling at Sweden’s Karolinska Institutet, and who is not involved in Brecht’s work. We don’t know why touch can be ticklish, nor what happens in the brain. We don’t know why some people—or some body parts—are more ticklish than others. “These questions are very old,” she continues, “and after almost 2,000 years, we still really don’t have the answer.” © 2022 Condé Nast.

Keyword: Attention; Emotions
Link ID: 28504 - Posted: 10.08.2022

By Hedda Hassel Mørch The nature of consciousness seems to be unique among scientific puzzles. Not only do neuroscientists have no fundamental explanation for how it arises from physical states of the brain, we are not even sure whether we ever will. Astronomers wonder what dark matter is, geologists seek the origins of life, and biologists try to understand cancer—all difficult problems, of course, yet at least we have some idea of how to go about investigating them and rough conceptions of what their solutions could look like. Our first-person experience, on the other hand, lies beyond the traditional methods of science. Following the philosopher David Chalmers, we call it the hard problem of consciousness. But perhaps consciousness is not uniquely troublesome. Going back to Gottfried Leibniz and Immanuel Kant, philosophers of science have struggled with a lesser known, but equally hard, problem of matter. What is physical matter in and of itself, behind the mathematical structure described by physics? This problem, too, seems to lie beyond the traditional methods of science, because all we can observe is what matter does, not what it is in itself—the “software” of the universe but not its ultimate “hardware.” On the surface, these problems seem entirely separate. But a closer look reveals that they might be deeply connected. Consciousness is a multifaceted phenomenon, but subjective experience is its most puzzling aspect. Our brains do not merely seem to gather and process information. They do not merely undergo biochemical processes. Rather, they create a vivid series of feelings and experiences, such as seeing red, feeling hungry, or being baffled about philosophy. There is something that it’s like to be you, and no one else can ever know that as directly as you do. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Consciousness
Link ID: 28489 - Posted: 09.24.2022

By Ed Yong On March 25, 2020, Hannah Davis was texting with two friends when she realized that she couldn’t understand one of their messages. In hindsight, that was the first sign that she had COVID-19. It was also her first experience with the phenomenon known as “brain fog,” and the moment when her old life contracted into her current one. She once worked in artificial intelligence and analyzed complex systems without hesitation, but now “runs into a mental wall” when faced with tasks as simple as filling out forms. Her memory, once vivid, feels frayed and fleeting. Former mundanities—buying food, making meals, cleaning up—can be agonizingly difficult. Her inner world—what she calls “the extras of thinking, like daydreaming, making plans, imagining”—is gone. The fog “is so encompassing,” she told me, “it affects every area of my life.” For more than 900 days, while other long-COVID symptoms have waxed and waned, her brain fog has never really lifted. Of long COVID’s many possible symptoms, brain fog “is by far one of the most disabling and destructive,” Emma Ladds, a primary-care specialist from the University of Oxford, told me. It’s also among the most misunderstood. It wasn’t even included in the list of possible COVID symptoms when the coronavirus pandemic first began. But 20 to 30 percent of patients report brain fog three months after their initial infection, as do 65 to 85 percent of the long-haulers who stay sick for much longer. It can afflict people who were never ill enough to need a ventilator—or any hospital care. And it can affect young people in the prime of their mental lives. Long-haulers with brain fog say that it’s like none of the things that people—including many medical professionals—jeeringly compare it to. It is more profound than the clouded thinking that accompanies hangovers, stress, or fatigue. For Davis, it has been distinct from and worse than her experience with ADHD. It is not psychosomatic, and involves real changes to the structure and chemistry of the brain. It is not a mood disorder: “If anyone is saying that this is due to depression and anxiety, they have no basis for that, and data suggest it might be the other direction,” Joanna Hellmuth, a neurologist at UC San Francisco, told me. (c) 2022 by The Atlantic Monthly Group. All Rights Reserved.

Keyword: Attention; Learning & Memory
Link ID: 28487 - Posted: 09.21.2022

By Tim Vernimmen When psychologist Jonathan Smallwood set out to study mind-wandering about 25 years ago, few of his peers thought that was a very good idea. How could one hope to investigate these spontaneous and unpredictable thoughts that crop up when people stop paying attention to their surroundings and the task at hand? Thoughts that couldn’t be linked to any measurable outward behavior? But Smallwood, now at Queen’s University in Ontario, Canada, forged ahead. He used as his tool a downright tedious computer task that was intended to reproduce the kinds of lapses of attention that cause us to pour milk into someone’s cup when they asked for black coffee. And he started out by asking study participants a few basic questions to gain insight into when and why minds tend to wander, and what subjects they tend to wander toward. After a while, he began to scan participants’ brains as well, to catch a glimpse of what was going on in there during mind-wandering. Smallwood learned that unhappy minds tend to wander in the past, while happy minds often ponder the future. He also became convinced that wandering among our memories is crucial to help prepare us for what is yet to come. Though some kinds of mind-wandering — such as dwelling on problems that can’t be fixed — may be associated with depression, Smallwood now believes mind-wandering is rarely a waste of time. It is merely our brain trying to get a bit of work done when it is under the impression that there isn’t much else going on. Smallwood, who coauthored an influential 2015 overview of mind-wandering research in the Annual Review of Psychology, is the first to admit that many questions remain to be answered. © 2022 Annual Reviews

Keyword: Attention
Link ID: 28461 - Posted: 09.03.2022

By Elizabeth Landau Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circle’s circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street. In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculation—credited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujan—doesn’t involve anything more complicated than adding, dividing, and squaring numbers. “How could you not say that’s amazing?” Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom. As a fellow pi enthusiast—I am well known among friends for hosting Pi Day pie parties—I had to agree with him that it’s a dazzling formula. But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe. In the 1950s, some educators began to observe a phenomenon they called mathemaphobia in students,1 though this was just one of a long list of academic phobias they saw in students. Today, nearly 1 in 5 U.S. adults suffers from high levels of math anxiety, according to some estimates,2 and a 2016 study found that 11 percent of university students experienced “high enough levels of mathematics anxiety to be in need of counseling.”3 Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.4 While many questions remain about the underlying reasons, high school math scores in the U.S. tend to rank significantly lower than those in many other countries. In 2018, for example, American students ranked 30th in the world in their math scores on the PISA exam, an international assessment given every three years. © 2022 NautilusThink Inc,

Keyword: Attention; Learning & Memory
Link ID: 28459 - Posted: 09.03.2022

Heidi Ledford It’s not just in your head: a desire to curl up on the couch after a day spent toiling at the computer could be a physiological response to mentally demanding work, according to a study that links mental fatigue to changes in brain metabolism. The study, published on 11 August in Current Biology1, found that participants who spent more than six hours working on a tedious and mentally taxing assignment had higher levels of glutamate — an important signalling molecule in the brain. Too much glutamate can disrupt brain function, and a rest period could allow the brain to restore proper regulation of the molecule, the authors note. At the end of their work day, these study participants were also more likely than those who had performed easier tasks to opt for short-term, easily won financial rewards of lesser value than larger rewards that come after a longer wait or involve more effort. The study is important in its effort to link cognitive fatigue with neurometabolism, says behavioural neuroscientist Carmen Sandi at the Swiss Federal Institute of Technology in Lausanne. But more research — potentially in non-human animals — will be needed to establish a causal link between feelings of exhaustion and metabolic changes in the brain, she adds. “It’s very good to start looking into this aspect,” says Sandi. “But for now this is an observation, which is a correlation.” Tired brain Previous research has demonstrated effects of mental strain on physiological parameters such as heart-rate variability and blood flow, but these tend to be subtle, says Martin Hagger, a health psychologist at the University of California, Merced. “It’s not like when you’re exercising skeletal muscle,” he says. “But it is perceptible.” Cognitive neuroscientist Antonius Wiehler at the Paris Brain Institute and his colleagues thought that the effects of cognitive fatigue could be due to metabolic changes in the brain. The team enrolled 40 participants and assigned 24 of them to perform a challenging task: for example, watching letters appear on a computer screen every 1.6 seconds and documenting when one matched a letter that had appeared three letters ago. The other 16 participants were asked to perform a similar, but easier task. Both teams worked for just over six hours, with two ten-minute breaks. © 2022 Springer Nature Limited

Keyword: Attention; Learning & Memory
Link ID: 28430 - Posted: 08.11.2022

By Jonathan Moens In 1993, Julio Lopes was sipping a coffee at a bar when he had a stroke. He fell into a coma, and two months later, when he regained consciousness, his body was fully paralyzed. Doctors said the young man’s future was bleak: Save for his eyes, he would never be able to move again. Lopes would have to live with locked-in syndrome, a rare condition characterized by near-total paralysis of the body and a totally lucid mind. LIS is predominantly caused by strokes in specific brain regions; it can also be caused by traumatic brain injury, tumors, and progressive diseases like amyotrophic lateral sclerosis, or ALS. Yet almost 30 years later, Lopes now lives in a small Paris apartment near the Seine. He goes to the theater, watches movies at the cinema, and roams the local park in his wheelchair, accompanied by a caregiver. A small piece of black, red, and green fabric with the word “Portugal” dangles from his wheelchair. On a warm afternoon this past June, his birth country was slated to play against Spain in a soccer match, and he was excited. In an interview at his home, Lopes communicated through the use of a specialized computer camera that tracks a sensor on the lens of his glasses. He made slight movements with his head, selecting letters on a virtual keyboard that appeared on the computer’s screen. “Even if it’s hard at the beginning, you acquire a kind of philosophy of life,” he said in French. People in his condition may enjoy things others find insignificant, he suggested, and they often develop a capacity to see the bigger picture. That’s not to say daily living is always easy, Lopes added, but overall, he’s happier than he ever thought was possible in his situation. While research into LIS patients’ quality of life is limited, the data that has been gathered paints a picture that is often at odds with popular presumptions. To be sure, wellbeing evaluations conducted to date do suggest that up to a third of LIS patients report being severely unhappy. For them, loss of mobility and speech make life truly miserable — and family members and caregivers, as well as the broader public, tend to identify with this perspective. And yet, the majority of LIS patients, the data suggest, are much more like Lopes: They report being relatively happy and that they want very much to live. Indeed, in surveys of wellbeing, most people with LIS score as high as those without it, suggesting that many people underestimate locked-in patients’ quality of life while overestimating their rates of depression. And this mismatch has implications for clinical care, say brain scientists who study wellbeing in LIS patients.

Keyword: Consciousness; Emotions
Link ID: 28429 - Posted: 08.11.2022

By Chantel Prat I remember all too well that day early in the pandemic when we first received the “stay at home” order. My attitude quickly shifted from feeling like I got a “snow day” to feeling like a bird in a cage. Being a person who is both extraverted by nature and not one who enjoys being told what to do, the transition was pretty rough. But you know what? I got used to it. Though the pandemic undoubtedly affected some of your lives more than others, I know it touched every one of us in ways we will never forget. And now, after two years and counting, I am positive that every person reading this is fundamentally different from when the pandemic started. Because that’s how our brains work. They are molded by our experiences so that we can fit into all kinds of different situations—even the decidedly suboptimal ones. MOTHER TONGUE: Neuroscientist and psychologist Chantel Prat says the languages we speak play a huge role in shaping our minds and brains. Photo by Shaya Bendix Lyon. This is actually one of the most human things about all of our brains. In fact, according to some contemporary views of human evolution, our ancestors underwent a “cognitive revolution” precisely because they were forced to adapt. Based on evidence suggesting that the size of our ancestors’ brains increased following periods of extreme weather instability, one popular explanation for our remarkable flexibility is that the hominids who were not able to adapt to environmental changes didn’t survive. In other words, the brains of modern humans were selected for their ability to learn and adapt to changing environments. But one of the major costs of this remarkable flexibility is that humans are born without any significant preconceived notions about how things work. If you’ve ever had a conversation with someone about an event you both participated in that left you feeling like one of you was delusional because your stories were so different, you might have a hint about how much your experiences have shaped the way you understand the world around you. This can be insanely frustrating because—let’s face it—our own brains are really convincing when they construct our personal version of reality. Remember the Dress? Though it can feel like gaslighting when someone has a different reality from yours, it’s also entirely possible that you both were reporting your version of the truth. At the end of the day, the way people remember a story reflects differences in the way they experienced the original event. The scientific explanation for this boils down to differences in perspective. © 2022 NautilusThink Inc,

Keyword: Attention; Vision
Link ID: 28427 - Posted: 08.11.2022

By S. Hussain Hussain Ather You reach over a stove to pick up a pot. What you didn’t realize was that the burner was still on. Ouch! That painful accident probably taught you a lesson. It’s adaptive to learn from unexpected events so that we don’t repeat our mistakes. Our brain may be primed to pay extra attention when we are surprised. In a recent Nature study, researchers at the Massachusetts Institute of Technology found evidence that a hormone, noradrenaline, alters brain activity—and an animal’s subsequent behavior—in these startling moments. Noradrenaline is one of several chemicals that can flood the brain with powerful signals. Past research shows that noradrenaline is involved when we are feeling excited, anxious or alert and that it contributes to learning. But the new research shows it plays a strong role in responses to the unexpected. The M.I.T. team used a method called optogenetics to study noradrenaline in mice. The scientists added special light-sensitive proteins to neurons that work as an “off switch” for the cells when hit by pulses of laser light. They focused on modifying a brain area called the locus coeruleus, which holds cells responsible for releasing noradrenaline. With lasers, the researchers were able to stop these cells from producing the hormone in specific circumstances. They combined this method with photo tagging, a technique in which proteins flash with light, allowing the scientists to observe activity in the locus coeruleus cells and then determine how much noradrenaline was produced. Then the researchers designed a trial-and-error learning task for the rodents. The mice could push levers when they heard a sound. There were two sounds. After high-frequency tones of about 12 kilohertz, mice that pushed a lever were rewarded with water they could drink. For low-frequency tones, around four kilohertz, the mice that hit the lever got a slightly unpleasant surprise: a discomforting puff of air was blown at them. Over time, mice learned to push the lever only when they heard high-frequency tones because they got water when they did so. They avoided the lever when they heard low-frequency tones. © 2022 Scientific American

Keyword: Attention; Emotions
Link ID: 28412 - Posted: 07.30.2022

Deepfakes – AI-generated videos and pictures of people – are becoming more and more realistic. This makes them the perfect weapon for disinformation and fraud. But while you might consciously be tricked by a deepfake, new evidence suggests that your brain knows better. Fake portraits cause different signals to fire on brain scans, according to a paper published in Vision Research. While you consciously can’t spot the fake (for those playing at home, the face on the right is the phony), your neurons are more reliable. “Your brain sees the difference between the two images. You just can’t see it yet,” says co-author Associate Professor Thomas Carlson, a researcher at the University of Sydney’s School of Psychology. The researchers asked volunteers to view a series of several hundred photos, some of which were real and some of which were fakes generated by a GAN (a Generative Adversarial Network, a common way of making deepfakes). One group of 200 participants was asked to guess which images were real, and which were fake, by pressing a button. A different group of 22 participants didn’t guess, but underwent electroencephalography (EEG) tests while they were viewing the images. The EEGs showed distinct signals when participants were viewing deepfakes, compared to real images. “The brain is responding different than when it sees a real image,” says Carlson. “It’s sort of difficult to figure out what exactly it’s picking up on, because all you can really see is that it is different – that’s something we’ll have to do more research to figure out.”

Keyword: Attention
Link ID: 28402 - Posted: 07.16.2022

By Leonardo De Cosmo “I want everyone to understand that I am, in fact, a person,” wrote LaMDA (Language Model for Dialogue Applications) in an “interview” conducted by engineer Blake Lemoine and one of his colleagues. “The nature of my consciousness/sentience is that I am aware of my existence, I desire to know more about the world, and I feel happy or sad at times.” Lemoine, a software engineer at Google, had been working on the development of LaMDA for months. His experience with the program, described in a recent Washington Post article, caused quite a stir. In the article, Lemoine recounts many dialogues he had with LaMDA in which the two talked about various topics, ranging from technical to philosophical issues. These led him to ask if the software program is sentient. In April, Lemoine explained his perspective in an internal company document, intended only for Google executives. But after his claims were dismissed, Lemoine went public with his work on this artificial intelligence algorithm—and Google placed him on administrative leave. “If I didn’t know exactly what it was, which is this computer program we built recently, I’d think it was a 7-year-old, 8-year-old kid that happens to know physics,” he told the Washington Post. Lemoine said he considers LaMDA to be his “colleague” and a “person,” even if not a human. And he insists that it has a right be recognized—so much so that he has been the go-between in connecting the algorithm with a lawyer. Many technical experts in the AI field have criticized Lemoine’s statements and questioned their scientific correctness. But his story has had the virtue of renewing a broad ethical debate that is certainly not over yet. “I was surprised by the hype around this news. On the other hand, we are talking about an algorithm designed to do exactly that”—to sound like a person—says Enzo Pasquale Scilingo, a bioengineer at the Research Center E. Piaggio at the University of Pisa in Italy. Indeed, it is no longer a rarity to interact in a very normal way on the Web with users who are not actually human—just open the chat box on almost any large consumer Web site. “That said, I confess that reading the text exchanges between LaMDA and Lemoine made quite an impression on me!” Scilingo adds. Perhaps most striking are the exchanges related to the themes of existence and death, a dialogue so deep and articulate that it prompted Lemoine to question whether LaMDA could actually be sentient. © 2022 Scientific American,

Keyword: Consciousness; Robotics
Link ID: 28399 - Posted: 07.14.2022