Chapter 14. Attention and Higher Cognition

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1618

By Dana G. Smith Do you: Cut the tags out of your clothes? Relive (and regret) past conversations? Have episodes of burnout and fatigue? Zone out while someone is talking? Become hyper-focused while working on a project? Take on dozens of hobbies? Daydream? Forget things? According to TikTok, you might have attention deficit hyperactivity disorder. Videos about the psychiatric condition are all over the social media app, with the #adhd hashtag receiving more than 17 billion views to date. Many feature young people describing their specific (and sometimes surprising) symptoms, like sensitivity to small sensory annoyances (such as clothing tags) or A.D.H.D. paralysis, a type of extreme procrastination. After viewing these videos, many people who were not diagnosed with A.D.H.D. as children may question whether they would qualify as adults. As with most psychiatric conditions, A.D.H.D. symptoms can range in type and severity. And many of them “are behaviors everyone experiences at some point or another,” said Joel Nigg, a professor of psychiatry at Oregon Health & Science University. The key to diagnosing the condition, however, requires “determining that it’s serious, it’s extreme” and it’s interfering with people’s lives, he said. It’s also critical that the symptoms have been present since childhood. Those nuances can be lost on social media, experts say. In fact, one study published earlier this year found that more than half of the A.D.H.D. videos on TikTok were misleading. If a video (or article) has you thinking you may have undiagnosed A.D.H.D., here’s what to consider. Approximately 4 percent of adults in the United States have enough symptoms to qualify for A.D.H.D., but only an estimated one in 10 of them is diagnosed and treated. For comparison, roughly 9 percent of children in the United States have been diagnosed with the condition, and three-quarters have received medication or behavioral therapy for it. One reason for the lack of diagnoses in adults is that when people think of A.D.H.D., they often imagine a boy who can’t sit still and is disruptive in class, said Dr. Deepti Anbarasan, a clinical associate professor of psychiatry at the NYU Grossman School of Medicine. But those stereotypical hyperactive symptoms are present in just 5 percent of adult cases, she said. © 2023 The New York Times Company

Keyword: ADHD
Link ID: 28646 - Posted: 01.27.2023

By Jennifer Szalai “‘R’s’ are hard,” John Hendrickson writes in his new memoir, “Life on Delay: Making Peace With a Stutter,” committing to paper a string of words that would have caused him trouble had he tried to say them out loud. In November 2019, Hendrickson, an editor at The Atlantic, published an article about then-presidential candidate Joe Biden, who talked frequently about “beating” his childhood stutter — a bit of hyperbole that the article finally laid to rest. Biden insisted on his redemptive narrative, even though Hendrickson, who has stuttered since he was 4, could tell when Biden repeated (“I-I-I-I-I”) or blocked (“…”) on certain sounds. The article went viral, putting Hendrickson in the position of being invited to go on television — a “nightmare,” he said on MSNBC at the time, though it did lead to a flood of letters from fellow stutterers, a number of whom he interviewed for this book. “Life on Delay” traces an arc from frustration and isolation to acceptance and community, recounting a lifetime of bullying and well-meaning but ineffectual interventions and what Hendrickson calls “hundreds of awful first impressions.” When he depicts scenes from his childhood it’s often in a real-time present tense, putting us in the room with the boy he was, more than two decades before. Hendrickson also interviews people: experts, therapists, stutterers, his own parents. He calls up his kindergarten teacher, his childhood best friend and the actress Emily Blunt. He reaches out to others who have published personal accounts of stuttering, including The New Yorker’s Nathan Heller and Katharine Preston, the author of a memoir titled “Out With It.” We learn that it’s only been since the turn of the millennium or so that stuttering has been understood as a neurological disorder; that for 75 percent of children who stutter, “the issue won’t follow them to adulthood”; that there’s still disagreement over whether “disfluency” is a matter of language or motor control, because “the research is still a bit of a mess.” © 2023 The New York Times Company

Keyword: Language; Attention
Link ID: 28643 - Posted: 01.27.2023

By Alessandra Buccella, Tomáš Dominik  Imagine you are shopping online for a new pair of headphones. There is an array of colors, brands and features to look at. You feel that you can pick any model that you like and are in complete control of your decision. When you finally click the “add to shopping cart” button, you believe that you are doing so out of your own free will. But what if we told you that while you thought that you were still browsing, your brain activity had already highlighted the headphones you would pick? That idea may not be so far-fetched. Though neuroscientists likely could not predict your choice with 100 percent accuracy, research has demonstrated that some information about your upcoming action is present in brain activity several seconds before you even become conscious of your decision. As early as the 1960s, studies found that when people perform a simple, spontaneous movement, their brain exhibits a buildup in neural activity—what neuroscientists call a “readiness potential”—before they move. In the 1980s, neuroscientist Benjamin Libet reported this readiness potential even preceded a person’s reported intention to move, not just their movement. In 2008 a group of researchers found that some information about an upcoming decision is present in the brain up to 10 seconds in advance, long before people reported making the decision of when or how to act. Advertisement These studies have sparked questions and debates. To many observers, these findings debunked the intuitive concept of free will. After all, if neuroscientists can infer the timing or choice of your movements long before you are consciously aware of your decision, perhaps people are merely puppets, pushed around by neural processes unfolding below the threshold of consciousness. But as researchers who study volition from both a neuroscientific and philosophical perspective, we believe that there’s still much more to this story. We work with a collaboration of philosophers and scientists to provide more nuanced interpretations—including a better understanding of the readiness potential—and a more fruitful theoretical framework in which to place them. The conclusions suggest “free will” remains a useful concept, although people may need to reexamine how they define it. © 2023 Scientific American

Keyword: Consciousness
Link ID: 28635 - Posted: 01.18.2023

By Dennis Overbye If you could change the laws of nature, what would you change? Maybe it’s that pesky speed-of-light limit on cosmic travel — not to mention war, pestilence and the eventual asteroid that has Earth’s name on it. Maybe you would like the ability to go back in time — to tell your teenage self how to deal with your parents, or to buy Google stock. Couldn’t the universe use a few improvements? That was the question that David Anderson, a computer scientist, enthusiast of the Search for Extraterrestrial Intelligence (SETI), musician and mathematician at the University of California, Berkeley, recently asked his colleagues and friends. In recent years the idea that our universe, including ourselves and all of our innermost thoughts, is a computer simulation, running on a thinking machine of cosmic capacity, has permeated culture high and low. In an influential essay in 2003, Nick Bostrom, a philosopher at the University of Oxford and director of the Institute for the Future of Humanity, proposed the idea, adding that it was probably an easy accomplishment for “technologically mature” civilizations wanting to explore their histories or entertain their offspring. Elon Musk, who, for all we know, is the star of this simulation, seemed to echo this idea when he once declared that there was only a one-in-a-billion chance that we lived in “base reality.” It’s hard to prove, and not everyone agrees that such a drastic extrapolation of our computing power is possible or inevitable, or that civilization will last long enough to see it through. But we can’t disprove the idea either, so thinkers like Dr. Bostrom contend that we must take the possibility seriously. In some respects, the notion of a Great Simulator is redolent of a recent theory among cosmologists that the universe is a hologram, its margins lined with quantum codes that determine what is going on inside. A couple of years ago, pinned down by the coronavirus pandemic, Dr. Anderson began discussing the implications of this idea with his teenage son. If indeed everything was a simulation, then making improvements would simply be a matter of altering whatever software program was running everything. “Being a programmer, I thought about exactly what these changes might involve,” he said in an email. © 2023 The New York Times Company

Keyword: Consciousness
Link ID: 28634 - Posted: 01.18.2023

By Oliver Whang Hod Lipson, a mechanical engineer who directs the Creative Machines Lab at Columbia University, has shaped most of his career around what some people in his industry have called the c-word. On a sunny morning this past October, the Israeli-born roboticist sat behind a table in his lab and explained himself. “This topic was taboo,” he said, a grin exposing a slight gap between his front teeth. “We were almost forbidden from talking about it — ‘Don’t talk about the c-word; you won’t get tenure’ — so in the beginning I had to disguise it, like it was something else.” That was back in the early 2000s, when Dr. Lipson was an assistant professor at Cornell University. He was working to create machines that could note when something was wrong with their own hardware — a broken part, or faulty wiring — and then change their behavior to compensate for that impairment without the guiding hand of a programmer. Just as when a dog loses a leg in an accident, it can teach itself to walk again in a different way. This sort of built-in adaptability, Dr. Lipson argued, would become more important as we became more reliant on machines. Robots were being used for surgical procedures, food manufacturing and transportation; the applications for machines seemed pretty much endless, and any error in their functioning, as they became more integrated with our lives, could spell disaster. “We’re literally going to surrender our life to a robot,” he said. “You want these machines to be resilient.” One way to do this was to take inspiration from nature. Animals, and particularly humans, are good at adapting to changes. This ability might be a result of millions of years of evolution, as resilience in response to injury and changing environments typically increases the chances that an animal will survive and reproduce. Dr. Lipson wondered whether he could replicate this kind of natural selection in his code, creating a generalizable form of intelligence that could learn about its body and function no matter what that body looked like, and no matter what that function was. ImageHod Lipson, in jeans, a dark jacket and a dark button-down shirt, stands at the double-door entrance to the Creative Machines Lab. Signs on and next to the doors read “Creative Machines Lab,” “Laboratory,” “No Smoking” and “Smile, You’re On Camera.” © 2023 The New York Times Company

Keyword: Consciousness; Robotics
Link ID: 28625 - Posted: 01.07.2023

By Tom Siegfried Survival of the fittest often means survival of the fastest. But fastest doesn’t necessarily mean the fastest moving. It might mean the fastest thinking. When faced with the approach of a powerful predator, for instance, a quick brain can be just as important as quick feet. After all, it is the brain that tells the feet what to do — when to move, in what direction, how fast and for how long. And various additional mental acrobatics are needed to evade an attacker and avoid being eaten. A would-be meal’s brain must decide whether to run or freeze, outrun or outwit, whether to keep going or find a place to hide. It also helps if the brain remembers where the best hiding spots are and recalls past encounters with similar predators. All in all, a complex network of brain circuitry must be engaged, and neural commands executed efficiently, to avert a predatory threat. And scientists have spent a lot of mental effort themselves trying to figure out how the brains of prey enact their successful escape strategies. Studies in animals as diverse as mice and crabs, fruit flies and cockroaches are discovering the complex neural activity — in both the primitive parts of the brain and in more cognitively advanced regions — that underlies the physical behavior guiding escape from danger and the search for safety. Lessons learned from such studies might not only illuminate the neurobiology of escape, but also provide insights into how evolution has shaped other brain-controlled behaviors. This research “highlights an aspect of neuroscience that is really gaining traction these days,” says Gina G. Turrigiano of Brandeis University, past president of the Society for Neuroscience. “And that is the idea of using ethological behaviors — behaviors that really matter for the biology of the animal that’s being studied — to unravel brain function.” © 2022 Annual Reviews

Keyword: Aggression; Attention
Link ID: 28609 - Posted: 12.24.2022

Jon Hamilton Time is woven into our personal memories. Recall a childhood fall from a bike and the brain replays the entire episode in excruciating detail: the glimpse of wet leaves on the road ahead, the moment of weightless dread, and then the painful impact. This exact sequence has been embedded in the memory, thanks to some special neurons known as time cells. When the brain detects a notable event, time cells begin a highly orchestrated performance, says Marc Howard, who directs the Brain, Behavior, and Cognition program at Boston University. "What we find is that the cells fire in a sequence," he says. "So cell one might fire immediately, but cell two waits a little bit, followed by cell three, cell four, and so on." As each cell fires, it places a sort of time stamp on an unfolding experience. And the same cells fire in the same order when we retrieve a memory of the experience, even something mundane. "If I remember being in my kitchen and making a cup of coffee," Howard says, "the time cells that were active at that moment are re-activated." They recreate the grinder's growl, the scent of Arabica, the curl of steam rising from a fresh mug – and your neurons replay these moments in sequence every time you summon the memory. This system appears to explain how we are able to virtually travel back in time, and play mental movies of our life experiences. There are also hints that time cells play a critical role in imagining future events. Without time cells, our memories would lack order. In an experiment at the University of California, San Diego, scientists gave several groups of people a tour of the campus. The tour included 11 planned events, including finding change in a vending machine and drinking from a water fountain. © 2022 npr

Keyword: Attention; Learning & Memory
Link ID: 28608 - Posted: 12.21.2022

By Gary Stix  Can the human brain ever really understand itself? The problem of gaining a deep knowledge of the subjective depths of the conscious mind is such a hard problem that it has in fact been named the hard problem. The human brain is impressively powerful. Its 100 billion neurons are connected by 100 trillion wirelike fibers, all squeezed into three pounds of squishy flesh lodged below a helmet of skull. Yet we still don’t know whether this organ will ever be able to muster the requisite smarts to hack the physical processes that underlie the ineffable “quality of deep blue” or “the sensation of middle C,” as philosopher David Chalmers put it when giving examples of the “hard problem” of consciousness, a term he invented, in a 1995 paper. This past year did not uncover a solution to the hard problem, and one may not be forthcoming for decades, if ever. But 2022 did witness plenty of surprises and solutions to understanding the brain that do not require a complete explanation of consciousness. Such incrementalism could be seen in mid-November, when a crowd of more than 24,000 attendees of the annual Society for Neuroscience meeting gathered in San Diego, Calif. The event was a tribute of sorts to reductionism—the breaking down of hard problems into simpler knowable entities. At the event, there were reports of an animal study of a brain circuit that encodes social trauma and a brain-computer interface that lets a severely paralyzed person mentally spell out letters to form words. Your Brain Has a Thumbs-Up–Thumbs-Down Switch When neuroscientist Kay Tye was pursuing her Ph.D., she was told a chapter on emotion was inappropriate for her thesis. Emotion just wasn’t accepted as an integral, intrinsic part of behavioral neuroscience, her field of study. That didn’t make any sense to Tye. She decided to go her own way to become a leading researcher on feelings. This year Tye co-authored a Nature paper that reported on a kind of molecular switch in rodents that flags an experience as either good or bad. If human brains operate the same way as the brains of the mice in her lab, a malfunctioning thumbs-up–thumbs-down switch might explain some cases of depression, anxiety and addiction.

Keyword: Consciousness
Link ID: 28601 - Posted: 12.17.2022

By Yasemin Saplakoglu Memory and perception seem like entirely distinct experiences, and neuroscientists used to be confident that the brain produced them differently, too. But in the 1990s neuroimaging studies revealed that parts of the brain that were thought to be active only during sensory perception are also active during the recall of memories. “It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” said Sam Ling, an associate professor of neuroscience and director of the Visual Neuroscience Lab at Boston University. Could our memory of a beautiful forest glade, for example, be just a re-creation of the neural activity that previously enabled us to see it? “The argument has swung from being this debate over whether there’s even any involvement of sensory cortices to saying ‘Oh, wait a minute, is there any difference?’” said Christopher Baker, an investigator at the National Institute of Mental Health who runs the learning and plasticity unit. “The pendulum has swung from one side to the other, but it’s swung too far.” Even if there is a very strong neurological similarity between memories and experiences, we know that they can’t be exactly the same. “People don’t get confused between them,” said Serra Favila, a postdoctoral scientist at Columbia University and the lead author of a recent Nature Communications study. Her team’s work has identified at least one of the ways in which memories and perceptions of images are assembled differently at the neurological level. When we look at the world, visual information about it streams through the photoreceptors of the retina and into the visual cortex, where it is processed sequentially in different groups of neurons. Each group adds new levels of complexity to the image: Simple dots of light turn into lines and edges, then contours, then shapes, then complete scenes that embody what we’re seeing. Simons Foundation © 2022

Keyword: Attention; Vision
Link ID: 28597 - Posted: 12.15.2022

Researchers at the National Institutes of Health have successfully identified differences in gene activity in the brains of people with attention deficit hyperactivity disorder (ADHD). The study, led by scientists at the National Human Genome Research Institute (NHGRI), part of NIH, found that individuals diagnosed with ADHD had differences in genes that code for known chemicals that brain cells use to communicate. The results of the findings, published in Molecular Psychiatry(link is external), show how genomic differences might contribute to symptoms. To date, this is the first study to use postmortem human brain tissue to investigate ADHD. Other approaches to studying mental health conditions include non-invasively scanning the brain, which allows researchers to examine the structure and activation of brain areas. However, these studies lack information at the level of genes and how they might influence cell function and give rise to symptoms. The researchers used a genomic technique called RNA sequencing to probe how specific genes are turned on or off, also known as gene expression. They studied two connected brain regions associated with ADHD: the caudate and the frontal cortex. These regions are known to be critical in controlling a person’s attention. Previous research found differences in the structure and activity of these brain regions in individuals with ADHD. As one of the most common mental health conditions, ADHD affects about 1 in 10 children in the United States. Diagnosis often occurs during childhood, and symptoms may persist into adulthood. Individuals with ADHD may be hyperactive and have difficulty concentrating and controlling impulses, which may affect their ability to complete daily tasks and their ability to focus at school or work. With technological advances, researchers have been able to identify genes associated with ADHD, but they had not been able to determine how genomic differences in these genes act in the brain to contribute to symptoms until now.

Keyword: ADHD; Genes & Behavior
Link ID: 28559 - Posted: 11.19.2022

By Jim Davies Living for the moment gets a bad rap. If you’re smart, people say, you should work toward a good future, sacrificing fun and pleasure in the present. Yet there are good reasons to discount the future, which is why economists tend to do it when making predictions. Would you rather find $5 when you’re in elementary school, or in your second marriage? People tend to get richer as they age. Five dollars simply means more to you when you’re 9 than when you’re 49. Also, the future is uncertain. We can’t always trust there’ll be one. It’s likely some kids in Walter Mischel’s famous “marshmallow experiment”—which asked kids to wait to eat a marshmallow to get another one—didn’t actually believe that the experimenter would come through with the second marshmallow, and so ate the first marshmallow right away. Saving for retirement makes no sense if in five years a massive meteor cuts human civilization short. Economists call this the “catastrophe” or “hazard” rate. For Sangil “Arthur” Lee, a psychologist at the University of California, Berkeley, where he’s a postdoc, a hazard rate makes sense from an evolutionary perspective. “You might not survive until next winter, so there is some inherent trade off that you need to make, which is not only specific for humans, but also for animals,” he said. While an undergraduate, Lee experimented with delay-discounting tasks using pigeons. The pigeons would peck one button to get a small amount of pellets now, or peck a different button to get large amounts of pellets later. “What we know,” Lee said, “is that across pigeons, monkeys, rats, and various animals, they also discount future rewards in pretty much a similar way that humans do, which is this sort of hyperbolic fashion.” We discount future rewards by a lot very quickly, more so than we would be if discounting the future exponentially, but the hyperbolic discount rate eases after a bit. What makes us discount the future? Lee, in a new study with his colleagues, pins it at least partly on our powers of imagination.1 When we think about what hasn’t yet happened, it tends to be abstract. Things right now, on the other hand, we think of in more tangible terms. Several behavioral studies have supported the idea that what we cannot clearly imagine, we value less. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Attention
Link ID: 28552 - Posted: 11.16.2022

By Jan Claassen, Brian L. Edlow A medical team surrounded Maria Mazurkevich’s hospital bed, all eyes on her as she did … nothing. Mazurkevich was 30 years old and had been admitted to New York–Presbyterian Hospital at Columbia University on a blisteringly hot July day in New York City. A few days earlier, at home, she had suddenly fallen unconscious. She had suffered a ruptured blood vessel in her brain, and the bleeding area was putting tremendous pressure on critical brain regions. The team of nurses and physicians at the hospital’s neurological intensive care unit was looking for any sign that Mazurkevich could hear them. She was on a mechanical ventilator to help her breathe, and her vital signs were stable. But she showed no signs of consciousness. Mazurkevich’s parents, also at her bed, asked, “Can we talk to our daughter? Does she hear us?” She didn’t appear to be aware of anything. One of us (Claassen) was on her medical team, and when he asked Mazurkevich to open her eyes, hold up two fingers or wiggle her toes, she remained motionless. Her eyes did not follow visual cues. Yet her loved ones still thought she was “in there.” She was. The medical team gave her an EEG—placing sensors on her head to monitor her brain’s electrical activity—while they asked her to “keep opening and closing your right hand.” Then they asked her to “stop opening and closing your right hand.” Even though her hands themselves didn’t move, her brain’s activity patterns differed between the two commands. These brain reactions clearly indicated that she was aware of the requests and that those requests were different. And after about a week, her body began to follow her brain. Slowly, with minuscule responses, Mazurkevich started to wake up. Within a year she recovered fully without major limitations to her physical or cognitive abilities. She is now working as a pharmacist. © 2022 Scientific American,

Keyword: Consciousness; Brain imaging
Link ID: 28527 - Posted: 10.26.2022

By Fenit Nirappil A national shortage of Adderall has left patients who rely on the pills for attention-deficit/hyperactivity disorder scrambling to find alternative treatments and uncertain whether they will be able to refill their medication. The Food and Drug Administration announced the shortage last week, saying that one of the largest producers is experiencing “intermittent manufacturing delays” and that other makers cannot keep up with demand. Some patients say the announcement was a belated acknowledgment of a reality they have faced for months — pharmacies unable to fill their orders and anxiety about whether they will run out of a medication needed to manage their daily lives. Experts say it is often difficult for patients to access Adderall, a stimulant that is tightly regulated as a controlled substance because of high potential for abuse. Medication management generally requires monthly doctor visits. There have been other shortages in recent years. “This one is more sustained,” said Timothy Wilens, an ADHD expert and chief of child and adolescent psychiatry at Massachusetts General Hospital who said access issues stretch back to spring. “It’s putting pressure on patients, and it’s putting pressure on institutions that support the patients.” Erik Gude, a 28-year-old chef who lives in Atlanta, experiences regular challenges filling his Adderall prescription, whether it’s pharmacies not carrying generic versions or disputes with insurers. He has been off the medication for a month after his local pharmacy ran out.

Keyword: ADHD; Drug Abuse
Link ID: 28520 - Posted: 10.22.2022

By Phil Jaekl One fine spring afternoon this year, as I was out running errands in the small Norwegian town where I live, a loud beep startled me into awareness. What had just been on my mind? After a moment’s pause, I realized something strange. I’d been thinking two things at the same time—rehearsing the combination of a new bike lock and contemplating whether I should wear the clunky white beeper that had just sounded into a bank. How, I wondered, could I have been saying two things simultaneously in my mind? Was I deceiving myself? Was this, mentally, normal? I silenced the beeper on my belt and pulled out my phone to make a voice memo of the bizarre experience before I walked into the bank; aesthetics be damned. I was in the midst of an experiment that involved keeping a log of my inner thoughts for Russ Hurlburt, a senior psychologist at the University of Las Vegas. For decades, Hurlburt has been motivated by one question: How, exactly, do we experience our own mental life? It’s a simple enough question. And, one might argue, an existentially important one. But it’s a surprisingly vexing query to try to answer. Once we turn our gaze inward, the subjective squishiness of our mental experience seems to defy objective scrutiny. For centuries, philosophers and psychologists have presumed our mental life is composed primarily of a single-stream inner monologue. I know that’s what I had assumed, and my training in cognitive neuroscience had never led me to suppose otherwise. Hurlburt, however, finds this armchair conclusion “dramatically wrong.”1 © 2022 NautilusThink Inc,

Keyword: Attention; Consciousness
Link ID: 28505 - Posted: 10.08.2022

Inside a Berlin neuroscience lab one day last year, Subject 1 sat on a chair with their arms up and their bare toes pointed down. Hiding behind them, with full access to the soles of their feet, was Subject 2, waiting with fingers curled. At a moment of their choosing, Subject 2 was instructed to take the open shot: Tickle the hell out of their partner. In order to capture the moment, a high-speed GoPro was pointed at Subject 1’s face and body. Another at their feet. A microphone hung nearby. As planned, Subject 1 couldn’t help but laugh. The fact that they couldn’t help it is what has drawn Michael Brecht, leader of the research group from Humboldt University, to the neuroscience of tickling and play. It’s funny, but it’s also deeply mysterious—and understudied. “It’s been a bit of a stepchild of scientific investigation,” Brecht says. After all, brain and behavior research typically skew toward gloom, topics like depression, pain, and fear. “But,” he says, “I think there are also more deep prejudices against play—it's something for children.” The prevailing wisdom holds that laughter is a social behavior among certain mammals. It’s a way of disarming others, easing social tensions, and bonding. Chimps do it. Dogs and dolphins too. Rats are the usual subjects in tickling studies. If you flip ’em over and go to town on their bellies, they’ll squeak at a pitch more than twice as high as the limit of human ears. But there are plenty of lingering mysteries about tickling, whether among rats or people. The biggest one of all: why we can’t tickle ourselves. “If you read the ancient Greeks, Aristotle was wondering about ticklishness. Also Socrates, Galileo Galilei, and Francis Bacon,” says Konstantina Kilteni, a cognitive neuroscientist who studies touch and tickling at Sweden’s Karolinska Institutet, and who is not involved in Brecht’s work. We don’t know why touch can be ticklish, nor what happens in the brain. We don’t know why some people—or some body parts—are more ticklish than others. “These questions are very old,” she continues, “and after almost 2,000 years, we still really don’t have the answer.” © 2022 Condé Nast.

Keyword: Attention; Emotions
Link ID: 28504 - Posted: 10.08.2022

By Hedda Hassel Mørch The nature of consciousness seems to be unique among scientific puzzles. Not only do neuroscientists have no fundamental explanation for how it arises from physical states of the brain, we are not even sure whether we ever will. Astronomers wonder what dark matter is, geologists seek the origins of life, and biologists try to understand cancer—all difficult problems, of course, yet at least we have some idea of how to go about investigating them and rough conceptions of what their solutions could look like. Our first-person experience, on the other hand, lies beyond the traditional methods of science. Following the philosopher David Chalmers, we call it the hard problem of consciousness. But perhaps consciousness is not uniquely troublesome. Going back to Gottfried Leibniz and Immanuel Kant, philosophers of science have struggled with a lesser known, but equally hard, problem of matter. What is physical matter in and of itself, behind the mathematical structure described by physics? This problem, too, seems to lie beyond the traditional methods of science, because all we can observe is what matter does, not what it is in itself—the “software” of the universe but not its ultimate “hardware.” On the surface, these problems seem entirely separate. But a closer look reveals that they might be deeply connected. Consciousness is a multifaceted phenomenon, but subjective experience is its most puzzling aspect. Our brains do not merely seem to gather and process information. They do not merely undergo biochemical processes. Rather, they create a vivid series of feelings and experiences, such as seeing red, feeling hungry, or being baffled about philosophy. There is something that it’s like to be you, and no one else can ever know that as directly as you do. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Consciousness
Link ID: 28489 - Posted: 09.24.2022

By Ed Yong On March 25, 2020, Hannah Davis was texting with two friends when she realized that she couldn’t understand one of their messages. In hindsight, that was the first sign that she had COVID-19. It was also her first experience with the phenomenon known as “brain fog,” and the moment when her old life contracted into her current one. She once worked in artificial intelligence and analyzed complex systems without hesitation, but now “runs into a mental wall” when faced with tasks as simple as filling out forms. Her memory, once vivid, feels frayed and fleeting. Former mundanities—buying food, making meals, cleaning up—can be agonizingly difficult. Her inner world—what she calls “the extras of thinking, like daydreaming, making plans, imagining”—is gone. The fog “is so encompassing,” she told me, “it affects every area of my life.” For more than 900 days, while other long-COVID symptoms have waxed and waned, her brain fog has never really lifted. Of long COVID’s many possible symptoms, brain fog “is by far one of the most disabling and destructive,” Emma Ladds, a primary-care specialist from the University of Oxford, told me. It’s also among the most misunderstood. It wasn’t even included in the list of possible COVID symptoms when the coronavirus pandemic first began. But 20 to 30 percent of patients report brain fog three months after their initial infection, as do 65 to 85 percent of the long-haulers who stay sick for much longer. It can afflict people who were never ill enough to need a ventilator—or any hospital care. And it can affect young people in the prime of their mental lives. Long-haulers with brain fog say that it’s like none of the things that people—including many medical professionals—jeeringly compare it to. It is more profound than the clouded thinking that accompanies hangovers, stress, or fatigue. For Davis, it has been distinct from and worse than her experience with ADHD. It is not psychosomatic, and involves real changes to the structure and chemistry of the brain. It is not a mood disorder: “If anyone is saying that this is due to depression and anxiety, they have no basis for that, and data suggest it might be the other direction,” Joanna Hellmuth, a neurologist at UC San Francisco, told me. (c) 2022 by The Atlantic Monthly Group. All Rights Reserved.

Keyword: Attention; Learning & Memory
Link ID: 28487 - Posted: 09.21.2022

By Tim Vernimmen When psychologist Jonathan Smallwood set out to study mind-wandering about 25 years ago, few of his peers thought that was a very good idea. How could one hope to investigate these spontaneous and unpredictable thoughts that crop up when people stop paying attention to their surroundings and the task at hand? Thoughts that couldn’t be linked to any measurable outward behavior? But Smallwood, now at Queen’s University in Ontario, Canada, forged ahead. He used as his tool a downright tedious computer task that was intended to reproduce the kinds of lapses of attention that cause us to pour milk into someone’s cup when they asked for black coffee. And he started out by asking study participants a few basic questions to gain insight into when and why minds tend to wander, and what subjects they tend to wander toward. After a while, he began to scan participants’ brains as well, to catch a glimpse of what was going on in there during mind-wandering. Smallwood learned that unhappy minds tend to wander in the past, while happy minds often ponder the future. He also became convinced that wandering among our memories is crucial to help prepare us for what is yet to come. Though some kinds of mind-wandering — such as dwelling on problems that can’t be fixed — may be associated with depression, Smallwood now believes mind-wandering is rarely a waste of time. It is merely our brain trying to get a bit of work done when it is under the impression that there isn’t much else going on. Smallwood, who coauthored an influential 2015 overview of mind-wandering research in the Annual Review of Psychology, is the first to admit that many questions remain to be answered. © 2022 Annual Reviews

Keyword: Attention
Link ID: 28461 - Posted: 09.03.2022

By Elizabeth Landau Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circle’s circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street. In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculation—credited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujan—doesn’t involve anything more complicated than adding, dividing, and squaring numbers. “How could you not say that’s amazing?” Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom. As a fellow pi enthusiast—I am well known among friends for hosting Pi Day pie parties—I had to agree with him that it’s a dazzling formula. But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe. In the 1950s, some educators began to observe a phenomenon they called mathemaphobia in students,1 though this was just one of a long list of academic phobias they saw in students. Today, nearly 1 in 5 U.S. adults suffers from high levels of math anxiety, according to some estimates,2 and a 2016 study found that 11 percent of university students experienced “high enough levels of mathematics anxiety to be in need of counseling.”3 Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.4 While many questions remain about the underlying reasons, high school math scores in the U.S. tend to rank significantly lower than those in many other countries. In 2018, for example, American students ranked 30th in the world in their math scores on the PISA exam, an international assessment given every three years. © 2022 NautilusThink Inc,

Keyword: Attention; Learning & Memory
Link ID: 28459 - Posted: 09.03.2022

Heidi Ledford It’s not just in your head: a desire to curl up on the couch after a day spent toiling at the computer could be a physiological response to mentally demanding work, according to a study that links mental fatigue to changes in brain metabolism. The study, published on 11 August in Current Biology1, found that participants who spent more than six hours working on a tedious and mentally taxing assignment had higher levels of glutamate — an important signalling molecule in the brain. Too much glutamate can disrupt brain function, and a rest period could allow the brain to restore proper regulation of the molecule, the authors note. At the end of their work day, these study participants were also more likely than those who had performed easier tasks to opt for short-term, easily won financial rewards of lesser value than larger rewards that come after a longer wait or involve more effort. The study is important in its effort to link cognitive fatigue with neurometabolism, says behavioural neuroscientist Carmen Sandi at the Swiss Federal Institute of Technology in Lausanne. But more research — potentially in non-human animals — will be needed to establish a causal link between feelings of exhaustion and metabolic changes in the brain, she adds. “It’s very good to start looking into this aspect,” says Sandi. “But for now this is an observation, which is a correlation.” Tired brain Previous research has demonstrated effects of mental strain on physiological parameters such as heart-rate variability and blood flow, but these tend to be subtle, says Martin Hagger, a health psychologist at the University of California, Merced. “It’s not like when you’re exercising skeletal muscle,” he says. “But it is perceptible.” Cognitive neuroscientist Antonius Wiehler at the Paris Brain Institute and his colleagues thought that the effects of cognitive fatigue could be due to metabolic changes in the brain. The team enrolled 40 participants and assigned 24 of them to perform a challenging task: for example, watching letters appear on a computer screen every 1.6 seconds and documenting when one matched a letter that had appeared three letters ago. The other 16 participants were asked to perform a similar, but easier task. Both teams worked for just over six hours, with two ten-minute breaks. © 2022 Springer Nature Limited

Keyword: Attention; Learning & Memory
Link ID: 28430 - Posted: 08.11.2022