Chapter 14. Attention and Higher Cognition
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Lynne Peeples Near the end of his first series of chess matches against IBM’s Deep Blue computer in 1996, the Russian grandmaster Garry Kasparov lamented what he saw as an unfair disadvantage: “I’m really tired. These games took a lot of energy. But if I play a normal human match, my opponent would also be exhausted.” Why thinking hard makes us feel tired Whereas machine intelligence can keep running as long as it has a power supply, a human brain will become fatigued — and you don’t have to be a chess grandmaster to understand the feeling. Anyone can end up drained after a long day of work, at school or juggling the countless decisions of daily life. This mental exhaustion can sap motivation, dull focus and erode judgement. It can raise the odds of careless mistakes. Especially when combined with sleep loss or circadian disruption, cognitive fatigue can also contribute to deadly medical errors and road traffic accidents. It was partly Kasparov’s weary comments that inspired Mathias Pessiglione, a cognitive neuroscientist and research director at the Paris Brain Institute, to study the tired brain. He wanted to know: “Why is this cognitive system prone to fatigue?” Researchers and clinicians have long struggled to define, measure and treat cognitive fatigue — relying mostly on self-reports of how tired someone says they feel. Now, however, scientists from across disciplines are enlisting innovative experimental approaches and biological markers to probe the metabolic roots and consequences of cognitive fatigue. The efforts are getting a boost in attention and funding in large part because of long COVID, which afflicts roughly 6 in every 100 people after infection with the coronavirus SARS-CoV-2, says Vikram Chib, a biomedical engineer at Johns Hopkins University in Baltimore, Maryland. “The primary symptom of long COVID is fatigue,” says Chib. “I think that has opened a lot of people’s eyes.” © 2025 Springer Nature Limited
Keyword: Neuroimmunology; Attention
Link ID: 30049 - Posted: 12.13.2025
By Sara Talpos It’s been more than a decade since scientists first started publishing papers on neural organoids, the small clusters of cells grown in labs and designed to mimic various parts of the human brain. Since then, organoids have been used to study everything from bipolar disorder and Alzheimer’s disease, to tumors and parasitic infections. Because these new tools have the potential to reduce the use of animals in research — a goal of the current Trump administration — the field’s future may be more financially secure than other areas of scientific research. In September, for example, the federal government announced an $87 million investment into organoid research broadly. Matthew Owen brings a unique perspective to this emerging field. As a philosopher of mind, he focuses on trying to understand both what the mind is and how it relates to the body and the brain. He draws on the work of historical philosophers and applies some of their ideas to modern-day science. In 2020, as a visiting scholar in a neuroscience lab at McGill University, he was introduced to researchers working with organoids. Owen, who also does research in bioethics, wanted to help them address a perhaps unsettling question: Could these miniature cell clusters ever develop consciousness? Some experts believe that organoid consciousness is not likely to happen anytime in the near future, if at all. Still, certain experiments are prompting the question. In 2022, for example, researchers, including Brett Kagan of the Australian start-up Cortical Labs, published a paper explaining how they had taught their lab-grown brain cells to play a ping-pong-like video game. (Because the cells were placed in a single layer, the structures were not technically organoids, though they are expected to have similar capabilities.) In the process, the authors wrote, the tiny cell clusters displayed “sentience.” Undark recently spoke with Owen about this particular experiment and about his own writing on organoids.
Keyword: Consciousness; Development of the Brain
Link ID: 30048 - Posted: 12.13.2025
By Claudia López Lloreda A new commentary calls into question a 2024 paper that described a universal pattern of cortical brain oscillations. But that team has provided a more expansive analysis in response and stands by its original conclusions. Both articles were published today in “Matters Arising” in Nature Neuroscience. Ultimately, the back-and-forth suggests that a frequency “motif” may exist, but it may not be as general as the original study proposed, says Aitor Morales-Gregorio, a postdoctoral researcher at Charles University, who was not involved with any of the work. “The [2024] conclusions are way too optimistic about how general and how universal this principle might be.” The 2024 study identified a brain-wave motif in 14 cortical areas in macaques: Alpha and beta rhythms predominated in the deeper layers, whereas gamma bands appeared in the more superficial layers. Because this motif also showed up in marmosets and humans, the researchers speculated that it may be a universal mechanism for cortical computation in primates. “Results typically come with a level of variability, of noise, of uncertainty,” says 2024 study investigator Diego Mendoza-Halliday, assistant professor of neuroscience at the University of Pittsburgh. But this pattern “was just there the whole time, at all times, in many, many of the recordings.” The team leveraged the findings to create an algorithm that detects Layer 4 of the cortex. But the pattern is “by no means universal,” according to the new commentary, which found the motif in about 60 percent of the recordings in an independent monkey dataset. Further, the algorithm trained to identify Layer 4 of the cortex is unreliable, the commentary shows. © 2025 Simons Foundation
Keyword: Attention
Link ID: 30044 - Posted: 12.13.2025
Helen Pearson In some parts of the world, record numbers of people are being diagnosed with attention deficit hyperactivity disorder (ADHD). In the United States, for example, government researchers last year reported that more than 11% of children had received an ADHD diagnosis at some point in their lives1 — a sharp increase from 2003, when around 8% of children had (see ‘ADHD among US boys and girls’). But now, top US health officials argue that diagnoses have spiralled out of control. In May, the Make America Healthy Again Commission — led by US health secretary Robert F. Kennedy Jr — said ADHD was part of a “crisis of overdiagnosis and overtreatment” and suggested that ADHD medications did not help children in the long term. One thing that’s clear is that several factors — including improved detection and greater awareness of ADHD — are causing people with symptoms to receive a diagnosis and treatment, whereas they wouldn’t have years earlier. Clinicians say this is especially true for women and girls, whose pattern of symptoms was often missed in the past. Although some specialists are concerned about the risks of overdiagnosis, many are more worried that too many people go undiagnosed and untreated. At the same time, the rise in awareness and diagnoses of ADHD has fuelled a public debate about how it should be viewed and how best to provide support, including when medication is required. The emergence of the neurodiversity movement is challenging the view of ADHD as a disorder that should be ‘treated’, and instead proposes that it’s a difference that should be better understood and supported — with more focus on adapting schools and workplaces, for instance. “I do have a big problem with ‘disorder’,” says Jeff Karp, a biomedical engineer at Brigham and Women’s Hospital in Boston, Massachusetts, who has ADHD. “It’s the school system that’s disordered. It’s not the kids.” But many clinicians and people with ADHD argue that it is associated with difficulties — ranging from academic struggles to an increased chance of injuries and substance misuse — that justify its label as a medical condition, and say that medication is an important and effective part of therapy for many people. © 2025 Springer Nature Limited
Keyword: ADHD
Link ID: 30034 - Posted: 11.29.2025
Mariana Lenharo The obesity drug tirzepatide, sold as Mounjaro or Zepbound, can suppress patterns of brain activity associated with food cravings, a study suggests. Researchers measured the changing electrical signals in the brain of a person with severe obesity who had experienced persistent ‘food noise’ — intrusive, compulsive thoughts about eating — shortly after the individual began taking the medication. The study is the first to use electrodes to directly measure how blockbuster obesity drugs that mimic the hormone GLP-1 affect brain activity in people, and to hint at how they curb extreme food cravings. “It’s a great strategy to try and find a neural signature of food noise, and then try to understand how drugs can manipulate it,” says Amber Alhadeff, a neuroscientist at the Monell Chemical Senses Center in Philadelphia, Pennsylvania. The findings were published today in Nature Medicine1. Casey Halpern, a neurosurgeon-scientist at the University of Pennsylvania in Philadelphia, and his colleagues did not set out to investigate the effects of obesity drugs on the brain. The team’s goal was to test whether a type of deep brain stimulation — a therapy that involves delivering a weak electrical current directly into the brain — can help to reduce compulsive eating in people with obesity for whom treatments such as bariatric surgery haven’t worked. The scientists set up a study in which participants had an electrode implanted into their nucleus accumbens, a region of the brain that is involved in feelings of reward. It also expresses the GLP-1 receptor, notes Christian Hölscher, a neuroscientist at the Henan Academy of Innovations in Medical Science in Zhengzhou, China, “so we know that GLP-1 plays a role in modulating reward here”. This type of electrode, which can both record electrical activity and deliver an electrical current when needed, is already used in people to treat some forms of epilepsy. © 2025 Springer Nature Limited
Keyword: Obesity; Attention
Link ID: 30016 - Posted: 11.19.2025
By Nora Bradford Here are three words: pine, crab, sauce. There’s a fourth word that combines with each of the others to create another common word. What is it? When the answer finally comes to you, it’ll likely feel instantaneous. You might even say “Aha!” This kind of sudden realization is known as insight, and a research team recently uncovered how the brain produces it (opens a new tab), which suggests why insightful ideas tend to stick in our memory. Maxi Becker (opens a new tab), a cognitive neuroscientist at Duke University, first got interested in insight after reading the landmark 1962 book The Structure of Scientific Revolutions (opens a new tab) by the historian and philosopher of science Thomas Kuhn. “He describes how some ideas are so powerful that they can completely shift the way an entire field thinks,” she said. “That got me wondering: How does the brain come up with those kinds of ideas? How can a single thought change how we see the world?” Such moments of insight are written across history. According to the Roman architect and engineer Vitruvius, in the third century BCE the Greek mathematician Archimedes suddenly exclaimed “Eureka!” after he slid into a bathtub and saw the water level rise by an amount equal to his submerged volume (although this tale may be apocryphal (opens a new tab)). In the 17th century, according to lore, Sir Isaac Newton had a breakthrough in understanding gravity after an apple fell on his head. In the early 1900s, Einstein came to a sudden realization that “if a man falls freely, he would not feel his weight,” which led him to his theory of relativity, as he later described in a lecture. Insights are not limited to geniuses: We have these cognitive experiences all the time when solving riddles or dealing with social or intellectual problems. They are distinct from analytical problem-solving, such as the process of doing formulaic algebra, in which you arrive at a solution slowly and gradually as if you’re getting warmer. Instead, insights often follow periods of confusion. You never feel as if you’re getting warmer; rather, you go from cold to hot, seemingly in an instant. Or, as the neuropsychologist Donald Hebb, known for his work building neurobiological models of learning, wrote in the 1940s, sometimes “learning occurs as a single jump, an all-or-none affair.” © 2025 Simons Foundation
Keyword: Attention; Learning & Memory
Link ID: 30004 - Posted: 11.08.2025
Ian Sample Science editor It’s never a great look. The morning meeting is in full swing but thanks to a late night out your brain switches off at the precise moment a question comes your way. Such momentary lapses in attention are a common problem for the sleep deprived, but what happens in the brain in these spells of mental shutdown has proved hard to pin down. Now scientists have shed light on the process and found there is more to zoning out than meets the eye. The brief loss of focus coincides with a wave of fluid flowing out of the brain, which returns once attention recovers. “The moment somebody’s attention fails is the moment this wave of fluid starts to pulse,” said Dr Laura Lewis, a senior author on the study at MIT in Boston. “It’s not just that your neurons aren’t paying attention to the world, there’s this big change in fluid in the brain at the same time.” Lewis and her colleague Dr Zinong Yang investigated the sleep-deprived brain to understand the kinds of attention failures that lead drowsy drivers to crash and tired animals to become a predator’s lunch. In the study, 26 volunteers took turns to wear an EEG cap while lying in an fMRI scanner. This enabled the scientists to monitor the brain’s electrical activity and physiological changes during tests in which people had to respond as quickly as possible to hearing a tone or seeing crosshairs on a screen turn into a square. Each volunteer was scanned after a restful night’s sleep at home and after a night of total sleep deprivation supervised by scientists at the laboratory. Unsurprisingly, people performed far worse when sleep deprived, responding more slowly or not at all. © 2025 Guardian News & Media Limited
Keyword: Sleep; Attention
Link ID: 29993 - Posted: 11.01.2025
Imma Perfetto Anyone who has ever struggled through the day following a poor night’s sleep has had to wrench their attention back to the task at hand after their mind drifted off unexpectedly. Now, researchers have pinpointed exactly what causes these momentary failures of attention. The new study in Nature Neuroscience found that the brains of sleep-deprived people initiate waves of cerebrospinal fluid (CSF), the liquid which cushions the brain, which dramatically impaired attention. This process usually happens during sleep. The rhythmic flow of CSF into and out of the brain carries away protein waste which has built up over the course of the day. When this is maintenance interrupted due to lack of sleep, it seems the brain attempts to play catch up during its waking hours. “If you don’t sleep, the CSF waves start to intrude into wakefulness where normally you wouldn’t see them,” says study senior author Laura Lewis of Massachusetts Institute of Technology’s (MIT) Institute for Medical Engineering and Science. “However, they come with an attentional trade off, where attention fails during the moments that you have this wave of fluid flow. “The results are suggesting that at the moment that attention fails, this fluid is actually being expelled outward away from the brain. And when attention recovers, it’s drawn back in.” © Copyright CSIRO
Keyword: Sleep; Attention
Link ID: 29992 - Posted: 11.01.2025
By Grigori Guitchounts On a mellow spring night, I gazed at the setting desert sun in Joshua Tree National Park in California. The sun glowed a warm blood-orange and the sky shimmered pink and purple. I had just defended my Ph.D. in neuroscience, and my partner and I had flown west to celebrate and exhale. It was early March 2020, and we were hoping to quiet our minds in the desert. I was also hoping to change mine. I had been curious about psychedelics for years, but it wasn’t until I read How to Change Your Mind by Michael Pollan about the new science of psychedelics, that I felt ready. The book made a compelling case that psychedelics provided a fascinating introspective experience. Still, I was nervous. I’d heard stories about bad trips and flashbacks. I knew enough neuroscience to know these were serious drugs—compounds that could temporarily dismantle how the brain makes sense of reality and potentially change it irreversibly. I also knew I was burned out. My Ph.D. had been hard in the way Ph.D.s often are: thrilling, lonely, disorienting. My advisor had left academia halfway through, and I’d spent years without much supervision, never quite sure whether I was on the right track and if I had a future in academia. But I didn’t take LSD seeking healing or clarity. I just wanted to see what the fuss was about. After years of hunkering down, I was craving a freeing experience. What followed was strange, intense, and beautiful. The wooden floorboards of our cabin turned into a bustling cityscape. The mirror in the bathroom showed my face aged beyond recognition: The natural lines in my skin became deep wrinkles, my eyes sunken, as if time had decided to give me a sneak peak of what would come. Later, absorbed with coloring pencils, I watched the marks I was making dissolve in real time, as if the paper were being erased by invisible rain. © 2025 NautilusNext Inc.,
Keyword: Drug Abuse; Consciousness
Link ID: 29979 - Posted: 10.22.2025
Asif Ghazanfar Picture someone washing their hands. The water running down the drain is a deep red. How you interpret this scene depends on its setting, and your history. If the person is in a gas station bathroom, and you just saw the latest true-crime series, these are the ablutions of a serial killer. If the person is at a kitchen sink, then perhaps they cut themselves while preparing a meal. If the person is in an art studio, you might find resonance with the struggle to get paint off your hands. If you are naive to crime story tropes, cooking or painting, you would have a different interpretation. If you are present, watching someone wash deep red off their hands into a sink, your response depends on even more variables. How we act in the world is also specific to our species; we all live in an ‘umwelt’, or self-centred world, in the words of the philosopher-biologist Jakob von Uexküll (1864-1944). It’s not as simple as just taking in all the sensory information and then making a decision. First, our particular eyes, ears, nose, tongue and skin already filter what we can see, hear, smell, taste and feel. We don’t take in everything. We don’t see ultraviolet light like a bird, we don’t hear infrasound like elephants and baleen whales do. Second, the size and shape of our bodies determine what possible actions we can take. Parkour athletes – those who run, vault, climb and jump in complex urban environments – are remarkable in their skills and daring, but sustain injuries that a cat doing the exact same thing would not. Every animal comes with a unique bag of tricks to exploit their environment; these tricks are also limitations under different conditions. Third, the world, our environment, changes. Seasons change, what animals can eat therefore also changes. If it’s the rainy season, grass will be abundant. The amount of grass determines who is around to eat it and therefore who is around to eat the grass-eaters. Ultimately, the challenge for each of us animals is how to act in this unstable world that we do not fully apprehend with our senses and our body’s limited degrees of freedom. There is a fourth constraint, one that isn’t typically recognised. Most of the time, our intuition tells us that what we are seeing (or hearing or feeling) is an accurate representation of what is out there, and that anyone else would see (or hear or feel) it the same way. But we all know that’s not true and yet are continually surprised by it. It is even more fundamental than that: you know that seemingly basic sensory information that we are able to take in with our eyes and ears? It’s inaccurate. How we perceive elementary colours, ‘red’ for example, always depends on the amount of light, surrounding colours and other factors. In low lighting, the deep red washing down the sink might appear black. A yellow sink will make it look more orange; a blue sink may make it look violet. © Aeon Media Group Ltd. 2012-2025.
Keyword: Vision; Attention
Link ID: 29961 - Posted: 10.08.2025
By Ellen Barry Around the time of the pandemic, I began to notice something happening in my social circle. A close friend, then in her early 50s, got a diagnosis of attention deficit hyperactivity disorder. She described it as a profound relief, releasing her from years of self-blame — about missed deadlines and lost receipts, but also things that were deeper and more complicated, like her sensitivity to injustice. Listen to this article with reporter commentary Something similar happened to a co-worker, and a cousin in his 30s, and an increasing number of people I met covering mental health. It wasn’t always A.D.H.D. For some of them, the revelation was a diagnosis of autism spectrum disorder: After years of inarticulate unease in social situations, they felt freed by the framework of neurodivergence, and embraced by the community that came along with it. Since then I’ve heard accounts from people who received midlife diagnoses of binge eating disorder, post-traumatic stress disorder, anxiety. Nearly all of them said the diagnosis provided relief. Sometimes it led to an effective treatment. But sometimes, simply identifying the problem — putting a name to it — seemed to help. Lately, it seems as if we never stop talking about the rising rates of chronic diseases, among them autism, A.D.H.D., depression, anxiety and PTSD. Health Secretary Robert F. Kennedy Jr. has pointed to these trends as evidence that Americans are “the sickest people in the world,” and has set about upending whole swaths of our public health system in search of causes, like vaccines or environmental toxins. But much of what we’re seeing is a change in diagnostic practices, as we apply medical labels to ever milder versions of disease. There are many reasons for this: The shame that once accompanied many disorders has lifted. Screening for mental health problems is now common in schools. Social media gives us the tools to diagnose ourselves. And clinicians, in a time of mental health crisis, see an opportunity to treat illnesses early. © 2025 The New York Times Company
By Sujata Gupta Anne-Laure Le Cunff was something of a wild child. As a teenager, she repeatedly disabled the school fire alarm to sneak smoke breaks and helped launch a magazine filled with her teachers’ fictional love lives. Later, as a young adult studying neuroscience, Le Cunff would spend hours researching complex topics but struggled to complete simple administrative tasks. And she often obsessed over random projects before abruptly abandoning them. Then, three years ago, a colleague asked Le Cunff if she might have attention-deficit/hyperactivity disorder, or ADHD, a condition marked by distractibility, hyperactivity and impulsivity. Doctors confirmed her colleague’s suspicions. But fearing professional stigma, Le Cunff — by then by then a postdoctoral fellow in the ADHD Lab at King’s College London — kept her diagnosis secret until this year. Le Cunff knew all too well about the deficits associated with ADHD. But her research — and personal experience — hinted at an underappreciated upside. “I started seeing … breadcrumbs pointing at a potential association between curiosity and ADHD,” she says. People within the ADHD community have long recognized that the condition can be both harmful and helpful. Researchers, though, have largely focused on the harms. And those studying treatments tend to define success as a reduction in ADHD symptoms, with little regard to possible benefits. That’s starting to change. For instance, Norwegian researchers asked 50 individuals with ADHD to describe their positive experiences with the disorder as part of an effort to develop more holistic treatments. People cited their creativity, energy, adaptability, resilience and curiosity, researchers reported in BMJ Open in October 2023. © Society for Science & the Public 2000–2025.
By Kenneth Chang After decades of brain research, scientists still aren’t sure whether most people see the same way, more or less — especially with colors. Is what I call red also red for you? Or could my red be your blue? Or maybe neon pink? If it were possible to project what I see directly into your mind, would the view be the same, or would it instead resemble a crazy-hued Andy Warhol painting? “That’s an age-old question, isn’t it?” said Andreas Bartels, a professor of visual neuroscience at the University of Tübingen in Germany. But scientists do have a good understanding of which parts of the brain handle vision. They have even figured out where various vision-processing tasks are performed, like recognizing what is moving, identifying colors and adjusting to different lighting conditions. Amazingly, it is even possible to deduce what you’re seeing by looking at an M.R.I. scan showing which parts of your brain are lighting up. “That comes out of the world of science fiction, or one would think, right?” Dr. Bartels said. “It’s amazing that this is possible, but this always has happened in individual brains.” That is, researchers pulled off this sleight of science with individuals. They would first show a subject lying in the M.R.I. machine a series of images, mapping out how that person’s brain responded. After that initial training, the researchers could randomly show one of the images and, based on just the brain activity, make a good guess at what the image was. In new research, Dr. Bartels and Michael Bannert, a postdoctoral researcher in Dr. Bartels’ laboratory, used that technique to provide a partial answer to the question of whether most of us have a shared sense of colors. They put 15 people, all with standard color vision, in an M.R.I. machine. The volunteers viewed expanding concentric rings that were red, green or yellow. © 2025 The New York Times Company
Keyword: Vision; Consciousness
Link ID: 29925 - Posted: 09.10.2025
By Claudia López Lloreda The process of making a decision engages neurons across the entire brain, according to a new mouse dataset created by an international collaboration. “Many, many areas are recruited even for what are arguably rather simple decisions,” says Anne Churchland, professor of neurobiology at University of California, Los Angeles and one of the founding members of the collaboration, called the International Brain Laboratory (IBL). The canonical model suggests that the activity underlying vision-dependent decisions goes from the visual thalamus to the primary visual cortex and association areas, and then possibly to the frontal cortex, Churchland says. But the new findings suggest that “maybe there’s more parallel processing and less of a straightforward circuit than we thought.” Churchland and other scientists established the IBL in 2017 out of frustration with small-scale studies of decision-making that analyzed only one or two brain regions at a time. The IBL aimed to study how the brain integrates information and makes a decision at scale. “We came together as a large group with the realization that a large team effort could be transformative in these questions that had been kind of stymieing all of us,” Churchland says. After years of standardizing their methods and instrumentation across the 12 participating labs, the IBL team constructed a brain-wide map of neural activity in mice as they complete a decision-making task. That map, published today in Nature, reveals that the activity associated with choices and motor actions shows up widely across the brain. The same is true for the activity underlying decisions based on prior knowledge, according to a companion paper by the same team, also published today in Nature. © 2025 Simons Foundation
Keyword: Attention; Brain imaging
Link ID: 29918 - Posted: 09.06.2025
Ian Sample Science editor A three-minute brainwave test can detect memory problems linked to Alzheimer’s disease long before people are typically diagnosed, raising hopes that the approach could help identify those most likely to benefit from new drugs for the condition. In a small trial, the test flagged specific memory issues in people with mild cognitive impairment, highlighting who was at greater risk of developing Alzheimer’s. Trials in larger groups are under way. The Fastball test is a form of electroencephalogram (EEG) that uses small sensors on the scalp to record the brain’s electrical activity while people watch a stream of images on a screen. The test detects memory problems by analysing the brain’s automatic responses to images the person sees before the test. “This shows us that our new passive measure of memory, which we’ve built specifically for Alzheimer’s disease diagnosis, can be sensitive to those individuals at very high risk but who are not yet diagnosed,” said Dr George Stothart, a cognitive neuroscientist at the University of Bath, where the test was developed. The trial, run with the University of Bristol, involved 54 healthy adults and 52 patients with mild cognitive impairment (MCI). People with MCI have problems with memory, thinking or language, but these are not usually severe enough to prevent them doing their daily activities. Before the test, volunteers were shown eight images and told to name them, but not specifically to remember them or look out for them in the test. The researchers then recorded the participants’ brain activity as they watched hundreds of images flash up on a screen. Each image appeared for a third of a second and every fifth picture was one of the eight they had seen before. © 2025 Guardian News & Media Limited
Keyword: Alzheimers; Attention
Link ID: 29914 - Posted: 09.03.2025
Hannah Devlin Science correspondent Attention deficit hyperactivity disorder medication is linked to significantly lower risk of suicidal behaviours, substance misuse, transport accidents and criminality, according to a study of the wider outcomes of treatment. The research, based on the medical records of nearly 150,000 people in Sweden, suggested that the drugs could have meaningful benefits beyond helping with the core symptoms of ADHD. Although the study was not a randomised trial – and so cannot definitively prove that medication caused improved outcomes – it adds to evidence of the substantial value of treatment. “We found that ADHD medication was associated with significantly reduced rates of first occurrences of suicidal behaviours, substance misuse, transport accidents and criminality,” said Prof Samuele Cortese, a child and adolescent psychiatrist and researcher at the University of Southampton. “Our results should inform the debate on the effects and safety of ADHD medications.” After accounting for factors including age, sex, education level, psychiatric diagnoses and medical history, ADHD medication was associated with reduced rates of a first occurrence of four of the five outcomes investigated: a 17% reduction for suicidal behaviour, 15% for substance misuse, 12% for transport accidents and 13% for criminality. It is well established that ADHD, thought to affect about 5% of children and 2.5% of adults worldwide, is linked to higher rates of mental health problems including suicide, substance misuse and accidental injuries. People with ADHD are also disproportionately represented within the criminal justice system. © 2025 Guardian News & Media Limited
Keyword: ADHD; Depression
Link ID: 29888 - Posted: 08.16.2025
Mariana Lenharo In late 2005, five months after a car accident, a 23-year-old woman lay unresponsive in a hospital bed. She had a severe brain injury and showed no sign of awareness. But when researchers scanning her brain asked her to imagine playing tennis, something striking happened: brain areas linked to movement lit up on her scan1. The experiment, conceived by neuroscientist Adrian Owen and his colleagues, suggested that the woman understood the instructions and decided to cooperate — despite appearing to be unresponsive. Owen, now at Western University in London, Canada, and his colleagues had introduced a new way to test for consciousness. Whereas some previous tests relied on observing general brain activity, this strategy zeroed in on activity directly linked to a researcher’s verbal command. The strategy has since been applied to hundreds of unresponsive people, revealing that many maintain an inner life and are aware of the world around them, at least to some extent. A 2024 study found that one in four people who were physically unresponsive had brain activity that suggested they could understand and follow commands to imagine specific activities, such as playing tennis or walking through a familiar space2. The tests rely on advanced neuroimaging techniques, so are mostly limited to research settings because of their high costs and the needed expertise. But since 2018, medical guidelines have started to recommend using these tests in clinical practice3. Since these methods emerged, scientists have been developing ways to probe layers of consciousness that are even more hidden. The stakes are high. Tens of thousands of people worldwide are currently in a persistent unresponsive state. Assessing their consciousness can guide important treatment decisions, such as whether to keep them on life support. Studies also suggest that hospitalized, unresponsive people with hidden signs of awareness are more likely to recover than are those without such signs (see, for example, ref. 4). © 2025 Springer Nature Limited
Keyword: Consciousness
Link ID: 29875 - Posted: 08.02.2025
By Tim Bayne One of the key scientific questions about consciousness concerns its distribution. We know that adult humans have the capacity for consciousness, but what about human neonates, bees or artificial intelligence (AI) systems? Who else—other than ourselves—belongs in the “consciousness club,” and how might we figure this out? It is tempting to assume, as many do, that we need a theory of consciousness to answer the distribution question. In the words of neuroscientists Giulio Tononi and Christof Koch, “we need not only more data but also a theory of consciousness—one that says what experience is and what type of physical systems can have it.” This is what philosopher Jonathan Birch has labeled the “theory-heavy” approach to the distribution problem. But there are serious issues with the theory-heavy approach. One is that we don’t have a consensus theory of consciousness. In a highly selective review that Anil Seth and I published in 2022, we listed no fewer than 22 neurobiological theories of consciousness. This overabundance of theories could reasonably be ignored if most agreed on fundamental questions in the field, such as which systems have the capacity for consciousness or the question of when consciousness first emerges in human development, but they don’t. A further problem with the theory-heavy approach is that in order to speak to the distribution problem, a theory cannot be restricted to consciousness as it occurs in adult humans, but must also apply to human infants, nonhuman animals, synthetic biological systems and AI. But because theories are largely based on data drawn from the study of adult humans, there will inevitably be a gap between the evidence base of a general theory and its scope. Why should we think that a theory developed in response to adult humans applies to different kinds of systems? © 2025 Simons Foundation
Keyword: Consciousness
Link ID: 29874 - Posted: 08.02.2025
Katie Kavanagh How does your brain wake up from sleep? A study of more than 1,000 arousals from slumber has revealed precisely how the brain bestirs itself during the transition to alertness1 — a finding that might help to manage sleep inertia, the grogginess that many people feel when hitting the snooze button. Recordings of people as they woke from the dream-laden phase of sleep showed that the first brain regions to rouse are those associated with executive function and decision-making, located at the front of the head. A wave of wakefulness then spreads to the back, ending with an area associated with vision. The findings could change how we think of waking up, says Rachel Rowe, a neuroscientist at the University of Colorado Boulder, who was not involved with the work. The results emphasize that “falling asleep and waking up aren’t simply reverse processes, but really waking up is this ordered wave of activation that moves from the front to the back of the brain”, whereas falling asleep seems to be less linear and more gradual. The study was published today in Current Biology1. The wide-awake brain shows a characteristic pattern of electrical activity, recorded by sensors on the scalp — it looks like a jagged line made up of small, tightly packed peaks and valleys. Although the pattern looks similar during rapid eye movement (REM) sleep, when vivid dreams occur, this stage features a lack of skeletal-muscle movement. The peaks are taller during most stages of non-REM sleep, which ranges from light to very deep slumber. Scientists already knew that the ‘awakened’ signature occurs at different times in different brain regions, but common imaging techniques did not allow these patterns to be explored on a precise timescale. © 2025 Springer Nature Limited
Keyword: Sleep; Attention
Link ID: 29864 - Posted: 07.19.2025
By Dan Falk I’ve been fascinated by time for as long as I can remember. In my undergraduate physics classes, time always lurked in the background—it was the “t” that the professors sprinkled into their equations—but it was never quite clear what time actually was. Years later, I wrote a book about time, but even with chapters on Newton and Einstein, and a solid dose of philosophy, something was missing. Nautilus Members enjoy an ad-free experience. Log in or Join now . For starters, we know clocks and watches work, but how do we tell time? If you’re watching network TV and a commercial break begins, you know you have time to use the bathroom or perhaps make a sandwich—in fact, you can probably arrange to be back in front of the TV just as the ads are ending. What makes you so good at judging these intervals of time? I figured that Dean Buonomano, being a neuroscientist, might have some of the answers. Buonomano is known for developing the idea that the key mechanism is not a single clock-like structure in the brain but rather networks of neurons working together, known as “neural dynamics.” But as Buonomano sees it, the brain does much more than keep track of time; in fact, it might be said to create it. It’s thanks to our brains that we feel time’s “flow,” even though nothing in physics points to such a flow out there in the world. Perhaps even more crucially, the brain allows us to engage in “mental time travel”—the ability to recall past events and imagine future happenings. This capability, he argues, was essential in shaping humanity’s path from the African savannah to today’s globe-spanning civilization. © 2025 NautilusNext Inc.,
Keyword: Attention
Link ID: 29851 - Posted: 07.12.2025


.gif)

