Chapter 14. Attention and Higher Cognition
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Humberto Basilio Mindia Wichert has taken part in plenty of brain experiments as a cognitive-neuroscience graduate student at the Humboldt University of Berlin, but none was as challenging as one he faced in 2023. Inside a stark white room, he stared at a flickering screen that flashed a different image every 10 seconds. His task was to determine what familiar object appeared in each image. But, at least at first, the images looked like nothing more than a jumble of black and white patches. “I’m very competitive with myself,” says Wichert. “I felt really frustrated.” Cognitive neuroscientist Maxi Becker, now at Duke University in Durham, North Carolina, chose the images in an attempt to spark a fleeting mental phenomenon that people often experience but can’t control or fully explain. Study participants puzzling out what is depicted in the images — known as Mooney images, after a researcher who published a set of them in the 1950s1 — can’t rely on analytical thinking. Instead, the answer must arrive all at once, like a flash of lightning in the dark (take Nature’s Mooney-images quiz below). Becker asked some of the participants to view the images while lying inside a functional magnetic resonance imaging (fMRI) scanner, so she could track tiny shifts in blood flow corresponding to brain activity. She hoped to determine which regions produce ‘aha!’ moments. Over the past two decades, scientists studying such moments of insight — also known as eureka moments — have used the tools of neuroscience to reveal which regions of the brain are active and how they interact when discovery strikes. They’ve refined the puzzles they use to trigger insight and the measurements they take, in an attempt to turn a self-reported, subjective experience into something that can be documented and rigorously studied. This foundational work has led to new questions, including why some people are more insightful than others, what mental states could encourage insight and how insight might boost memory. © 2025 Springer Nature Limited
Keyword: Attention; Learning & Memory
Link ID: 29844 - Posted: 06.28.2025
By Katrina Miller Take a look at this video of a waiting room. Do you see anything strange? Perhaps you saw the rug disappear, or the couch pillows transform, or a few ceiling panels evaporate. Or maybe you didn’t. In fact, dozens of objects change in this video, which won second place in the Best Illusion of the Year Contest in 2021. Voting for the latest version of the contest opened on Monday. Illusions “are the phenomena in which the physical reality is divorced from perception,” said Stephen Macknik, a neuroscientist at SUNY Downstate Health Sciences University in Brooklyn. He runs the contest with his colleague and spouse, Susana Martinez-Conde. By studying the disconnect between perception and reality, scientists can better understand which brain regions and processes help us interpret the world around us. The illusion above highlights change blindness, the brain’s failure to notice shifts in the environment, especially when they occur gradually. To some extent, all sensory experience is illusory, Dr. Martinez-Conde asserts. “We are always constructing a simulation of reality,” she said. “We don’t have direct access to that reality. We live inside the simulation that we create.” She and Dr. Macknik have run the illusion contest since 2005. What began as a public outreach event at an academic conference has since blossomed into an annual competition open to anyone in the world. They initially worried that people would run out of illusions to submit. “But that actually never happened,” Dr. Martinez-Conde said. “What ended up happening instead is that people started developing illusions, actually, with an eye to competing in the contest.” © 2025 The New York Times Company
Keyword: Vision; Attention
Link ID: 29843 - Posted: 06.28.2025
Myrian Wares for Quanta Magazine You’ve just gotten home from an exhausting day. All you want to do is put your feet up and zone out to whatever is on television. Though the inactivity may feel like a well-earned rest, your brain is not just chilling. In fact, it is using nearly as much energy as it did during your stressful activity, according to recent research. Sharna Jamadar (opens a new tab), a neuroscientist at Monash University in Australia, and her colleagues reviewed research from her lab and others around the world to estimate the metabolic cost of cognition (opens a new tab) — that is, how much energy it takes to power the human brain. Surprisingly, they concluded that effortful, goal-directed tasks use only 5% more energy than restful brain activity. In other words, we use our brain just a small fraction more when engaging in focused cognition than when the engine is idling. It often feels as though we allocate our mental energy through strenuous attention and focus. But the new research builds on a growing understanding that the majority of the brain’s function goes to maintenance. While many neuroscientists have historically focused on active, outward cognition, such as attention, problem-solving, working memory and decision-making, it’s becoming clear that beneath the surface, our background processing is a hidden hive of activity. Our brains regulate our bodies’ key physiological systems, allocating resources where they’re needed as we consciously and subconsciously react to the demands of our ever-changing environments. “There is this sentiment that the brain is for thinking,” said Jordan Theriault (opens a new tab), a neuroscientist at Northeastern University who was not involved in the new analysis. “Where, metabolically, [the brain’s function is] mostly spent on managing your body, regulating and coordinating between organs, managing this expensive system which it’s attached to, and navigating a complicated external environment.” © 2025 Simons Foundation.
Keyword: Attention; Brain imaging
Link ID: 29825 - Posted: 06.07.2025
Nicola Davis Science correspondent Whether it is doing sums or working out what to text your new date, some tasks produce a furrowed brow. Now scientists say they have come up with a device to monitor such effort: an electronic tattoo, stuck to the forehead. The researchers say the device could prove valuable among pilots, healthcare workers and other professions where managing mental workload is crucial to preventing catastrophes. “For this kind of high-demand and high-stake scenario, eventually we hope to have this real-time mental workload decoder that can give people some warning and alert so that they can self-adjust, or they can ask AI or a co-worker to offload some of their work,” said Dr Nanshu Lu, an author of the research from the University of Texas at Austin, adding the device may not only help workers avoid serious mistakes but also protect their health. Writing in the journal Device, Lu and colleagues describe how using questionnaires to investigate mental workload is problematic, not least as people are poor at objectively judging cognitive effort and they are usually conducted after a task. Meanwhile, existing electroencephalography (EEG) and electrooculography (EOG) devices, that can be used to assess mental workload by measuring brain waves and eye movements respectively, are wired, bulky and prone to erroneous measurements arising from movements. By contrast, the “e-tattoo” is a lightweight, flexible, wireless device. © 2025 Guardian News & Media Limited
Keyword: Attention; Stress
Link ID: 29815 - Posted: 05.31.2025
By Laura Dattaro One of Clay Holroyd’s mostly highly cited papers is a null result. In 2005, he tested a theory he had proposed about a brain response to unexpected rewards and disappointments, but the findings—now cited more than 600 times—didn’t match his expectations, he says. In the years since, other researchers have run similar tests, many of which contradicted Holroyd’s results. But in 2021, EEGManyLabs announced that it would redo Holroyd’s original experiment across 13 labs. In their replication effort, the researchers increased the sample size from 17 to 370 people. The results—the first from EEGManyLabs—published in January in Cortex, failed to replicate the null result, effectively confirming Holroyd’s theory. “Fundamentally, I thought that maybe it was a power issue,” says Holroyd, a cognitive neuroscientist at Ghent University. “Now this replication paper quite nicely showed that it was a power issue.” The two-decade tale demonstrates why pursuing null findings and replications—the focus of this newsletter—is so important. Holroyd’s 2002 theory proposed that previously observed changes in dopamine associated with unexpectedly positive or negative results cause neural responses that can be measured with EEG. The more surprising a result, he posited, the larger the response. To test the idea, Holroyd and his colleagues used a gambling-like task in which they told participants the odds of correctly identifying which of four choices would lead to a 10-cent reward. In reality, the reward was random. When participants received no reward, their neural reaction to the negative result was equally strong regardless of which odds they had been given, contradicting the theory. © 2025 Simons Foundation
Keyword: Attention; Learning & Memory
Link ID: 29814 - Posted: 05.31.2025
By Chris Berdik Yale psychiatrist Albert Powers didn’t know what to expect as he strolled among the tarot card readers, astrologers, and crystal vendors at the psychic fair held at the Best Western outside North Haven, Connecticut, on a cloudy November Saturday in 2014. At his clinic, Powers worked with young people, mostly teenagers, who had started hearing voices. His patients and their families were worried that the voices might be precursors of psychosis such as schizophrenia. Sometimes, they were. But Powers also knew that lots of people occasionally heard voices — between 7 and 15 percent of the population, according to studies — and about 75 percent of those people lived otherwise normal lives. WHAT I LEFT OUT is a recurring feature in which book authors are invited to share anecdotes and narratives that, for whatever reason, did not make it into their final manuscripts. In this installment, journalist Chris Berdik shares a story that didn’t make it into his recent book “Clamor: How Noise Took Over the World and How We Can Take It Back” (Norton, 272 pages). He wanted to study high-functioning voice hearers, and a gathering of psychics seemed like a good place to find them. If clinicians could better distinguish voice hearers who develop psychosis and lose touch with reality from those who don’t, he thought, then maybe he could help steer more patients down a healthier path. Powers introduced himself to the fair’s organizer and explained the sort of person he hoped to find. The organizer directed him to a nearby table where he met a smiley, middle-aged medium. The woman had a day job as an emergency services dispatcher, but the voices made frequent appearances in her daily life, and her side hustle was communicating with the dead.
Keyword: Schizophrenia; Attention
Link ID: 29809 - Posted: 05.28.2025
Dobromir Rahnev Is it possible to upload the consciousness of your mind into a computer? – Amreen, age 15, New Delhi, India The concept, cool yet maybe a little creepy, is known as mind uploading. Think of it as a way to create a copy of your brain, a transmission of your mind and consciousness into a computer. There you would live digitally, perhaps forever. You’d have an awareness of yourself, you’d retain your memories and still feel like you. But you wouldn’t have a body. Within that simulated environment, you could do anything you do in real life – eating, driving a car, playing sports. You could also do things impossible in the real world, like walking through walls, flying like a bird or traveling to other planets. The only limit is what science can realistically simulate. Doable? Theoretically, mind uploading should be possible. Still, you may wonder how it could happen. After all, researchers have barely begun to understand the brain. Yet science has a track record of turning theoretical possibilities into reality. Just because a concept seems terribly, unimaginably difficult doesn’t mean it’s impossible. Consider that science took humankind to the Moon, sequenced the human genome and eradicated smallpox. Those things too were once considered unlikely. As a brain scientist who studies perception, I fully expect mind uploading to one day be a reality. But as of today, we’re nowhere close. Learn about the latest, most interesting health and science research © 2010–2025, The Conversation US, Inc.
Keyword: Consciousness; Robotics
Link ID: 29803 - Posted: 05.24.2025
By Sara Novak Just a few weeks after they hatch, baby male zebra finches begin to babble, spending much of the day testing their vocal chords. Dad helps out, singing to his hatchlings during feedings, so that the babies can internalize his tune, the same mating refrain shared by all male zebra finches. Soon, these tiny Australian birds begin to rehearse the song itself, repeating it up to 10,000 times a day, without any clear reward other than their increasing perfection of the melody. The baby birds’ painstaking devotion to mastering their song led Duke University neuroscientist Richard Mooney and his Duke colleague John Pearson to wonder whether the birds could help us better understand the nature of self-directed learning. In humans, language and musical expression are thought to be self-directed—spontaneous, adaptive and intrinsically reinforced. In a study recently published in Nature, the scientists tracked the brain signals and levels of dopamine, a neurotransmitter involved in reward and movement, in the brains of five male baby Zebra finches while they were singing. They also measured song quality for each rendition the birds sang, in terms of both pitch and vigor, as well as the quality of song performance relative to the bird’s age. What they found is that dopamine levels in the baby birds’ brains closely matched the birds’ performance of the song, suggesting it plays a central role in the learning process. Scientists have long known that learning that is powered by external rewards, such as grades, praise or sugary treats, is driven by dopamine—which is thought to chart the differences between expected and experienced rewards. But while they have suspected that self-directed learning is likewise guided by dopamine, it had been difficult to test that hypothesis until now. © 2025 NautilusNext Inc.,
Keyword: Attention; Sexual Behavior
Link ID: 29800 - Posted: 05.24.2025
By Gina Kolata Dr. Geoffrey Manley, a neurosurgeon at the University of California, San Francisco, wants the medical establishment to change the way it deals with brain injuries. His work is motivated in part by what happened to a police officer he treated in 2002, just after completing his medical training. The man arrived at the emergency room unconscious, in a coma. He had been in a terrible car crash while pursuing a criminal. Two days later, Dr. Manley’s mentor said it was time to tell the man’s family there was no hope. His life support should be withdrawn. He should be allowed to die. Dr. Manley resisted. The patient’s brain oxygen levels were encouraging. Seven days later the policeman was still in a coma. Dr. Manley’s mentor again pressed him to talk to the man’s family about withdrawing life support. Again, Dr. Manley resisted. Ten days after the accident, the policeman began to come out of his coma. Three years later he was back at work and was named San Francisco Police Officer of the Month. In 2010, he was Police Officer of the Year “That case, and another like it,” Dr. Manley said, “changed my practice.” But little has changed in the world of traumatic brain injuries since Dr. Manley’s patient woke up. Assessments of who will recover and how severely patients are injured are pretty much the same, which results in patients being told they “just” have a concussion, who then have trouble getting care for recurring symptoms like memory lapses or headaches. And it results in some patients in the position of that policemen, who have their life support withdrawn when they might have recovered. Now, though, Dr. Manley and 93 others from 14 countries are proposing a new way to evaluate patients. They published their classification system Tuesday in the journal Lancet Neurology. © 2025 The New York Times Company
Keyword: Brain Injury/Concussion; Consciousness
Link ID: 29798 - Posted: 05.21.2025
By Mac Shine The brain is an endlessly dynamic machine; it can wake you from sleep, focus your attention, spark a memory or help you slam on the brakes while driving. But what makes this precision possible? How can the brain dial up just the right amount of alertness or inhibition, and only when it’s needed? A new study, out today in Nature, may have found part of the answer in an unlikely place: a cluster of small, largely overlooked inhibitory neurons nestled next to one of the brain’s most powerful arousal hubs, the locus coeruleus (LC). Led by Michael R. Bruchas, a neuroscientist at the University of Washington, the study is a tour de force in neural sleuthing, employing methods ranging from viral tracing and electrophysiology to imaging and behavior to map an elusive cell population known as the pericoeruleus. In a world where we’re constantly being pinged, alerted, nudged and notified, the ability to not react—to gate our arousal and filter our responses—may be one of the brain’s most underappreciated superpowers. Here I discuss the results with Bruchas—and what he and his team found is remarkable. Far from being a passive neighbor to the LC, the pericoeruleus appears to act as a kind of micromanager of arousal, selectively inhibiting different subgroups of LC neurons depending on the behavioral context. If the LC is like a floodlight that bathes the brain in noradrenaline—raising alertness, sharpening perception and mobilizing attention—then the pericoeruleus may be the finely-tuned lens that directs where and when that light shines. It’s a subtle but powerful form of control, and one that challenges traditional views of how the LC operates. For decades, the LC has been thought of primarily as a global broadcaster: When it fires, it releases norepinephrine widely across the cortex, preparing the brain for action. But this new work is the latest in a recent line of inquiry that has challenged this simplicity—suggesting that the system is more complex and nuanced than previously thought. “We’re beginning to see that the locus coeruleus doesn’t just flood the brain with arousal – it targets specific outputs, and the pericoeruleus plays a key role in gating that process,” said Li Li, one of the co-first authors of the paper and a former postdoctoral researcher in Bruchas’ lab, now assistant professor of anesthesiology at Seattle Children’s Hospital. © 2025 Simons Foundation
Keyword: Attention
Link ID: 29787 - Posted: 05.14.2025
By Asher Elbein True friends, most people would agree, are there for each other. Sometimes that means offering emotional support. Sometimes it means helping each other move. And if you’re a superb starling — a flamboyant, chattering songbird native to the African savanna — it means stuffing bugs down the throats of your friends’ offspring, secure in the expectation that they’ll eventually do the same for yours. Scientists have long known that social animals usually put blood relatives first. But for a study published Wednesday in the journal Nature, researchers crunched two decades of field data to show that unrelated members of a superb starling flock often help each other raise chicks, trading assistance to one another over years in a behavior that was not previously known. “We think that these reciprocal helping relationships are a way to build ties,” said Dustin Rubenstein, a professor of ecology at Columbia University and an author of the paper. Superb starlings are distinctive among animals that breed cooperatively, said Alexis Earl, a biologist at Cornell University and an author of the paper. Their flocks mix family groups with immigrants from other groups. New parents rely on up to 16 helpers, which bring chicks extra food and help run off predators. Dr. Rubenstein’s lab has maintained a 20-year field study of the species that included 40 breeding seasons. It has recorded thousands of interactions between hundreds of the chattering birds and collected DNA to examine their genetic relationships. When Dr. Earl, then a graduate student in the lab, began crunching the data, she and her colleagues weren’t shocked to see that birds largely helped relatives, the way an aunt or uncle may swoop in to babysit and give parents a break. © 2025 The New York Times Company
Keyword: Evolution; Emotions
Link ID: 29780 - Posted: 05.10.2025
By Carl Zimmer Consciousness may be a mystery, but that doesn’t mean that neuroscientists don’t have any explanations for it. Far from it. “In the field of consciousness, there are already so many theories that we don’t need more theories,” said Oscar Ferrante, a neuroscientist at the University of Birmingham. If you’re looking for a theory to explain how our brains give rise to subjective, inner experiences, you can check out Adaptive Resonance Theory. Or consider Dynamic Core Theory. Don’t forget First Order Representational Theory, not to mention semantic pointer competition theory. The list goes on: A 2021 survey identified 29 different theories of consciousness. Dr. Ferrante belongs to a group of scientists who want to lower that number, perhaps even down to just one. But they face a steep challenge, thanks to how scientists often study consciousness: Devise a theory, run experiments to build evidence for it, and argue that it’s better than the others. “We are not incentivized to kill our own ideas,” said Lucia Melloni, a neuroscientist at the Max Planck Institute for Empirical Aesthetics in Frankfurt, Germany. Seven years ago, Dr. Melloni and 41 other scientists embarked on a major study on consciousness that she hoped would break this pattern. Their plan was to bring together two rival groups to design an experiment to see how well both theories did at predicting what happens in our brains during a conscious experience. The team, called the Cogitate Consortium, published its results on Wednesday in the journal Nature. But along the way, the study became subject to the same sharp-elbowed conflicts they had hoped to avoid. Dr. Melloni and a group of like-minded scientists began drawing up plans for their study in 2018. They wanted to try an approach known as adversarial collaboration, in which scientists with opposing theories join forces with neutral researchers. The team chose two theories to test. © 2025 The New York Times Company
Keyword: Consciousness
Link ID: 29773 - Posted: 05.03.2025
By Anil Seth On stage in New York a couple years ago, noted neuroscientist Christof Koch handed a very nice bottle of Madeira wine to philosopher David Chalmers. Chalmers had won a quarter-century-long bet about consciousness—or at least our understanding of it. Nautilus Members enjoy an ad-free experience. Log in or Join now . The philosopher had challenged the neuroscientist in 1998—with a crate of fine wine on the line—that in 25 years, science would still not have located the seat of consciousness in the brain. The philosopher was right. But not without an extraordinary—and revealing—effort on the part of consciousness researchers and theorists. Backing up that concession were the results of a long and thorough “adversarial collaboration” that compared two leading theories about consciousness, testing each with rigorous experimental data. Now we finally learn more about the details of this work in a new paper in the journal Nature. Nicknamed COGITATE, the collaboration pitted “global neuronal workspace theory” (GNWT)—an idea advocated by cognitive neuroscientist Stanislas Dehaene, which associates consciousness with the broadcast of information throughout large swathes of the brain—against “integrated information theory” (IIT)—the idea from neuroscientist Giulio Tononi, which identifies consciousness with the intrinsic cause-and-effect power of brain networks. The adversarial collaboration involved the architects of both theories sitting down together, along with other researchers who would lead and execute the project (hats off to them), to decide on experiments that could potentially distinguish between the theories—ideally supporting one and challenging the other. Deciding on the theory-based predictions, and on experiments good enough to test them, was never going to be easy. In consciousness research, it is especially hard since—as philosopher Tim Bayne and I noted—theories often make different assumptions, and attempt to explain different things even if, on the face of it, they are all theories of “consciousness.” © 2025 NautilusNext Inc.,
Keyword: Consciousness
Link ID: 29772 - Posted: 05.03.2025
By Allison Parshall ] Where in the brain does consciousness originate? Theories abound, but neuroscientists still haven’t coalesced around one explanation, largely because it’s such a hard question to probe with the scientific method. Unlike other phenomena studied by science, consciousness cannot be observed externally. “I observe your behavior. I observe your brain, if I do an intracranial EEG [electroencephalography] study. But I don’t ever observe your experience,” says Robert Chis-Ciure, a postdoctoral researcher studying consciousness at the University of Sussex in England. Scientists have landed on two leading theories to explain how consciousness emerges: integrated information theory, or IIT, and global neuronal workspace theory, or GNWT. These frameworks couldn’t be more different—they rest on different assumptions, draw from different fields of science and may even define consciousness in different ways, explains Anil K. Seth, a consciousness researcher at the University of Sussex. To compare them directly, researchers organized a group of 12 laboratories called the Cogitate Consortium to test the theories’ predictions against each other in a large brain-imaging study. The result, published in full on Wednesday in Nature, was effectively a draw and raised far more questions than it answered. The preliminary findings were posted to the preprint server bioRxiv in 2023. And only a few months later, a group of scholars publicly called IIT “pseudoscience” and attempted to excise it from the field. As the dust settles, leading consciousness researchers say that the Cogitate results point to a way forward for understanding how consciousness arises—no matter what theory eventually comes out on top. “We all are very good at constructing castles in the sky” with abstract ideas, says Chis-Ciure, who was not involved in the new study. “But with data, you make those more grounded.” © 2025 SCIENTIFIC AMERICAN,
Keyword: Consciousness
Link ID: 29771 - Posted: 05.03.2025
By Yasemin Saplakoglu In 1943, a pair of neuroscientists were trying to describe how the human nervous system works when they accidentally laid the foundation for artificial intelligence. In their mathematical framework (opens a new tab) for how systems of cells can encode and process information, Warren McCulloch and Walter Pitts argued that each brain cell, or neuron, could be thought of as a logic device: It either turns on or it doesn’t. A network of such “all-or-none” neurons, they wrote, can perform simple calculations through true or false statements. “They were actually, in a sense, describing the very first artificial neural network,” said Tomaso Poggio (opens a new tab) of the Massachusetts Institute of Technology, who is one of the founders of computational neuroscience. McCulloch and Pitts’ framework laid the groundwork for many of the neural networks that underlie the most powerful AI systems. These algorithms, built to recognize patterns in data, have become so competent at complex tasks that their products can seem eerily human. ChatGPT’s text is so conversational and personal that some people are falling in love (opens a new tab). Image generators can create pictures so realistic that it can be hard to tell when they’re fake. And deep learning algorithms are solving scientific problems that have stumped humans for decades. These systems’ abilities are part of the reason the AI vocabulary is so rich in language from human thought, such as intelligence, learning and hallucination. But there is a problem: The initial McCulloch and Pitts framework is “complete rubbish,” said the science historian Matthew Cobb (opens a new tab) of the University of Manchester, who wrote the book The Idea of the Brain: The Past and Future of Neuroscience (opens a new tab). “Nervous systems aren’t wired up like that at all.” A promotional card for Quanta's AI series, which reads Science Promise and the Peril of AI, Explore the Series" When you poke at even the most general comparison between biological and artificial intelligence — that both learn by processing information across layers of networked nodes — their similarities quickly crumble. © 2025 Simons Foundation
Keyword: Consciousness; Robotics
Link ID: 29770 - Posted: 05.03.2025
By Matt Richtel So sharp are partisan divisions these days that it can seem as if people are experiencing entirely different realities. Maybe they actually are, according to Leor Zmigrod, a neuroscientist and political psychologist at Cambridge University. In a new book, “The Ideological Brain: The Radical Science of Flexible Thinking,” Dr. Zmigrod explores the emerging evidence that brain physiology and biology help explain not just why people are prone to ideology but how they perceive and share information. What is ideology? It’s a narrative about how the world works and how it should work. This potentially could be the social world or the natural world. But it’s not just a story: It has really rigid prescriptions for how we should think, how we should act, how we should interact with other people. An ideology condemns any deviation from its prescribed rules. You write that rigid thinking can be tempting. Why is that? Ideologies satisfy the need to try to understand the world, to explain it. And they satisfy our need for connection, for community, for just a sense that we belong to something. There’s also a resource question. Exploring the world is really cognitively expensive, and just exploiting known patterns and rules can seem to be the most efficient strategy. Also, many people argue — and many ideologies will try to tell you — that adhering to rules is the only good way to live and to live morally. I actually come at it from a different perspective: Ideologies numb our direct experience of the world. They narrow our capacity to adapt to the world, to understand evidence, to distinguish between credible evidence and not credible evidence. Ideologies are rarely, if ever, good. Q: In the book, you describe research showing that ideological thinkers can be less reliable narrators. Can you explain? Remarkably, we can observe this effect in children. In the 1940s, Else Frenkel-Brunswik, a psychologist at the University of California, Berkeley, interviewed hundreds of children and tested their levels of prejudice and authoritarianism, like whether they championed conformity and obedience or play and imagination. When children were told a story about new pupils at a fictional school and asked to recount the story later, there were significant differences in what the most prejudiced children remembered, as opposed to the most liberal children. © 2025 The New York Times Company
Keyword: Emotions; Attention
Link ID: 29737 - Posted: 04.09.2025
Alexandra Topping The benefits of taking drugs for attention deficit hyperactivity disorder outweigh the impact of increases in blood pressure and heart rate, according to a new study. An international team of researchers led by scientists from the University of Southampton found the majority of children taking ADHD medication experienced small increases in blood pressure and pulse rates, but that the drugs had “overall small effects”. They said the study’s findings highlighted the need for “careful monitoring”. Prof Samuele Cortese, the senior lead author of the study, from the University of Southampton, said the risks and benefits of taking any medication had to be assessed together, but for ADHD drugs the risk-benefit ratio was “reassuring”. “We found an overall small increase in blood pressure and pulse for the majority of children taking ADHD medications,” he said. “Other studies show clear benefits in terms of reductions in mortality risk and improvement in academic functions, as well as a small increased risk of hypertension, but not other cardiovascular diseases. Overall, the risk-benefit ratio is reassuring for people taking ADHD medications.” About 3 to 4% of adults and 5% of children in the UK are believed to have ADHD, a neurodevelopmental disorder with symptoms including impulsiveness, disorganisation and difficulty focusing, according to the National Institute for Health and Care Excellence (Nice). Doctors can prescribe stimulants, such as methylphenidate, of which the best-known brand is Ritalin. Other stimulant medications used to treat ADHD include lisdexamfetamine and dexamfetamine. Non-stimulant drugs include atomoxetine, an sNRI (selective norepinephrine reuptake inhibitor), and guanfacine. © 2025 Guardian News & Media Limited
Keyword: ADHD; Drug Abuse
Link ID: 29734 - Posted: 04.09.2025
By Smriti Mallapaty Neuroscientists have observed for the first time how structures deep in the brain are activated when the brain becomes aware of its own thoughts, known as conscious perception1. The brain is constantly bombarded with sights, sounds and other stimuli, but people are only ever aware of a sliver of the world around them — the taste of a piece of chocolate or the sound of someone’s voice, for example. Researchers have long known that the outer layer of the brain, called the cerebral cortex, plays a part in this experience of being aware of specific thoughts. The involvement of deeper brain structures has been much harder to elucidate, because they can be accessed only with invasive surgery. Designing experiments to test the concept in animals is also tricky. But studying these regions would allow researchers to broaden their theories of consciousness beyond the brain’s outer wrapping, say researchers. “The field of consciousness studies has evoked a lot of criticism and scepticism because this is a phenomenon that is so hard to study,” says Liad Mudrik, a neuroscientist at Tel Aviv University in Israel. But scientists have increasingly been using systematic and rigorous methods to investigate consciousness, she says. Aware or not In a study published in Science today1, Mingsha Zhang, a neuroscientist at Beijing Normal University, focused on the thalamus. This region at the centre of the brain is involved in processing sensory information and working memory, and is thought to have a role in conscious perception. Participants were already undergoing therapy for severe and persistent headaches, for which they had thin electrodes injected deep into their brains. This allowed Zhang and his colleagues to study their brain signals and measure conscious awareness. © 2025 Springer Nature Limited
Keyword: Consciousness
Link ID: 29731 - Posted: 04.05.2025
By Christina Caron Health Secretary Robert F. Kennedy Jr. has often criticized prescription stimulants, such as Adderall, that are primarily used to treat attention deficit hyperactivity disorder. “We have damaged this entire generation,” he said last year during a podcast, referring to the number of children taking psychiatric medications. “We have poisoned them.” In February, the “Make America Healthy Again” commission, led by Mr. Kennedy, announced plans to evaluate the “threat” posed by drugs like prescription stimulants. But are they a threat? And if so, to whom? Like many medications, prescription stimulants have potential side effects, and there are people who misuse them. Yet these drugs are also considered some of the most effective and well-researched treatments that psychiatry has to offer, said Dr. Jeffrey H. Newcorn, the director of the Division of A.D.H.D. and Learning Disorders at the Icahn School of Medicine at Mount Sinai in New York. Here are some answers to common questions and concerns about stimulants. What are prescription stimulants? Prescription stimulants are drugs that help change the way the brain works by increasing the communication among neurons. They are divided into two classes: methylphenidates (like Ritalin, Focalin and Concerta) and amphetamines (like Vyvanse and Adderall). © 2025 The New York Times Company
Keyword: ADHD; Drug Abuse
Link ID: 29723 - Posted: 04.02.2025
By Christina Caron On TikTok, misinformation about attention deficit hyperactivity disorder can be tricky to spot, according to a new study. The study, published on Wednesday in the journal PLOS One, found that fewer than 50 percent of the claims made in some of the most popular A.D.H.D. videos on TikTok offered information that matched diagnostic criteria or professional treatment recommendations for the disorder. And, the researchers found, even study participants who had already been diagnosed with A.D.H.D. had trouble discerning which information was most reliable. About half of the TikTok creators included in the study were using the platform to sell products, such as fidget spinners, or services like coaching. None of them were licensed mental health professionals. The lack of nuance is concerning, said Vasileia Karasavva, a Ph.D. student in clinical psychology at the University of British Columbia in Vancouver and the lead author of the study. If TikTok creators talk about difficulty concentrating, she added, they don’t typically mention that the symptom is not specific to A.D.H.D. or that it could also be a manifestation of a different mental disorder, like depression or anxiety. “The last thing we want to do is discourage people from expressing how they’re feeling, what they’re experiencing and finding community online,” Ms. Karasavva said. “At the same time, it might be that you self-diagnose with something that doesn’t apply to you, and then you don’t get the help that you actually need.” Ms. Karasavva’s results echo those of a 2022 study that also analyzed 100 popular TikTok videos about A.D.H.D. and found that half of them were misleading. “The data are alarming,” said Stephen P. Hinshaw, a professor of psychology and an expert in A.D.H.D. at the University of California, Berkeley, who was not involved in either study. The themes of the videos might easily resonate with viewers, he added, but “accurate diagnosis takes access, time and money.” © 2025 The New York Times Company
Keyword: ADHD
Link ID: 29714 - Posted: 03.22.2025