Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 725

Myrian Wares for Quanta Magazine You’ve just gotten home from an exhausting day. All you want to do is put your feet up and zone out to whatever is on television. Though the inactivity may feel like a well-earned rest, your brain is not just chilling. In fact, it is using nearly as much energy as it did during your stressful activity, according to recent research. Sharna Jamadar (opens a new tab), a neuroscientist at Monash University in Australia, and her colleagues reviewed research from her lab and others around the world to estimate the metabolic cost of cognition (opens a new tab) — that is, how much energy it takes to power the human brain. Surprisingly, they concluded that effortful, goal-directed tasks use only 5% more energy than restful brain activity. In other words, we use our brain just a small fraction more when engaging in focused cognition than when the engine is idling. It often feels as though we allocate our mental energy through strenuous attention and focus. But the new research builds on a growing understanding that the majority of the brain’s function goes to maintenance. While many neuroscientists have historically focused on active, outward cognition, such as attention, problem-solving, working memory and decision-making, it’s becoming clear that beneath the surface, our background processing is a hidden hive of activity. Our brains regulate our bodies’ key physiological systems, allocating resources where they’re needed as we consciously and subconsciously react to the demands of our ever-changing environments. “There is this sentiment that the brain is for thinking,” said Jordan Theriault (opens a new tab), a neuroscientist at Northeastern University who was not involved in the new analysis. “Where, metabolically, [the brain’s function is] mostly spent on managing your body, regulating and coordinating between organs, managing this expensive system which it’s attached to, and navigating a complicated external environment.” © 2025 Simons Foundation.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 29825 - Posted: 06.07.2025

Nicola Davis Science correspondent Whether it is doing sums or working out what to text your new date, some tasks produce a furrowed brow. Now scientists say they have come up with a device to monitor such effort: an electronic tattoo, stuck to the forehead. The researchers say the device could prove valuable among pilots, healthcare workers and other professions where managing mental workload is crucial to preventing catastrophes. “For this kind of high-demand and high-stake scenario, eventually we hope to have this real-time mental workload decoder that can give people some warning and alert so that they can self-adjust, or they can ask AI or a co-worker to offload some of their work,” said Dr Nanshu Lu, an author of the research from the University of Texas at Austin, adding the device may not only help workers avoid serious mistakes but also protect their health. Writing in the journal Device, Lu and colleagues describe how using questionnaires to investigate mental workload is problematic, not least as people are poor at objectively judging cognitive effort and they are usually conducted after a task. Meanwhile, existing electroencephalography (EEG) and electrooculography (EOG) devices, that can be used to assess mental workload by measuring brain waves and eye movements respectively, are wired, bulky and prone to erroneous measurements arising from movements. By contrast, the “e-tattoo” is a lightweight, flexible, wireless device. © 2025 Guardian News & Media Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 29815 - Posted: 05.31.2025

By Laura Dattaro One of Clay Holroyd’s mostly highly cited papers is a null result. In 2005, he tested a theory he had proposed about a brain response to unexpected rewards and disappointments, but the findings—now cited more than 600 times—didn’t match his expectations, he says. In the years since, other researchers have run similar tests, many of which contradicted Holroyd’s results. But in 2021, EEGManyLabs announced that it would redo Holroyd’s original experiment across 13 labs. In their replication effort, the researchers increased the sample size from 17 to 370 people. The results—the first from EEGManyLabs—published in January in Cortex, failed to replicate the null result, effectively confirming Holroyd’s theory. “Fundamentally, I thought that maybe it was a power issue,” says Holroyd, a cognitive neuroscientist at Ghent University. “Now this replication paper quite nicely showed that it was a power issue.” The two-decade tale demonstrates why pursuing null findings and replications—the focus of this newsletter—is so important. Holroyd’s 2002 theory proposed that previously observed changes in dopamine associated with unexpectedly positive or negative results cause neural responses that can be measured with EEG. The more surprising a result, he posited, the larger the response. To test the idea, Holroyd and his colleagues used a gambling-like task in which they told participants the odds of correctly identifying which of four choices would lead to a 10-cent reward. In reality, the reward was random. When participants received no reward, their neural reaction to the negative result was equally strong regardless of which odds they had been given, contradicting the theory. © 2025 Simons Foundation

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 14: Attention and Higher Cognition
Link ID: 29814 - Posted: 05.31.2025

By Sara Novak Just a few weeks after they hatch, baby male zebra finches begin to babble, spending much of the day testing their vocal chords. Dad helps out, singing to his hatchlings during feedings, so that the babies can internalize his tune, the same mating refrain shared by all male zebra finches. Soon, these tiny Australian birds begin to rehearse the song itself, repeating it up to 10,000 times a day, without any clear reward other than their increasing perfection of the melody. The baby birds’ painstaking devotion to mastering their song led Duke University neuroscientist Richard Mooney and his Duke colleague John Pearson to wonder whether the birds could help us better understand the nature of self-directed learning. In humans, language and musical expression are thought to be self-directed—spontaneous, adaptive and intrinsically reinforced. In a study recently published in Nature, the scientists tracked the brain signals and levels of dopamine, a neurotransmitter involved in reward and movement, in the brains of five male baby Zebra finches while they were singing. They also measured song quality for each rendition the birds sang, in terms of both pitch and vigor, as well as the quality of song performance relative to the bird’s age. What they found is that dopamine levels in the baby birds’ brains closely matched the birds’ performance of the song, suggesting it plays a central role in the learning process. Scientists have long known that learning that is powered by external rewards, such as grades, praise or sugary treats, is driven by dopamine—which is thought to chart the differences between expected and experienced rewards. But while they have suspected that self-directed learning is likewise guided by dopamine, it had been difficult to test that hypothesis until now. © 2025 NautilusNext Inc.,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 19: Language and Lateralization
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 15: Language and Lateralization
Link ID: 29800 - Posted: 05.24.2025

By Mac Shine The brain is an endlessly dynamic machine; it can wake you from sleep, focus your attention, spark a memory or help you slam on the brakes while driving. But what makes this precision possible? How can the brain dial up just the right amount of alertness or inhibition, and only when it’s needed? A new study, out today in Nature, may have found part of the answer in an unlikely place: a cluster of small, largely overlooked inhibitory neurons nestled next to one of the brain’s most powerful arousal hubs, the locus coeruleus (LC). Led by Michael R. Bruchas, a neuroscientist at the University of Washington, the study is a tour de force in neural sleuthing, employing methods ranging from viral tracing and electrophysiology to imaging and behavior to map an elusive cell population known as the pericoeruleus. In a world where we’re constantly being pinged, alerted, nudged and notified, the ability to not react—to gate our arousal and filter our responses—may be one of the brain’s most underappreciated superpowers. Here I discuss the results with Bruchas—and what he and his team found is remarkable. Far from being a passive neighbor to the LC, the pericoeruleus appears to act as a kind of micromanager of arousal, selectively inhibiting different subgroups of LC neurons depending on the behavioral context. If the LC is like a floodlight that bathes the brain in noradrenaline—raising alertness, sharpening perception and mobilizing attention—then the pericoeruleus may be the finely-tuned lens that directs where and when that light shines. It’s a subtle but powerful form of control, and one that challenges traditional views of how the LC operates. For decades, the LC has been thought of primarily as a global broadcaster: When it fires, it releases norepinephrine widely across the cortex, preparing the brain for action. But this new work is the latest in a recent line of inquiry that has challenged this simplicity—suggesting that the system is more complex and nuanced than previously thought. “We’re beginning to see that the locus coeruleus doesn’t just flood the brain with arousal – it targets specific outputs, and the pericoeruleus plays a key role in gating that process,” said Li Li, one of the co-first authors of the paper and a former postdoctoral researcher in Bruchas’ lab, now assistant professor of anesthesiology at Seattle Children’s Hospital. © 2025 Simons Foundation

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29787 - Posted: 05.14.2025

By Claudia López Lloreda For a neuroscientist, the opportunity to record single neurons in people doesn’t knock every day. It is so rare, in fact, that after 14 years of waiting by the door, Florian Mormann says he has recruited just 110 participants—all with intractable epilepsy. All participants had electrodes temporarily implanted in their brains to monitor their seizures. But the slow work to build this cohort is starting to pay off for Mormann, a group leader at the University of Bonn, and for other researchers taking a similar approach, according to a flurry of studies published in the past year. For instance, certain neurons selectively respond not only to particular scents but also to the words and images associated with them, Mormann and his colleagues reported in October. Other neurons help to encode stimuli, form memories and construct representations of the world, recent work from other teams reveals. Cortical neurons encode specific information about the phonetics of speech, two independent teams reported last year. Hippocampal cells contribute to working memory and map out time in novel ways, two other teams discovered last year, and some cells in the region encode information related to a person’s changing knowledge about the world, a study published in August found. These studies offer the chance to answer questions about human brain function that remain challenging to answer using animal models, says Ziv Williams, associate professor of neurosurgery at Harvard Medical School, who led one of the teams that worked on speech phonetics. “Concept cells,” he notes by way of example, such as those Mormann identified, or the “Jennifer Aniston” neurons famously described in a 2005 study, have proved elusive in the monkey brain. © 2025 Simons Foundation

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 29709 - Posted: 03.19.2025

By Felicity Nelson A region in the brainstem, called the median raphe nucleus, contains neurons that control perseverance and exploration.Credit: K H Fung/Science Photo Library Whether mice persist with a task, explore new options or give up comes down to the activity of three types of neuron in the brain. In experiments, researchers at University College London (UCL) were able to control the three behaviours by switching the neurons on and off in a part of the animals’ brainstem called the median raphe nucleus. The findings are reported in Nature today1. “It’s quite remarkable that manipulation of specific neural subtypes in the median raphe nucleus mediates certain strategic behaviours,” says neuroscientist Roger Marek at the Queensland Brain Institute in Brisbane, Australia, who was not involved in the work. Whether these behaviours are controlled in the same way in humans needs to be confirmed, but if they are, this could be relevant to certain neuropsychiatric conditions that are associated with imbalances in the three behavioural strategies, says Sonja Hofer, a co-author of the paper and a systems neuroscientist at UCL. For instance, an overly high drive to persist with familiar actions and repetitive behaviours can be observed in people with obsessive–compulsive disorder and autism, she says. Conversely, pathological disengagement and lack of motivation are symptoms of major depressive disorder, and an excessive drive to explore and inability to persevere with a task is seen in attention deficit hyperactivity disorder. “It could be that changes in the firing rate of specific median raphe cell types could contribute to certain aspects of these conditions,” says Hofer. © 2025 Springer Nature Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29696 - Posted: 03.08.2025

Nell Greenfieldboyce People are constantly looking at the behavior of others and coming up with ideas about what might be going on in their heads. Now, a new study of bonobos adds to evidence that they might do the same thing. Specifically, some bonobos were more likely to point to the location of a treat when they knew that a human companion was not aware of where it had been hidden, according to a study which appears in the Proceedings of the National Academy of Sciences. The findings add to a long-running debate about whether humans have a unique ability to imagine and understand the mental states of others. Some researchers say this kind of "theory of mind" may be practiced more widely in the animal kingdom, and potentially watching it in action was quite the experience. "It's quite surreal. I mean, I've worked with primates for quite some years now and you never get used to it," says Luke Townrow, a PhD student at Johns Hopkins University. "We found evidence that they are tailoring their communication based on what I know." Hmmm, where is the grape? To see what bonobos might know about what humans around them know, Townrow worked with Chris Krupenye of Johns Hopkins University to devise a simple experiment. "It's always a challenge for us, that animals don't speak, so we can't just ask them what they're thinking. We have to come up with creative, experimental designs that allow them to express their knowledge," says Krupenye. © 2025 npr

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29658 - Posted: 02.05.2025

By Kristel Tjandra Close your eyes and picture an apple—what do you see? Most people will conjure up a vivid image of the fruit, but for the roughly one in 100 individuals with aphantasia, nothing will appear in the mind’s eye at all. Now, scientists have discovered that in people with this inability to form mental images, visual processing areas of the brain still light up when they try to do so. The study, published today in Current Biology, suggests aphantasia is not caused by a complete deficit in visual processing, as researchers have previously proposed. Visual brain areas are still active when aphantasic people are asked to imagine—but that activity doesn’t translate into conscious experience. The work offers new clues about the neurological differences underlying this little-explored condition. The study authors “take a very strong, mechanistic approach,” says Sarah Shomstein, a vision scientist at George Washington University who was not involved in the study. “It was asking the right questions and using the right methods.” Some scientists suspect aphantasia may be caused by a malfunction in the primary visual cortex, the first area in the brain to process images. “Typically, primary cortex is thought to be the engine of visual perception,” says Joel Pearson, a neuroscientist at the University of New South Wales Sydney who co-led the study. “If you don’t have activity there, you’re not going to have perceptual consciousness.” To see what was going on in this region in aphantasics, the team used functional magnetic resonance imaging to measure the brain activity of 14 people with aphantasia and 18 neurotypical controls as they repeatedly saw two simple patterns, made up of either green vertical lines or red horizontal lines. They then repeated the experiment, this time asking participants to simply imagine the two images.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 29624 - Posted: 01.11.2025

By Christina Caron Barrie Miskin was newly pregnant when she noticed her appearance was changing. Dark patches bloomed on her skin like watercolor ink. A “thicket” of hairs sprouted on her upper lip and chin. The outside world was changing, too: In her neighborhood of Astoria, Queens, bright lights enveloped objects in a halo, blurring her vision. Co-workers and even her doctors started to seem like “alien proxies” of themselves, Ms. Miskin, 46, said. “I felt like I was viewing the world through a pane of dirty glass,” she added. Yet Ms. Miskin knew it was all an illusion, so she sought help. Welcome to Psych 101, a new monthly column that explores mental health terms and trends that are worthy of a bigger conversation. If there is a subject you’d like to see covered, please drop us a line at Psych101@nytimes.com. It took more than a year of consulting with mental health specialists before Ms. Miskin finally found an explanation for her symptoms: She was diagnosed with a dissociative condition called depersonalization/derealization disorder, or D.D.D. Before her pregnancy, Ms. Miskin had stopped taking antidepressants. Her new psychiatrist said the symptoms could have been triggered by months of untreated depression that followed. While Ms. Miskin felt alone in her mystery illness, she wasn’t. Tens of thousands of posts on social media reference depersonalization or derealization, with some likening the condition to “living in a movie or a dream” or “observing the world through a fog.” People who experience depersonalization can feel as though they are detached from their mind or body. Derealization, on the other hand, refers to feeling detached from the environment, as though the people and things in the world are unreal. Those who are living with D.D.D. are “painfully aware” that something is amiss, said Elena Bezzubova, a psychoanalyst who specializes in treating the condition. It’s akin to seeing an apple and feeling that it is so strange it doesn’t seem real, even though you know that it is, she added. The disorder is thought to occur in about 1 to 2 percent of the population, but it’s possible for anyone to experience fleeting symptoms. © 2025 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 12: Psychopathology: The Biology of Behavioral Disorders
Link ID: 29623 - Posted: 01.11.2025

By Carl Zimmer In our digital age, few things are more irritating than a slow internet connection. Your web browser starts to lag. On video calls, the faces of your friends turn to frozen masks. When the flow of information dries up, it can feel as if we are cut off from the world. Engineers measure this flow in bits per second. Streaming a high-definition video takes about 25 million bps. The download rate in a typical American home is about 262 million bps. Now researchers have estimated the speed of information flow in the human brain: just 10 bps. They titled their study, published this month in the journal Neuron, “The unbearable slowness of being.” “It’s a bit of a counterweight to the endless hyperbole about how incredibly complex and powerful the human brain is,” said Markus Meister, a neuroscientist at the California Institute of Technology and an author of the study. “If you actually try to put numbers to it, we are incredibly slow.” Dr. Meister got the idea for the study while teaching an introductory neuroscience class. He wanted to give his students some basic numbers about the brain. But no one had pinned down the rate at which information flows through the nervous system. Dr. Meister realized that he could estimate that flow by looking at how quickly people carry out certain tasks. To type, for example, we look at a word, recognize each letter and then sort out the sequence of keys to press. As we type, information flows into our eyes, through our brains and into the muscles of our fingers. The higher the flow rate, the faster we can type. In 2018, a team of researchers in Finland analyzed 136 million keystrokes made by 168,000 volunteers. They found that, on average, people typed 51 words a minute. A small fraction typed 120 words a minute or more. Dr. Meister and his graduate student, Jieyu Zheng, used a branch of mathematics known as information theory to estimate the flow of information required to type. At 120 words a minute, the flow is only 10 bits a second. © 2024 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29611 - Posted: 12.28.2024

By Laura Sanders Growing up, Roberto S. Luciani had hints that his brain worked differently than most people. He didn’t relate when people complained about a movie character looking different than what they’d pictured from the book, for instance. But it wasn’t until he was a teenager that things finally clicked. His mother had just woken up and was telling him about a dream she had. “Movielike,” is how she described it. “Up until then, I assumed that cartoon depictions of imagination were exaggerated,” Luciani says, “I asked her what she meant and quickly realized my visual imagery was not functioning like hers.” That’s because Luciani has a condition called aphantasia — an inability to picture objects, people and scenes in his mind. When he was growing up, the term didn’t even exist. But now, Luciani, a cognitive scientist at the University of Glasgow in Scotland, and other scientists are getting a clearer picture of how some brains work, including those with a blind mind’s eye. In a recent study, Luciani and colleagues explored the connections between the senses, in this case, hearing and seeing. In most of our brains, these two senses collaborate. Auditory information influences activity in brain areas that handle vision. But in people with aphantasia, this connection isn’t as strong, researchers report November 4 in Current Biology. While in a brain scanner, blindfolded people listened to three sound scenes: A forest full of birds, a crowd of people, and a street bustling with traffic. In 10 people without aphantasia, these auditory scenes create reliable neural hallmarks in parts of the visual cortex. But in 23 people with aphantasia, these hallmarks were weaker. © Society for Science & the Public 2000–2024.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29566 - Posted: 11.20.2024

By Yasemin Saplakoglu Around 2,500 years ago, Babylonian traders in Mesopotamia impressed two slanted wedges into clay tablets. The shapes represented a placeholder digit, squeezed between others, to distinguish numbers such as 50, 505 and 5,005. An elementary version of the concept of zero was born. Hundreds of years later, in seventh-century India, zero took on a new identity. No longer a placeholder, the digit acquired a value and found its place on the number line, before 1. Its invention went on to spark historic advances in science and technology. From zero sprang the laws of the universe, number theory and modern mathematics. “Zero is, by many mathematicians, definitely considered one of the greatest — or maybe the greatest — achievement of mankind,” said the neuroscientist Andreas Nieder (opens a new tab), who studies animal and human intelligence at the University of Tübingen in Germany. “It took an eternity until mathematicians finally invented zero as a number.” Perhaps that’s no surprise given that the concept can be difficult for the brain to grasp. It takes children longer to understand and use zero than other numbers, and it takes adults longer to read it than other small numbers. That’s because to understand zero, our mind must create something out of nothing. It must recognize absence as a mathematical object. “It’s like an extra level of abstraction away from the world around you,” said Benjy Barnett (opens a new tab), who is completing graduate work on consciousness at University College London. Nonzero numbers map onto countable objects in the environment: three chairs, each with four legs, at one table. With zero, he said, “we have to go one step further and say, ‘OK, there wasn’t anything there. Therefore, there must be zero of them.’” © 2024 Simons Foundation

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29523 - Posted: 10.19.2024

By Miryam Naddaf Neurons in the hippocampus help to pick out patterns in the flood of information pouring through the brain.Credit: Arthur Chien/Science Photo Library The human brain is constantly picking up patterns in everyday experiences — and can do so without conscious thought, finds a study1 of neuronal activity in people who had electrodes implanted in their brain tissue for medical reasons. The study shows that neurons in key brain regions combine information on what occurs and when, allowing the brain to pick out the patterns in events as they unfold over time. That helps the brain to predict coming events, the authors say. The work was published today in Nature. “The brain does a lot of things that we are not consciously aware of,” says Edvard Moser, a neuroscientist at the Norwegian University of Science and Technology in Trondheim. “This is no exception.” To make sense of the world around us, the brain must process an onslaught of information on what happens, where it happens and when it happens. The study’s authors wanted to explore how the brain organizes this information over time — a crucial step in learning and memory. The team studied 17 people who had epilepsy and had electrodes implanted in their brains in preparation for surgical treatment. These electrodes allowed the authors to directly capture the activity of individual neurons in multiple brain regions. Among those regions were the hippocampus and entorhinal cortex, which are involved in memory and navigation. These areas contain time and place cells that act as the body’s internal clock and GPS system, encoding time and locations. “All the external world coming into our brain has to be filtered through that system,” says study co-author Itzhak Fried, a neurosurgeon and neuroscientist at the University of California, Los Angeles. © 2024 Springer Nature Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 29497 - Posted: 09.28.2024

By Angie Voyles Askham Nathaniel Daw has never touched a mouse. As professor of computational and theoretical neuroscience at Princeton University, he mainly works with other people’s data to construct models of the brain’s decision-making process. So when a collaborator came to him a few years ago with confusing data from mice that had performed a complex decision-making task in the lab, Daw says his best advice was just to fit the findings to the tried-and-true model of reward prediction error (RPE). That model relies on the idea that dopaminergic activity in the midbrain reflects discrepancies between expected and received rewards. Daw’s collaborator, Ben Engelhard, had measured the activity of dopamine neurons in the ventral tegmental area (VTA) of mice as they were deciding how to navigate a virtual environment. And although the virtual environment was more complex than what a mouse usually experiences in the real world, an RPE-based model should have held, Daw assumed. “It was obvious to me that there was this very simple story that was going to explain his data,” Daw says. But it didn’t. The neurons exhibited a wide range of responses, with some activated by visual cues and others by movement or cognitive tasks. The classic RPE model, it turned out, could not explain such heterogeneity. Daw, Engelhard and their colleagues published the findings in 2019. That was a wake-up call, Daw says, particularly after he watched videos of what the mice actually experienced in the maze. “It’s just so much more complicated, and high dimensional, and richer” than expected, he says. The idea that this richness could be reduced to such a simple model seems ludicrous now, he adds. “I was just so blinded.” © 2024 Simons Foundation

Related chapters from BN: Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 4: Development of the Brain; Chapter 14: Attention and Higher Cognition
Link ID: 29479 - Posted: 09.14.2024

By Sneha Khedkar About 10 years ago, when Michael Yartsev set up the NeuroBat Lab, he built a new windowless cave of sorts: a fully automated bat flight room. Equipped with cameras and other recording devices, the remote-controlled space has enabled his team to study the neuronal basis of navigation, acoustic and social behavior in Egyptian fruit bats without having any direct interaction with the animals. “In our lab, there’s never a human involved in the experiments,” says Yartsev, associate professor of bioengineering at the University of California, Berkeley. The impetus to create the space was clear. The setup, paired with wireless electrodes inserted in the bats’ hippocampus, has helped the team demonstrate, for example, that place cells encode a flying bat’s current, past and future locations. Also, a mountain of evidence suggests that the identity, sex and stress levels of human experimenters can influence the behavior of and brain circuit activity in other lab animals, such as mice and rats. Now Yartsev and his team have proved that “experimenter effects” hold true for bats, too, according to a new study published last month in Nature Neuroscience. The presence of human experimenters changed hippocampal neuronal activity in bats both at rest and during flight—and exerted an even stronger influence than another fruit bat, the study shows. The team expected that humans would influence neural activity, Yartsev says, “but we did not expect it to be so profound.” © 2024 Simons Foundation

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 9: Hearing, Balance, Taste, and Smell
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 6: Hearing, Balance, Taste, and Smell
Link ID: 29430 - Posted: 08.13.2024

By Maya L. Kapoor Six years ago, while shopping at a supermarket, Sadie Dingfelder spied her husband selecting a store-branded peanut butter jar. “Since when do you buy generic?” she asked, grabbing the jar from the cart. To her surprise, the frightened man turned out to be a total stranger. As usual, Dingfelder quickly began rewriting the unsettling interaction in her mind as a funny story, but a stark thought struck her this time: “Other people do not make this kind of mistake.” Dingfelder, a freelance science journalist, has prosopagnosia, or face blindness. It’s extremely difficult for her to recognize faces: She has gotten into cars with the wrong people; she has made plans with friends and then been surprised by who came. She once had to ask filmmaker John Waters, who met her at a museum for an interview, to help identify the museum staffer who had just introduced them — she couldn’t pick her out from a crowd of his fans. In “Do I Know You? A Faceblind Reporter’s Journey Into the Science of Sight, Memory, and Imagination,” Dingfelder begins coming to terms with her neurodivergence, weaving together science reporting — including brain scans, computerized tests, and assessments by medical researchers — and personal memoir in order to understand herself better. Ultimately, “Do I Know You?” is a question Dingfelder seems to be asking herself. By the end of the book, the answer feels like a firm yes. The term prosopagnosia, a portmanteau of the Greek words for “face” and “not knowing,” was coined by Joachim Bodamer, a psychiatrist and neurologist in Nazi Germany. Bodamer had encountered German soldiers with head traumas who had lost the ability to recognize people, including one soldier who blithely passed by his own mother at a train station.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29423 - Posted: 08.11.2024

By Yasemin Saplakoglu Two years ago, Sarah Shomstein realized she didn’t have a mind’s eye. The vision scientist was sitting in a seminar room, listening to a scientific talk, when the presenter asked the audience to imagine an apple. Shomstein closed her eyes and did so. Then, the presenter asked the crowd to open their eyes and rate how vividly they saw the apple in their mind. Saw the apple? Shomstein was confused. She didn’t actually see an apple. She could think about an apple: its taste, its shape, its color, the way light might hit it. But she didn’t see it. Behind her eyes, “it was completely black,” Shomstein recalled. And yet, “I imagined an apple.” Most of her colleagues reacted differently. They reported actually seeing an apple, some vividly and some faintly, floating like a hologram in front of them. In that moment, Shomstein, who’s spent years researching perception at George Washington University, realized she experienced the world differently than others. She is part of a subset of people — thought to be about 1% to 4% of the general population — who lack mental imagery, a phenomenon known as aphantasia. Though it was described more than 140 years ago, the term “aphantasia” was coined only in 2015. It immediately drew the attention of anyone interested in how the imagination works. That included neuroscientists. So far, they’re finding that aphantasia is not a disorder — it’s a different way of experiencing the world. Early studies have suggested that differences in the connections between brain regions involved in vision, memory and decision-making could explain variations in people’s ability to form mental images. Because many people with aphantasia dream in images and can recognize objects and faces, it seems likely that their minds store visual information — they just can’t access it voluntarily or can’t use it to generate the experience of imagery. That’s just one explanation for aphantasia. In reality, people’s subjective experiences vary dramatically, and it’s possible that different subsets of aphantasics have their own neural explanations. Aphantasia and hyperphantasia, the opposite phenomenon in which people report mental imagery as vivid as reality, are in fact two ends of a spectrum, sandwiching an infinite range of internal experiences between them. © 2024 the Simons Foundation.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 29417 - Posted: 08.02.2024

By Olivia Gieger Three pioneers in face-perception research have won the 2024 Kavli Prize in Neuroscience. Nancy Kanwisher, professor of cognitive neuroscience at the Massachusetts Institute of Technology; Winrich Freiwald, professor of neurosciences and behavior at Rockefeller University; and Doris Tsao, professor of neurobiology at the University of California, Berkeley, will share the $1 million Kavli Prize for their discoveries of the regions—in both the human and monkey brains—responsible for identifying and recognizing faces. “This is work that’s very classic and very elegant, not only in face-processing and face-recognition work, but the impact it’s had on how we think about brain organization in general is huge,” says Alexander Cohen, assistant professor of neurology at Harvard Medical School, who studies face recognition in autistic people. The Norwegian Academy of Science and Letters awards the prize every two years. Kanwisher says she long suspected that something special happens in the brain when we look at faces, because people with prosopagnosia—the inability to recognize faces—maintain the ability to recognize nearly all other objects. What’s more, it is harder to recognize an upside-down face than most other inverted objects, studies have shown. To get to the root of face processing, Kanwisher spent hours as a young researcher lying still in an MRI machine as images of faces and objects flashed before her. A spot in the bottom right of the cerebral cortex lit up when she and others looked at faces, according to functional MRI (fMRI) scans, she and her colleagues reported in a seminal 1997 paper. They called the region the fusiform face area. © 2024 Simons Foundation

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29356 - Posted: 06.13.2024

By Betsy Mason To help pay for his undergraduate education, Elias Garcia-Pelegrin had an unusual summer job: cruise ship magician. “I was that guy who comes out at dinnertime and does random magic for you,” he says. But his latest magic gig is even more unusual: performing for Eurasian jays at Cambridge University’s Comparative Cognition Lab. Birds can be harder to fool than tourists. And to do magic for the jays, he had to learn to do sleight-of-hand tricks with a live, wriggling waxworm instead of the customary coin or ball. But performing in an aviary does have at least one advantage over performing on a cruise ship: The birds aren’t expecting to be entertained. “You don’t have to worry about impressing anybody, or tell a joke,” Garcia-Pelegrin says. “So you just do the magic.” In just the last few years, researchers have become interested in what they can learn about animal minds by studying what does and doesn’t fool them. “Magic effects can reveal blind spots in seeing and roadblocks in thinking,” says Nicky Clayton, who heads the Cambridge lab and, with Garcia-Pelegrin and others, cowrote an overview of the science of magic in the Annual Review of Psychology. What we visually perceive about the world is a product of how our brains interpret what our eyes see. Humans and other animals have evolved to handle the immense amount of visual information we’re exposed to by prioritizing some types of information, filtering out things that are usually less relevant and filling in gaps with assumptions. Many magic effects exploit these cognitive shortcuts in humans, and comparing how well these same tricks work on other species may reveal something about how their minds operate. Clayton and her colleagues have used magic tricks with both jays and monkeys to reveal differences in how these animals experience the world. Now they are hoping to expand to more species and inspire other researchers to try magic to explore big questions about complex mental abilities and how they evolved.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 29345 - Posted: 06.06.2024