Chapter 14. Attention and Higher Cognition
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Claudia López Lloreda For a neuroscientist, the opportunity to record single neurons in people doesn’t knock every day. It is so rare, in fact, that after 14 years of waiting by the door, Florian Mormann says he has recruited just 110 participants—all with intractable epilepsy. All participants had electrodes temporarily implanted in their brains to monitor their seizures. But the slow work to build this cohort is starting to pay off for Mormann, a group leader at the University of Bonn, and for other researchers taking a similar approach, according to a flurry of studies published in the past year. For instance, certain neurons selectively respond not only to particular scents but also to the words and images associated with them, Mormann and his colleagues reported in October. Other neurons help to encode stimuli, form memories and construct representations of the world, recent work from other teams reveals. Cortical neurons encode specific information about the phonetics of speech, two independent teams reported last year. Hippocampal cells contribute to working memory and map out time in novel ways, two other teams discovered last year, and some cells in the region encode information related to a person’s changing knowledge about the world, a study published in August found. These studies offer the chance to answer questions about human brain function that remain challenging to answer using animal models, says Ziv Williams, associate professor of neurosurgery at Harvard Medical School, who led one of the teams that worked on speech phonetics. “Concept cells,” he notes by way of example, such as those Mormann identified, or the “Jennifer Aniston” neurons famously described in a 2005 study, have proved elusive in the monkey brain. © 2025 Simons Foundation
Keyword: Attention; Learning & Memory
Link ID: 29709 - Posted: 03.19.2025
By Kelly Servick New York City—A recent meeting here on consciousness started from a relatively uncontroversial premise: A newly fertilized human egg isn’t conscious, and a preschooler is, so consciousness must emerge somewhere in between. But the gathering, sponsored by New York University (NYU), quickly veered into more unsettled territory. At the Infant Consciousness Conference from 28 February to 1 March, researchers explored when and how consciousness might arise, and how to find out. They also considered hints from recent brain imaging studies that the capacity for consciousness could emerge before birth, toward the end of gestation. “Fetal consciousness would have been a less central topic at a meeting like this a few years ago,” says Claudia Passos-Ferreira, a bioethicist at NYU who co-organized the gathering. The conversation has implications for how best to care for premature infants, she says, and intersects with thorny issues such as abortion. “Whatever you claim about this, there are some moral implications.” How to define consciousness is itself the subject of debate. “Each of us might have a slightly different definition,” neuroscientist Lorina Naci of Trinity College Dublin acknowledged at the meeting before describing how she views consciousness—as the capacity to have an experience or a subjective point of view. There’s also vigorous debate about where consciousness arises in the brain and what types of neural activity define it. That makes it hard to agree on specific markers of consciousness in beings—such as babies—that can’t talk about their experience. Further complicating the picture, the nature of consciousness could be different for infants than adults, researchers noted at the meeting. And it may emerge gradually versus all at once, on different timescales for different individuals.
Keyword: Consciousness; Development of the Brain
Link ID: 29703 - Posted: 03.12.2025
By Mark Humphries There are many ways neuroscience could end. Prosaically, society may just lose interest. Of all the ways we can use our finite resources, studying the brain has only recently become one; it may one day return to dust. Other things may take precedence, like feeding the planet or preventing an asteroid strike. Or neuroscience may end as an incidental byproduct, one of the consequences of war or of thoughtlessly disassembling a government or of being sideswiped by a chunk of space rock. We would prefer it to end on our own terms. We would like neuroscience to end when we understand the brain. Which raises the obvious question: Is this possible? For the answer to be yes, three things need to be true: that there is a finite amount of stuff to know, that stuff is physically accessible and that we understand all the stuff we obtain. But each of these we can reasonably doubt. The existence of a finite amount of knowledge is not a given. Some arguments suggest that an infinite amount of knowledge is not only possible but inevitable. Physicist David Deutsch proposes the seemingly innocuous idea that knowledge grows when we find a good explanation for a phenomenon, an explanation whose details are hard to vary without changing its predictions and hence breaking it as an explanation. Bad explanations are those whose details can be varied without consequence. Ancient peoples attributing the changing seasons to the gods is a bad explanation, for those gods and their actions can be endlessly varied without altering the existence of four seasons occurring in strict order. Our attributing the changing seasons to the Earth’s tilt in its orbit of the sun is a good explanation, for if we omit the tilt, we lose the four seasons and the opposite patterns of seasons in the Northern and Southern hemispheres. A good explanation means we have nailed down some property of the universe sufficiently well that something can be built upon it. © 2025 Simons Foundation
Keyword: Consciousness
Link ID: 29702 - Posted: 03.12.2025
By Felicity Nelson A region in the brainstem, called the median raphe nucleus, contains neurons that control perseverance and exploration.Credit: K H Fung/Science Photo Library Whether mice persist with a task, explore new options or give up comes down to the activity of three types of neuron in the brain. In experiments, researchers at University College London (UCL) were able to control the three behaviours by switching the neurons on and off in a part of the animals’ brainstem called the median raphe nucleus. The findings are reported in Nature today1. “It’s quite remarkable that manipulation of specific neural subtypes in the median raphe nucleus mediates certain strategic behaviours,” says neuroscientist Roger Marek at the Queensland Brain Institute in Brisbane, Australia, who was not involved in the work. Whether these behaviours are controlled in the same way in humans needs to be confirmed, but if they are, this could be relevant to certain neuropsychiatric conditions that are associated with imbalances in the three behavioural strategies, says Sonja Hofer, a co-author of the paper and a systems neuroscientist at UCL. For instance, an overly high drive to persist with familiar actions and repetitive behaviours can be observed in people with obsessive–compulsive disorder and autism, she says. Conversely, pathological disengagement and lack of motivation are symptoms of major depressive disorder, and an excessive drive to explore and inability to persevere with a task is seen in attention deficit hyperactivity disorder. “It could be that changes in the firing rate of specific median raphe cell types could contribute to certain aspects of these conditions,” says Hofer. © 2025 Springer Nature Limited
Keyword: Attention
Link ID: 29696 - Posted: 03.08.2025
By Ingrid Wickelgren After shuffling the cards in a standard 52-card deck, Alex Mullen, a three-time world memory champion, can memorize their order in under 20 seconds. As he flips though the cards, he takes a mental walk through a house. At each point in his journey — the mailbox, front door, staircase and so on — he attaches a card. To recall the cards, he relives the trip. This technique, called “method of loci” or “memory palace,” is effective because it mirrors the way the brain naturally constructs narrative memories: Mullen’s memory for the card order is built on the scaffold of a familiar journey. We all do something similar every day, as we use familiar sequences of events, such as the repeated steps that unfold during a meal at a restaurant or a trip through the airport, as a home for specific details — an exceptional appetizer or an object flagged at security. The general narrative makes the noteworthy features easier to recall later. “You are taking these details and connecting them to this prior knowledge,” said Christopher Baldassano (opens a new tab), a cognitive neuroscientist at Columbia University. “We think this is how you create your autobiographical memories.” Psychologists empirically introduced (opens a new tab) this theory some 50 years ago, but proof of such scaffolds in the brain was missing. Then, in 2018, Baldassano found it: neural fingerprints of narrative experience, derived from brain scans, that replay sequentially during standard life events. He believes that the brain builds a rich library of scripts for expected scenarios — restaurant or airport, business deal or marriage proposal — over a person’s lifetime. These standardized scripts, and departures from them, influence how and how well we remember specific instances of these event types, his lab has found. And recently, in a paper published in Current Biology in fall 2024, they showed that individuals can select a dominant script (opens a new tab) for a complex, real-world event — for example, while watching a marriage proposal in a restaurant, we might opt, subconsciously, for either a proposal or a restaurant script — which determines what details we remember. © 2025 Simons Foundation
Keyword: Learning & Memory; Attention
Link ID: 29685 - Posted: 02.26.2025
By Bill Newsome What paper changed your life?: Activity of superior colliculus in behaving monkey. II. Effect of attention on neuronal responses. M.E. Goldberg and R.H. Wurtz Journal of Neurophysiology (1972) In 1972, Mickey Goldberg and Bob Wurtz published a quadrilogy of papers in the Journal of Neurophysiology—yes, you could do that in those days—on the physiological activity of single superior colliculus neurons in alert monkeys trained to perform simple eye fixation and eye movement tasks. The experiments revealed a rich variety of sensory and motor signals: Some neurons fired at the onset of a visual stimulus; others showed bursts of activity immediately prior to the eye movement. The researchers found that visually evoked activity differed depending on whether the monkey ultimately used the stimulus as a target for a saccadic eye movement. The neural response to the visual stimulus was stronger and continued until the time of the eye movement, forming a sort of temporal bridge between stimulus and evoked behavioral response. This bridge was alluring because it hinted at intermediate processes—perhaps the stuff of cognition—between sensory input and behavioral output. But it was also mysterious, in that no models existed for how such activity might be initiated and maintained until the behavioral response. These papers were revelatory to me because they pointed toward a mechanistic physiological understanding of such complex cognitive functions as attention. I was particularly fascinated by the second paper in the series of four, which dug into that mystery. Goldberg and Wurtz explicitly made a suggestive leap from physiology to psychology: “[Because] we can infer that the monkey attended to the stimulus when he made a saccade to it, the enhancement can be viewed as a neurophysiological event related to the psychological phenomenon of attention.” They also issued appropriate caveats, noting that “the unitary behavioral concept” of attention “may not have a single physiological mechanism.” h. © 2025 Simons Foundation
Keyword: Vision; Attention
Link ID: 29679 - Posted: 02.22.2025
By Michael S. Rosenwald In early February, Vishvaa Rajakumar, a 20-year-old Indian college student, won the Memory League World Championship, an online competition pitting people against one another with challenges like memorizing the order of 80 random numbers faster than most people can tie a shoelace. The renowned neuroscientist Eleanor Maguire, who died in January, studied mental athletes like Mr. Rajakumar and found that many of them used the ancient Roman “method of loci,” a memorization trick also known as the “memory palace.” The technique takes several forms, but it generally involves visualizing a large house and assigning memories to rooms. Mentally walking through the house fires up the hippocampus, the seahorse-shaped engine of memory deep in the brain that consumed Dr. Maguire’s career. We asked Mr. Rajakumar about his strategies of memorization. His answers, lightly edited and condensed for clarity, are below. Q. How do you prepare for the Memory League World Championship? Hydration is very important because it helps your brain. When you memorize things, you usually sub-vocalize, and it helps to have a clear throat. Let’s say you’re reading a book. You’re not reading it out loud, but you are vocalizing within yourself. If you don’t drink a lot of water, your speed will be a bit low. If you drink a lot of water, it will be more and more clear and you can read it faster. Q. What does your memory palace look like? Let’s say my first location is my room where I sleep. My second location is the kitchen. And the third location is my hall. The fourth location is my veranda. Another location is my bathroom. Let’s say I am memorizing a list of words. Let’s say 10 words. What I do is, I take a pair of words, make a story out of them and place them in a location. And I take the next two words. I make a story out of them. I place them in the second location. The memory palace will help you to remember the sequence. © 2025 The New York Times Company
Keyword: Learning & Memory; Attention
Link ID: 29673 - Posted: 02.15.2025
Nell Greenfieldboyce People are constantly looking at the behavior of others and coming up with ideas about what might be going on in their heads. Now, a new study of bonobos adds to evidence that they might do the same thing. Specifically, some bonobos were more likely to point to the location of a treat when they knew that a human companion was not aware of where it had been hidden, according to a study which appears in the Proceedings of the National Academy of Sciences. The findings add to a long-running debate about whether humans have a unique ability to imagine and understand the mental states of others. Some researchers say this kind of "theory of mind" may be practiced more widely in the animal kingdom, and potentially watching it in action was quite the experience. "It's quite surreal. I mean, I've worked with primates for quite some years now and you never get used to it," says Luke Townrow, a PhD student at Johns Hopkins University. "We found evidence that they are tailoring their communication based on what I know." Hmmm, where is the grape? To see what bonobos might know about what humans around them know, Townrow worked with Chris Krupenye of Johns Hopkins University to devise a simple experiment. "It's always a challenge for us, that animals don't speak, so we can't just ask them what they're thinking. We have to come up with creative, experimental designs that allow them to express their knowledge," says Krupenye. © 2025 npr
Keyword: Attention; Consciousness
Link ID: 29658 - Posted: 02.05.2025
By Bethany Brookshire Self-awareness may be beyond primates in the wild. Chimps, organutans and other species faced with a mirror react to a dot on their face in the lab, a widely used measure of self-awareness. But while baboons in Namibia exposed to mirrors find the reflective glass fascinating, they don’t respond to dots placed on their faces, researchers report in the January Proceedings of the Royal Society B: Biological Sciences. The result could indicate that lab responses to mirrors are a result of training — and that self-awareness might exist on a spectrum. Support Science Today. Thank you for being a subscriber to Science News! Interested in more ways to support STEM? Consider making a gift to our nonprofit publisher, the Society for Science, an organization dedicated to expanding scientific literacy and ensuring that every young person can strive to become an engineer or scientist. Donate Now “Psychological self-awareness is this idea that you as an individual can become an object of your own attention,” says Alecia Carter, an evolutionary anthropologist at University College London. It’s a hard concept to measure in other species, in part, she notes, because “it’s also difficult to imagine not having that kind of self-awareness.” One measure of self-awareness is the mark test. An animal sits in front of a mirror, and a mark is placed somewhere they normally cannot see, such as on the face. If the animal recognizes themselves in the mirror, and the mark as out of place, the animal will respond to the mark. Chimps, orangutans and bonobos have “passed” the mark test in the lab, while primates that are not great apes, such as rhesus macaques, have mastered it only after training. Other species, such as Asian elephants, dolphins and even a fish called the cleaner wrasse, have also responded to the mark test. © Society for Science & the Public 2000–2025.
Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 29654 - Posted: 02.01.2025
By Ellen Barry A study of more than 30,000 British adults diagnosed with attention deficit hyperactivity disorder, or A.D.H.D., found that, on average, they were dying earlier than their counterparts in the general population — around seven years earlier for men, and around nine for women. The study, which was published Thursday in The British Journal of Psychiatry, is believed to be the first to use all-cause mortality data to estimate life expectancy in people with A.D.H.D. Previous studies have pointed to an array of risks associated with the condition, among them poverty, mental health disorders, smoking and substance abuse. The authors cautioned that A.D.H.D. is substantially underdiagnosed and that the people in their study — most of them diagnosed as young adults — might be among the more severely affected. Still, they described their findings as “extremely concerning,” highlighting unmet needs that “require urgent attention.” “It’s a big number, and it is worrying,” said Joshua Stott, a professor of aging and clinical psychology at University College London and an author of the study. “I see it as likely to be more about health inequality than anything else. But it’s quite a big health inequality.” The study did not identify causes of early death among people with A.D.H.D. but found that they were twice as likely as the general population to smoke or abuse alcohol and that they had far higher rates of autism, self-harming behaviors and personality disorders than the general population. In adulthood, Dr. Stott said, “they find it harder to manage impulses, and have more risky behaviors.” He said health care systems might need to adjust in order to better serve people with A.D.H.D., who may have sensory sensitivity or difficulty managing time or communicating with clinicians during brief appointments. He said he hoped treatments for substance abuse or depression could be adapted for patients with A.D.H.D. © 2025 The New York Times Company
Keyword: ADHD; Drug Abuse
Link ID: 29643 - Posted: 01.25.2025
By Yasemin Saplakoglu Imagine you’re on a first date, sipping a martini at a bar. You eat an olive and patiently listen to your date tell you about his job at a bank. Your brain is processing this scene, in part, by breaking it down into concepts. Bar. Date. Martini. Olive. Bank. Deep in your brain, neurons known as concept cells are firing. You might have concept cells that fire for martinis but not for olives. Or ones that fire for bars — perhaps even that specific bar, if you’ve been there before. The idea of a “bank” also has its own set of concept cells, maybe millions of them. And there, in that dimly lit bar, you’re starting to form concept cells for your date, whether you like him or not. Those cells will fire when something reminds you of him. Concept neurons fire for their concept no matter how it is presented: in real life or a photo, in text or speech, on television or in a podcast. “It’s more abstract, really different from what you’re seeing,” said Elizabeth Buffalo (opens a new tab), a neuroscientist at the University of Washington. For decades, neuroscientists mocked the idea that the brain could have such intense selectivity, down to the level of an individual neuron: How could there be one or more neurons for each of the seemingly countless concepts we engage with over a lifetime? “It’s inefficient. It’s not economic,” people broadly agreed, according to the neurobiologist Florian Mormann (opens a new tab) at the University of Bonn. But when researchers identified concept cells in the early 2000s, the laughter started to fade. Over the past 20 years, they have established that concept cells not only exist but are critical to the way the brain abstracts and stores information. New studies, including one recently published in Nature Communications, have suggested that they may be central to how we form and retrieve memory. © 2025 Simons Foundation
Keyword: Learning & Memory; Attention
Link ID: 29639 - Posted: 01.22.2025
Rachael Elward Lauren Ford Severance, which imagines a world where a person’s work and personal lives are surgically separated, will soon return to Apple TV+ for a second season. While the concept of this gripping piece of science fiction is far-fetched, it touches on some interesting neuroscience. Can a person’s mind really be surgically split in two? Remarkably, “split-brain” patients have existed since the 1940s. To control epilepsy symptoms, these patients underwent a surgery to separate the left and right hemispheres. Similar surgeries still happen today. Later research on this type of surgery showed that the separated hemispheres of split-brain patients could process information independently. This raises the uncomfortable possibility that the procedure creates two separate minds living in one brain. In season one of Severance, Helly R (Britt Lower) experienced a conflict between her “innie” (the side of her mind that remembered her work life) and her “outie” (the side outside of work). Similarly, there is evidence of a conflict between the two hemispheres of real split-brain patients. When speaking with split-brain patients, you are usually communicating with the left hemisphere of the brain, which controls speech. However, some patients can communicate from their right hemisphere by writing, for example, or arranging Scrabble letters. A young patient was asked what job he would like in the future. His left hemisphere chose an office job making technical drawings. His right hemisphere, however, arranged letters to spell “automobile racer”. Split brain patients have also reported “alien hand syndrome”, where one of their hands is perceived to be moving of its own volition. These observations suggest that two separate conscious “people” may coexist in one brain and may have conflicting goals. In Severance, however, both the innie and the outie have access to speech. This is one indicator that the fictional “severance procedure” must involve a more complex separation of the brain’s networks. © 2010–2025, The Conversation US, Inc.
Keyword: Learning & Memory; Consciousness
Link ID: 29635 - Posted: 01.18.2025
By Kristel Tjandra Close your eyes and picture an apple—what do you see? Most people will conjure up a vivid image of the fruit, but for the roughly one in 100 individuals with aphantasia, nothing will appear in the mind’s eye at all. Now, scientists have discovered that in people with this inability to form mental images, visual processing areas of the brain still light up when they try to do so. The study, published today in Current Biology, suggests aphantasia is not caused by a complete deficit in visual processing, as researchers have previously proposed. Visual brain areas are still active when aphantasic people are asked to imagine—but that activity doesn’t translate into conscious experience. The work offers new clues about the neurological differences underlying this little-explored condition. The study authors “take a very strong, mechanistic approach,” says Sarah Shomstein, a vision scientist at George Washington University who was not involved in the study. “It was asking the right questions and using the right methods.” Some scientists suspect aphantasia may be caused by a malfunction in the primary visual cortex, the first area in the brain to process images. “Typically, primary cortex is thought to be the engine of visual perception,” says Joel Pearson, a neuroscientist at the University of New South Wales Sydney who co-led the study. “If you don’t have activity there, you’re not going to have perceptual consciousness.” To see what was going on in this region in aphantasics, the team used functional magnetic resonance imaging to measure the brain activity of 14 people with aphantasia and 18 neurotypical controls as they repeatedly saw two simple patterns, made up of either green vertical lines or red horizontal lines. They then repeated the experiment, this time asking participants to simply imagine the two images.
Keyword: Attention; Vision
Link ID: 29624 - Posted: 01.11.2025
By Christina Caron Barrie Miskin was newly pregnant when she noticed her appearance was changing. Dark patches bloomed on her skin like watercolor ink. A “thicket” of hairs sprouted on her upper lip and chin. The outside world was changing, too: In her neighborhood of Astoria, Queens, bright lights enveloped objects in a halo, blurring her vision. Co-workers and even her doctors started to seem like “alien proxies” of themselves, Ms. Miskin, 46, said. “I felt like I was viewing the world through a pane of dirty glass,” she added. Yet Ms. Miskin knew it was all an illusion, so she sought help. Welcome to Psych 101, a new monthly column that explores mental health terms and trends that are worthy of a bigger conversation. If there is a subject you’d like to see covered, please drop us a line at Psych101@nytimes.com. It took more than a year of consulting with mental health specialists before Ms. Miskin finally found an explanation for her symptoms: She was diagnosed with a dissociative condition called depersonalization/derealization disorder, or D.D.D. Before her pregnancy, Ms. Miskin had stopped taking antidepressants. Her new psychiatrist said the symptoms could have been triggered by months of untreated depression that followed. While Ms. Miskin felt alone in her mystery illness, she wasn’t. Tens of thousands of posts on social media reference depersonalization or derealization, with some likening the condition to “living in a movie or a dream” or “observing the world through a fog.” People who experience depersonalization can feel as though they are detached from their mind or body. Derealization, on the other hand, refers to feeling detached from the environment, as though the people and things in the world are unreal. Those who are living with D.D.D. are “painfully aware” that something is amiss, said Elena Bezzubova, a psychoanalyst who specializes in treating the condition. It’s akin to seeing an apple and feeling that it is so strange it doesn’t seem real, even though you know that it is, she added. The disorder is thought to occur in about 1 to 2 percent of the population, but it’s possible for anyone to experience fleeting symptoms. © 2025 The New York Times Company
Keyword: Attention
Link ID: 29623 - Posted: 01.11.2025
Nicola Davis Science correspondent Standing patiently on a small fluffy rug, Calisto the flat-coated retriever is being fitted with some hi-tech headwear. But this is not a new craze in canine fashion: she is about to have her brainwaves recorded. Calisto is one of about 40 pet dogs – from newfoundlands to Tibetan terriers – taking part in a study to explore whether their brainwaves synchronise with those of their owners when the pair interact, a phenomenon previously seen when two humans engage with each other. The researchers behind the work say such synchronisation would suggest person and pet are paying attention to the same things, and in certain circumstances interpreting moments in a similar way. In other words, owner and dog really are on the same wavelength. Dr Valdas Noreika of Queen Mary, University of London said he got the idea for the study after working on similar experiments with mothers and their babies, where such synchronisation has also been seen. “Owners modulate their language in a similar way as parents modulate when they speak to children,” he said. “There are lots of similarities. That could be one of the reasons why we get so attached to dogs – because we already have these cognitive functions and capacities to attach with someone who is smaller or requires help or attention.” Hints of an emotional bond between humans and their dogs stretch into the distant past: researchers have previously discovered the 14,000-year-old remains of a puppy buried in Germany alongside a man and a woman: the analysis suggested the young dog had been nursed through several periods of illness, despite having no particular use. © 2025 Guardian News & Media Limited o
Keyword: Brain imaging; Attention
Link ID: 29613 - Posted: 01.04.2025
By Carl Zimmer In our digital age, few things are more irritating than a slow internet connection. Your web browser starts to lag. On video calls, the faces of your friends turn to frozen masks. When the flow of information dries up, it can feel as if we are cut off from the world. Engineers measure this flow in bits per second. Streaming a high-definition video takes about 25 million bps. The download rate in a typical American home is about 262 million bps. Now researchers have estimated the speed of information flow in the human brain: just 10 bps. They titled their study, published this month in the journal Neuron, “The unbearable slowness of being.” “It’s a bit of a counterweight to the endless hyperbole about how incredibly complex and powerful the human brain is,” said Markus Meister, a neuroscientist at the California Institute of Technology and an author of the study. “If you actually try to put numbers to it, we are incredibly slow.” Dr. Meister got the idea for the study while teaching an introductory neuroscience class. He wanted to give his students some basic numbers about the brain. But no one had pinned down the rate at which information flows through the nervous system. Dr. Meister realized that he could estimate that flow by looking at how quickly people carry out certain tasks. To type, for example, we look at a word, recognize each letter and then sort out the sequence of keys to press. As we type, information flows into our eyes, through our brains and into the muscles of our fingers. The higher the flow rate, the faster we can type. In 2018, a team of researchers in Finland analyzed 136 million keystrokes made by 168,000 volunteers. They found that, on average, people typed 51 words a minute. A small fraction typed 120 words a minute or more. Dr. Meister and his graduate student, Jieyu Zheng, used a branch of mathematics known as information theory to estimate the flow of information required to type. At 120 words a minute, the flow is only 10 bits a second. © 2024 The New York Times Company
Keyword: Attention
Link ID: 29611 - Posted: 12.28.2024
By Alissa Wilkinson There’s a moment in “Theater of Thought” (in theaters) when Darío Gil, the director of research at IBM, is explaining quantum computing to Werner Herzog, the movie’s director. Standing before a whiteboard, Gil draws some points on spheres to illustrate how qubits work, then proceeds to define the Schrödinger equation. As he talks and writes, the audio grows quieter, and Herzog’s distinctive resonant German accent takes over. “I admit that I literally understand nothing of this, and I assume most of you don’t either,” he intones in voice-over. “But I found it fascinating that this mathematical formula explains the law that draws the subatomic world.” It’s a funny moment, a playful way to keep us from glazing over when presented with partial differential equations. Herzog may be a world-renowned filmmaker, but he’s hardly a scientist, and that makes him the perfect director for “Theater of Thought,” a documentary about, as he puts it, the “mysteries of our brain.” Emphasis on mysteries. Herzog interviews a dizzying array of scientists, researchers, and even a Nobel Prize winner or two. He asks them about everything: how the brain works, what consciousness means, what the tiniest organisms in the world are, whether parrots understand human speech, whether rogue governments can control thoughts, whether we’re living in an elaborate simulation, how telepathy and psychedelics work, and, at several points, what thinking even is. Near the end of the film he notes that not one of the scientists could explain what a thought is, or what consciousness is, but “they were all keenly alive to the ethical questions in neuroscience.” In other words, they’re immersed in both the mystery and what their field of study implies about the future of humanity. There’s a boring way to make this movie, with talking-head interviews that are arranged to form a coherent argument. Herzog goes another direction, starting off by narrating why he’s making it, then talking about his interviewees as we are introduced to them in their labs or in their favorite outdoor settings. (He also visits Philippe Petit, the Twin Towers tightrope walker, as he practices in his Catskills backyard.) Herzog’s constant verbal presence brings us into his own head space — his own brain, if you will — and gives us the sense that we’re following his patterns of thought. © 2024 The New York Times Company
Keyword: Consciousness
Link ID: 29598 - Posted: 12.14.2024
By Iris Berent Seeing the striking magenta of bougainvillea. Tasting a rich morning latte. Feeling the sharp pain of a needle prick going into your arm. These subjective experiences are the stuff of the mind. What is “doing the experiencing,” the 3-pound chunk of meat in our head, is a tangible object that works on electrochemical signals—physics, essentially. How do the two—our mental experiences and physical brains—interact? The puzzle of consciousness seems to be giving science a run for its money. The problem, to be clear, isn’t merely to pinpoint “where it all happens” in the brain (although this, too, is far from trivial). The real mystery is how to bridge the gap between the mental, first-person stuff of consciousness and the physical lump of matter inside the cranium. Some think the gap is unbreachable. The philosopher David Chalmers, for instance, has argued that consciousness is something special and distinct from the physical world. If so, it may never be possible to explain consciousness in terms of physical brain processes. No matter how deeply scientists understand the brain, for Chalmers, this would never explain how our neurons produce consciousness. Why should a hunk of flesh, teeming with chemical signals and electrical charges, experience a point of view? There seems to be no conceivable reason why meaty matter would have this light of subjectivity “on the inside.” Consciousness, then, is a “hard problem”—as Chalmers has labeled it—indeed. The possibility that consciousness itself isn’t anything physical raises burning questions about whether, for example, an AI can fall in love with its programmer. And since consciousness is a natural phenomenon, much like gravity or genes, these questions carry huge implications. Science explains the natural world by physical principles only. So if it turns out that one natural phenomenon transcends the laws of physics, then it is not only the science of consciousness that is in trouble—our entire understanding of the natural world would require serious revision. © 2024 NautilusNext Inc.,
Keyword: Consciousness
Link ID: 29581 - Posted: 11.30.2024
By Joanne Silberner To describe the destructive effects of intense health anxiety to his young doctors in training at Columbia University Irving Medical Center in New York City, psychiatrist Brian Fallon likes to quote 19th-century English psychiatrist Henry Maudsley: “The sorrow which has no vent in tears may make other organs weep.” That weeping from other parts of the body may come in the form of a headache that, in the mind of its sufferer, is flagging a brain tumor. It may be a rapid heartbeat a person wrongly interprets as a brewing heart attack. The fast beats may be driven by overwhelming, incapacitating anxiety. Hal Rosenbluth, a businessman in the Philadelphia area, says he used to seek medical care for the slightest symptom. In his recent book Hypochondria, he describes chest pains, breathing difficulties and vertigo that came on after he switched from a daily diabetes drug to a weekly one. He ended up going to the hospital by ambulance for blood tests, multiple electrocardiograms, a chest x-ray, a cardiac catheterization and an endoscopy, all of which were normal. Rosenbluth’s worries about glucose levels had led him to push for the new diabetes drug, and its side effects were responsible for many of his cardiac symptoms. His own extreme anxiety had induced doctors to order the extra care. Hypochondria can, in extreme cases, leave people unable to hold down a job or make it impossible for them to leave the house, cook meals, or care for themselves and their families. Recent medical research has shown that hypochondria is as much a real illness as depression and post-traumatic stress disorder. This work, scientists hope, will convince doctors who believed the disorder was some kind of character flaw that their patients are truly ill—and in danger. A study published just last year showed that people with hypochondria have higher death rates than similar but nonafflicted people, and the leading nonnatural cause of death was suicide. It was relatively rare, but the heightened risk was clear.
Keyword: Stress; Attention
Link ID: 29567 - Posted: 11.20.2024
By Laura Sanders Growing up, Roberto S. Luciani had hints that his brain worked differently than most people. He didn’t relate when people complained about a movie character looking different than what they’d pictured from the book, for instance. But it wasn’t until he was a teenager that things finally clicked. His mother had just woken up and was telling him about a dream she had. “Movielike,” is how she described it. “Up until then, I assumed that cartoon depictions of imagination were exaggerated,” Luciani says, “I asked her what she meant and quickly realized my visual imagery was not functioning like hers.” That’s because Luciani has a condition called aphantasia — an inability to picture objects, people and scenes in his mind. When he was growing up, the term didn’t even exist. But now, Luciani, a cognitive scientist at the University of Glasgow in Scotland, and other scientists are getting a clearer picture of how some brains work, including those with a blind mind’s eye. In a recent study, Luciani and colleagues explored the connections between the senses, in this case, hearing and seeing. In most of our brains, these two senses collaborate. Auditory information influences activity in brain areas that handle vision. But in people with aphantasia, this connection isn’t as strong, researchers report November 4 in Current Biology. While in a brain scanner, blindfolded people listened to three sound scenes: A forest full of birds, a crowd of people, and a street bustling with traffic. In 10 people without aphantasia, these auditory scenes create reliable neural hallmarks in parts of the visual cortex. But in 23 people with aphantasia, these hallmarks were weaker. © Society for Science & the Public 2000–2024.
Keyword: Attention
Link ID: 29566 - Posted: 11.20.2024