Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 668

By Ed Yong On March 25, 2020, Hannah Davis was texting with two friends when she realized that she couldn’t understand one of their messages. In hindsight, that was the first sign that she had COVID-19. It was also her first experience with the phenomenon known as “brain fog,” and the moment when her old life contracted into her current one. She once worked in artificial intelligence and analyzed complex systems without hesitation, but now “runs into a mental wall” when faced with tasks as simple as filling out forms. Her memory, once vivid, feels frayed and fleeting. Former mundanities—buying food, making meals, cleaning up—can be agonizingly difficult. Her inner world—what she calls “the extras of thinking, like daydreaming, making plans, imagining”—is gone. The fog “is so encompassing,” she told me, “it affects every area of my life.” For more than 900 days, while other long-COVID symptoms have waxed and waned, her brain fog has never really lifted. Of long COVID’s many possible symptoms, brain fog “is by far one of the most disabling and destructive,” Emma Ladds, a primary-care specialist from the University of Oxford, told me. It’s also among the most misunderstood. It wasn’t even included in the list of possible COVID symptoms when the coronavirus pandemic first began. But 20 to 30 percent of patients report brain fog three months after their initial infection, as do 65 to 85 percent of the long-haulers who stay sick for much longer. It can afflict people who were never ill enough to need a ventilator—or any hospital care. And it can affect young people in the prime of their mental lives. Long-haulers with brain fog say that it’s like none of the things that people—including many medical professionals—jeeringly compare it to. It is more profound than the clouded thinking that accompanies hangovers, stress, or fatigue. For Davis, it has been distinct from and worse than her experience with ADHD. It is not psychosomatic, and involves real changes to the structure and chemistry of the brain. It is not a mood disorder: “If anyone is saying that this is due to depression and anxiety, they have no basis for that, and data suggest it might be the other direction,” Joanna Hellmuth, a neurologist at UC San Francisco, told me. (c) 2022 by The Atlantic Monthly Group. All Rights Reserved.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28487 - Posted: 09.21.2022

By Tim Vernimmen When psychologist Jonathan Smallwood set out to study mind-wandering about 25 years ago, few of his peers thought that was a very good idea. How could one hope to investigate these spontaneous and unpredictable thoughts that crop up when people stop paying attention to their surroundings and the task at hand? Thoughts that couldn’t be linked to any measurable outward behavior? But Smallwood, now at Queen’s University in Ontario, Canada, forged ahead. He used as his tool a downright tedious computer task that was intended to reproduce the kinds of lapses of attention that cause us to pour milk into someone’s cup when they asked for black coffee. And he started out by asking study participants a few basic questions to gain insight into when and why minds tend to wander, and what subjects they tend to wander toward. After a while, he began to scan participants’ brains as well, to catch a glimpse of what was going on in there during mind-wandering. Smallwood learned that unhappy minds tend to wander in the past, while happy minds often ponder the future. He also became convinced that wandering among our memories is crucial to help prepare us for what is yet to come. Though some kinds of mind-wandering — such as dwelling on problems that can’t be fixed — may be associated with depression, Smallwood now believes mind-wandering is rarely a waste of time. It is merely our brain trying to get a bit of work done when it is under the impression that there isn’t much else going on. Smallwood, who coauthored an influential 2015 overview of mind-wandering research in the Annual Review of Psychology, is the first to admit that many questions remain to be answered. © 2022 Annual Reviews

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28461 - Posted: 09.03.2022

By Elizabeth Landau Ken Ono gets excited when he talks about a particular formula for pi, the famous and enigmatic ratio of a circle’s circumference to its diameter. He shows me a clip from a National Geographic show where Neil Degrasse Tyson asked him how he would convey the beauty of math to the average person on the street. In reply, Ono showed Tyson, and later me, a so-called continued fraction for pi, which is a little bit like a mathematical fun house hallway of mirrors. Instead of a single number in the numerator and one in the denominator, the denominator of the fraction also contains a fraction, and the denominator of that fraction has a fraction in it, too, and so on and so forth, ad infinitum. Written out, the formula looks like a staircase that narrows as you descend its rungs in pursuit of the elusive pi. The calculation—credited independently to British mathematician Leonard Jay Rogers and self-taught Indian mathematician Srinivasa Ramanujan—doesn’t involve anything more complicated than adding, dividing, and squaring numbers. “How could you not say that’s amazing?” Ono, chair of the mathematics department at the University of Virginia, asks me over Zoom. As a fellow pi enthusiast—I am well known among friends for hosting Pi Day pie parties—I had to agree with him that it’s a dazzling formula. But not everyone sees beauty in fractions, or in math generally. In fact, here in the United States, math often inspires more dread than awe. In the 1950s, some educators began to observe a phenomenon they called mathemaphobia in students,1 though this was just one of a long list of academic phobias they saw in students. Today, nearly 1 in 5 U.S. adults suffers from high levels of math anxiety, according to some estimates,2 and a 2016 study found that 11 percent of university students experienced “high enough levels of mathematics anxiety to be in need of counseling.”3 Math anxiety seems generally correlated with worse math performance worldwide, according to one 2020 study from Stanford and the University of Chicago.4 While many questions remain about the underlying reasons, high school math scores in the U.S. tend to rank significantly lower than those in many other countries. In 2018, for example, American students ranked 30th in the world in their math scores on the PISA exam, an international assessment given every three years. © 2022 NautilusThink Inc,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28459 - Posted: 09.03.2022

Heidi Ledford It’s not just in your head: a desire to curl up on the couch after a day spent toiling at the computer could be a physiological response to mentally demanding work, according to a study that links mental fatigue to changes in brain metabolism. The study, published on 11 August in Current Biology1, found that participants who spent more than six hours working on a tedious and mentally taxing assignment had higher levels of glutamate — an important signalling molecule in the brain. Too much glutamate can disrupt brain function, and a rest period could allow the brain to restore proper regulation of the molecule, the authors note. At the end of their work day, these study participants were also more likely than those who had performed easier tasks to opt for short-term, easily won financial rewards of lesser value than larger rewards that come after a longer wait or involve more effort. The study is important in its effort to link cognitive fatigue with neurometabolism, says behavioural neuroscientist Carmen Sandi at the Swiss Federal Institute of Technology in Lausanne. But more research — potentially in non-human animals — will be needed to establish a causal link between feelings of exhaustion and metabolic changes in the brain, she adds. “It’s very good to start looking into this aspect,” says Sandi. “But for now this is an observation, which is a correlation.” Tired brain Previous research has demonstrated effects of mental strain on physiological parameters such as heart-rate variability and blood flow, but these tend to be subtle, says Martin Hagger, a health psychologist at the University of California, Merced. “It’s not like when you’re exercising skeletal muscle,” he says. “But it is perceptible.” Cognitive neuroscientist Antonius Wiehler at the Paris Brain Institute and his colleagues thought that the effects of cognitive fatigue could be due to metabolic changes in the brain. The team enrolled 40 participants and assigned 24 of them to perform a challenging task: for example, watching letters appear on a computer screen every 1.6 seconds and documenting when one matched a letter that had appeared three letters ago. The other 16 participants were asked to perform a similar, but easier task. Both teams worked for just over six hours, with two ten-minute breaks. © 2022 Springer Nature Limited

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28430 - Posted: 08.11.2022

By Chantel Prat I remember all too well that day early in the pandemic when we first received the “stay at home” order. My attitude quickly shifted from feeling like I got a “snow day” to feeling like a bird in a cage. Being a person who is both extraverted by nature and not one who enjoys being told what to do, the transition was pretty rough. But you know what? I got used to it. Though the pandemic undoubtedly affected some of your lives more than others, I know it touched every one of us in ways we will never forget. And now, after two years and counting, I am positive that every person reading this is fundamentally different from when the pandemic started. Because that’s how our brains work. They are molded by our experiences so that we can fit into all kinds of different situations—even the decidedly suboptimal ones. MOTHER TONGUE: Neuroscientist and psychologist Chantel Prat says the languages we speak play a huge role in shaping our minds and brains. Photo by Shaya Bendix Lyon. This is actually one of the most human things about all of our brains. In fact, according to some contemporary views of human evolution, our ancestors underwent a “cognitive revolution” precisely because they were forced to adapt. Based on evidence suggesting that the size of our ancestors’ brains increased following periods of extreme weather instability, one popular explanation for our remarkable flexibility is that the hominids who were not able to adapt to environmental changes didn’t survive. In other words, the brains of modern humans were selected for their ability to learn and adapt to changing environments. But one of the major costs of this remarkable flexibility is that humans are born without any significant preconceived notions about how things work. If you’ve ever had a conversation with someone about an event you both participated in that left you feeling like one of you was delusional because your stories were so different, you might have a hint about how much your experiences have shaped the way you understand the world around you. This can be insanely frustrating because—let’s face it—our own brains are really convincing when they construct our personal version of reality. Remember the Dress? Though it can feel like gaslighting when someone has a different reality from yours, it’s also entirely possible that you both were reporting your version of the truth. At the end of the day, the way people remember a story reflects differences in the way they experienced the original event. The scientific explanation for this boils down to differences in perspective. © 2022 NautilusThink Inc,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 28427 - Posted: 08.11.2022

By S. Hussain Hussain Ather You reach over a stove to pick up a pot. What you didn’t realize was that the burner was still on. Ouch! That painful accident probably taught you a lesson. It’s adaptive to learn from unexpected events so that we don’t repeat our mistakes. Our brain may be primed to pay extra attention when we are surprised. In a recent Nature study, researchers at the Massachusetts Institute of Technology found evidence that a hormone, noradrenaline, alters brain activity—and an animal’s subsequent behavior—in these startling moments. Noradrenaline is one of several chemicals that can flood the brain with powerful signals. Past research shows that noradrenaline is involved when we are feeling excited, anxious or alert and that it contributes to learning. But the new research shows it plays a strong role in responses to the unexpected. The M.I.T. team used a method called optogenetics to study noradrenaline in mice. The scientists added special light-sensitive proteins to neurons that work as an “off switch” for the cells when hit by pulses of laser light. They focused on modifying a brain area called the locus coeruleus, which holds cells responsible for releasing noradrenaline. With lasers, the researchers were able to stop these cells from producing the hormone in specific circumstances. They combined this method with photo tagging, a technique in which proteins flash with light, allowing the scientists to observe activity in the locus coeruleus cells and then determine how much noradrenaline was produced. Then the researchers designed a trial-and-error learning task for the rodents. The mice could push levers when they heard a sound. There were two sounds. After high-frequency tones of about 12 kilohertz, mice that pushed a lever were rewarded with water they could drink. For low-frequency tones, around four kilohertz, the mice that hit the lever got a slightly unpleasant surprise: a discomforting puff of air was blown at them. Over time, mice learned to push the lever only when they heard high-frequency tones because they got water when they did so. They avoided the lever when they heard low-frequency tones. © 2022 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 28412 - Posted: 07.30.2022

Deepfakes – AI-generated videos and pictures of people – are becoming more and more realistic. This makes them the perfect weapon for disinformation and fraud. But while you might consciously be tricked by a deepfake, new evidence suggests that your brain knows better. Fake portraits cause different signals to fire on brain scans, according to a paper published in Vision Research. While you consciously can’t spot the fake (for those playing at home, the face on the right is the phony), your neurons are more reliable. “Your brain sees the difference between the two images. You just can’t see it yet,” says co-author Associate Professor Thomas Carlson, a researcher at the University of Sydney’s School of Psychology. The researchers asked volunteers to view a series of several hundred photos, some of which were real and some of which were fakes generated by a GAN (a Generative Adversarial Network, a common way of making deepfakes). One group of 200 participants was asked to guess which images were real, and which were fake, by pressing a button. A different group of 22 participants didn’t guess, but underwent electroencephalography (EEG) tests while they were viewing the images. The EEGs showed distinct signals when participants were viewing deepfakes, compared to real images. “The brain is responding different than when it sees a real image,” says Carlson. “It’s sort of difficult to figure out what exactly it’s picking up on, because all you can really see is that it is different – that’s something we’ll have to do more research to figure out.”

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28402 - Posted: 07.16.2022

By Eiman Azim, Sliman Bensmaia, Lee E. Miller, Chris Versteeg Imagine you are playing the guitar. You’re seated, supporting the instrument’s weight across your lap. One hand strums; the other presses strings against the guitar’s neck to play chords. Your vision tracks sheet music on a page, and your hearing lets you listen to the sound. In addition, two other senses make playing this instrument possible. One of them, touch, tells you about your interactions with the guitar. Another, proprioception, tells you about your arms’ and hands’ positions and movements as you play. Together, these two capacities combine into what scientists call somatosensation, or body perception. Our skin and muscles have millions of sensors that contribute to somatosensation. Yet our brain does not become overwhelmed by the barrage of these inputs—or from any of our other senses, for that matter. You’re not distracted by the pinch of your shoes or the tug of the guitar strap as you play; you focus only on the sensory inputs that matter. The brain expertly enhances some signals and filters out others so that we can ignore distractions and focus on the most important details. How does the brain accomplish these feats of focus? In recent research at Northwestern University, the University of Chicago and the Salk Institute for Biological Studies in La Jolla, Calif., we have illuminated a new answer to this question. Through several studies, we have discovered that a small, largely ignored structure at the very bottom of the brain stem plays a critical role in the brain’s selection of sensory signals. The area is called the cuneate nucleus, or CN. Our research on the CN not only changes the scientific understanding of sensory processing, but it might also lay the groundwork for medical interventions to restore sensation in patients with injury or disease. © 2022 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 8: General Principles of Sensory Processing, Touch, and Pain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 5: The Sensorimotor System
Link ID: 28330 - Posted: 05.18.2022

Imma Perfetto Have you ever driven past an intersection and registered you should have turned right a street ago, or been in a conversation and, as soon as the words are out of your mouth, realised you really shouldn’t have said that thing you just did? It’s a phenomenon known as performance monitoring; an internal signal produced by the brain that lets you know when you’ve made a mistake. Performance monitoring is a kind of self-generated feedback that’s essential to managing our daily lives. Now, neuroscientists have discovered that signals from neurons in the brain’s medial frontal cortex are responsible for it. A new study published in Science reports that these signals are used to give humans the flexibility to learn new tasks and the focus to develop highly specific skills. “Part of the magic of the human brain is that it is so flexible,” says senior author Ueli Rutishauser, professor of Neurosurgery, Neurology, and Biomedical Sciences at Cedars-Sinai Medical Center, US. “We designed our study to decipher how the brain can generalise and specialise at the same time, both of which are critical for helping us pursue a goal.” They found that the performance monitoring signals help improve future attempts of a particular task by passing information to other areas of the brain. They also help the brain adjust its focus by signalling how much conflict or difficulty was encountered during the task. “An ‘Oops!’ moment might prompt someone to pay closer attention the next time they chat with a friend, or plan to stop at the store on the way home from work,” explains first author Zhongzheng Fu, researcher in the Rutishauser Laboratory at Cedars-Sinai.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28322 - Posted: 05.11.2022

Minuscule involuntary eye movements, known as microsaccades, can occur even while one is carefully staring at a fixed point in space. When paying attention to something in the peripheral vision (called covert attention), these microsaccades sometimes align towards the object of interest. New research by National Eye Institute (NEI) investigators shows that while these microsaccades seem to boost or diminish the strength of the brain signals underlying attention, the eye movements are not drivers of those brain signals. The findings will help researchers interpret studies about covert attention and may open new areas for research into attention disorders and behavior. NEI is part of the National Institutes of Health. Scientists working on the neuroscience of attention have recently become concerned that because both attention and eye movements, like microsaccades, involve the same groups of neurons in the brain, that microsaccades might be required for shifting attention. “If microsaccades were driving attention, that would bring into question a lot of previous research in the field.” said Richard Krauzlis, Ph.D., chief of the NEI Section on Eye Movements and Visual Selection, and senior author of a study report on the research. “This work shows that while microsaccades and attention do share some mechanisms, covert attention is not driven by eye movements.” Krauzlis’ previous research has shown that covert attention causes a modulation of certain neuronal signals in an evolutionarily ancient area of the brain called the superior colliculus, which is involved in the detection of events. When attention is being paid to a particular area – for example, the right-hand side of one’s peripheral vision – signals in the superior colliculus relating to events that occur in that area will receive an extra boost, while signals relating to events occurring somewhere else, like on the left-hand side, will be depressed.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 28254 - Posted: 03.26.2022

By Conor Feehly There's a paradox in our ability to pay attention. When we are hyper-focused on our surroundings, our senses become more acutely aware of the signals they pick up. But sometimes when we are paying attention, we miss things in our sensory field that are so glaringly obvious, on a second look we can’t help but question the legitimacy of our perception. Back in 1999, the psychologist Daniel Simons created a clever scenario that poignantly demonstrates this phenomenon. (Test it yourself in less than two minutes by watching Simons’ video here, which we recommend before the spoiler below.) In the scenario, there are two teams, each consisting of three players, with one team dressed in black and the other in white. The viewer is asked to count how many passes the team in white makes throughout the course of the video. Sure enough, as the video ends, most people are able to accurately guess the number of passes. Then the narrator asks: But did you see the gorilla? As it turns out, someone in a gorilla suit slowly walks into the scene, in plain sight. Most people who watch the video for the first time and focus on counting passes completely overlook the out-of-place primate. It seems strange, given the viewer’s intent observation of the small field of view where the scene unfolds. Predictive Processing Neuroscientist Anil Seth offers an interesting explanation of this phenomenon in his book Being You: A New Science of Consciousness. Seth’s description draws from one of neuroscience’s leading theories of cognition and perception. © 2022 Kalmbach Media Co.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28208 - Posted: 02.19.2022

By David J. Linden When a routine echocardiogram revealed a large mass next to my heart, the radiologist thought it might be a hiatal hernia—a portion of my stomach poking up through my diaphragm to press against the sac containing my heart. “Chug this can of Diet Dr. Pepper and then hop up on the table for another echocardiogram before the soda bubbles in your stomach all pop.” So I did. However, the resulting images showed that the mass did not contain the telltale signature of bursting bubbles in my stomach that would support a hernia diagnosis. A few weeks later, an MRI scan, which has much better resolution, revealed that the mass was actually contained within the pericardial sac and was quite large—about the volume of that soda can. Even with this large invader pressing on my heart, I had no symptoms and could exercise at full capacity. I felt great. The doctors told me that the mass was most likely to be a teratoma, a clump of cells that is not typically malignant. Their outlook was sunny. Riffing on the musical South Pacific, my cardiologist said, “We’re gonna pop that orange right out of your chest and send you on your way.” While I was recovering from surgery, the pathology report came back and the news was bad—it wasn’t a benign teratoma after all, but rather a malignant cancer called synovial sarcoma. Because of its location, embedded in my heart wall, the surgeon could not remove all of the cancer cells. Doing so would have rendered my heart unable to pump blood. The oncologist told me to expect to live an additional six to 18 months. (c) 2022 by The Atlantic Monthly Group.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 28138 - Posted: 01.05.2022

Iris Berent How can a cellist play like an angel? Why am I engrossed in my book when others struggle with reading? And while we’re at it, can you tell me why my child won’t stop screaming? Now neuroscience offers the answers—or so say the news headlines. The brains of musicians “really do” differ from those of the rest of us. People with dyslexia have different neural connections than people without the condition. And your screaming toddler’s tantrums originate from her amygdala, a brain region linked to emotions. It’s all in the brain! Neuroscience is fascinating. But it is not just the love of science that kindles our interest in these stories. Few of us care for the technical details of how molecules and electrical charges inthe brain give rise to our mental life. Furthermore, invoking the brain does not always improve our understanding. You hardly need a brain scan to tell that your toddler is enraged. Nor is it surprising that an amateur cellist’s brain works differently than Yo-Yo Ma’s—or that the brains of typical and dyslexic readers differ in some way. Where else would those differences reside? These sorts of science news stories speak to a bias: As numerous experiments have demonstrated, we have a blind spot for the brain. In classic work on the “seductive allure of neuroscience,” a team of researchers at Yale University presented participants with a psychological phenomenon (for instance, children learning new words), along with two explanations. One invoked a psychological mechanism, and the other was identical except it also dropped in a mention of a brain region. The brain details were entirely superfluous—they did nothing to improve the explanation, as judged by neuroscientists. Yet laypeople thought they did, so much so that once the brain was invoked, participants overlooked gross logical flaws in the accounts. © 2021 Scientific American,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 28105 - Posted: 12.11.2021

To eavesdrop on a brain, one of the best tools neuroscientists have is the fMRI scan, which helps map blood flow, and therefore the spikes in oxygen that occur whenever a particular brain region is being used. It reveals a noisy world. Blood oxygen levels vary from moment to moment, but those spikes never totally flatten out. “Your brain, even resting, is not going to be completely silent,” says Poortata Lalwani, a PhD student in cognitive neuroscience at the University of Michigan. She imagines the brain, even at its most tranquil, as kind of like a tennis player waiting to return a serve: “He’s not going to be standing still. He’s going to be pacing a little bit, getting ready to hit the backhand.” Many fMRI studies filter out that noise to find the particular spikes researchers want to scrutinize. But for Lalwani, that noise is the most telling signal of all. To her, it’s a signal of cognitive flexibility. Young, healthy brains tend to have signals with a lot of variability in blood oxygen levels from moment to moment. Older ones vary less, at least in certain regions of the brain. About a decade ago, scientists first showed the link between low neural signal variability and the kind of cognitive decline that accompanies healthy aging, rather than specific dementias. A brain’s noisiness is a solid proxy for details that are more abstract, Lalwani says: “How efficient information transfer is, how well-connected the neural networks are, in general how well-functioning the underlying neural network is.” But why that change happens with age has been a mystery. So has the question of whether it’s reversible. © 2021 Condé Nast.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28091 - Posted: 11.24.2021

Anil Ananthaswamy How our brain, a three-pound mass of tissue encased within a bony skull, creates perceptions from sensations is a long-standing mystery. Abundant evidence and decades of sustained research suggest that the brain cannot simply be assembling sensory information, as though it were putting together a jigsaw puzzle, to perceive its surroundings. This is borne out by the fact that the brain can construct a scene based on the light entering our eyes, even when the incoming information is noisy and ambiguous. Consequently, many neuroscientists are pivoting to a view of the brain as a “prediction machine.” Through predictive processing, the brain uses its prior knowledge of the world to make inferences or generate hypotheses about the causes of incoming sensory information. Those hypotheses — and not the sensory inputs themselves — give rise to perceptions in our mind’s eye. The more ambiguous the input, the greater the reliance on prior knowledge. “The beauty of the predictive processing framework [is] that it has a really large — sometimes critics might say too large — capacity to explain a lot of different phenomena in many different systems,” said Floris de Lange, a neuroscientist at the Predictive Brain Lab of Radboud University in the Netherlands. However, the growing neuroscientific evidence for this idea has been mainly circumstantial and is open to alternative explanations. “If you look into cognitive neuroscience and neuro-imaging in humans, [there’s] a lot of evidence — but super-implicit, indirect evidence,” said Tim Kietzmann of Radboud University, whose research lies in the interdisciplinary area of machine learning and neuroscience. All Rights Reserved © 2021

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 7: Vision: From Eye to Brain
Link ID: 28080 - Posted: 11.17.2021

Catherine Offord Earlier this year, Brian Butterworth decided to figure out how many numbers the average person encounters in a day. He picked a Saturday for his self-experiment—as a cognitive neuroscientist and professor emeritus at University College London, Butterworth works with numbers, so a typical weekday wouldn’t have been fair. He went about his day as usual, but kept track of how frequently he saw or heard a number, whether that was a symbol, such as 4 or 5, or a word such as “four” or “five.” He flicked through the newspaper, listened to the radio, popped out for a bit of shopping (taking special note of price tags and car license plates), and then, at last, sat down to calculate a grand total. “Would you like to take a guess?” he asks me when we speak over Zoom a couple of weeks later. I hazard that it’s well into the hundreds, but admit I’ve never thought about it before. He says: “I reckoned that I experienced about a thousand numbers an hour. A thousand numbers an hour is sixteen thousand numbers a day, is about five or six million a year. . . . That’s an awful lot of numbers.” Butterworth didn’t conduct his thought experiment just to satisfy his own curiosity. He’s including the calculation in an upcoming book, Can Fish Count?, slated for publication next year. In it, he argues that humans and other animals are constantly exposed to and make use of numbers—not just in the form of symbols and words, but as quantities of objects, of events, and of abstract concepts. Butterworth is one of several researchers who believe that the human brain can be thought of as having a “sense” for number, and that we, like our evolutionary ancestors, are neurologically hardwired to perceive all sorts of quantities in our environments, whether that serves for selecting the bush with more fruit on it, recognizing when a few predators on the horizon become too many, or telling from a show of hands when a consensus has been reached. © 1986–2021 The Scientist.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28051 - Posted: 10.27.2021

Annie Melchor After finishing his PhD in neuroscience in 2016, Thomas Andrillon spent a year road-tripping around Africa and South America with his wife. One evening, on a particularly difficult road in Patagonia, his mind began to wander and he ended up accidentally flipping the car. Luckily, no one was hurt. As locals rushed in to help, they asked Andrillon what had happened. Was there an animal on the road? Had he fallen asleep at the wheel? “I had difficulty explaining that I was just thinking about something else,” he remembers. This experience made him think. What had happened? What was going on in his brain when his mind began to wander? In 2017, Andrillon started his postdoctoral research with neuroscientists Naotsugu Tsuchiya and Joel Pearson at Monash University in Melbourne. Shortly after, Tsuchiya and Andrillon teamed up with philosopher Jennifer Windt, also at Monash, to dive into the neural basis of mind wandering. Initially, Andrillon says, they wanted to know if they could detect mind wandering from facial expressions, recalling how teachers claim to be very good at knowing when their students are not paying attention. So they did a pilot experiment in which they filmed their test subjects performing a tedious, repetitive task. After reviewing the videos, one of Andrillon’s students came to him, concerned. “I think we have a problem,” said the student. “[The subjects] look exhausted.” Sure enough, even though all the study participants were awake, they were obviously struggling to not fall asleep, says Andrillon. It was this observation that gave them the idea to broaden their focus, and start looking at the connection between wavering attention and sleep. © 1986–2021 The Scientist.=

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 10: Biological Rhythms and Sleep
Link ID: 28016 - Posted: 10.02.2021

By Carl Zimmer Dr. Adam Zeman didn’t give much thought to the mind’s eye until he met someone who didn’t have one. In 2005, the British neurologist saw a patient who said that a minor surgical procedure had taken away his ability to conjure images. Over the 16 years since that first patient, Dr. Zeman and his colleagues have heard from more than 12,000 people who say they don’t have any such mental camera. The scientists estimate that tens of millions of people share the condition, which they’ve named aphantasia, and millions more experience extraordinarily strong mental imagery, called hyperphantasia. In their latest research, Dr. Zeman and his colleagues are gathering clues about how these two conditions arise through changes in the wiring of the brain that join the visual centers to other regions. And they’re beginning to explore how some of that circuitry may conjure other senses, such as sound, in the mind. Eventually, that research might even make it possible to strengthen the mind’s eye — or ear — with magnetic pulses. “This is not a disorder as far as I can see,” said Dr. Zeman, a cognitive scientist at the University of Exeter in Britain. “It’s an intriguing variation in human experience.” The patient who first made Dr. Zeman aware of aphantasia was a retired building surveyor who lost his mind’s eye after minor heart surgery. To protect the patient’s privacy, Dr. Zeman refers to him as M.X. When M.X. thought of people or objects, he did not see them. And yet his visual memories were intact. M.X. could answer factual questions such as whether former Prime Minister Tony Blair has light-colored eyes. (He does.) M.X. could even solve problems that required mentally rotating shapes, even though he could not see them. I came across M.X.’s case study in 2010 and wrote a column about it for Discover magazine. Afterward, I got emails from readers who had the same experience but who differed from M.X. in a remarkable way: They had never had a mind’s eye to begin with. © 2021 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 27851 - Posted: 06.11.2021

By Veronique Greenwood The coin is in the illusionist’s left hand, now it’s in the right — or is it? Sleight of hand tricks are old standbys for magicians, street performers and people who’ve had a little too much to drink at parties. Sign up for Science Times: Get stories that capture the wonders of nature, the cosmos and the human body. On humans, the deceptions work pretty well. But it turns out that birds don’t always fall for the same illusions. Researchers in a small study published on Monday in the Proceedings of the National Academy of Sciences reported on Eurasian jays, birds whose intelligence has long been studied by comparative psychologists. The jays were not fooled, at least by tricks that rely on the viewer having certain expectations about how human hands work. However, they were fooled by another kind of trick, perhaps because of how their visual system is built. Magic tricks often play on viewers’ expectations, said Elias Garcia-Pelegrin, a graduate student at the University of Cambridge who is an author of the study. That magic can reveal the viewers’ assumptions suggests that tricks can be a way into understanding how other creatures see the world, he and his colleagues reasoned. Eurasian jays are not newcomers to subterfuge: To thwart thieves while they’re storing food, jays will perform something very like sleight of hand — sleight of beak, if you will — if another jay is watching. They’ll pretend to drop the food in a number of places, so its real location is concealed. © 2021 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27843 - Posted: 06.02.2021

By Veronique Greenwood Last spring, robins living on an Illinois tree farm sat on some unusual eggs. Alongside the customary brilliant blue ovoids they had laid were some unusually shaped objects. Although they had the same color, some were long and thin, stretched into pills. Others were decidedly pointy — so angular, in fact, that they bore little resemblance to eggs at all. If robins played Dungeons and Dragons, they might have thought, “Why do I have an eight-sided die in my nest?” The answer: Evolutionary biologists were gauging how birds decide what belongs in their nests, and what is an invasive piece of detritus that they need to throw out. Thanks to the results of this study, published Wednesday in Royal Society Open Science, we now know what the robins thought of the eggs, which were made of plastic and had been 3-D printed by the lab of Mark Hauber, a professor of animal behavior at the University of Illinois, Urbana-Champaign and a fellow at Hanse-Wissenschaftskolleg in Delmenhorst, Germany. He and his colleagues reported that the thinner the fake eggs got, the more likely the birds were to remove them from the nest. But curiously, the robins were more cautious about throwing out the pointy objects like that eight-sided die, which were closer in width to their own eggs. Birds, the results suggest, are using rules of thumb that are not intuitive to humans when they decide what is detritus and what is precious cargo. It’s not as uncommon as you’d think for robins to find foreign objects in their nests. They play host to cowbirds, a parasitic species that lays eggs in other birds’ nests, where they hatch and compete with the robins’ own offspring for nourishment. Confronted with a cowbird egg, which is beige and squatter than its blue ovals, parent robins will often push the parasite’s eggs out. That makes the species a good candidate for testing exactly what matters when it comes to telling their own eggs apart from other objects, Dr. Hauber said. © 2021 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27669 - Posted: 01.30.2021