Chapter 18. Attention and Higher Cognition

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 141 - 160 of 1705

Catherine Offord Earlier this year, Brian Butterworth decided to figure out how many numbers the average person encounters in a day. He picked a Saturday for his self-experiment—as a cognitive neuroscientist and professor emeritus at University College London, Butterworth works with numbers, so a typical weekday wouldn’t have been fair. He went about his day as usual, but kept track of how frequently he saw or heard a number, whether that was a symbol, such as 4 or 5, or a word such as “four” or “five.” He flicked through the newspaper, listened to the radio, popped out for a bit of shopping (taking special note of price tags and car license plates), and then, at last, sat down to calculate a grand total. “Would you like to take a guess?” he asks me when we speak over Zoom a couple of weeks later. I hazard that it’s well into the hundreds, but admit I’ve never thought about it before. He says: “I reckoned that I experienced about a thousand numbers an hour. A thousand numbers an hour is sixteen thousand numbers a day, is about five or six million a year. . . . That’s an awful lot of numbers.” Butterworth didn’t conduct his thought experiment just to satisfy his own curiosity. He’s including the calculation in an upcoming book, Can Fish Count?, slated for publication next year. In it, he argues that humans and other animals are constantly exposed to and make use of numbers—not just in the form of symbols and words, but as quantities of objects, of events, and of abstract concepts. Butterworth is one of several researchers who believe that the human brain can be thought of as having a “sense” for number, and that we, like our evolutionary ancestors, are neurologically hardwired to perceive all sorts of quantities in our environments, whether that serves for selecting the bush with more fruit on it, recognizing when a few predators on the horizon become too many, or telling from a show of hands when a consensus has been reached. © 1986–2021 The Scientist.

Keyword: Attention
Link ID: 28051 - Posted: 10.27.2021

By Kate Conger, Kellen Browning and Erin Woo A 27-year-old YouTube star, prodded by her millions of followers with concerns about her health. A 19-year-old TikTok creator who features posts about being skinny. Teen communities throughout the internet, cleverly naming and culling their discussions to avoid detection. They present a nearly intractable problem for social media companies under pressure to do something about material on their services that many people believe is causing harm, particularly to teenagers. Those concerns came into sharp focus in recent weeks in a pair of Senate subcommittee hearings: the first featuring a Facebook executive defending her company, and the second featuring a former Facebook employee turned whistle-blower who bluntly argued that her former employer’s products drove some young people toward eating disorders. The hearings were prompted in part by a Wall Street Journal article that detailed how internal Facebook research showed Instagram, which is owned by Facebook, can make body image issues worse for some young people. On Tuesday, executives from YouTube, TikTok and Snapchat are scheduled to testify before a Senate subcommittee about the effects of their products on children. They are expected to face questions about how they moderate content that might encourage disordered eating, and how their algorithms might promote such content. “Big Tech’s exploiting these powerful algorithms and design features is reckless and heedless, and needs to change,” Senator Richard Blumenthal, a Democrat of Connecticut and the chair of the subcommittee, said in a statement. “They seize on the insecurities of children, including eating disorders, simply to make more money.” But what exactly can be done about that content — and why people create it in the first place — may defy easy answers. If creators say they don’t intend to glamorize eating disorders, should their claims be taken at face value? Or should the companies listen to users complaining about them? © 2021 The New York Times Company

Keyword: Anorexia & Bulimia; Attention
Link ID: 28049 - Posted: 10.23.2021

By Jamie Friedlander Serrano My dad was planning a trip to Cannon Beach, a small coastal town in Oregon that I love. Yet when I sat down to email him some recommendations, I drew a blank. I couldn’t remember the name of the state park we visited or the breakfast spot we adored. Even the name of the hotel we stayed at eluded me. U.S. coronavirus cases tracker and map Since giving birth to my year-old daughter, I’ve had countless moments like this. I have trouble recalling words, forget to respond to text messages, and even missed an appointment. What I’m experiencing is often called “mommy brain”— the forgetful, foggy and scatterbrained feeling many pregnant women and new mothers experience. But is mommy brain real? Anecdotally, yes. Ask any new mom if she has felt the above, and she'll likely say she has — as many as 80 percent of new moms report feelings of mommy brain. Scientifically, it also appears the answer is yes: A growing body of research supports the argument that moms' brains change during pregnancy and after giving birth. A clear explanation for the phenomenon still remains somewhat elusive, however. There are countless variables that experts say contribute to mommy brain, such as fluctuating hormones postpartum, sleep deprivation in dealing with a new baby, anxiety over new parenthood, elevated stress levels, and a general of lives that having a baby forces. Put together, it’s only natural that changes in mental processing would occur, says Moriah Thomason, Barakett associate professor of child and adolescent psychiatry at New York University School of Medicine. When our brain needs to make space for a new priority — keeping a baby alive — remembering a grocery list takes a back seat. “Does it mean that you literally cannot do those things that you used to do as well? Probably not,” she says. “It’s just not the most important thing for you to be accessing.” © 1996-2021 The Washington Post

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 28033 - Posted: 10.13.2021

Annie Melchor After finishing his PhD in neuroscience in 2016, Thomas Andrillon spent a year road-tripping around Africa and South America with his wife. One evening, on a particularly difficult road in Patagonia, his mind began to wander and he ended up accidentally flipping the car. Luckily, no one was hurt. As locals rushed in to help, they asked Andrillon what had happened. Was there an animal on the road? Had he fallen asleep at the wheel? “I had difficulty explaining that I was just thinking about something else,” he remembers. This experience made him think. What had happened? What was going on in his brain when his mind began to wander? In 2017, Andrillon started his postdoctoral research with neuroscientists Naotsugu Tsuchiya and Joel Pearson at Monash University in Melbourne. Shortly after, Tsuchiya and Andrillon teamed up with philosopher Jennifer Windt, also at Monash, to dive into the neural basis of mind wandering. Initially, Andrillon says, they wanted to know if they could detect mind wandering from facial expressions, recalling how teachers claim to be very good at knowing when their students are not paying attention. So they did a pilot experiment in which they filmed their test subjects performing a tedious, repetitive task. After reviewing the videos, one of Andrillon’s students came to him, concerned. “I think we have a problem,” said the student. “[The subjects] look exhausted.” Sure enough, even though all the study participants were awake, they were obviously struggling to not fall asleep, says Andrillon. It was this observation that gave them the idea to broaden their focus, and start looking at the connection between wavering attention and sleep. © 1986–2021 The Scientist.=

Keyword: Attention; Sleep
Link ID: 28016 - Posted: 10.02.2021

Jordana Cepelewicz Neuroscientists are the cartographers of the brain’s diverse domains and territories — the features and activities that define them, the roads and highways that connect them, and the boundaries that delineate them. Toward the front of the brain, just behind the forehead, is the prefrontal cortex, celebrated as the seat of judgment. Behind it lies the motor cortex, responsible for planning and coordinating movement. To the sides: the temporal lobes, crucial for memory and the processing of emotion. Above them, the somatosensory cortex; behind them, the visual cortex. Not only do researchers often depict the brain and its functions much as mapmakers might draw nations on continents, but they do so “the way old-fashioned mapmakers” did, according to Lisa Feldman Barrett, a psychologist at Northeastern University. “They parse the brain in terms of what they’re interested in psychologically or mentally or behaviorally,” and then they assign the functions to different networks of neurons “as if they’re Lego blocks, as if there are firm boundaries there.” But a brain map with neat borders is not just oversimplified — it’s misleading. “Scientists for over 100 years have searched fruitlessly for brain boundaries between thinking, feeling, deciding, remembering, moving and other everyday experiences,” Barrett said. A host of recent neurological studies further confirm that these mental categories “are poor guides for understanding how brains are structured or how they work.” Neuroscientists generally agree about how the physical tissue of the brain is organized: into particular regions, networks, cell types. But when it comes to relating those to the task the brain might be performing — perception, memory, attention, emotion or action — “things get a lot more dodgy,” said David Poeppel, a neuroscientist at New York University. All Rights Reserved © 2021

Keyword: Brain imaging; Attention
Link ID: 27963 - Posted: 08.25.2021

Tim Adams For centuries, philosophers have theorised about the mind-body question, debating the relationship between the physical matter of the brain and the conscious mental activity it somehow creates. Even with advances in neuroscience and brain imaging techniques, large parts of that fundamental relationship remain stubbornly mysterious. It was with good reason that, in 1995, the cognitive scientist David Chalmers coined the term “the hard problem” to describe the question of exactly how our brains conjure subjective conscious experience. Some philosophers continue to insist that mind is inherently distinct from matter. Advances in understanding how the brain functions undermine those ideas of dualism, however. Anil Seth, professor of cognitive and computational neuroscience at the University of Sussex, is at the leading edge of that latter research. His Ted talk on consciousness has been viewed more than 11m times. His new book, Being You, proposes an idea of the human mind as a “highly evolved prediction machine”, rooted in the functions of the body and “constantly hallucinating the world and the self” to create reality. One of the things that I liked about your approach in the book was the way that many of the phenomena you investigate arise out of your experience. For example, the feeling of returning to consciousness after anaesthesia or how your mother, experiencing delirium, was no longer recognisably herself. Do you think it’s always important to keep that real-world framework in mind? The reason I’m interested in consciousness is intrinsically personal. I want to understand myself and, by extension, others. But I’m also super-interested for example in developing statistical models and mathematical methods for characterising things such as emergence [behaviour of the mind as a whole that exceeds the capability of its individual parts] and there is no personal component in that. © 2021 Guardian News & Media Limited

Keyword: Consciousness; Attention
Link ID: 27962 - Posted: 08.25.2021

By Katherine Ellison Jessica McCabe crashed and burned at 30, when she got divorced, dropped out of community college and moved in with her mother. Eric Tivers had 21 jobs before age 21. Both have been diagnosed with attention-deficit/hyperactivity disorder, and both today are entrepreneurs who wear their diagnoses — and rare resilience — on their sleeves. With YouTube videos, podcasts and tweets, they’ve built online communities aimed at ending the shame that so often makes having ADHD so much harder. Now they’re going even further, asking: Why not demand more than mere compassion? Why not seek deeper changes to create a more ADHD-friendly world? “I’ve spent the last five or six years trying to understand how my brain works so that I could conform, but now I’m starting to evolve,” says McCabe, 38, whose chipper, NASCAR-speed delivery has garnered 742,000 subscribers — and counting — to her YouTube channel, “How to ADHD.” “I think we no longer have to accept that we live in a world that is not built for our brains.” With Tivers, she is planning a virtual summit on the topic for next May. As a first step, with the help of Canadian cognitive scientist Deirdre Kelly, she says she’ll soon release new guidelines to assess products and services for their ADHD friendliness. Computer programs that help restless users meditate and a chair that accommodates a variety of seated positions are high on the list to promote, while error-prone apps or devices will be flagged. Kelly also envisions redesigning refrigerator vegetable drawers, so that the most nutritious food will no longer be out of sight and mind. In the past two decades, the world has become much kinder to the estimated 6.1 million children and approximately 10 million adults with ADHD, whose hallmark symptoms are distraction, forgetfulness and impulsivity. Social media has made all the difference.

Keyword: ADHD
Link ID: 27960 - Posted: 08.25.2021

By Christiane Gelitz, Maddie Bender | To a chef, the sounds of lip smacking, slurping and swallowing are the highest form of flattery. But to someone with a certain type of misophonia, these same sounds can be torturous. Brain scans are now helping scientists start to understand why. People with misophonia experience strong discomfort, annoyance or disgust when they hear particular triggers. These can include chewing, swallowing, slurping, throat clearing, coughing and even audible breathing. Researchers previously thought this reaction might be caused by the brain overactively processing certain sounds. Now, however, a new study published in the Journal of Neuroscience has linked some forms of misophonia to heightened “mirroring” behavior in the brain: those affected feel distress while their brains act as if they are mimicking the triggering mouth movements. “This is the first breakthrough in misophonia research in 25 years,” says psychologist Jennifer J. Brout, who directs the International Misophonia Research Network and was not involved in the new study. The research team, led by Newcastle University neuroscientist Sukhbinder Kumar, analyzed brain activity in people with and without misophonia when they were at rest and while they listened to sounds. These included misophonia triggers (such as chewing), generally unpleasant sounds (like a crying baby), and neutral sounds. The brain's auditory cortex, which processes sound, reacted similarly in subjects with and without misophonia. But in both the resting state and listening trials, people with misophonia showed stronger connections between the auditory cortex and brain regions that control movements of the face, mouth and throat. Kumar found this connection became most active in participants with misophonia when they heard triggers specific to the condition. © 2021 Scientific American,

Keyword: Hearing; Attention
Link ID: 27955 - Posted: 08.21.2021

By John Horgan In my 20s, I had a friend who was brilliant, charming, Ivy-educated and rich, heir to a family fortune. I’ll call him Gallagher. He could do anything he wanted. He experimented, dabbling in neuroscience, law, philosophy and other fields. But he was so critical, so picky, that he never settled on a career. Nothing was good enough for him. He never found love for the same reason. He also disparaged his friends’ choices, so much so that he alienated us. He ended up bitter and alone. At least that’s my guess. I haven’t spoken to Gallagher in decades. There is such a thing as being too picky, especially when it comes to things like work, love and nourishment (even the pickiest eater has to eat something). That’s the lesson I gleaned from Gallagher. But when it comes to answers to big mysteries, most of us aren’t picky enough. We settle on answers for bad reasons, for example, because our parents, priests or professors believe it. We think we need to believe something, but actually we don’t. We can, and should, decide that no answers are good enough. We should be agnostics. Some people confuse agnosticism (not knowing) with apathy (not caring). Take Francis Collins, a geneticist who directs the National Institutes of Health. He is a devout Christian, who believes that Jesus performed miracles, died for our sins and rose from the dead. In his 2006 bestseller The Language of God, Collins calls agnosticism a “cop-out.” When I interviewed him, I told him I am an agnostic and objected to “cop-out.” © 2021 Scientific American

Keyword: Consciousness
Link ID: 27952 - Posted: 08.18.2021

By Katherine Ellison ADHD — the most common psychiatric disorder of childhood —  lasts longer for more people than has been widely assumed, according to new research. “Only 10 percent of people really appear to grow out of ADHD,” says the lead author, psychologist Margaret Sibley, associate professor of psychiatry and behavioral sciences at the University of Washington School of Medicine. “Ninety percent still struggle with at least mild symptoms as adults — even if they have periods when they are symptom free.” The study challenges a widely persistent perception of a time-limited condition occurring mostly in childhood. Indeed, one of the earliest names for attention deficit/hyperactivity disorder was “a hyperkinetic disease of infancy,” while its most common poster child has long been a young, White, disruptive male. Previous research has suggested the condition essentially vanishes in about half of those who receive diagnoses. But in recent years, increasing numbers of women, people of color and especially adults have been seeking help in managing the hallmark symptoms of distraction, forgetfulness and impulsivity. By the most recent estimates, 9.6 percent of children ages 3 to 17 have been diagnosed with ADHD. Yet researchers report that only 4.4 percent of young adults ages 18 to 44 have the disorder, suggesting that if the new estimates are valid, there may be some catching up to do. Sibley’s paper paints a picture of an on-again, off-again condition, with symptoms fluctuating depending on life circumstances. © 1996-2021 The Washington Post

Keyword: ADHD
Link ID: 27946 - Posted: 08.14.2021

By Christina Caron Q: How common is adult A.D.H.D.? What are the symptoms and is it possible for someone who was not diagnosed with it as a child to be diagnosed as an adult? A: Attention deficit hyperactivity disorder, or A.D.H.D., is a neurodevelopmental disorder often characterized by inattention, disorganization, hyperactivity and impulsivity. It is one of the most common mental health disorders. According to the World Federation of A.D.H.D., it is thought to occur in nearly 6 percent of children and 2.5 percent of adults. In the United States, 5.4 million children, or about 8 percent of all U.S. children ages 3 to 17, were estimated to have A.D.H.D. in 2016, the Centers for Disease Control and Prevention reported. For decades, experts believed that A.D.H.D. occurred only among children and ended after adolescence. But a number of studies in the ’90s showed that A.D.H.D. can continue into adulthood. Experts now say that at least 60 percent of children with A.D.H.D. will also have symptoms as adults. It’s not surprising that so many people are now wondering whether they might have the disorder, especially if their symptoms were exacerbated by the pandemic. The Attention Deficit Disorder Association, an organization founded in 1990 for adults with A.D.H.D, saw its membership nearly double between 2019 and 2021. In addition, Children and Adults With Attention-Deficit/Hyperactivity Disorder, or CHADD, reported that the highest proportion of people who call their A.D.H.D. help line are adults seeking guidance and resources for themselves. © 2021 The New York Times Company

Keyword: ADHD
Link ID: 27933 - Posted: 08.07.2021

By Christof Koch Consider the following experiences: • You're headed toward a storm that's a couple of miles away, and you've got to get across a hill. You ask yourself: “How am I going to get over that, through that?” • You see little white dots on a black background, as if looking up at the stars at night. Advertisement • You look down at yourself lying in bed from above but see only your legs and lower trunk. These may seem like idiosyncratic events drawn from the vast universe of perceptions, sensations, memories, thoughts and dreams that make up our daily stream of consciousness. In fact, each one was evoked by directly stimulating the brain with an electrode. As American poet Walt Whitman intuited in his poem “I Sing the Body Electric,” these anecdotes illustrate the intimate relationship between the body and its animating soul. The brain and the conscious mind are as inexorably linked as the two sides of a coin. Recent clinical studies have uncovered some of the laws and regularities of conscious activity, findings that have occasionally proved to be paradoxical. They show that brain areas involved in conscious perception have little to do with thinking, planning and other higher cognitive functions. Neuroengineers are now working to turn these insights into technologies to replace lost cognitive function and, in the more distant future, to enhance sensory, cognitive or memory capacities. For example, a recent brain-machine interface provides completely blind people with limited abilities to perceive light. These tools, however, also reveal the difficulties of fully restoring sight or hearing. They underline even more the snags that stand in the way of sci-fi-like enhancements that would enable access to the brain as if it were a computer storage drive. © 2021 Scientific American,

Keyword: Consciousness
Link ID: 27865 - Posted: 06.19.2021

Christopher M. Filley One of the most enduring themes in human neuroscience is the association of higher brain functions with gray matter. In particular, the cerebral cortex—the gray matter of the brain's surface—has been the primary focus of decades of work aiming to understand the neurobiological basis of cognition and emotion. Yet, the cerebral cortex is only a few millimeters thick, so the relative neglect of the rest of the brain below the cortex has prompted the term “corticocentric myopia” (1). Other regions relevant to behavior include the deep gray matter of the basal ganglia and thalamus, the brainstem and cerebellum, and the white matter that interconnects all of these structures. On page 1304 of this issue, Zhao et al. (2) present compelling evidence for the importance of white matter by demonstrating genetic influences on structural connectivity that invoke a host of provocative clinical implications. Insight into the importance of white matter in human behavior begins with its anatomy (3–5) (see the figure). White matter occupies about half of the adult human brain, and some 135,000 km of myelinated axons course through a wide array of tracts to link gray matter regions into distributed neural networks that serve cognitive and emotional functions (3). The human brain is particularly well interconnected because white matter has expanded more in evolution than gray matter, which has endowed the brain of Homo sapiens with extensive structural connectivity (6). The myelin sheath, white matter's characteristic feature, appeared late in vertebrate evolution and greatly increased axonal conduction velocity. This development enhanced the efficiency of distributed neural networks, expanding the transfer of information throughout the brain (5). Information transfer serves to complement the information processing of gray matter, where neuronal cell bodies, synapses, and a variety of neurotransmitters are located (5). The result is a brain with prodigious numbers of both neurons and myelinated axons, which have evolved to subserve the domains of attention, memory, emotion, language, perception, visuospatial processing, executive function (5), and social cognition (7). © 2021 American Association for the Advancement of Science.

Keyword: Development of the Brain; Attention
Link ID: 27862 - Posted: 06.19.2021

Laura Sanders A new view of the human brain shows its cellular residents in all their wild and weird glory. The map, drawn from a tiny piece of a woman’s brain, charts the varied shapes of 50,000 cells and 130 million connections between them. This intricate map, named H01 for “human sample 1,” represents a milestone in scientists’ quest to provide ever more detailed descriptions of a brain (SN: 2/7/14). “It’s absolutely beautiful,” says neuroscientist Clay Reid at the Allen Institute for Brain Science in Seattle. “In the best possible way, it’s the beginning of something very exciting.” Scientists at Harvard University, Google and elsewhere prepared and analyzed the brain tissue sample. Smaller than a sesame seed, the bit of brain was about a millionth of an entire brain’s volume. It came from the cortex — the brain’s outer layer responsible for complex thought — of a 45-year-old woman undergoing surgery for epilepsy. After it was removed, the brain sample was quickly preserved and stained with heavy metals that revealed cellular structures. The sample was then sliced into more than 5,000 wafer-thin pieces and imaged with powerful electron microscopes. Computational programs stitched the resulting images back together and artificial intelligence programs helped scientists analyze them. A short description of the resulting view was published as a preprint May 30 to bioRxiv.org. The full dataset is freely available online. black background with green and purple nerve cells with lots of long tendrils These two neurons are mirror symmetrical. It’s unclear why these cells take these shapes. Lichtman Lab/Harvard University, Connectomics Team/Google For now, researchers are just beginning to see what’s there. “We have really just dipped our toe into this dataset,” says study coauthor Jeff Lichtman, a developmental neurobiologist at Harvard University. Lichtman compares the brain map to Google Earth: “There are gems in there to find, but no one can say they’ve looked at the whole thing.” © Society for Science & the Public 2000–2021.

Keyword: Brain imaging
Link ID: 27858 - Posted: 06.16.2021

By Carl Zimmer Dr. Adam Zeman didn’t give much thought to the mind’s eye until he met someone who didn’t have one. In 2005, the British neurologist saw a patient who said that a minor surgical procedure had taken away his ability to conjure images. Over the 16 years since that first patient, Dr. Zeman and his colleagues have heard from more than 12,000 people who say they don’t have any such mental camera. The scientists estimate that tens of millions of people share the condition, which they’ve named aphantasia, and millions more experience extraordinarily strong mental imagery, called hyperphantasia. In their latest research, Dr. Zeman and his colleagues are gathering clues about how these two conditions arise through changes in the wiring of the brain that join the visual centers to other regions. And they’re beginning to explore how some of that circuitry may conjure other senses, such as sound, in the mind. Eventually, that research might even make it possible to strengthen the mind’s eye — or ear — with magnetic pulses. “This is not a disorder as far as I can see,” said Dr. Zeman, a cognitive scientist at the University of Exeter in Britain. “It’s an intriguing variation in human experience.” The patient who first made Dr. Zeman aware of aphantasia was a retired building surveyor who lost his mind’s eye after minor heart surgery. To protect the patient’s privacy, Dr. Zeman refers to him as M.X. When M.X. thought of people or objects, he did not see them. And yet his visual memories were intact. M.X. could answer factual questions such as whether former Prime Minister Tony Blair has light-colored eyes. (He does.) M.X. could even solve problems that required mentally rotating shapes, even though he could not see them. I came across M.X.’s case study in 2010 and wrote a column about it for Discover magazine. Afterward, I got emails from readers who had the same experience but who differed from M.X. in a remarkable way: They had never had a mind’s eye to begin with. © 2021 The New York Times Company

Keyword: Attention; Vision
Link ID: 27851 - Posted: 06.11.2021

By Ben Guarino and Frances Stead Sellers In the coronavirus pandemic’s early weeks, in neuropathology departments around the world, scientists wrestled with a question: Should they cut open the skulls of patients who died of covid-19 and extract their brains? Autopsy staff at Columbia University in New York were hesitant. Sawing into bone creates dust, and the Centers for Disease Control and Prevention had issued a warning about the bodies of covid patients — airborne debris from autopsies could be an infectious hazard. But as more patients were admitted and more began to die, researchers decided to “make all the efforts we could to start collecting the brain tissue,” Columbia neuropathologist Peter D. Canoll said. In March 2020, in an insolation room, the Columbia team extracted a brain from a patient who had died of severe covid-19, the illness caused by the coronavirus. During the next months, they would examine dozens more. Saw met skull elsewhere, too. In Germany, scientists autopsied brains — even though medical authorities recommended against doing that. Researchers were searching the brain for damage — and for the virus itself. At the pandemic’s start, understanding how the virus affected the nervous system was largely a mystery. S. Andrew Josephson, chair of neurology at the University of California at San Francisco and editor in chief of the academic journal JAMA Neurology, said, “We had hundreds of submissions of ‘I saw one case of X.’” It was difficult to understand whether single cases has any relationship to covid at all. Patients reported visual and auditory disturbances, vertigo and tingling sensations, among other perplexing symptoms. Some lost their sense of smell, or their vision became distorted. Weeks or months after the initial onset of symptoms, some remain convinced after even a mild bout of the coronavirus of persistent “brain fog.”

Keyword: Learning & Memory; Attention
Link ID: 27845 - Posted: 06.08.2021

By Jason S. Tsukahara, Alexander P. Burgoyne, Randall W. Engle It has been said that “the eyes are the window to the soul,” but new research suggests that they may be a window to the brain as well. Our pupils respond to more than just the light. They indicate arousal, interest or mental exhaustion. Pupil dilation is even used by the FBI to detect deception. Now work conducted in our laboratory at the Georgia Institute of Technology suggests that baseline pupil size is closely related to individual differences in intelligence. The larger the pupils, the higher the intelligence, as measured by tests of reasoning, attention and memory. In fact, across three studies, we found that the difference in baseline pupil size between people who scored the highest on the cognitive tests and those who scored the lowest was large enough to be detected by the unaided eye. We first uncovered this surprising relationship while studying differences in the amount of mental effort people used to complete memory tasks. We used pupil dilations as an indicator of effort, a technique psychologist Daniel Kahneman popularized in the 1960s and 1970s. When we discovered a relationship between baseline pupil size and intelligence, we weren’t sure if it was real or what it meant. Advertisement Intrigued, we conducted several large-scale studies in which we recruited more than 500 people aged 18 to 35 from the Atlanta community. We measured participants’ pupil size using an eye tracker, a device that captures the reflection of light off the pupil and cornea using a high-powered camera and computer. We measured participants’ pupils at rest while they stared at a blank computer screen for up to four minutes. All the while, the eye tracker was recording. Using the tracker, we then calculated each participant’s average pupil size. © 2021 Scientific American

Keyword: Learning & Memory; Vision
Link ID: 27844 - Posted: 06.08.2021

By Veronique Greenwood The coin is in the illusionist’s left hand, now it’s in the right — or is it? Sleight of hand tricks are old standbys for magicians, street performers and people who’ve had a little too much to drink at parties. Sign up for Science Times: Get stories that capture the wonders of nature, the cosmos and the human body. On humans, the deceptions work pretty well. But it turns out that birds don’t always fall for the same illusions. Researchers in a small study published on Monday in the Proceedings of the National Academy of Sciences reported on Eurasian jays, birds whose intelligence has long been studied by comparative psychologists. The jays were not fooled, at least by tricks that rely on the viewer having certain expectations about how human hands work. However, they were fooled by another kind of trick, perhaps because of how their visual system is built. Magic tricks often play on viewers’ expectations, said Elias Garcia-Pelegrin, a graduate student at the University of Cambridge who is an author of the study. That magic can reveal the viewers’ assumptions suggests that tricks can be a way into understanding how other creatures see the world, he and his colleagues reasoned. Eurasian jays are not newcomers to subterfuge: To thwart thieves while they’re storing food, jays will perform something very like sleight of hand — sleight of beak, if you will — if another jay is watching. They’ll pretend to drop the food in a number of places, so its real location is concealed. © 2021 The New York Times Company

Keyword: Attention; Learning & Memory
Link ID: 27843 - Posted: 06.02.2021

By Nayef Al-Rodhan o In Chile, the National Commission for Scientific and Technological Research has begun to debate a “neurorights” bill to be written into the country’s constitution. The world, and most importantly the OECD, UNESCO and the United Nations, should be watching closely. The Chilean bill sets out to protect the right to personal identity, free will, mental privacy, equitable access to technologies that augment human capacities, and the right to protection against bias and discrimination. The landmark bill would be the first of its kind to pioneer a regulatory framework which protects human rights from the manipulation of brain activity. The relatively nascent concept of neurorights follows a number of recent medical innovations, most notably brain-computer interface technology (BCI), which has the potential to revolutionize the field of neuroscience. BCI-based therapy may be useful for poststroke motor rehabilitation and may be a potential method for the accurate detection and treatment of neurological diseases such as Alzheimer’s. Advocates claim there is therefore a moral imperative to use the technology, given the benefits it could bring; others worry about its ethical, moral and societal consequences. Many (mistakenly) see this process as being potentially undermined by premature governance restrictions, or accuse any mention of brake mechanisms as an exaggerated reaction to an unlikely science-fiction scenario. © 2021 Scientific American

Keyword: Robotics; Consciousness
Link ID: 27841 - Posted: 06.02.2021

By Jackie Rocheleau It’s an attractive idea: By playing online problem-solving, matching and other games for a few minutes a day, people can improve such mental abilities as reasoning, verbal skills and memory. But whether these games deliver on those promises is up for debate. “For every study that finds some evidence, there’s an equal number of papers that find no evidence,” says Bobby Stojanoski, a cognitive neuroscientist at Western University in Ontario (SN: 3/8/17; SN: 5/9/17). Now, in perhaps the biggest real-world test of these programs, Stojanoski and colleagues pitted more than 1,000 people who regularly use brain trainers against around 7,500 people who don’t do the mini brain workouts. There was little difference between how both groups performed on a series of tests of their thinking abilities, suggesting that brain training doesn’t live up to its name, the scientists report in the April Journal of Experimental Psychology: General. “They put brain training to the test,” says Elizabeth Stine-Morrow, a cognitive aging scientist at the University of Illinois at Urbana-Champaign. While the study doesn’t show why brain trainers aren’t seeing benefits, it does show there is no link “between the amount of time spent with the brain training programs and cognition,” Stine-Morrow says. “That was pretty cool.” © Society for Science & the Public 2000–2021

Keyword: Learning & Memory
Link ID: 27830 - Posted: 05.27.2021