Links for Keyword: Attention

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 699

Minuscule involuntary eye movements, known as microsaccades, can occur even while one is carefully staring at a fixed point in space. When paying attention to something in the peripheral vision (called covert attention), these microsaccades sometimes align towards the object of interest. New research by National Eye Institute (NEI) investigators shows that while these microsaccades seem to boost or diminish the strength of the brain signals underlying attention, the eye movements are not drivers of those brain signals. The findings will help researchers interpret studies about covert attention and may open new areas for research into attention disorders and behavior. NEI is part of the National Institutes of Health. Scientists working on the neuroscience of attention have recently become concerned that because both attention and eye movements, like microsaccades, involve the same groups of neurons in the brain, that microsaccades might be required for shifting attention. “If microsaccades were driving attention, that would bring into question a lot of previous research in the field.” said Richard Krauzlis, Ph.D., chief of the NEI Section on Eye Movements and Visual Selection, and senior author of a study report on the research. “This work shows that while microsaccades and attention do share some mechanisms, covert attention is not driven by eye movements.” Krauzlis’ previous research has shown that covert attention causes a modulation of certain neuronal signals in an evolutionarily ancient area of the brain called the superior colliculus, which is involved in the detection of events. When attention is being paid to a particular area – for example, the right-hand side of one’s peripheral vision – signals in the superior colliculus relating to events that occur in that area will receive an extra boost, while signals relating to events occurring somewhere else, like on the left-hand side, will be depressed.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 28254 - Posted: 03.26.2022

By Conor Feehly There's a paradox in our ability to pay attention. When we are hyper-focused on our surroundings, our senses become more acutely aware of the signals they pick up. But sometimes when we are paying attention, we miss things in our sensory field that are so glaringly obvious, on a second look we can’t help but question the legitimacy of our perception. Back in 1999, the psychologist Daniel Simons created a clever scenario that poignantly demonstrates this phenomenon. (Test it yourself in less than two minutes by watching Simons’ video here, which we recommend before the spoiler below.) In the scenario, there are two teams, each consisting of three players, with one team dressed in black and the other in white. The viewer is asked to count how many passes the team in white makes throughout the course of the video. Sure enough, as the video ends, most people are able to accurately guess the number of passes. Then the narrator asks: But did you see the gorilla? As it turns out, someone in a gorilla suit slowly walks into the scene, in plain sight. Most people who watch the video for the first time and focus on counting passes completely overlook the out-of-place primate. It seems strange, given the viewer’s intent observation of the small field of view where the scene unfolds. Predictive Processing Neuroscientist Anil Seth offers an interesting explanation of this phenomenon in his book Being You: A New Science of Consciousness. Seth’s description draws from one of neuroscience’s leading theories of cognition and perception. © 2022 Kalmbach Media Co.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28208 - Posted: 02.19.2022

By David J. Linden When a routine echocardiogram revealed a large mass next to my heart, the radiologist thought it might be a hiatal hernia—a portion of my stomach poking up through my diaphragm to press against the sac containing my heart. “Chug this can of Diet Dr. Pepper and then hop up on the table for another echocardiogram before the soda bubbles in your stomach all pop.” So I did. However, the resulting images showed that the mass did not contain the telltale signature of bursting bubbles in my stomach that would support a hernia diagnosis. A few weeks later, an MRI scan, which has much better resolution, revealed that the mass was actually contained within the pericardial sac and was quite large—about the volume of that soda can. Even with this large invader pressing on my heart, I had no symptoms and could exercise at full capacity. I felt great. The doctors told me that the mass was most likely to be a teratoma, a clump of cells that is not typically malignant. Their outlook was sunny. Riffing on the musical South Pacific, my cardiologist said, “We’re gonna pop that orange right out of your chest and send you on your way.” While I was recovering from surgery, the pathology report came back and the news was bad—it wasn’t a benign teratoma after all, but rather a malignant cancer called synovial sarcoma. Because of its location, embedded in my heart wall, the surgeon could not remove all of the cancer cells. Doing so would have rendered my heart unable to pump blood. The oncologist told me to expect to live an additional six to 18 months. (c) 2022 by The Atlantic Monthly Group.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 28138 - Posted: 01.05.2022

Iris Berent How can a cellist play like an angel? Why am I engrossed in my book when others struggle with reading? And while we’re at it, can you tell me why my child won’t stop screaming? Now neuroscience offers the answers—or so say the news headlines. The brains of musicians “really do” differ from those of the rest of us. People with dyslexia have different neural connections than people without the condition. And your screaming toddler’s tantrums originate from her amygdala, a brain region linked to emotions. It’s all in the brain! Neuroscience is fascinating. But it is not just the love of science that kindles our interest in these stories. Few of us care for the technical details of how molecules and electrical charges inthe brain give rise to our mental life. Furthermore, invoking the brain does not always improve our understanding. You hardly need a brain scan to tell that your toddler is enraged. Nor is it surprising that an amateur cellist’s brain works differently than Yo-Yo Ma’s—or that the brains of typical and dyslexic readers differ in some way. Where else would those differences reside? These sorts of science news stories speak to a bias: As numerous experiments have demonstrated, we have a blind spot for the brain. In classic work on the “seductive allure of neuroscience,” a team of researchers at Yale University presented participants with a psychological phenomenon (for instance, children learning new words), along with two explanations. One invoked a psychological mechanism, and the other was identical except it also dropped in a mention of a brain region. The brain details were entirely superfluous—they did nothing to improve the explanation, as judged by neuroscientists. Yet laypeople thought they did, so much so that once the brain was invoked, participants overlooked gross logical flaws in the accounts. © 2021 Scientific American,

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 1: Introduction: Scope and Outlook
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 1: Cells and Structures: The Anatomy of the Nervous System
Link ID: 28105 - Posted: 12.11.2021

To eavesdrop on a brain, one of the best tools neuroscientists have is the fMRI scan, which helps map blood flow, and therefore the spikes in oxygen that occur whenever a particular brain region is being used. It reveals a noisy world. Blood oxygen levels vary from moment to moment, but those spikes never totally flatten out. “Your brain, even resting, is not going to be completely silent,” says Poortata Lalwani, a PhD student in cognitive neuroscience at the University of Michigan. She imagines the brain, even at its most tranquil, as kind of like a tennis player waiting to return a serve: “He’s not going to be standing still. He’s going to be pacing a little bit, getting ready to hit the backhand.” Many fMRI studies filter out that noise to find the particular spikes researchers want to scrutinize. But for Lalwani, that noise is the most telling signal of all. To her, it’s a signal of cognitive flexibility. Young, healthy brains tend to have signals with a lot of variability in blood oxygen levels from moment to moment. Older ones vary less, at least in certain regions of the brain. About a decade ago, scientists first showed the link between low neural signal variability and the kind of cognitive decline that accompanies healthy aging, rather than specific dementias. A brain’s noisiness is a solid proxy for details that are more abstract, Lalwani says: “How efficient information transfer is, how well-connected the neural networks are, in general how well-functioning the underlying neural network is.” But why that change happens with age has been a mystery. So has the question of whether it’s reversible. © 2021 Condé Nast.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 28091 - Posted: 11.24.2021

Anil Ananthaswamy How our brain, a three-pound mass of tissue encased within a bony skull, creates perceptions from sensations is a long-standing mystery. Abundant evidence and decades of sustained research suggest that the brain cannot simply be assembling sensory information, as though it were putting together a jigsaw puzzle, to perceive its surroundings. This is borne out by the fact that the brain can construct a scene based on the light entering our eyes, even when the incoming information is noisy and ambiguous. Consequently, many neuroscientists are pivoting to a view of the brain as a “prediction machine.” Through predictive processing, the brain uses its prior knowledge of the world to make inferences or generate hypotheses about the causes of incoming sensory information. Those hypotheses — and not the sensory inputs themselves — give rise to perceptions in our mind’s eye. The more ambiguous the input, the greater the reliance on prior knowledge. “The beauty of the predictive processing framework [is] that it has a really large — sometimes critics might say too large — capacity to explain a lot of different phenomena in many different systems,” said Floris de Lange, a neuroscientist at the Predictive Brain Lab of Radboud University in the Netherlands. However, the growing neuroscientific evidence for this idea has been mainly circumstantial and is open to alternative explanations. “If you look into cognitive neuroscience and neuro-imaging in humans, [there’s] a lot of evidence — but super-implicit, indirect evidence,” said Tim Kietzmann of Radboud University, whose research lies in the interdisciplinary area of machine learning and neuroscience. All Rights Reserved © 2021

Related chapters from BN: Chapter 17: Learning and Memory; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 13: Memory and Learning; Chapter 7: Vision: From Eye to Brain
Link ID: 28080 - Posted: 11.17.2021

Catherine Offord Earlier this year, Brian Butterworth decided to figure out how many numbers the average person encounters in a day. He picked a Saturday for his self-experiment—as a cognitive neuroscientist and professor emeritus at University College London, Butterworth works with numbers, so a typical weekday wouldn’t have been fair. He went about his day as usual, but kept track of how frequently he saw or heard a number, whether that was a symbol, such as 4 or 5, or a word such as “four” or “five.” He flicked through the newspaper, listened to the radio, popped out for a bit of shopping (taking special note of price tags and car license plates), and then, at last, sat down to calculate a grand total. “Would you like to take a guess?” he asks me when we speak over Zoom a couple of weeks later. I hazard that it’s well into the hundreds, but admit I’ve never thought about it before. He says: “I reckoned that I experienced about a thousand numbers an hour. A thousand numbers an hour is sixteen thousand numbers a day, is about five or six million a year. . . . That’s an awful lot of numbers.” Butterworth didn’t conduct his thought experiment just to satisfy his own curiosity. He’s including the calculation in an upcoming book, Can Fish Count?, slated for publication next year. In it, he argues that humans and other animals are constantly exposed to and make use of numbers—not just in the form of symbols and words, but as quantities of objects, of events, and of abstract concepts. Butterworth is one of several researchers who believe that the human brain can be thought of as having a “sense” for number, and that we, like our evolutionary ancestors, are neurologically hardwired to perceive all sorts of quantities in our environments, whether that serves for selecting the bush with more fruit on it, recognizing when a few predators on the horizon become too many, or telling from a show of hands when a consensus has been reached. © 1986–2021 The Scientist.

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 28051 - Posted: 10.27.2021

Annie Melchor After finishing his PhD in neuroscience in 2016, Thomas Andrillon spent a year road-tripping around Africa and South America with his wife. One evening, on a particularly difficult road in Patagonia, his mind began to wander and he ended up accidentally flipping the car. Luckily, no one was hurt. As locals rushed in to help, they asked Andrillon what had happened. Was there an animal on the road? Had he fallen asleep at the wheel? “I had difficulty explaining that I was just thinking about something else,” he remembers. This experience made him think. What had happened? What was going on in his brain when his mind began to wander? In 2017, Andrillon started his postdoctoral research with neuroscientists Naotsugu Tsuchiya and Joel Pearson at Monash University in Melbourne. Shortly after, Tsuchiya and Andrillon teamed up with philosopher Jennifer Windt, also at Monash, to dive into the neural basis of mind wandering. Initially, Andrillon says, they wanted to know if they could detect mind wandering from facial expressions, recalling how teachers claim to be very good at knowing when their students are not paying attention. So they did a pilot experiment in which they filmed their test subjects performing a tedious, repetitive task. After reviewing the videos, one of Andrillon’s students came to him, concerned. “I think we have a problem,” said the student. “[The subjects] look exhausted.” Sure enough, even though all the study participants were awake, they were obviously struggling to not fall asleep, says Andrillon. It was this observation that gave them the idea to broaden their focus, and start looking at the connection between wavering attention and sleep. © 1986–2021 The Scientist.=

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 10: Biological Rhythms and Sleep
Link ID: 28016 - Posted: 10.02.2021

By Carl Zimmer Dr. Adam Zeman didn’t give much thought to the mind’s eye until he met someone who didn’t have one. In 2005, the British neurologist saw a patient who said that a minor surgical procedure had taken away his ability to conjure images. Over the 16 years since that first patient, Dr. Zeman and his colleagues have heard from more than 12,000 people who say they don’t have any such mental camera. The scientists estimate that tens of millions of people share the condition, which they’ve named aphantasia, and millions more experience extraordinarily strong mental imagery, called hyperphantasia. In their latest research, Dr. Zeman and his colleagues are gathering clues about how these two conditions arise through changes in the wiring of the brain that join the visual centers to other regions. And they’re beginning to explore how some of that circuitry may conjure other senses, such as sound, in the mind. Eventually, that research might even make it possible to strengthen the mind’s eye — or ear — with magnetic pulses. “This is not a disorder as far as I can see,” said Dr. Zeman, a cognitive scientist at the University of Exeter in Britain. “It’s an intriguing variation in human experience.” The patient who first made Dr. Zeman aware of aphantasia was a retired building surveyor who lost his mind’s eye after minor heart surgery. To protect the patient’s privacy, Dr. Zeman refers to him as M.X. When M.X. thought of people or objects, he did not see them. And yet his visual memories were intact. M.X. could answer factual questions such as whether former Prime Minister Tony Blair has light-colored eyes. (He does.) M.X. could even solve problems that required mentally rotating shapes, even though he could not see them. I came across M.X.’s case study in 2010 and wrote a column about it for Discover magazine. Afterward, I got emails from readers who had the same experience but who differed from M.X. in a remarkable way: They had never had a mind’s eye to begin with. © 2021 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 10: Vision: From Eye to Brain
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 7: Vision: From Eye to Brain
Link ID: 27851 - Posted: 06.11.2021

By Veronique Greenwood The coin is in the illusionist’s left hand, now it’s in the right — or is it? Sleight of hand tricks are old standbys for magicians, street performers and people who’ve had a little too much to drink at parties. Sign up for Science Times: Get stories that capture the wonders of nature, the cosmos and the human body. On humans, the deceptions work pretty well. But it turns out that birds don’t always fall for the same illusions. Researchers in a small study published on Monday in the Proceedings of the National Academy of Sciences reported on Eurasian jays, birds whose intelligence has long been studied by comparative psychologists. The jays were not fooled, at least by tricks that rely on the viewer having certain expectations about how human hands work. However, they were fooled by another kind of trick, perhaps because of how their visual system is built. Magic tricks often play on viewers’ expectations, said Elias Garcia-Pelegrin, a graduate student at the University of Cambridge who is an author of the study. That magic can reveal the viewers’ assumptions suggests that tricks can be a way into understanding how other creatures see the world, he and his colleagues reasoned. Eurasian jays are not newcomers to subterfuge: To thwart thieves while they’re storing food, jays will perform something very like sleight of hand — sleight of beak, if you will — if another jay is watching. They’ll pretend to drop the food in a number of places, so its real location is concealed. © 2021 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27843 - Posted: 06.02.2021

By Veronique Greenwood Last spring, robins living on an Illinois tree farm sat on some unusual eggs. Alongside the customary brilliant blue ovoids they had laid were some unusually shaped objects. Although they had the same color, some were long and thin, stretched into pills. Others were decidedly pointy — so angular, in fact, that they bore little resemblance to eggs at all. If robins played Dungeons and Dragons, they might have thought, “Why do I have an eight-sided die in my nest?” The answer: Evolutionary biologists were gauging how birds decide what belongs in their nests, and what is an invasive piece of detritus that they need to throw out. Thanks to the results of this study, published Wednesday in Royal Society Open Science, we now know what the robins thought of the eggs, which were made of plastic and had been 3-D printed by the lab of Mark Hauber, a professor of animal behavior at the University of Illinois, Urbana-Champaign and a fellow at Hanse-Wissenschaftskolleg in Delmenhorst, Germany. He and his colleagues reported that the thinner the fake eggs got, the more likely the birds were to remove them from the nest. But curiously, the robins were more cautious about throwing out the pointy objects like that eight-sided die, which were closer in width to their own eggs. Birds, the results suggest, are using rules of thumb that are not intuitive to humans when they decide what is detritus and what is precious cargo. It’s not as uncommon as you’d think for robins to find foreign objects in their nests. They play host to cowbirds, a parasitic species that lays eggs in other birds’ nests, where they hatch and compete with the robins’ own offspring for nourishment. Confronted with a cowbird egg, which is beige and squatter than its blue ovals, parent robins will often push the parasite’s eggs out. That makes the species a good candidate for testing exactly what matters when it comes to telling their own eggs apart from other objects, Dr. Hauber said. © 2021 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27669 - Posted: 01.30.2021

By Cathleen O’Grady Golden paper wasps have demanding social lives. To keep track of who’s who in a complex pecking order, they have to recognize and remember many individual faces. Now, an experiment suggests the brains of these wasps process faces all at once—similar to how human facial recognition works. It’s the first evidence of insects identifying one another using “holistic” processing, and a clue to why social animals have evolved such abilities. The finding suggests holistic processing might not require big, complex brains, says Rockefeller University neuroscientist Winrich Freiwald, who wasn’t involved with the research. “It must be so hard to train these animals, so I find it fascinating how one can get such clear results,” he says. Most people recognize faces not from specific features, such as a unique beauty spot or the shape of a nose, but by processing them as a whole, taking in how all the features hang together. Experiments find that people are good at discriminating between facial features—like noses—when they see them in the context of a face but find it much harder when the features are seen in isolation. Other primates, including chimpanzees and rhesus macaques, use such holistic processing. And studies have even found that honey bees and wasps, trained to recognize human faces, have more difficulty with partial faces than whole ones, suggesting holistic processing. But biologists didn’t know whether insects actually use holistic processing naturally with each other. © 2021 American Association for the Advancement of Science.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27655 - Posted: 01.20.2021

By Amy Barrett Amy Barrett: So, let’s start at the very beginning. What’s involved in forming a thought? David Badre: Forming a thought is sort of the core problem, that’s a big mystery in human psychology and neuroscience. This book is kind of asking the next question; how do we go from a thought that we have, that we form. Some idea about what we want to do, some task we want to take, some goal that we have. How do we translate that into the actions we need to do to actually achieve that? And that’s something that we kind of take for granted. We do it at lots of times during the course of our day. And these can be big goals. You know, you want to go to university or you want to start a business or something. But it can also be just simple everyday goals like going and getting a cup of coffee, which is the example I use in the book. All of that requires making a link between this idea you have, a goal you have, and the actual actions. It turns out that’s not a trivial thing. The brain requires a special class of mechanisms to do that. And those are called cognitive control mechanisms by scientists. And that’s really what the book is about, because it affects so many aspects of our lives. How we do that translation between our thoughts and how we behave. (C)BBC

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27654 - Posted: 01.20.2021

By Veronique Greenwood Zipping through water like shimmering arrowheads, cuttlefish are swift, sure hunters — death on eight limbs and two waving tentacles for small creatures in their vicinity. They morph to match the landscape, shifting between a variety of hues and even textures, using tiny structures that expand and contract beneath their skin. They even seem to have depth perception, researchers using tiny 3-D vision glasses found, placing them apart from octopuses and squids. And their accuracy at striking prey is remarkable. But for cuttlefish, these physical feats in pursuit of food are not the whole story. A new study published this month in the journal Royal Society Open Science shows that there is even more to cuttlefish cognition than scientists may have known. The sea creatures appear to be capable of performing calculations that are more complicated than simply “more food is better.” Presented with a choice between one shrimp or two, they will actually choose the single shrimp when they have learned through experience that they are rewarded for this choice. While the braininess of their octopus cousins gets a lot of attention, researchers who study animal cognition have uncovered surprising talents in cuttlefish over the years. For instance, the cephalopods will hunt fewer crabs during the day if they learn that shrimp, their preferred food, is predictably available during the night. That shows that they can think ahead. Chuan-Chin Chiao, a biologist at National Tsing Hua University in Taiwan, and an author of the current paper alongside his colleague Tzu-Hsin Kuo, has found in the past that cuttlefish that are hungry will choose a bigger, harder-to-catch shrimp to attack, and those that are not will choose smaller, easier-to-catch ones. But researchers have also found that animals do not always make decisions that seem logical at first glance. Like humans, whose behavior rarely fits economists’ visions of what an ideal, rational creature would do, animals respond to their environments using learned experiences. © 2020 The New York Times Company

Related chapters from BN: Chapter 6: Evolution of the Brain and Behavior; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27637 - Posted: 12.31.2020

Sarah Sloat Patience, you might have heard, is a virtue. That’s why so many Puritans named their daughters “Patience” in the 1600s. It is the ability to wait calmly in the face of frustration or adversity. Like Penelope weaving while waiting for Odysseus, patient people wait for their partners to finish a Netflix show they’re binging. Impatient people do not. But despite the societal framing of patience as a measurement of character, in its purest sense, patience is a chemically induced output of the brain. However, exactly what goes on in the brain that leads to patience isn’t well understood. A new study involving mice takes a step toward understanding patience by pointing to the role of serotonin, and how it interacts with different brain structures. Serotonin is a chemical and a neurotransmitter, meaning it sends messages throughout the brain. It influences many behaviors, including mood and sleep. In a paper recently released in the journal Science Advances, scientists argue that serotonin influences specific areas of the brain to promote patient behavior. But critically, this process only occurs if there’s already “high expectation or confidence” that being patient will lead to future rewards. First author Katsuhiko Miyazaki is a scientist at the Okinawa Institute of Science and Technology in Japan who researches the relationship between serotonergic neural activity and animal behavior. He tells me this study originated from an interest in revealing how projections of serotonin promote waiting for future rewards.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 13: Memory and Learning
Link ID: 27615 - Posted: 12.09.2020

By Kashmir Hill and Jeremy White There are now businesses that sell fake people. On the website Generated.Photos, you can buy a “unique, worry-free” fake person for $2.99, or 1,000 people for $1,000. If you just need a couple of fake people — for characters in a video game, or to make your company website appear more diverse — you can get their photos for free on ThisPersonDoesNotExist.com. Adjust their likeness as needed; make them old or young or the ethnicity of your choosing. If you want your fake person animated, a company called Rosebud.AI can do that and can even make them talk. These simulated people are starting to show up around the internet, used as masks by real people with nefarious intent: spies who don an attractive face in an effort to infiltrate the intelligence community; right-wing propagandists who hide behind fake profiles, photo and all; online harassers who troll their targets with a friendly visage. The A.I. system sees each face as a complex mathematical figure, a range of values that can be shifted. Choosing different values — like those that determine the size and shape of eyes — can alter the whole image. For other qualities, our system used a different approach. Instead of shifting values that determine specific parts of the image, the system first generated two images to establish starting and end points for all of the values, and then created images in between. The creation of these types of fake images only became possible in recent years thanks to a new type of artificial intelligence called a generative adversarial network. In essence, you feed a computer program a bunch of photos of real people. It studies them and tries to come up with its own photos of people, while another part of the system tries to detect which of those photos are fake. The back-and-forth makes the end product ever more indistinguishable from the real thing. The portraits in this story were created by The Times using GAN software that was made publicly available by the computer graphics company Nvidia. © 2020 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27589 - Posted: 11.21.2020

Diana Kwon It all began with a cough. Three years ago Tracey McNiven, a Scottish woman in her mid-30s, caught a bad chest infection that left her with a persistent cough that refused to subside, even after medication. A few months later strange symptoms started to appear. McNiven noticed numbness spreading through her legs and began to feel that their movement was out of her control. When she walked, she felt like a marionette, with someone else pulling the strings. Over the course of two weeks the odd loss of sensation progressively worsened. Then, one evening at home, McNiven's legs collapsed beneath her. “I was lying there, and I felt like I couldn't breathe,” she recalls. “I couldn't feel below my waist.” McNiven's mother rushed her to the hospital where she remained for more than half a year. During her first few weeks in the hospital, McNiven endured a barrage of tests as doctors tried to uncover the cause of her symptoms. It could be a progressive neurodegenerative condition such as motor neuron disease, they thought. Or maybe it was multiple sclerosis, a disease in which the body's own immune cells attack the nervous system. Bafflingly, however, the brain scans, blood tests, spinal taps and everything else came back normal. McNiven's predicament is not uncommon. According to one of the most comprehensive assessments of neurology clinics to date, roughly a third of patients have neurological symptoms that are deemed to be either partially or entirely unexplained. These may include tremor, seizures, blindness, deafness, pain, paralysis and coma and can parallel those of almost any neurological disease. In some patients, such complications can persist for years or even decades; some people require wheelchairs or cannot get out of bed. Although women are more often diagnosed than men, such seemingly inexplicable illness can be found in anyone and across the life span. © 2020 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 11: Emotions, Aggression, and Stress
Link ID: 27586 - Posted: 11.18.2020

By Benedict Carey Merriam-Webster’s defines a time warp as a “discontinuity, suspension or anomaly” in the otherwise normal passage of time; this year all three terms could apply. It seems like March happened 10 years ago; everyday may as well be Wednesday, and still, somehow, here come the holidays — fast, just like every year. Some bard or novelist may yet come forth to help explain the paradoxes of pandemic time, both its Groundhog Days and the blurs of stress and fear for those on the front lines, or who had infectious people in their household. But brain science also has something to say about the relationship between perceived time and the Greenwich Mean variety, and why the two may slip out of sync. In a new study, a research team based in Dallas reported the first strong evidence to date of so-called “time cells” in the human brain. The finding, posted by the journal PNAS, was not unexpected: In recent years, several research groups have isolated neurons in rodents that track time intervals. It’s where the scientists look for these cells, and how they identified them, that provide some insight into the subjective experiences of time. “The first thing to say is that, strictly speaking, there is no such thing as ‘time cells’ in the brain,” said Gyorgy Buzsaki, a neuroscientist at New York University who was not involved in the new research. “There is no neural clock. What happens in the brain is neurons change in response to other neurons.” He added, “Having said that, it’s a useful concept to talk about how this neural substrate represents the passage of what we call time.” In the new study, a team led by Dr. Bradley Lega, a neurosurgeon at UT Southwestern Medical Center, analyzed the firing of cells in the medial temporal area, a region deep in the brain that is essential for memory formation and retrieval. It’s a natural place to look: Memories must be somehow “time-stamped” to retain some semblance of sequence, or chronological order. © 2020 The New York Times Company

Related chapters from BN: Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27576 - Posted: 11.10.2020

By Scott Barry Kaufman Do you get excited and energized by the possibility of learning something new and complex? Do you get turned on by nuance? Do you get really stimulated by new ideas and imaginative scenarios? If so, you may have an influx of dopamine in your synapses, but not where we traditionally think of this neurotransmitter flowing. In general, the potential for growth from disorder has been encoded deeply into our DNA. We didn’t only evolve the capacity to regulate our defensive and destructive impulses, but we also evolved the capacity to make sense of the unknown. Engaging in exploration allows us to integrate novel or unexpected events with existing knowledge and experiences, a process necessary for growth. Dopamine production is essential for growth. But there are so many misconceptions about the role of dopamine in cognition and behavior. Dopamine is often labeled the “feel-good molecule,” but this is a gross mischaracterization of this neurotransmitter. As personality neuroscientist Colin DeYoung (a close colleague of mine) notes, dopamine is actually the “neuromodulator of exploration.” Dopamine’s primary role is to make us want things, not necessarily like things. We get the biggest rush of dopamine coursing through our brains at the possibility of reward, but this rush is no guarantee that we’ll actually like or even enjoy the thing once we get it. Dopamine is a huge energizing force in our lives, driving our motivation to explore and facilitating the cognitive and behavioral processes that allow us to extract the most delights from the unknown. If dopamine is not all about feeling good, then why does the feel-good myth persist in the public imagination? I think it’s because so much research on dopamine has been conducted with regard to its role in motivating exploration toward our more primal “appetitive” rewards, such as chocolate, social attention, social status, sexual partners, gambling or drugs like cocaine. © 2020 Scientific American

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 4: The Chemistry of Behavior: Neurotransmitters and Neuropharmacology
Related chapters from MM:Chapter 14: Attention and Higher Cognition
Link ID: 27549 - Posted: 10.26.2020

Shawna Williams In Greek mythology, Orpheus descends to the underworld and persuades Hades to allow him to take his dead wife, Eurydice, back to the realm of the living. Hades agrees, but tells Orpheus that he must not look back until he has exited the underworld. Despite the warning, Orpheus glances behind him on his way out to check whether Eurydice is indeed following him—and loses her forever. The story hints at a dark side to curiosity, a drive to seek certain kinds of knowledge even when doing so is risky—and even if the information serves no practical purpose at the time. In fact, the way people pursue information they’re curious about can resemble the drive to attain more tangible rewards such as food—a parallel that hasn’t been lost on scientists. To investigate the apparent similarity between curiosity and hunger, researchers led by Kou Murayama of the University of Reading in the UK recently devised an experiment to compare how the brain processes desires for food and knowledge, and the risks people are willing to take to satisfy those desires. Beginning in 2016, the team recruited 32 volunteers and instructed them not to eat for at least two hours before coming into the lab. After they arrived, the volunteers’ fingers were hooked up to electrodes that could deliver a weak current, and researchers calibrated the level of electricity to what each participant reported was uncomfortable, but not painful. Then, still hooked up to the electrodes, the volunteers were asked to gamble: they viewed either a photo of a food item or a video of a magician performing a trick, followed by a visual depiction of their odds of “winning” that round (which ranged from 1:6 to 5:6). © 1986–2020 The Scientist.

Related chapters from BN: Chapter 18: Attention and Higher Cognition; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 14: Attention and Higher Cognition; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 27535 - Posted: 10.21.2020