Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 29035

By Yasemin Saplakoglu Imagine you’re on a first date, sipping a martini at a bar. You eat an olive and patiently listen to your date tell you about his job at a bank. Your brain is processing this scene, in part, by breaking it down into concepts. Bar. Date. Martini. Olive. Bank. Deep in your brain, neurons known as concept cells are firing. You might have concept cells that fire for martinis but not for olives. Or ones that fire for bars — perhaps even that specific bar, if you’ve been there before. The idea of a “bank” also has its own set of concept cells, maybe millions of them. And there, in that dimly lit bar, you’re starting to form concept cells for your date, whether you like him or not. Those cells will fire when something reminds you of him. Concept neurons fire for their concept no matter how it is presented: in real life or a photo, in text or speech, on television or in a podcast. “It’s more abstract, really different from what you’re seeing,” said Elizabeth Buffalo (opens a new tab), a neuroscientist at the University of Washington. For decades, neuroscientists mocked the idea that the brain could have such intense selectivity, down to the level of an individual neuron: How could there be one or more neurons for each of the seemingly countless concepts we engage with over a lifetime? “It’s inefficient. It’s not economic,” people broadly agreed, according to the neurobiologist Florian Mormann (opens a new tab) at the University of Bonn. But when researchers identified concept cells in the early 2000s, the laughter started to fade. Over the past 20 years, they have established that concept cells not only exist but are critical to the way the brain abstracts and stores information. New studies, including one recently published in Nature Communications, have suggested that they may be central to how we form and retrieve memory. © 2025 Simons Foundation

Keyword: Learning & Memory; Attention
Link ID: 29639 - Posted: 01.22.2025

By Holly Barker Previously unrecognized genetic changes on the X chromosome of autistic people could explain the higher prevalence of the condition among men and boys than among women and girls, according to two new studies. About 60 variants are more common in people with autism than in those without the condition, an analysis of roughly 15,000 X chromosomes revealed. Several of the variants are in Xp22.11, a region of the X chromosome linked to autism in boys and men. In the second study, the team pinpointed 27 autism-linked variants in DDX53, one of the genes in the vulnerable region that had not been tied to the condition in past research. Those findings could help explain why autism is diagnosed three to four times more often in boys than girls, according to the study investigators, led by Stephen Scherer, chief of research at SickKids Research Institute. Although that disparity is likely influenced by social factors—male-only studies could lead to autism being less recognizable in women and girls, and girls may be conditioned to mask their autism traits—there is also a clear biological component. The X chromosome plays an outsized role in brain development, and many genes on the chromosome are strongly linked to autism, previous studies have found. Still, the sex chromosomes have been mostly ignored in genetic searches of autism variants, says Aaron Besterman, associate clinical professor of psychiatry at the University of California, San Diego, who was not involved in the work. “It’s been a dirty little secret that for a long time the X chromosome has not been well interrogated from a genetics perspective,” he says. Sex chromosomes are often sidelined because of difficulties interpreting data, given that men possess half the number of X-linked genes as women. What’s more, random inactivation of X chromosomes makes it hard to tell how a single variant is expressed in female tissues. And the existence of pseudoautosomal regions—stretches of DNA that behave like regular chromosomes and escape inactivation—complicates matters further. © 2025 Simons Foundation

Keyword: Autism; Sexual Behavior
Link ID: 29638 - Posted: 01.22.2025

Hannah Devlin Science correspondent A groundbreaking NHS trial will attempt to boost patients’ mood using a brain-computer-interface that directly alters brain activity using ultrasound. The device, which is designed to be implanted beneath the skull but outside the brain, maps activity and delivers targeted pulses of ultrasound to “switch on” clusters of neurons. Its safety and tolerability will be tested on about 30 patient in the £6.5m trial, funded by the UK’s Advanced Research and Invention Agency (Aria). In future, doctors hope the technology could revolutionise the treatment of conditions such as depression, addiction, OCD and epilepsy by rebalancing disrupted patterns of brain activity. Jacques Carolan, Aria’s programme director, said: “Neurotechnologies can help a much broader range of people than we thought. Helping with treatment resistant depression, epilepsy, addiction, eating disorders, that is the huge opportunity here. We are at a turning point in both the conditions we hope we can treat and the new types of technologies emerging to do that.” The trial follows rapid advances in brain-computer-interface (BCI) technology, with Elon Musk’s company Neuralink launching a clinical trial in paralysis patients last year and another study restoring communication to stroke patients by translating their thoughts directly into speech. However, the technologies raise significant ethical issues around the ownership and privacy of data, the possibility of enhancement and the risk of neuro-discrimination, whereby brain data might be used to judge a person’s suitability for employment or medical insurance. © 2025 Guardian News & Media Limited

Keyword: Depression; Brain imaging
Link ID: 29637 - Posted: 01.22.2025

By Angie Voyles Askham More than 150 years after the first known description of Huntington’s disease and 32 years after the causative gene, HTT, was identified, new evidence has emerged to explain how variants linked to the disease devastate the brain: The toxicity comes not from the initial variant itself but rather from its dynamic expansion past a set threshold in specific cells, according to a study published today in Cell. The results help explain why most people with Huntington’s disease don’t start to show symptoms—including muscle rigidity, irregular movements and severe psychological issues—until age 30 to 50, with the gradual loss of striatal projection neurons, also called medium spiny neurons, says co-lead researcher Steven McCarroll, professor of biomedical science and genetics at Harvard Medical School. “We hadn’t been thinking about mutations as dynamic things” that become toxic only later in life, he says. The HTT variants associated with Huntington’s disease all have extra repeats of the DNA triplet CAG. Typical people carry about 15 to 30 of these repeats, and those with the disease tend to have 40 or more. The disease-linked expansions, which are known to grow even larger over time, result in a gangly version of the Huntington’s protein that is thought to cause neurons to malfunction and degenerate. But the expansion does not appear to affect a cell’s biology until it exceeds 150 CAG copies, according to the new study. And the repeats accumulate quietly over the course of years, and at different rates for different cells. Striatal projection neurons with more than 150 repeats have severely dysregulated transcriptomes, McCarroll and his colleagues found by analyzing gene expression in postmortem tissue from people with Huntington’s disease. But other cell types in the striatum, including oligodendrocytes and interneurons, do not end up with as many repeats, nor do they undergo similar transcriptomic changes, the work shows. © 2025 Simons Foundation

Keyword: Huntingtons; Genes & Behavior
Link ID: 29636 - Posted: 01.22.2025

Rachael Elward Lauren Ford Severance, which imagines a world where a person’s work and personal lives are surgically separated, will soon return to Apple TV+ for a second season. While the concept of this gripping piece of science fiction is far-fetched, it touches on some interesting neuroscience. Can a person’s mind really be surgically split in two? Remarkably, “split-brain” patients have existed since the 1940s. To control epilepsy symptoms, these patients underwent a surgery to separate the left and right hemispheres. Similar surgeries still happen today. Later research on this type of surgery showed that the separated hemispheres of split-brain patients could process information independently. This raises the uncomfortable possibility that the procedure creates two separate minds living in one brain. In season one of Severance, Helly R (Britt Lower) experienced a conflict between her “innie” (the side of her mind that remembered her work life) and her “outie” (the side outside of work). Similarly, there is evidence of a conflict between the two hemispheres of real split-brain patients. When speaking with split-brain patients, you are usually communicating with the left hemisphere of the brain, which controls speech. However, some patients can communicate from their right hemisphere by writing, for example, or arranging Scrabble letters. A young patient was asked what job he would like in the future. His left hemisphere chose an office job making technical drawings. His right hemisphere, however, arranged letters to spell “automobile racer”. Split brain patients have also reported “alien hand syndrome”, where one of their hands is perceived to be moving of its own volition. These observations suggest that two separate conscious “people” may coexist in one brain and may have conflicting goals. In Severance, however, both the innie and the outie have access to speech. This is one indicator that the fictional “severance procedure” must involve a more complex separation of the brain’s networks. © 2010–2025, The Conversation US, Inc.

Keyword: Learning & Memory; Consciousness
Link ID: 29635 - Posted: 01.18.2025

By Phie Jacobs For more than 30 years, scientists have known the genetic culprit behind Huntington disease, a devastating neurodegenerative disorder that causes cells deep in the brain to sicken and die. But they couldn’t account for why people who inherit the faulty gene variant take so long to develop symptoms, or why disease progression varies so widely from person to person. A study published today in Cell helps explain: In the brain cells that die off in Huntington, a repetitive stretch of a gene’s DNA gets longer and longer over a person’s life, and this accelerating expansion turns deadly to the cell—and ultimately to the person. The findings represent “a really remarkable insight,” says Leslie Thompson, a neuroscientist at the University of California, Irvine who wasn’t involved in the new research. “This study and some others are changing the way that we’re thinking about the disease.” People who develop Huntington inherit a flawed version of the HTT gene, which produces a protein called huntingtin. This gene contains an unusual stretch of DNA, where a sequence of three of its nucleotide bases—cytosine, adenine, and guanine, or CAG in genetic parlance—are repeated multiple times in a row. And although most people inherit versions of HTT with about 15 to 30 consecutive CAG repeats and never develop Huntington, those with 40 or more in the gene almost always have symptoms later in life, including psychological and cognitive problems and uncontrolled, jerking movements known as chorea. The genetic stutter produces an abnormally large, unstable version of the huntingtin protein, which forms clumps inside brain cells. The condition usually leads to early death, often from issues related to difficulty swallowing, injuries from falls, or suicide. The longer a person’s stretch of repeats, the earlier the disorder rears its head. Scientists originally thought the number of CAG repeats only increased as the HTT gene was passed down through generations; a child of a parent with Huntington might themselves develop the condition at an earlier age. But it turns out the length of this genetic “stutter” can change over a person’s life in at least some of their cells. A 2003 study analyzed brain samples donated by people who had died of Huntington and found shockingly large CAG expansions in a part of the brain known as the striatum.

Keyword: Huntingtons; Genes & Behavior
Link ID: 29634 - Posted: 01.18.2025

By Anna Victoria Molofsky Twenty years ago, a remarkable discovery upended our understanding of the range of elements that can shape neuronal function: A team in Europe demonstrated that enzymatic digestion of the extracellular matrix (ECM)—a latticework of proteins that surrounds all brain cells—could restore plasticity to the visual cortex even after the region’s “critical period” had ended. Other studies followed, showing that ECM digestion could also alter learning in the hippocampus and other brain circuits. These observations established that proteins outside neurons can control synaptic plasticity. We now know that up to 20 percent of the brain is extracellular space, filled with hundreds of ECM proteins—a “matrisome” that plays multiple roles, including modulating synaptic function and myelin formation. ECM genes in the human brain are different than those in other species, suggesting that the proteins they encode could be part of what makes our brains unique and keeps them healthy. In a large population study, posted as a preprint on bioRxiv last year, that examined blood protein biomarkers of organ aging, for example, the presence of ECM proteins was most highly correlated with a youthful brain. Matrisome proteins are also dysregulated in astrocytes from people at high risk for Alzheimer’s disease, another study showed. Despite the influence of these proteins and the ongoing work of a few dedicated researchers, however, the ECM field has not caught on. I would challenge a room full of neuroscientists to name one protein in the extracellular matrix. To this day, the only ECM components most neuroscientists have heard of are “perineuronal nets”—structures that play an important role in stabilizing synapses but make up just a tiny fraction of the matrisome. A respectable scientific journal, covering its own paper that identified a critical impact of ECM, called it “brain goop.” © 2025 Simons Foundation

Keyword: Learning & Memory; Glia
Link ID: 29633 - Posted: 01.18.2025

By Aimee Cunningham If cigarettes contained very little of the chemical that keeps people smoking, it could help smokers move away from these deadly products. That’s the rationale behind a new rule proposed on January 15 by the U.S. Food and Drug Administration, which seeks to limit the amount of the addictive chemical nicotine in cigarettes. The reduced-nicotine cigarettes would have less than 5 percent of the amount of nicotine that’s generally found in regular cigarettes. The rule would also cap the nicotine in certain other products in which the tobacco leaves are burned. The FDA rule is just one step toward reduced-nicotine cigarettes and other combusted tobacco products becoming the standard. This process would probably take many years, depending on the priorities of future administrations and whether the tobacco industry challenges the rule in court, as it has the FDA’s rule placing graphic warning labels on their products. The 2009 Family Smoking Prevention and Tobacco Control gave the FDA the authority to require graphic warning labels and to reduce nicotine in tobacco products. The idea for a nicotine limit has been around for decades. And the evidence supporting drastically lowering the amount of nicotine in combusted tobacco products has grown during that time. Randomized controlled trials of reduced-nicotine cigarettes report that people using them end up smoking fewer cigarettes per day. That’s also the case for studies that focused on groups at higher risk for smoking, including people who are socioeconomically disadvantaged and people with mental health conditions. © Society for Science & the Public 2000–2025.

Keyword: Drug Abuse
Link ID: 29632 - Posted: 01.18.2025

Hannah Devlin Science correspondent A powerful psychedelic that is used in healing ceremonies by Indigenous groups in the Amazon is being trialled as a pioneering approach to reduce problematic alcohol consumption. Dimethyltryptamine (DMT) is the active ingredient in ayahuasca, a hallucinogenic brew that has been used for thousands of years by shamans in South America. Scientists based at University College London are testing whether a one-off dose of the drug could help hazardous drinkers who want to reduce their alcohol intake. Alcohol addiction is notoriously difficult to overcome and there are few effective therapies available. “The current treatments really don’t work for a large proportion of people. For alcohol addiction, 50% of people relapse within three months and around 60-70% within three years,” said Prof Ravi Das, who is co-leading the trial at University College London with Prof Jeremy Skipper. “Treatment itself hasn’t changed fundamentally in 70 years, so there’s a desperate need for new drugs and treatment approaches. To the extent that DMT might provide a more effective treatment approach, it is worth exploring.” In its pure form, DMT is one of the most powerful psychoactive substances found in nature. “The dose we chose reliably produces strong effects,” said Dr Greg Cooper, a research fellow at UCL, adding that this included total out-of-body experiences, fully immersive hallucinations and entering colourful geometric landscapes. © 2025 Guardian News & Media Limited

Keyword: Drug Abuse
Link ID: 29631 - Posted: 01.18.2025

By Shaena Montanari Just as romantic partners exhibit more similar brain waves than do strangers when, say, drawing on an Etch A Sketch toy together, animal pairs also show neural synchrony during social interactions and cooperation tasks. “Neural synchrony is something that happens in these minute-to-minute engagements that you have with another individual,” says Zoe Donaldson, associate professor of behavioral neuroscience at the University of Colorado Boulder. But over time, too, pairs in a relationship learn to infer what their partner is going to do, she adds. In prairie voles, at least, that learning process may unfold at the molecular level in the form of “transcriptional synchrony,” according to a preprint Donaldson and her colleagues posted on bioRxiv in November. Prairie voles are socially monogamous, and after two of them bond, gene-expression patterns in their nucleus accumbens—a forebrain region linked to reward and social interaction—start to align. It remains unclear whether this transcriptional synchrony causes pair bonding or only correlates with it, she adds, but in the meantime, it offers researchers a new place to hunt for the basis of these strong social ties. This new study “pushes the limits of what’s possible” technically, says Robert Froemke, professor in New York University’s Neuroscience Institute and otolaryngology department, who was not involved in the study. Though the existence of neural synchrony logically suggests that there may also be shared patterns of gene expression, “it’s still remarkable to actually have it documented,” he says. The new preprint offers the first evidence of transcriptional synchrony in prairie voles, Donaldson says, but a 2020 study revealed that fighting pairs of Betta splendens fish show a strong correlation of gene expression after 60 minutes of fighting, and only a weak correlation after 20 minutes. © 2025 Simons Foundation

Keyword: Sexual Behavior
Link ID: 29630 - Posted: 01.15.2025

By Giorgia Guglielmi Amid the rising buzz around Ozempic and similar weight-loss drugs, a group of 58 researchers is challenging the way obesity is defined and diagnosed, arguing that current methods fail to capture the complexity of the condition. They offer a more nuanced approach. The group’s revised definition, published in The Lancet Diabetes & Endocrinology1 on 14 January, focuses on how excess body fat, a measure called adiposity, affects the body, rather than relying only on body mass index (BMI), which links a person’s weight to their height. They propose two categories: preclinical obesity, when a person has extra body fat but their organs work normally, and clinical obesity, when excess fat harms the body’s organs and tissues. This shift could improve clinical care, public-health policies and societal attitudes toward obesity, says Elisabeth van Rossum, an endocrinologist at the Erasmus University Medical Center Rotterdam in the Netherlands. “Now the idea is, eat less, move more, and you’ll lose weight,” says van Rossum, who wasn’t involved in the work. Although a healthy lifestyle is important, she adds, “if it would be so simple, we wouldn’t have an epidemic, and this paper is an excellent contribution to the discussion about the complexity of obesity”. Global problem More than 1 billion people worldwide live with obesity, and the condition is linked to about 5 million deaths every year2 from disorders such as diabetes and cardiovascular disease. Because it is easy to measure and compare, BMI has long been used as a tool to diagnose obesity. But it doesn’t offer a full picture of a person’s health, because it doesn’t account for differences in body composition, such as muscle versus fat. © 2025 Springer Nature Limited

Keyword: Obesity
Link ID: 29629 - Posted: 01.15.2025

By Jennifer Kahn Here’s a strange story: One day two summers ago, I woke up because my arms — both of them — hurt. Not the way they do when you’ve slept in a funny position, but as if the tendons in my forearms and hands were moving through mud. What felt like sharp electric shocks kept sparking in my fingers and sometimes up the inside of my biceps and across my chest. Holding anything was excruciating: a cup, a toothbrush, my phone. Even doing nothing was miserable. It hurt when I sat with my hands in my lap, when I stood, when I lay flat on the bed or on my side. The slightest pressure — a bedsheet, a watch band, a bra strap — was intolerable. It was August, and every doctor seemed to be away on vacation. The ones I did manage to see were politely stumped. It wasn’t carpal tunnel, tennis elbow or any other injury they could identify. I did nothing unusual the day before: an hour of work on my laptop, followed by a visit with a friend. We sat in her backyard and talked. For the first few weeks, I could barely sleep. Over the following months, I lost weight — almost a pound a week. I couldn’t drive, or cook, or use my laptop for work, or even hold a book or a pen. I would have been bored, except the pain was so tiring that I could barely function. I spent the days shuffling around the house listening to audiobooks and doing voice-to-text searches for “nerve pain arms” with my phone flat on the table, then carefully, painfully, scrolling through the results. I think we’re past the point where I have to explain that chronic pain is not the result of imbalanced humors or a wandering uterus or possession by demons. But for more modern skeptics, this is where I should add that chronic pain also isn’t just “all in your head” or “not really that bad” — or any of the other ways in which people who suffer from it are still regularly gaslit and dismissed. Personally, I never had to contend with not being believed, almost certainly because I’m an otherwise healthy, reasonably well-off white woman with a clean medical history and no significant record of anxiety or depression. Instead, I was taken seriously. A whole gamut of tests was run. My wrists were X-rayed. I had an M.R.I. on my cervical spine. Each new doctor ordered new blood tests: some for vitamin deficiencies, others for autoimmune diseases like rheumatoid arthritis. © 2025 The New York Times Company

Keyword: Pain & Touch
Link ID: 29628 - Posted: 01.15.2025

By Meghan Rosen Baby Boomers may drive a bigger-than-expected boom in dementia cases. By 2060, 1 million U.S. adults per year will develop dementia, scientists predict January 13 in Nature Medicine. Dementia is a broad term encompassing many symptoms, including memory, reasoning and language difficulties that interfere with people’s daily lives. Researchers estimate that it currently affects more than 6 million people in the United States. “This is a huge problem,” says Josef Coresh, an epidemiologist at New York University’s Grossman School of Medicine. A rise in the projected number of dementia cases is not surprising, given the aging U.S. population ­­— but the extent of the rise stands out, he says. His team predicts that 42 percent of people in the United States who are over 55 years old will develop dementia sometime during their lifetime. That’s about double the percentage estimated by previous researchers. Coresh’s new estimate is based on a study population that’s larger — more than 15,000 people — and more diverse than earlier work. His team followed participants for years, in some cases decades, using several methods to identify dementia cases. They pored over hospital and death records, evaluated participants in person and screened them by phone. For the last decade, the researchers have been calling participants twice a year, Coresh says. That gave the team a window into people’s lives, revealing dementia cases that might otherwise have gone unreported. Though the team focused on dementia in people over age 55, risk doesn’t typically start ticking up for decades. And some populations were at greater risk than others, including women, Black people and those with a particular gene variant linked to Alzheimer’s disease. © Society for Science & the Public 2000–2025.

Keyword: Alzheimers
Link ID: 29627 - Posted: 01.15.2025

By Mitch Leslie Scientists think sleep is the brain’s rinse cycle, when fluid percolating through the organ flushes out chemical waste that accumulated while we were awake. But what propels this circulation has been uncertain. A study of mice, reported today in Cell, suggests regular contractions of blood vessels in the brain, stimulated by the periodic release of a chemical cousin of adrenaline, push the fluid along. “This is excellent science,” says neuroscientist Suzana Herculano-Houzel of Vanderbilt University, who wasn’t connected to the study. “They put a number of pieces of evidence together that tell a pretty compelling story.” The scientists also found that the sleep drug zolpidem, better known as Ambien, impedes the blood vessel oscillations and the fluid flow they promote, implying it could hamper cleansing. The finding could help researchers create new sleep aids that preserve this brain-scrubbing function. The brain lacks the lymphatic vessels that collect and move fluid in other parts of the body. But in 2012, neuroscientist Maiken Nedergaard of the University of Rochester Medical Center and colleagues identified an alternative drainage system in which cerebrospinal fluid, the liquid bathing the brain, seeps through the organ via tiny passages alongside blood vessels, sweeping away metabolic refuse and other unwanted molecules. Fluid flow through this so-called glymphatic system ramps up during sleep, they also reported. Studies from 
Nedergaard’s group and others suggest vigorous glymphatic clearance is beneficial: Circulation falters in Alzheimer’s disease and other neurodegenerative illnesses. Some researchers have challenged parts of this picture, however; a 2024 study, for example, suggested waste clearance is actually faster during waking than during sleep. In the new research, Nedergaard and her team wanted to pin down what keeps cerebrospinal fluid moving through the brain. But studying the mouse glymphatic system often involves anesthetizing the rodents, she says, which is very different from natural sleep. To avoid this problem, the scientists surgically implanted mice with electrodes and fiber optic filaments. Although the rodents are tethered to a set of cables, they can fall asleep normally while researchers track blood volume, electrical activity, and chemical levels and use light transmitted through the fiber optic lines to activate certain groups of neurons.

Keyword: Sleep
Link ID: 29626 - Posted: 01.11.2025

By Roni Caryn Rabin Water fluoridation is widely seen as one of the great public health achievements of the 20th century, credited with substantially reducing tooth decay. But there has been growing controversy among scientists about whether fluoride may be linked to lower I.Q. scores in children. A comprehensive federal analysis of scores of previous studies, published this week in JAMA Pediatrics, has added to those concerns. It found a significant inverse relationship between exposure levels and cognitive function in children. Higher fluoride exposures were linked to lower I.Q. scores, concluded researchers working for the National Institute of Environmental Health Sciences. None of the studies included in the analysis were conducted in the United States, where recommended fluoridation levels in drinking water are very low. At those amounts, evidence was too limited to draw definitive conclusions. Observational studies cannot prove a cause-and-effect relationship. Yet in countries with much higher levels of fluoridation, the analysis also found evidence of what scientists call a dose-response relationship, with I.Q. scores falling in lock step with increasing fluoride exposure. Children are exposed to fluoride through many sources other than drinking water: toothpaste, dental treatments and some mouthwashes, as well as black tea, coffee and certain foods, such as shrimp and raisins. Some drugs and industrial emissions also contain fluoride. For every one part per million increase in fluoride in urinary samples, which reflect total exposures from water and other sources, I.Q. points in children decreased by 1.63, the analysis found. “There is concern that pregnant women and children are getting fluoride from many sources,” said Kyla Taylor, an epidemiologist at the institute and the report’s lead author, “and that their total fluoride exposure is too high and may affect fetal, infant and child neurodevelopment.” © 2025 The New York Times Company

Keyword: Intelligence; Development of the Brain
Link ID: 29625 - Posted: 01.11.2025

By Kristel Tjandra Close your eyes and picture an apple—what do you see? Most people will conjure up a vivid image of the fruit, but for the roughly one in 100 individuals with aphantasia, nothing will appear in the mind’s eye at all. Now, scientists have discovered that in people with this inability to form mental images, visual processing areas of the brain still light up when they try to do so. The study, published today in Current Biology, suggests aphantasia is not caused by a complete deficit in visual processing, as researchers have previously proposed. Visual brain areas are still active when aphantasic people are asked to imagine—but that activity doesn’t translate into conscious experience. The work offers new clues about the neurological differences underlying this little-explored condition. The study authors “take a very strong, mechanistic approach,” says Sarah Shomstein, a vision scientist at George Washington University who was not involved in the study. “It was asking the right questions and using the right methods.” Some scientists suspect aphantasia may be caused by a malfunction in the primary visual cortex, the first area in the brain to process images. “Typically, primary cortex is thought to be the engine of visual perception,” says Joel Pearson, a neuroscientist at the University of New South Wales Sydney who co-led the study. “If you don’t have activity there, you’re not going to have perceptual consciousness.” To see what was going on in this region in aphantasics, the team used functional magnetic resonance imaging to measure the brain activity of 14 people with aphantasia and 18 neurotypical controls as they repeatedly saw two simple patterns, made up of either green vertical lines or red horizontal lines. They then repeated the experiment, this time asking participants to simply imagine the two images.

Keyword: Attention; Vision
Link ID: 29624 - Posted: 01.11.2025

By Christina Caron Barrie Miskin was newly pregnant when she noticed her appearance was changing. Dark patches bloomed on her skin like watercolor ink. A “thicket” of hairs sprouted on her upper lip and chin. The outside world was changing, too: In her neighborhood of Astoria, Queens, bright lights enveloped objects in a halo, blurring her vision. Co-workers and even her doctors started to seem like “alien proxies” of themselves, Ms. Miskin, 46, said. “I felt like I was viewing the world through a pane of dirty glass,” she added. Yet Ms. Miskin knew it was all an illusion, so she sought help. Welcome to Psych 101, a new monthly column that explores mental health terms and trends that are worthy of a bigger conversation. If there is a subject you’d like to see covered, please drop us a line at Psych101@nytimes.com. It took more than a year of consulting with mental health specialists before Ms. Miskin finally found an explanation for her symptoms: She was diagnosed with a dissociative condition called depersonalization/derealization disorder, or D.D.D. Before her pregnancy, Ms. Miskin had stopped taking antidepressants. Her new psychiatrist said the symptoms could have been triggered by months of untreated depression that followed. While Ms. Miskin felt alone in her mystery illness, she wasn’t. Tens of thousands of posts on social media reference depersonalization or derealization, with some likening the condition to “living in a movie or a dream” or “observing the world through a fog.” People who experience depersonalization can feel as though they are detached from their mind or body. Derealization, on the other hand, refers to feeling detached from the environment, as though the people and things in the world are unreal. Those who are living with D.D.D. are “painfully aware” that something is amiss, said Elena Bezzubova, a psychoanalyst who specializes in treating the condition. It’s akin to seeing an apple and feeling that it is so strange it doesn’t seem real, even though you know that it is, she added. The disorder is thought to occur in about 1 to 2 percent of the population, but it’s possible for anyone to experience fleeting symptoms. © 2025 The New York Times Company

Keyword: Attention
Link ID: 29623 - Posted: 01.11.2025

By Laura Sanders Recovery from PTSD comes with key changes in the brain’s memory system, a new study finds. These differences were found in the brains of 19 people who developed post-traumatic stress disorder after the 2015 terrorist attacks in Paris — and then recovered over the following years. The results, published January 8 in Science Advances, point to the complexity of PTSD, but also to ways that brains can reshape themselves as they recover. With memory tasks and brain scans, the study provides a cohesive look at the recovering brain, says cognitive neuroscientist Vishnu Murty of the University of Oregon in Eugene. “It’s pulled together a lot of pieces that were floating around in the field.” On the night of November 13, 2015, terrorists attacked a crowded stadium, a theater and restaurants in Paris. In the years after, PTSD researchers were able to study some of the people who endured that trauma. Just over half the 100 people who volunteered for the study had PTSD initially. Of those, 34 still had the disorder two to three years later; 19 had recovered by two to three years. People who developed PTSD showed differences in how their brains handled intrusive memories, laboratory-based tests of memory revealed. Participants learned pairs of random words and pictures — a box of tissues with the word “work,” for example. PTSD involves pairs of associated stimuli too, though in much more complicated ways. A certain smell or sound, for instance, can be linked with the memory of trauma. © Society for Science & the Public 2000–2025.

Keyword: Learning & Memory; Stress
Link ID: 29622 - Posted: 01.11.2025

By Apoorva Mandavilli The snake struck 11-year-old Beatrice Ndanu Munyoki as she sat on a small stone, which lay atop a larger one, watching the family’s eight goats. She was idly running her fingers through the dirt when she saw a red head dart from between the stones and felt a sharp sting on her right index finger. Never a crier, she ran to her father, David Mutunga, who was building a fence. He cut the cloth belt on her dress into strips with a machete, tied her arm in three places and rushed her to a hospital 30 minutes away on a motorcycle taxi. As the day stretched on, her finger grew darker, but the hospital in Mwingi, a small town in Kenya, had no antidote for that kind of venom. Finally that evening in November 2023, she was taken by ambulance to another hospital and injected with antivenom. When the finger blistered, swelled and turned black despite a second dose the next day, “I understood that they will now remove that part,” Mr. Mutunga said with tears in his eyes. Beatrice’s finger was amputated. In Kenya, India, Brazil and dozens of other countries, snakes vie for the same land, water and sometimes food as people, with devastating consequences. Deforestation, human sprawl and climate change are exacerbating the problem. According to official estimates, about five million people are bitten by snakes each year. About 120,000 die, and some 400,000 lose limbs to amputation. The real toll is almost certainly much higher. Estimates are generally based on hospital records, but most snakebites occur in rural areas, far from dispensaries that stock antivenom and among people too poor to afford treatment. “We don’t actually know the burden of snakebite for most countries of the world,” said Nicholas Casewell, a snake researcher at the Liverpool School of Tropical Medicine. © 2025 The New York Times Company

Keyword: Neurotoxins
Link ID: 29621 - Posted: 01.08.2025

By Angie Voyles Askham Old age is the best predictor of Alzheimer’s disease, Parkinson’s disease and many other neurodegenerative conditions. And yet, as deeply studied as those conditions are, the process of healthy brain aging is not well understood. Without that knowledge, “how can we possibly fix something that goes wrong because of it?” asks Courtney Glavis-Bloom, senior staff scientist at the Salk Institute for Biological Sciences. “We don’t have the basics. It’s like running before we walk.” That said, mounting evidence suggests that aging takes a particular toll on non-neuronal and white-matter cells in mice. For example, white-matter cells display more differentially expressed genes in aged mice than in younger ones, according to a 2023 single-cell analysis of the frontal cortex and striatum. And glia present in white matter show accelerated aging when compared with cells in the cortex across 15 different brain regions, another 2023 mouse study revealed. “Different brain regions show totally different trajectories regarding aging,” says Andreas Keller, head of the Department of Clinical Bioinformatics at the Helmholtz Institute for Pharmaceutical Research Saarland, who worked on the latter study. Some of the cell types with the most extensive aging-related changes in gene expression occur in a small region of the hypothalamus, according to a new single-cell mouse atlas, the largest and broadest to date. Rare neuronal and non-neuronal cell populations within this “hot spot” are particularly vulnerable to the aging process, says Hongkui Zeng, executive vice president and director of the Allen Institute for Brain Science, who led the work. “This demonstrates the power of using the cell-type-specific approach that will identify highly susceptible, rare populations of interest in the brain,” she says. © 2025 Simons Foundation

Keyword: Alzheimers
Link ID: 29620 - Posted: 01.08.2025