Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
By Amber Dance Real estate agents will tell you that a home’s most important feature is “location, location, location.” It’s similar in neuroscience: “Location is everything in the brain,” said Bosiljka Tasic (opens a new tab), a self-described “biological cartographer.” Brain injury in one spot could knock out memory; damage in another could interfere with personality. Neuroscientists and doctors are lost without a good map. Researchers have been mapping the brain for more than a century. By tracing cellular patterns that are visible under a microscope, they’ve created colorful charts and models that delineate regions and have been able to associate them with functions. In recent years, they’ve added vastly greater detail: They can now go cell by cell and define each one by its internal genetic activity. But no matter how carefully they slice and how deeply they analyze, their maps of the brain seem incomplete, muddled, inconsistent. For example, some large brain regions have been linked to many different tasks; scientists suspect that they should be subdivided into smaller regions, each with its own job. So far, mapping these cellular neighborhoods from enormous genetic datasets has been both a challenge and a chore. Recently, Tasic, a neuroscientist and genomicist at the Allen Institute for Brain Science, and her collaborators recruited artificial intelligence for the sorting and mapmaking effort. They fed genetic data from five mouse brains — 10.4 million individual cells with hundreds of genes per cell — into a custom machine learning algorithm. The program delivered maps that are a neuro-realtor’s dream, with known and novel subdivisions within larger brain regions. Humans couldn’t delineate such borders in several lifetimes, but the algorithm did it in hours. The authors published their methods (opens a new tab) in Nature Communications in October. © 2026 Simons Foundation
Keyword: Brain imaging; Development of the Brain
Link ID: 30117 - Posted: 02.11.2026
Jon Hamilton Parkinson's disease does more than cause tremor and trouble walking. It can also affect sleep, smell, digestion and even thinking. That may be because the disease disrupts communication in a brain network that links the body and mind, a team reports in the journal Nature. "It almost feels like a tunnel is jammed, so no traffic can go normally," says Hesheng Liu, a brain scientist at Changping Laboratory and Peking University in Beijing and an author of the study. The finding fits nicely with growing evidence that Parkinson's is a network disorder, rather than one limited to brain areas that control specific movements, says Peter Strick, a professor and chair of neurobiology at the University of Pittsburgh who was not involved in the study. Other degenerative brain diseases affect other brain networks in different ways. Alzheimer's, for example, tends to reduce connectivity in the default mode network, which supports memory and sense of self. ALS (amyotrophic lateral sclerosis) primarily damages the motor system network, which controls movement. Understanding the network affected by Parkinson's, which affects about 1 million people in the United States, could change the way doctors treat the disease. A mystery solved? People with Parkinson's often have symptoms that vary in ways that are hard to explain. For example, someone who usually is unable to stand may suddenly leap when faced with an emergency. And Parkinson's patients who can still walk may freeze if they try to carry on a conversation. © 2026 npr
Keyword: Parkinsons
Link ID: 30116 - Posted: 02.11.2026
By Corinna da Fonseca-Wollheim The placid chords of a Debussy prelude splashed through a darkened auditorium during a recital by the pianist Nicolas Namoradze at the University of California, San Francisco, on a November evening. A translucent image of Namoradze’s brain appeared above him on a screen: Electrical currents of different wavelengths, associated with varying levels of alertness, registered as colorful activity coursing through the model like storm fronts on a weather map. With each chord, clouds of green and blue bloomed, then faded as the sound receded. As the recital progressed with works by Bach, Beethoven and Scriabin, the image of the gently rotating brain showed a complex choreography of signals that sometimes ping-ponged between different areas or flickered simultaneously across the organ’s hemispheres. As a visual spectacle accompanying Namoradze’s pellucid playing, it was mesmerizing: an X-ray, seemingly, of virtuosity at work. But to the scientists in the audience, attendees at a conference on the neuroscience of music and dance, it was more than entertainment. It was evidence of a breakthrough in experiment design — one that opens up possibilities in an area that has long eluded scientific study: how music activates the brain, not in listeners, but in performers. It was also a reminder of the value artists can bring to scientific inquiry as active participants shaping studies of their craft. The neuroscientist Theodore Zanto, a member of the Neuroscape lab at U.C.S.F. that created the “Glass Brain” animations, said in an interview the next day that he was surprised — and moved — by the result. “It’s probably the cleanest real-time representation of what’s happening inside the brain during a piano performance,” he said. © 2026 The New York Times Company
Keyword: Hearing; Brain imaging
Link ID: 30115 - Posted: 02.11.2026
By Holly Barker Synaptic proteins degrade more slowly in aged mice than in younger mice, a new study finds. Microglia appear to unburden the neurons of the excess proteins, but that accumulation may turn toxic, the findings suggest. To function properly, cells need to clear out old and damaged proteins periodically, but that process stalls with age: Protein turnover is about 20 percent slower in the brains of older rodents than in youthful ones, according to an analysis of whole-brain samples. The new study is the first to probe protein clearance specifically in neurons in living animals. “Neurons face unique challenges to protein turnover,” says study investigator Ian Guldner, a postdoctoral fellow in Tony Wyss-Coray’s lab at Stanford University. For instance, their longevity prevents them from distributing old proteins among daughter cells. And unlike other proteins on the path to degradation, neuronal components must first navigate the axon—sometimes traveling as far as 1 meter, Guldner says. In the new study, Guldner and his colleagues engineered mice to express a modified version of aminoacyl-tRNA synthetase—a component of the protein synthesis machinery—in excitatory neurons. Every day for one week, mice of different ages received injections of chemically altered amino acids compatible only with that mutant enzyme. Neurons used the labeled amino acids to replenish proteins, enabling the group to track how quickly those proteins degraded over the subsequent two weeks. “The achievement lies in the technical advance, namely by being able to look at protein degradation and aggregation specifically in neuronal cells,” says F. Ulrich Hartl, director of the Max Planck Institute of Biochemistry, who was not involved in the study. © 2026 Simons Foundation
Keyword: Development of the Brain; Glia
Link ID: 30114 - Posted: 02.11.2026
Ian Sample Science editor People who have a couple of teas or coffees a day have a lower risk of dementia and marginally better cognitive performance than those who avoid the drinks, researchers say. Health records for more than 130,000 people showed that over 40 years, those who routinely drank two to three cups of caffeinated coffee or one to two cups of caffeinated tea daily had a 15-20% lower risk of dementia than those who went without. The caffeinated coffee drinkers also reported slightly less cognitive decline than those who opted for decaf and performed better on some objective tests of brain function, according to a report published in the Journal of the American Medical Association. The findings suggest habitual tea and coffee drinking is good for the brain, but the research cannot prove it, as caffeine drinkers may be less prone to dementia for other reasons. A similar link would arise if poor sleepers, who appear to have a greater risk of cognitive decline, steered clear of caffeine to get a better night’s rest. “Our study alone can’t prove causality, but to our knowledge, it is the best evidence to date looking at coffee and tea intake and cognitive health, and it is consistent with plausible biology,” said the lead author, Yu Zhang, who studies nutritional epidemiology at Harvard University. Coffee and tea contain caffeine and polyphenols that may protect against brain ageing by improving vascular health and reducing inflammation and oxidative stress, where harmful atoms and molecules called free radicals damage cells and tissues. Substances in the drinks could also work by improving metabolic health. Caffeine, for example, is linked to lower rates of type 2 diabetes, a known risk factor for dementia. © 2026 Guardian News & Media Limited
Keyword: Drug Abuse; Alzheimers
Link ID: 30113 - Posted: 02.11.2026
By Alexa Robles-Gil Having an imaginary friend, playing house or daydreaming about the future were long considered uniquely human abilities. Now, scientists have conducted the first study indicating that apes have the ability to play pretend as well. The findings, published Thursday in the journal Science, suggest that imagination is within the cognitive potential of an ape and can possibly be traced back to our common evolutionary ancestors. “This is one of those things that we assume is distinct about our species,” said Christopher Krupenye, a cognitive scientist at Johns Hopkins University and an author of the study. “This kind of finding really shows us that there’s much more richness to these animals’ minds than people give them credit for,” he said. Researchers knew that apes were capable of certain kinds of imagination. If an ape watches someone hide food in a cup, it can imagine that the food is there despite not seeing it. Because that perception is the reality — the food is actually there — it requires the ape to sustain only one view of the world, the one that it knows to be true. “This kind of work goes beyond it,” Dr. Krupenye said. “Because it suggests that they can, at the same time, consider multiple views of the world and really distinguish what’s real from what’s imaginary.” Bonobos, an endangered species found only in the Democratic Republic of Congo, are difficult to study in the wild. For this research, Dr. Krupenye and Amalia Bastos, a cognitive scientist at the University of St. Andrews, relied on an organization known as the Ape Initiative to study Kanzi, a male bonobo famous for demonstrating some understanding of spoken English. (Kanzi was an enculturated ape born in captivity; he died last year at age 44.) © 2026 The New York Times Company
Keyword: Consciousness; Evolution
Link ID: 30112 - Posted: 02.07.2026
By Nora Bradford For more than a century, psychologists thought that the infant experience was, as the psychologist and philosopher William James famously put it, a “blooming, buzzing confusion.” But new research suggests babies are born with a surprisingly sophisticated neurological toolkit that can organize the visual world into categories and pick out the beat in a song. In the first of two new studies, neuroscientists managed a rare feat: performing functional MRI (fMRI) scans on more than 100 awake 2-month-old infants to see how their brains categorize visual objects. fMRI requires near-stillness, which makes scanning babies notoriously difficult. While the infants lay in the machines, images of animals, food, household objects and other familiar items appeared above their heads like “an IMAX for babies,” says Cliona O’Doherty, a developmental neuroscientist at Stanford University who conducted the work at Trinity College Dublin. “MRI is difficult even under ‘ideal’ circumstances when research participants can follow instructions to hold still,” says Scott Johnson, a developmental psychologist at UCLA who was not involved in the study. “Babies can’t take instruction, so these researchers must have the patience of saints.” The imaging showed that a brain region called the ventral visual cortex, responsible for recognizing what we see, already responded similarly to that of adults, O’Doherty and colleagues report February 2 in Nature Neuroscience. In both adults and 2-month olds, the ventral visual cortex’s activity is distinct for different categories of objects, pushing back against the traditional view that the brain gradually learns to distinguish between categories throughout development. © Society for Science & the Public 2000–2026
Keyword: Hearing; Development of the Brain
Link ID: 30111 - Posted: 02.07.2026
By Natalia Mesa A region of the cerebellum shows language specificity akin to that of cortical language regions, indicating that it might be part of the broader language network, according to a new brain-imaging study. “This is the first time we see an area outside of the core left-hemisphere language areas that behaves so similarly to those core areas,” says study investigator Ev Fedorenko, associate professor of brain and cognitive sciences at the Massachusetts Institute of Technology. Initially thought to coordinate only movement, the cerebellum also contributes to cognitive processes, such as social reward, abstract reasoning and working memory, according to studies from the past decade. But despite the fact that people with cerebellar lesions have subtle language struggles, the region’s contributions to that skill have been ignored until recently, Fedorenko says. With this new work, “I think it becomes harder to dismiss language responses as somehow artifactual.” Fedorenko and her team analyzed nearly 1,700 whole-brain functional MRI experiments conducted over the course of 15 years. They originally collected and analyzed those scans to identify language-selective regions of the neocortex, but they reanalyzed many of them to determine the cerebellum’s role in linguistic processing. Four cerebellar regions activated robustly when participants performed language-related tasks, such as reading passages of text or listening to someone else reading the passages aloud, in line with previous work. But only one region responded exclusively to these language-related tasks; it did not activate during a variety of nonlinguistic tasks—including movement, arithmetic tasks and a spatial working memory task—or when participants listened to music or watched videos of faces and bodies. The findings were published last month in Neuron. © 2026 Simons Foundation
Keyword: Language
Link ID: 30110 - Posted: 02.07.2026
By Molly Glick Not long after upending federal diet guidelines in order to prioritize “real food” on our plates, United States Health and Human Services Secretary Robert F. Kennedy Jr. has offered a new piece of questionable advice. During a tour to promote these dietary recommendations, Kennedy recently claimed that a keto diet can cure schizophrenia—an assertion that experts have quickly thrown cold water on. The ketogenic diet promotes fat-rich meals and low amounts of carbohydrates. While keto eating has skyrocketed in popularity in recent years—it ranked the most Googled diet in the U.S. in 2020—it was initially designed in the early 20th century for patients with epilepsy. More recent studies have confirmed that the diet is effective for certain types of epilepsy because it can control seizures. Meanwhile, we have much less evidence for its impacts on symptoms of schizophrenia. So far, small studies have offered some early evidence that ketogenic diets may help people with the condition. “There is currently no credible evidence that ketogenic diets cure schizophrenia,” Mark Olfson, a psychiatrist at Columbia University, told The New York Times. Kennedy also proclaimed that the diet can essentially cure bipolar disorder, according to studies he recently read. But as with schizophrenia, keto’s impacts on bipolar disorder have only been examined in limited numbers of patients so far. Preliminary findings have also hinted that a keto diet could ease symptoms of depression. It may offer “small antidepressant benefits” for people who don’t respond to medication, according to a recently published JAMA Psychiatry paper. But this work is in the early stages as well and remains far from conclusive. © 2026 NautilusNext Inc.
Keyword: Schizophrenia; Depression
Link ID: 30109 - Posted: 02.07.2026
Peter Lukacs Popular wisdom holds we can ‘rewire’ our brains: after a stroke, after trauma, after learning a new skill, even with 10 minutes a day on the right app. The phrase is everywhere, offering something most of us want to believe: that when the brain suffers an assault, it can be restored with mechanical precision. But ‘rewiring’ is a risky metaphor. It borrows its confidence from engineering, where a faulty system can be repaired by swapping out the right component; it also smuggles that confidence into biology, where change is slower, messier and often incomplete. The phrase has become a cultural mantra that is easier to comprehend than the scientific term, neuroplasticity – the brain’s ability to change and form new neural connections throughout life. But what does it really mean to ‘rewire’ the brain? Is it a helpful shorthand for describing the remarkable plasticity of our nervous system or has it become a misleading oversimplification that distorts our grasp of science? After all, ‘rewiring your brain’ sounds like more than metaphor. It implies an engineering project: a system whose parts can be removed, replaced and optimised. The promise is both alluring and oddly mechanical. The metaphor actually did come from engineering. To an engineer, rewiring means replacing old and faulty circuits with new ones. As the vocabulary of technology crept into everyday life, it brought with it a new way of thinking about the human mind. Medical roots of the phrase trace back to 1912, when the British surgeon W Deane Butcher compared the body’s neural system to a house’s electrical wiring, describing how nerves connect to muscles much like wires connect appliances to a power source. By the 1920s, the Harvard psychologist Leonard Troland was referring to the visual system as ‘an extremely intricate telegraphic system’, reinforcing the comparison between brain function and electrical networks. © Aeon Media Group Ltd. 2012-2026.
Keyword: Learning & Memory; Development of the Brain
Link ID: 30108 - Posted: 02.04.2026
Elizabeth Quill Think about your breakfast this morning. Can you imagine the pattern on your coffee mug? The sheen of the jam on your half-eaten toast? Most of us can call up such pictures in our minds. We can visualize the past and summon images of the future. But for an estimated 4% of people, this mental imagery is weak or absent. When researchers ask them to imagine something familiar, they might have a concept of what it is, and words and associations might come to mind, but they describe their mind’s eye as dark or even blank. Systems neuroscientist Mac Shine at the University of Sydney, Australia, first realized that his mental experience differed in this way in 2013. He and his colleagues were trying to understand how certain types of hallucination come about1, and were discussing the vividness of mental imagery. “When I close my eyes, there’s absolutely nothing there,” Shine recalls telling his colleagues. They immediately asked him what he was talking about. “Whoa. What’s going on?” Shine thought. Neither he nor his colleagues had realized how much variation there is in the experiences people have when they close their eyes. This moment of revelation is common to many people who don’t form mental images. They report that they might never have thought about this aspect of their inner life if not for a chance conversation, a high-school psychology class or an article they stumbled across (see ‘How do you imagine?’). Although scientists have known for more than a century that mental imagery varies between people, the topic received a surge of attention when, a decade ago, an influential paper coined the term aphantasia to describe the experience of people with no mental imagery2. © 2026 Springer Nature Limited
Keyword: Attention; Consciousness
Link ID: 30107 - Posted: 02.04.2026
By Ellen Barry A new analysis of birth cohorts in the Canadian province of Ontario has found a striking rise in the incidence of psychotic disorders among young people, a finding that its authors said could reflect teens’ increasing use of substances like cannabis, stimulants and hallucinogens. The study, published on Monday in The Canadian Medical Association Journal, found that the rate of new diagnoses of psychotic disorders among people ages 14 to 20 increased by 60 percent between 1997 and 2023, while new diagnoses at older ages plateaued or declined. Compared with people born in the late 1970s, those born in the early 2000s were about twice as likely to have been diagnosed with a psychotic disorder by age 20. The researchers included 12 million people born in Ontario between 1960 and 2009, of which 0.9 percent were diagnosed with a psychotic disorder during the study period. The study was epidemiological and did not try to identify a cause for the rising prevalence. There are a number of possible explanations, among them older paternal age, the stress of migration, neonatal health problems and early intervention programs that now regularly identify the disorders at younger ages, the authors note. But Dr. Daniel Myran, one of the study’s authors, said he undertook the study, in part, to follow up on concerns that the legalization of cannabis might increase population-level rates of schizophrenia and other psychotic disorders. “I was expecting to see some increases in these younger folks, but I was quite surprised by the scale,” said Dr. Myran, a family physician and research chair at North York General Hospital. He said the results suggested a need for more research into the impact of expanding cannabis use by young people. © 2026 The New York Times Company
Keyword: Schizophrenia; Drug Abuse
Link ID: 30106 - Posted: 02.04.2026
By Marla Vacek Broadfoot Nearly 1 in 8 dementia cases — about half a million nationwide — may be linked to insomnia. The new findings, reported December 27 in the Journals of Gerontology: Series A, add weight to growing evidence that sleep is a modifiable risk factor for dementia, akin to hearing loss and hypertension. The study does not establish a direct cause-and-effect relationship between insomnia and dementia for individuals, says Yuqian Lin, a data analyst at Massachusetts General Hospital in Boston. Rather, she says, it looks at the overall extent to which insomnia may contribute to dementia across the population. Lin and her colleagues analyzed data from the National Health and Aging Trends Study, or NHATS, a long-running survey of 5,900 U.S. adults ages 65 and older. Participants reported whether they had difficulty falling asleep, staying asleep or both. Dementia was identified using standard research tools that rely on cognitive testing and reports from family members or caregivers. To estimate the impact of insomnia on the population, Lin and her team calculated the proportion of dementia cases that could theoretically be prevented if insomnia-related sleep disturbances were eliminated. The calculation combined the prevalence of insomnia and dementia in the NHATS population with relative risk estimates drawn from recent large meta-analyses linking insomnia to dementia later in life. © Society for Science & the Public 2000–2026.
Keyword: Sleep; Alzheimers
Link ID: 30105 - Posted: 02.04.2026
By Jake Buehler Though fearsome predators, snakes can go weeks or even months without eating. Now, scientists think they may know how they do it. Snakes have lost the genes to produce ghrelin, a key hormone that regulates appetite, digestion, and fat storage, researchers report today in Royal Society Open Biology. Chameleons and a group of desert lizards called toadhead agamas that also have huge spaces between meals have also lost the same genes, hinting that cutting off ghrelin is a key way to excel at fasting, possibly by suppressing appetite and holding onto fat stores. “I give [the researchers] a lot of credit for looking more deeply into the data that was staring us all in the face—myself included,” says Todd Castoe, a genomicist at the University of Texas at Arlington not involved with the study. The hormone is ubiquitous across vertebrates, from fish to mammals. So finding that reptiles have repeatedly ditched it is “pretty remarkable,” he says. When scientists first discovered ghrelin nearly 30 years ago, they thought this “hunger hormone” could be key to fighting obesity in humans. But it hasn’t been that simple. Since then, researchers have found that ghrelin has a complicated role within a network of hormones constantly tweaking hunger and energy stores. And even though ghrelin is commonly found in vertebrates, it’s been unclear how it has evolved across various groups of vertebrates. So in the new study, Rui Resende Pinto, an evolutionary biologist at the University of Porto, and his colleagues focused on reptiles, many of which can go long periods without food. The researchers scanned the genomes of 112 species. In snakes, chameleons, and toadhead agamas, ghrelin genes were either missing or so warped by mutations they could no longer encode the hormone, the team found. The degree of the genes’ erosion also varied considerably between snake families: Some snakes such as boas and pythons had malformed ghrelin genes, but others, such as vipers, cobras, and their relatives, barely had anything left.
Keyword: Obesity; Evolution
Link ID: 30104 - Posted: 02.04.2026
By Ingrid Wickelgren The human brain is a vast network of billions of neurons. By exchanging signals to depress or excite each other, they generate patterns that ripple across the brain up to 1,000 times per second. For more than a century, that dizzyingly complex neuronal code was thought to be the sole arbiter of perception, thought, emotion, and behavior, as well as related health conditions. If you wanted to understand the brain, you turned to the study of neurons: neuroscience. But a recent body of work from several labs, published as a trio of papers in Science in 2025, provides the strongest evidence yet that a narrow focus on neurons is woefully insufficient for understanding how the brain works. The experiments, in mice, zebra fish, and fruit flies, reveal that the large brain cells called astrocytes serve as supervisors. Once viewed as mere support cells for neurons, astrocytes are now thought to help tune brain circuits and thereby control overall brain state or mood — say, our level of alertness, anxiousness, or apathy. Astrocytes, which outnumber neurons in many brain regions, have complex and varied shapes, and sometimes tendrils, that can envelop hundreds of thousands or millions of synapses, the junctions where neurons exchange molecular signals. This anatomical arrangement perfectly positions astrocytes to affect information flow, though whether or how they alter activity at synapses has long been controversial, in part because the mechanisms of potential interactions weren’t fully understood. In revealing how astrocytes temper synaptic conversations, the new studies make astrocytes’ influence impossible to ignore. “We live in the age of connectomics, where everyone loves to say [that] if you understand the connections [between neurons], we can understand how the brain works. That’s not true,” said Marc Freeman (opens a new tab), the director of the Vollum Institute, an independent neuroscience research center at Oregon Health and Science University, who led one of the new studies. “You can get dramatic changes in firing patterns of neurons with zero changes in [neuronal] connectivity.” © 2026 Simons Foundation
Keyword: Glia; Learning & Memory
Link ID: 30103 - Posted: 01.31.2026
By Amy X. Wang Alice, fumbling through Wonderland, comes across a mushroom. One bite of it shrinks her down in size. Chowing on the other side makes her swell up, huge, taller than the treetops. Urgently, Alice sets to work “nibbling first at one and then at the other, and growing sometimes taller and sometimes shorter,” until finally she succeeds in “bringing herself down to her usual height” — whereupon everything feels “quite strange.” Is this Lewis Carroll’s 1865 fantasy tale or … the average body-conscious, improvement-obsessed 2026 Whole Foods shopper? Mushrooms, long venerated in literature as dark transformative forces, have become Goopified. Nowadays, you can chug “adaptogenic mushroom coffee,” slurp “functional mushroom cocoa,” doze off with “mushroom sleep drops” or ingest/imbibe any number of other tinctures in the billion-dollar fungal supplements market that promise to fine-tune, or even totally recalibrate, the self. The latest and hottest items in this booming new retail category are mushroom gummies, gushed over by wellness influencers, spilling out from supermarket shelves right there next to your standard cough drops and protein bars. Fungi have aided medical advances like antibiotics and statins, it’s true, and certain species have shown promising results in fighting Parkinson’s or cancer — but what these pastel gumdrops proffer is a broader, more elliptical “cellular well-being.” The mystique feels intentional on product-makers’ part: Like Carroll’s baffled heroine, maybe you’re meant to be in a bit of thrall to the mysterious, almighty mushroom — lurching through Wonderland, charmed and confused by design. After all, you wonder, what are these ancient, alien creatures, growing in the secret dark? Hippocrates was supposedly using them to cauterize wounds around the 5th century B.C.E. In the Super Mario video games, mushrooms might give you extra lives; in HBO’s “The Last of Us,” they bring about the ruin of human civilization. © 2026 The New York Times Company
Keyword: Attention; Drug Abuse
Link ID: 30102 - Posted: 01.31.2026
By Calli McMurray In 2010, Ardem Patapoutian unmasked a piece of cellular machinery that had long evaded identification: PIEZO channels, pores wrenched open by changes in a cell’s membrane tension to allow ions to flow through, thereby converting mechanical force into electrical activity. The discovery marked a turning point for the field of mechanosensation—a process that can be unwieldy to study, says Arthur Beyder, associate professor of physiology and medicine at the Mayo Clinic, because “it reaches its fingers into everything.” The field needed “something to grab onto,” he says, to untangle these processes from other sensory ones—and PIEZO channels provided the first handhold. The PIEZO discovery garnered much attention, and since then, a flurry of studies have outlined how the channels contribute to touch, itch and proprioception. In 2021, Patapoutian shared the Nobel Prize in Physiology or Medicine for his contributions to this work. Now, a growing cadre of researchers is using these receptors as a tool to explore interoception, or the brain’s sense of what the internal organs are doing. “We’re seeing a resurgence and an expansion of research in this area,” says Miriam Goodman, professor of molecular and cellular physiology at Stanford University. The field, she adds, is in the middle of a “PIEZO-driven renaissance.” Even a body at rest is in constant motion: The heart pumps blood, the lungs expand and contract, the gut squeezes food, and the bladder stretches with urine. Biologists had intuited that mechanical force was a key part of these processes—and also part of how organs communicate with the brain—but for decades they did not have a way to dive into the molecular mechanisms behind them. © 2026 Simons Foundation
Keyword: Pain & Touch
Link ID: 30101 - Posted: 01.31.2026
By Azeen Ghorayshi Health Secretary Robert F. Kennedy Jr. has overhauled a panel that helps the federal government set priorities for autism research and social services, installing several members who have said that vaccines can cause autism despite decades of research that has failed to establish such a link. The panel, the Interagency Autism Coordinating Committee, was established in 2000 and has historically included autistic people, parents, scientists and clinicians, as well as federal employees, who hold public meetings to debate how federal funds should best be allocated to support people with autism. The 21 new public members selected by Mr. Kennedy include many outspoken activists, among them a former employee of a super PAC that supported Mr. Kennedy’s presidential campaign, a doctor who has been sued over dangerous heavy metal treatments for a young child with autism, a political economist who has testified against vaccines before a congressional committee, and parents who have spoken publicly about their belief that their children’s autism was caused by vaccines. The group, which also includes 21 government members across many federal agencies, will advise the federal government on how to prioritize the $2 billion allocated by Congress toward autism research and services over the next five years. Though it’s not yet clear what the committee will do — or what it can do, given that it serves only an advisory function — many longtime autism advocates and researchers said they were alarmed by the fact that the committee seemed stacked to advance Mr. Kennedy’s priorities on vaccines. “The new committee does not represent the autism community,” said Alison Singer, who served on the committee from 2007 to 2019. Ms. Singer, whose 28-year-old daughter has profound autism, is the head of the Autism Science Foundation. “It disproportionately, excruciatingly so, represents an extremely small subset of families who believe vaccines cause autism.” © 2026 The New York Times Company
Keyword: Autism
Link ID: 30100 - Posted: 01.31.2026
By Simon Makin Positive thinking may boost the body’s defenses against disease. Increasing activity in a brain region that controls motivation and expectation, specifically the brain’s reward system, is linked with making more antibodies after receiving a vaccine. The finding suggests these boosts were related to the placebo effect, researchers report January 19 in Nature Medicine. “Placebo is a self-help mechanism, and here we actually harness it,” says Talma Hendler, a neuroscientist at Tel Aviv University. “This suggests we could use the brain to help the body fight illness.” The work is important because it “is first-in-human evidence of a relationship between brain reward systems and immune function,” says Tor Wager, a neuroscientist at Dartmouth College in Hanover, N.H., who was not involved in the study. The study was not designed to test vaccine effectiveness. Larger studies, including more complete immune assessments, will be required to test this association as a medical intervention. Scientists have found many links between the brain and bodily health. Both negative and positive mental states can affect the immune system, and studies in rodents have suggested that the brain’s reward network is involved in these effects. To find out if the same circuitry was at play in humans, Hendler and colleagues trained healthy volunteers to regulate their brain activity using neurofeedback, a technique that uses brain imaging to show users the activity of the area they are trying to boost. The team randomly assigned 85 participants to receive training aimed at increasing activity in either their reward network or a different network, or to receive no training. © Society for Science & the Public 2000–2026.
Keyword: Neuroimmunology
Link ID: 30099 - Posted: 01.31.2026
By Alessio Cozzolino After a heart attack, the heart “talks” to the brain. And that conversation may make recovery worse. Shutting down nerve cells that send messages from injured heart cells to the brain boosted the heart’s ability to pump and decreased scarring, experiments in mice show. Targeting inflammation in a part of the nervous system where those “damage” messages wind up also improved heart function and tissue repair, scientists report January 27 in Cell. “This research is another great example highlighting that we cannot look at one organ and its disease in isolation,” says Wolfram Poller, an interventional cardiologist at Massachusetts General Hospital and Harvard Medical School who was not involved in the study. “And it opens the door to new therapeutic strategies and targets that go beyond the heart.” Someone in the United States has a heart attack about every 40 seconds, according to the U.S. Centers for Disease Control and Prevention. That adds up to about 805,000 people each year. A heart attack is a mechanical problem caused by the obstruction of a coronary artery, usually by a blood clot. If the blockage lasts long enough, the affected cells may start to die. Heart attacks can have long-term effects such as a weakened heart, a reduced ability to pump blood, irregular heart rhythms, and a higher risk of heart failure or another heart attack. Although experts knew from previous research that the nervous and immune systems could amplify inflammation and slow healing, the key players and pathways involved were unknown, says Vineet Augustine, a neurobiologist at the University of California, San Diego. © Society for Science & the Public 2000–2026
Keyword: Neuroimmunology
Link ID: 30098 - Posted: 01.28.2026


.gif)

