Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 29326

By Claire L. Evans In 1983, the octogenarian geneticist Barbara McClintock stood at the lectern of the Karolinska Institute in Stockholm. She was famously publicity averse — nearly a hermit — but it’s customary for people to speak when they’re awarded a Nobel Prize, so she delivered a halting account of the experiments that had led to her discovery, in the early 1950s, of how DNA sequences can relocate across the genome. Near the end of the speech, blinking through wire-framed glasses, she changed the subject, asking: “What does a cell know of itself?” McClintock had a reputation for eccentricity. Still, her question seemed more likely to come from a philosopher than a plant geneticist. She went on to describe lab experiments in which she had seen plant cells respond in a “thoughtful manner.” Faced with unexpected stress, they seemed to adjust in ways that were “beyond our present ability to fathom.” What does a cell know of itself? It would be the work of future biologists, she said, to find out. Forty years later, McClintock’s question hasn’t lost its potency. Some of those future biologists are now hard at work unpacking what “knowing” might mean for a single cell, as they hunt for signs of basic cognitive phenomena — like the ability to remember and learn — in unicellular creatures and nonneural human cells alike. Science has long taken the view that a multicellular nervous system is a prerequisite for such abilities, but new research is revealing that single cells, too, keep a record of their experiences for what appear to be adaptive purposes. In a provocative study published in Nature Communications late last year, the neuroscientist Nikolay Kukushkin and his mentor Thomas J. Carew at New York University showed that human kidney cells growing in a dish can “remember” patterns of chemical signals (opens a new tab) when they’re presented at regularly spaced intervals — a memory phenomenon common to all animals, but unseen outside the nervous system until now. Kukushkin is part of a small but enthusiastic cohort of researchers studying “aneural,” or brainless, forms of memory. What does a cell know of itself? So far, their research suggests that the answer to McClintock’s question might be: much more than you think. © 2025 Simons Foundation

Keyword: Learning & Memory
Link ID: 29872 - Posted: 08.02.2025

Emily Kwong A grayscale ballerina who appears to be moving. A human who can fit in a doll box. A black-and-white prism which appear to change shape when viewed from three different directions. Those are the top winners of the 2024 Best Illusion of the Year Contest, open to illusion makers around the world. The contest was co-created by neuroscientist and science writer Susana Martinez-Conde. After 20 years, Martinez-Conde is still amazed that novel illusions keep coming in — submitted by artists, magicians, vision scientists and illusion makers all over the world. "Illusions are fundamental to the way that we perceive the world — the way that, frankly, we exist as human beings. Illusions are a feature and not a bug," she told All illusions are perceptual experiences that do not match physical reality. Aristotle was one of the first to document an illusion in nature, the so-called "waterfall illusion," or motion aftereffect. When someone watches a moving stimulus, such as a river, a nearby stationary object, like a rock, may also appear to move. Other famous illusions include "Rotating Snakes," which Martinez-Conde has studied as part of her research into peripheral drift. As a scientist, Martinez-Conde sees as illusions as an opportunity to study how the human brain constructs perceptions of the world. "We can analyze the neurons and the brain circuits that support neural activity that matches perception, and those could be part of the neural basis of consciousness." Voting for the 2025 Best Illusion of the Year will take place next year. The online contest is run by the non-profit Neural Correlate Society. © 2025 npr

Keyword: Vision
Link ID: 29871 - Posted: 08.02.2025

By Pam Belluck A combination of healthy activities including exercise, nutritious diet, computer brain games and socializing can improve cognitive performance in people at risk for dementia, according to a large new study. The study, conducted in five locations across the United States over two years, is the biggest randomized trial to examine whether healthy behaviors protect brain health. “It confirms that paying attention to things like physical activity and vascular risk factors and diet are all really important ways to maintain brain health,” said Dr. Kristine Yaffe, an expert in cognitive aging at the University of California, San Francisco, who was not involved in the study. The results were presented on Monday at the Alzheimer’s Association International Conference in Toronto and published in the journal JAMA. The study involved 2,111 people, ages 60 to 79, from diverse racial and ethnic backgrounds. None were cognitively impaired. All had sedentary lifestyles, suboptimal diets and two other dementia risk factors, such as a family history of cognitive decline and high blood pressure. Half of the participants followed a structured program. They were prescribed a healthy diet, socially engaging activities and a weekly regimen of eight exercise sessions and three sessions of computerized cognitive training. They attended 38 meetings with facilitators and fellow participants. The other participants followed a self-guided program. They were given educational materials and resources, and were regularly encouraged to engage in healthy behaviors. They attended six team meetings during the study. © 2025 The New York Times Company

Keyword: Alzheimers
Link ID: 29870 - Posted: 08.02.2025

By Siddhant Pusdekar The food we eat, the air we breathe, and our daily activities all shape how our minds work. Yet most brain research focuses on a narrow slice of humanity: people in high-income countries in the Northern Hemisphere. That leaves a vast gap in our understanding of how neural activity varies across cultures, environments, and lifestyles. A team of researchers from Tanzania and India has taken a step toward closing that gap. In a study published this week in eNeuro, they describe a strategy for collecting data from the brains of diverse groups—from hunter-gatherers to urban dwellers—using electroencephalography (EEG). The technology relies on portable headsets, widely used in clinical settings, that record the brain’s electrical activity through electrodes placed on the scalp. The researchers trained trusted community members as “surveyors,” who visited participants where they live and work to gather EEG data and conduct surveys about their lifestyles and experiences. The initial effort, which involved nearly 8000 volunteers across Tanzania and India, shows that this kind of data collection in low- and middle-income countries is feasible and affordable, the researchers say. The work cost them $50 for each person studied, a fraction of equivalent, large-scale studies conducted in research labs. A: I think mental health is one of the defining health issues in India. When we survey 18- to 24-year-olds, 50% tell us that almost every other day of the month they don’t feel like going to work or college. India is a young country and is increasingly relying on its youth to grow its economy. If they can’t function in their daily activities, you can’t expect them to be productive and contribute to the economy.

Keyword: Brain imaging
Link ID: 29869 - Posted: 07.26.2025

By Diana Kwon A new sensor makes it possible for the first time to simultaneously track dopamine and up to two additional molecules in the brains of living animals. The sensor, dubbed HaloDA1.0, uses a novel dopamine-tagging system that emits light at the far-red end of the color spectrum, according to the team behind the work. “There’s a real need to monitor multiple relevant molecules, as they’re doing here,” says Nicolas Tritsch, assistant professor of neuroscience at McGill University, who was not involved in the study. Because dopamine is involved in a range of key brain functions, when studying its effects on a cell it’s important to consider other neuromodulators that are released at the same time, as well as the signaling cascades these molecules may trigger, Tritsch says. Most dopamine-tracking strategies genetically encode a naturally occurring fluorescent protein into dopamine receptors; when dopamine attaches to the modified receptors, the fluorescent protein changes shape and emits light. But naturally occurring fluorescent proteins have a limited color palette, which has made it difficult to develop sensors that can go beyond two-color imaging, says study investigator Yulong Li, professor of life sciences at Peking University. Instead of genetically encoding a fluorescent protein, HaloDA1.0 attaches a synthetic molecule called HaloTag to dopamine receptors. This tag binds tightly to previously developed artificial dyes that change shape and fluoresce in the far-red spectrum when dopamine binds to its receptors. Because the dyes fluoresce at the far end of the red spectrum, it leaves room for other sensors to glow at different wavelengths. © 2025 Simons Foundation

Keyword: Brain imaging
Link ID: 29868 - Posted: 07.26.2025

Maria Godoy Back in the 1800s, obesity was almost nonexistent in the United States. Over the last century, it's become common here and in other industrialized nations, though it remains rare among people who live more traditional lifestyles, such as the Hadza hunter-gatherers of Tanzania. So what's changed? One common explanation is that as societies have developed, they've also become more sedentary, and people have gotten less active. The assumption is that as a result, we burn fewer calories each day, contributing to an energy imbalance that leads to weight gain over time, says Herman Pontzer, a professor of evolutionary biology and global health at Duke University who studies how human metabolism has evolved. Sponsor Message But in a major new study published in the journal PNAS, Pontzer and an international team of collaborators found that's not the case. They compared the daily total calorie burn for people from 34 different countries and cultures around the world. The people involved ran the spectrum from hunter-gatherers and farming populations with low obesity rates, to people in more sedentary jobs in places like Europe and the U.S., where obesity is widespread. "Surprisingly, what we find is that actually, the total calories burned per day is really similar across these populations, even though the lifestyle and the activity levels are really different," says Pontzer. And that finding offers strong evidence that diet — not a lack of physical activity — is the major driver of weight gain and obesity in our modern world. © 2025 npr

Keyword: Obesity
Link ID: 29867 - Posted: 07.26.2025

By Sofia Caetano Avritzer The original paleo diet might have included fewer succulent steaks and more juicy maggots. Neandertals are often depicted at the top of the food chain for their time, consuming as much meat as lions or hyenas. But maggots growing on rotting meat might have been the real signature dish of the Neandertal diet, researchers report July 25 in Science Advances. The idea that Neandertals were extreme carnivores comes partly from the high levels of a specific type of nitrogen called N-15 in their bones. Nitrogen has two stable forms. N-14 is lighter and a lot more common in nature, while N-15 is heavier and much rarer. When an animal eats a plant with both types of nitrogen, it will keep more N-15 than N-14 in its body after digestion. If that animal gets eaten, its predator will have an even higher proportion of N-15. That makes this molecule more prominent in animals that eat a lot of meat, says Melanie Beasley, a biological anthropologist at Purdue University in West Lafayette, Ind. The proportion of N-15 to N-14 found in Neandertal bones is similar to that found in animals like hyenas, which eat almost exclusively meat, Beasley says. But humans can’t consume as much meat as specialized carnivores, says Karen Hardy, a prehistoric archeologist at the University of Glasgow in Scotland. Without a balanced diet, the human body transforms protein into energy instead of using it to develop muscle, hormones and more. This creates toxic waste products that can cause nausea, diarrhea and even death. So, if Neandertals probably couldn’t eat as much meat as lions or hyenas, where does all the N-15 come from? Rotting meat. © Society for Science & the Public 2000–2025.

Keyword: Evolution; Obesity
Link ID: 29866 - Posted: 07.26.2025

By Michael S. Rosenwald Sarah Morlok Cotton, the last surviving member of a set of identical quadruplets who charmed Depression-era America with song-and-dance performances, and then took part in a landmark psychological study after being diagnosed with schizophrenia, died on July 7 in Belleville, Mich. She was 95. Her death, at an adult foster home, was confirmed by her son David Cotton. The Morlok Quads, as they came to be known, were a medical marvel and attracted crowds of people to Edward W. Sparrow Hospital in Lansing, Mich., shortly after they were born there on May 19, 1930. Newspapers held naming contests, and the winning entry suggested names that derived from the first letters of the hospital: Edna, Wilma, Sarah and Helen. The quadruplets’ middle names were simply initials denoting their birth order. (Sarah, the third born, was C.) Donations poured in almost immediately. The city of Lansing provided the family with a rent-free home. The Massachusetts Carriage Company sent a custom-made baby carriage with four seats. Businessmen opened bank accounts for each child. “Lansing’s Morlok quadruplets,” The Associated Press wrote, “are the most famous group of babies on the American continent.” The Morloks charged visitors 25 cents to visit their home and see the babies. Carl Morlok, who ran for constable of Lansing in 1931, used photos of his daughters on his campaign ads with the slogan, “We will appreciate your support.” He won in a landslide. Amid the commotion, Sadie Morlok tried to provide her daughters with a sense of normalcy. “Our mother used to dress us in pretty little identical crocheted sweaters and bonnets in spring and summer, or snow pant outfits in winter,” Mrs. Cotton wrote in her autobiography, “The Morlok Quadruplets: The Alphabet Sisters” (2015). “Then, she would carefully seat two of us facing the other two in the carriage and go for a nice stroll around the block to give us sunshine and a breath.” © 2025 The New York Times Company

Keyword: Schizophrenia; Genes & Behavior
Link ID: 29865 - Posted: 07.26.2025

Katie Kavanagh How does your brain wake up from sleep? A study of more than 1,000 arousals from slumber has revealed precisely how the brain bestirs itself during the transition to alertness1 — a finding that might help to manage sleep inertia, the grogginess that many people feel when hitting the snooze button. Recordings of people as they woke from the dream-laden phase of sleep showed that the first brain regions to rouse are those associated with executive function and decision-making, located at the front of the head. A wave of wakefulness then spreads to the back, ending with an area associated with vision. The findings could change how we think of waking up, says Rachel Rowe, a neuroscientist at the University of Colorado Boulder, who was not involved with the work. The results emphasize that “falling asleep and waking up aren’t simply reverse processes, but really waking up is this ordered wave of activation that moves from the front to the back of the brain”, whereas falling asleep seems to be less linear and more gradual. The study was published today in Current Biology1. The wide-awake brain shows a characteristic pattern of electrical activity, recorded by sensors on the scalp — it looks like a jagged line made up of small, tightly packed peaks and valleys. Although the pattern looks similar during rapid eye movement (REM) sleep, when vivid dreams occur, this stage features a lack of skeletal-muscle movement. The peaks are taller during most stages of non-REM sleep, which ranges from light to very deep slumber. Scientists already knew that the ‘awakened’ signature occurs at different times in different brain regions, but common imaging techniques did not allow these patterns to be explored on a precise timescale. © 2025 Springer Nature Limited

Keyword: Sleep; Attention
Link ID: 29864 - Posted: 07.19.2025

Jon Hamilton After about age 40, our brains begin to lose a step or two. Each year, our reaction time slows by a few thousandths of a second. We're also less able to recall items on a shopping list. Those changes can be signs of a disease, like Alzheimer's. But usually, they're not. "Both of those things, memory and processing speed, change with age in a normal group of people," says Matt Huentelman, a professor at TGen, the Translational Genomics Research Institute, in Phoenix. Huentelman should know. He helps run MindCrowd, a free online cognitive test that has been taken by more than 700,000 adults. About a thousand of those people had test scores indicating that their brain was "exceptional," meaning they performed like a person 30 years younger on tests of memory and processing speed. Genetics played a role, of course. But Huentelman and a team of researchers have been focusing on other differences. A key protein called Reelin may help stave off Alzheimer's disease, according to a growing body of research. A protein called Reelin keeps popping up in brains that resist aging and Alzheimer's "We want to study these exceptional performers because we think they can tell us what the rest of us should be doing," he says. Early results suggest that sleep and maintaining cardiovascular health are a good start. Other measures include avoiding smoking, limiting alcohol and getting plenty of exercise. Huentelman was one of several dozen researchers who met in Miami this summer to discuss healthy brain aging. The event was hosted by the McKnight Brain Research Foundation, which funds studies on age-related cognitive decline and memory loss. To preserve cognitive function in later life, "we're going to have to understand [brain] aging at a mechanistic level," says Alice Luo Clayton, a neuroscientist who is the group's chief executive officer. © 2025 npr

Keyword: Development of the Brain
Link ID: 29863 - Posted: 07.19.2025

By Katarina Zimmer Using a tiny, spherical glass lens sandwiched between two brass plates, the 17th century Dutch microscopist Antonie van Leeuwenhoek was the first to officially describe red blood cells and sperm cells in human tissues, and observe “animalcules” — bacteria and protists — in the water of a lake. Increasingly powerful light microscopes followed, revealing cell organelles like the nucleus and energy-producing mitochondria. But by 1873, scientists realized there was a limit to the level of detail. When light passes through a lens, the light gets spread out through diffraction. This means that two objects can’t be distinguished if they’re less than roughly 250 nanometers (250 billionths of a meter) apart — instead, they’ll appear as a blur. That put the inner workings of cell structures off limits. Electron microscopy, which uses electron beams instead of light, offers higher resolution. But the resulting black-and-white images make it hard tell proteins apart, and the method only works on dead cells. Now, however, optics engineers and physicists have developed sophisticated tricks to overcome the diffraction limit of light microscopes, opening up a new world of detail. These “super-resolution” light microscopy techniques can distinguish objects down to 100 nanometers and sometimes even less than 10 nanometers. Scientists attach tiny, colored fluorescent tags to individual proteins or bits of DNA, often in living cells where they can watch them in action. As a result, they are now filling in key knowledge gaps about how cells work and what goes wrong in neurological diseases and cancers, or during viral infections. “We can really see new biology — things that we were hoping to see but hadn’t seen before,” says molecular cell biologist Lothar Schermelleh, who directs an imaging center at the University of Oxford in the United Kingdom. Here’s some of what scientists are learning in this new age of light microscopy. Overcoming the diffraction limit

Keyword: Brain imaging
Link ID: 29862 - Posted: 07.19.2025

By Tom Zeller Jr. During the week between two experimental infusions at the Danish Headache Center, where I had agreed to be a test subject, I rented a small flat in central Copenhagen, near Assistens Cemetery. This is where many notable Danes have been laid to rest, and I took some time that September to visit the monuments, which were shrouded in manicured stands of mature poplars and willows. The accompanying article is adapted from “The Headache: The Science of a Most Confounding Affliction — and a Search for Relief,” by Tom Zeller Jr. (Mariner Books, 310 pages). Copyright © 2025. Reprinted by permission. The grave of Niels Bohr, one of the 20th century’s leading figures in theoretical physics, is marked by a gray stone pillar with an owl perched on top. Hans Christian Andersen, the author who gave us “The Little Mermaid” and “The Ugly Duckling,” among other treasured stories, resides here too. But it felt most appropriate to my mission that Danish philosopher Søren Kierkegaard, who thought suffering was where life’s meaning is forged, occupied his own leafy corner of the park. In the Kierkegaardian tradition, suffering is redemptive — the feedstock of enlightenment — and rather than wallow in its insults and pains, the sufferer should embrace its power to transform. “Even the heaviest suffering cannot be heavier than a mountain,” he once wrote. “And thus, if the sufferer believes that his suffering is beneficial to him — yes, then he moves mountains. In order to move a mountain, you must get under it.” I was thinking of Kierkegaard when I first presented my arm to Lanfranco Pellesi, then a researcher at the Danish Headache Center, for my initial infusion. Pellesi had an early interest in studying near-death experiences, before turning his attention to pain, and then from pain to headaches. It struck me as such an obvious trajectory — one that followed an almost inevitable path — and I asked him how he made sense of that progression. “I think probably it links to the problem of conscience — where it is, where it’s not.”

Keyword: Pain & Touch
Link ID: 29861 - Posted: 07.19.2025

By Tina Hesman Saey A large-scale study of proteins in blood and cerebrospinal fluid could pave the way for improved blood tests to diagnose multiple brain diseases — and potential early warning signs of disease risk — researchers report July 15 in several papers in Nature Medicine and Nature Aging. Proteins do much of the work to keep cells and bodies working. Trouble with these building blocks can spell disease; protein misfolding, for instance, links many brain diseases. The results, drawn from samples from 18,645 people, reveal biochemical fingerprints of neurodegenerative disorders such as Alzheimer’s, Parkinson’s, frontotemporal dementia and amyotrophic lateral sclerosis, or ALS. These tests could also help identify disease subtypes and track progression before symptoms emerge. Such well-validated and robust results are “more likely to ultimately translate into something that’s medically actionable,” says Andrew Saykin, director of the Indiana Alzheimer’s Disease Research Center in Indianapolis, which contributed samples to the effort. In one key finding, researchers discovered that individuals carrying a form of the APOE gene called APOE4 — the biggest genetic risk factor for developing Alzheimer’s — share a blood signature regardless of diagnosis. That signature appeared not only in people with Alzheimer’s but also in those with other brain diseases or no neurodegeneration at all, neuroscientist Caitlin Finney and colleagues report in Nature Medicine. The APOE4 protein signature involves proteins that respond to infection and inflammation, hinting at how the variant predisposes carriers to brain diseases. It also suggests that the APOE4 protein may be involved in the early stages of multiple diseases. © Society for Science & the Public 2000–2025.

Keyword: Development of the Brain; Alzheimers
Link ID: 29860 - Posted: 07.16.2025

By Celina Zhao You could be 45 on paper but 60 in your kidneys. Turns out, your organs have birthdays of their own — and how well they’re faring may set the pace for your health, researchers report July 9 in Nature Medicine. Using data from nearly 45,000 people, scientists developed a blood-based test to estimate the biological age of 11 organs, providing a measure of how healthy or worn down each organ is. When a person has an organ substantially “older” than their actual age, disease risks tied to that organ surge. Conversely, extremely youthful brains and immune systems are linked to living longer, the results suggest. “The fact that [the researchers] can create an organ age using proteins — and use it to predict diseases that you would expect to be predicted from that organ — is quite amazing,” says Sarah Harris, a molecular biologist at the University of Edinburgh who was not involved in the study. Aging is far from a uniform process; each organ follows its own clock of decline. One way to track this hidden timeline, previously discovered by Stanford neurology researchers Hamilton Oh and Tony Wyss-Coray, is through the thousands of proteins coursing through our blood. Some unmistakably originate in the liver, while others can be traced to the lungs. Analyzing these proteins can reveal clues about how each organ is holding up. In the new study, the team zeroed in on thousands of patients from the UK Biobank, a long-term database tracking the health of individuals ages 40 to 70 for up to 17 years. By assessing proteins in the blood, the team determined the average protein signature for, say, a 40-year-old liver or 70-year-old arteries. © Society for Science & the Public 2000–2025.

Keyword: Development of the Brain
Link ID: 29859 - Posted: 07.16.2025

Sally Adee Keith Krehbiel lived with Parkinson’s disease for nearly 25 years before agreeing to try a brain implant that might alleviate his symptoms. He had long been reluctant to submit to the surgery. “It was a big move,” he says. But by 2020, his symptoms had become so severe that he grudgingly agreed to go ahead. Deep-brain stimulation involves inserting thin wires through two small holes in the skull into a region of the brain associated with movement. The hope is that by delivering electrical pulses to the region, the implant can normalize aberrant brain activity and reduce symptoms. Since the devices were first approved almost three decades ago, some 200,000 people have had them fitted to help calm the tremors and rigidity caused by Parkinson’s disease. But about 40,000 of those who received devices made after 2020 got them with a special feature that has largely not yet been turned on. The devices can read brain waves and then adapt and tailor the rhythm of their output, in much the same way as a pacemaker monitors and corrects the heart’s electrical rhythms, says Helen Bronte-Stewart, a neurologist at Stanford University in California. Bronte-Stewart received approval to start a clinical trial of this new technology, known as adaptive deep-brain stimulation (aDBS), the same week that Krehbiel was preparing for surgery. He recalls the phone call in which she asked him if he wanted to be her first participant: “I said, ‘Boy, do I!’” Five years on, the results of this 68-person trial, called ADAPT-PD, are under review for publication. Although the exact details are still under wraps, they were convincing enough to earn approval for the technology earlier this year from both US and European regulators. © 2025 Springer Nature Limited

Keyword: Parkinsons
Link ID: 29858 - Posted: 07.16.2025

By Celina Ribeiro Some say it was John Sattler’s own fault. The lead-up to the 1970 rugby league grand final had been tense; the team he led, the South Sydney Rabbitohs, had lost the 1969 final. Here was an opportunity for redemption. The Rabbitohs were not about to let glory slip through their fingers again. Soon after the starting whistle, Sattler went in for a tackle. As he untangled – in a move not uncommon in the sport at the time – he gave the Manly Sea Eagles’ John Bucknall a clip on the ear. Seconds later – just three minutes into the game – the towering second rower returned favour with force: Bucknall’s mighty right arm bore down on Sattler, breaking his jaw in three places and tearing his skin; he would later need eight stitches. When his teammate Bob McCarthy turned to check on him, he saw his captain spurting blood, his jaw hanging low. Forty years later Sattler would recall that moment. One thought raged in his shattered head: “I have never felt pain like this in my life.” But he played on. Tackling heaving muscular players as they advanced. Being tackled in turn, around the head, as he pushed forward. All the while he could feel his jaw in pieces. At half-time the Rabbitohs were leading. In the locker room, Sattler warned his teammates, “Don’t play me out of this grand final.” McCarthy told him, “Mate, you’ve got to go off.” He refused. “I’m staying.” Sattler played the whole game. The remaining 77 minutes. At the end, he gave a speech and ran a lap of honour. The Rabbitohs had won. The back page of the next day’s Sunday Mirror screamed “BROKEN JAW HERO”. © 2025 Guardian News & Media Limited

Keyword: Pain & Touch
Link ID: 29857 - Posted: 07.16.2025

By Shaena Montanari Leafcutter ants’ roles can be reprogrammed by manipulating two neuropeptides, according to a new study. These ants are known for their rigorous division of labor in a caste system, with groups performing roles ranging from cutting leaves to nest defense to tending the fungus that is their food source. Despite physical differences among the ants—the heads of the nest defender ants can be five times the size of the fungal carers’ heads, for instance—it’s still possible to “pharmacologically reprogram them to assume some of the roles that typically other castes assume,” indicating behavioral flexibility, says Daniel Kronauer, professor at Rockefeller University, who was not involved in the work. The researchers induced the behavioral changes by first using RNA sequencing to uncover target neuropeptides and then manipulating neuropeptide levels in the ants. The study was published in June in Cell. The work illustrates the close relationship between neuropeptides and behavior, says Shelley Berger, professor of cell and developmental biology at the University of Pennsylvania and principal investigator of the study. Defender ants are “so big and awkward and clumsy,” she says, but after a certain neuropeptide level is lowered, the ant becomes a “nurse tending to the brood.” The study shows the “importance of neuropeptides as these molecular controllers of incredibly complex” behavioral traits, says Zoe Donaldson, professor of behavioral neuroscience at the University of Colorado Boulder, who was not involved in the study. “I think it’s a really elegant demonstration of just how powerful they are.” Almost all species of ants live in colonies, but leafcutter ants (Atta cephalotes) have a particularly intricate labor division, says study investigator Karl Glastad, assistant professor of biology at the University of Rochester. He and Berger previously explored hormonal controls of social behavior in Florida carpenter ants, which have two worker subtypes, but leafcutter ants are a “really elaborated version” of that species, Glastad says. © 2025 Simons Foundation

Keyword: Hormones & Behavior; Evolution
Link ID: 29856 - Posted: 07.16.2025

Smriti Mallapaty A gene variant known to increase the risk of Alzheimer’s disease also makes people vulnerable to a host of other age-related brain disorders, from Parkinson’s disease to motor neuron disease. The gene variant, a version of apolipoprotein E called APOE ε4, produces a distinct set of proteins that contribute to chronic inflammation, finds an analysis1 using the largest proteomics database for neurodegenerative disease. Neurodegenerative diseases affect more than 57 million people worldwide. Researchers know that people who carry the APOE ε4 variant have an increased risk of developing late-onset Alzheimer’s disease, but studies are beginning to implicate this version, or allele, of APOE in other neurodegenerative diseases. Caitlin Finney and Artur Shvetcov, who study neurodegenerative diseases at the Westmead Institute for Medical Research in Sydney, Australia, and their colleagues wanted to better understand how this genetic risk factor contributes to disease. They took advantage of a newly established proteomics database that allowed them to look beyond individual diseases, says Finney. The Global Neurodegeneration Proteomics Consortium (GNPC) data set2 includes samples from more than 18,600 individuals, mainly of European ancestry, including many with Alzheimer’s, Parkinson’s, a form of motor neuron disease called amyotrophic lateral sclerosis (ALS) and types of dementia, as well as individuals without neurological disorders. The consortia collected around 250 million measurements of proteins found in the blood and cerebrospinal fluid, which surrounds the brain and spinal cord, taken at some two dozen clinics across the United States and Europe. “It’s one of the most powerful databases that we have available for proteomics right now,” says Maryam Shoai, a bioinformatician at University College London. Predicting risk © 2025 Springer Nature Limited

Keyword: Alzheimers; Parkinsons
Link ID: 29855 - Posted: 07.16.2025

By Jan Hoffman Jamie Mains showed up for her checkup so high that there was no point in pretending otherwise. At least she wasn’t shooting fentanyl again; medication was suppressing those cravings. Now it was methamphetamine that manacled her, keeping her from eating, sleeping, thinking straight. Still, she could not stop injecting. “Give me something that’s going to help me with this,” she begged her doctor. “There is nothing,” the doctor replied. Overcoming meth addiction has become one of the biggest challenges of the national drug crisis. Fentanyl deaths have been dropping, in part because of medications that can reverse overdoses and curb the urge to use opioids. But no such prescriptions exist for meth, which works differently on the brain. In recent years, meth, a highly addictive stimulant, has been spreading aggressively across the country, rattling communities and increasingly involved in overdoses. Lacking a medical treatment, a growing number of clinics are trying a startlingly different strategy: To induce patients to stop using meth, they pay them. The approach has been around for decades, but most clinics were uneasy about adopting it because of its bluntly transactional nature. Patients typically come in twice a week for a urine drug screen. If they test negative, they are immediately handed a small reward: a modest store voucher, a prize or debit card cash. The longer they abstain from use, the greater the rewards, with a typical cumulative value of nearly $600. The programs, which usually last three to six months, operate on the principle of positive reinforcement, with incentives intended to encourage repetition of desired behavior — somewhat like a parent who permits a child to stay up late as a reward for good grades. Research shows that the approach, known in addiction treatment as “contingency management,” or CM, produces better outcomes for stimulant addiction than counseling or cognitive behavioral therapy. Follow-up studies of patients a year after they successfully completed programs show that about half remained stimulant-free. © 2025 The New York Times Company

Keyword: Drug Abuse
Link ID: 29854 - Posted: 07.16.2025

Mariana Lenharo A speedy imaging method can map the nerves running from a mouse’s brain and spinal cord to the rest of its body at micrometre-scale resolution, revealing details such as individual fibres travelling from a key nerve to distant organs1. Previous efforts have mapped the network of connections between nerve cells, known as the connectome, in the mouse brain. But tracing the complex paths of nerves through the rest of the body has been challenging. To do so, the creators of the new map used a custom-built microscope to scan exposed tissue, completing the process in just 40 hours. Nerves look blue in the reconstructed view of a genetically engineered mouse (left) whose neurons produce a fluorescent marker. In a separate animal (right), antibodies detail the sympathetic nerves (purple). Credit: M.-Y. Shi et al./Cell (CC-BY-4.0) The method, described today in Cell, is an important technical achievement, says Ann-Shyn Chiang, a neuroscientist at the National Tsing Hua University in Hsinchu, Taiwan, who was not involved with the research. “This work is a major step forward in expanding connectomics beyond the brain,” he says. To prepare a mouse’s body for the scan, researchers treat it with chemicals that make its tissues transparent by removing fat, calcium and other components that block light. This provides a clear view of the nerves, which have been labelled with fluorescent marker proteins. The see-through body is then placed into a device that combines a slicing tool and a microscope that takes 3D images. A piston gradually pushes the mouse towards the slicing blade, 400 micrometres at a time. After each slice, a microscope images the newly exposed surface of the mouse, capturing details up to 600 micrometres deep — roughly the thickness of six sheets of paper — below the surface. The body then advances for the next cut. The cycle repeats around 200 times without pause, to cover the entire body. The images are then combined. © 2025 Springer Nature Limited

Keyword: Brain imaging; Development of the Brain
Link ID: 29853 - Posted: 07.12.2025