Most Recent Links
Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.
Liam Drew Three hearts; blue blood; no skeleton; arms like tongues. These are just some of the alien features of octopuses, squid and cuttlefish — members of the cephalopod family. The outlandish list continues. Cephalopod skin can taste chemicals, sense light and change colour and texture rapidly. In many species, the sucker-covered arms can even regenerate. These invertebrates have evolved independently from the vertebrate lineage for more than 600 million years. Their last common ancestor was probably a worm-like creature with a rudimentary nervous system and eye-like patches of light-sensitive cells. Despite this evolutionary gulf, vertebrates and these highly specialized molluscs share strange similarities. Their eyes, for example. “It’s eerie how similar they ended up,” says Cristopher Niell, a neuroscientist at the University of Oregon in Eugene. “The convergent evolution of the eye still blows my mind.” Now, one similarity is spurring a boom in cephalopod neuroscience. Around 400 million years ago, cuttlefish, squid and octopuses diverged from the only other living cephalopods — the nautiluses. They then lost their protective shells and evolved brains that are uniquely large among invertebrates. These brains bestow the soft-bodied cephalopods with high intelligence. Cuttlefish, squid and octopuses have excellent memories, use tools and are adept problem-solvers; they have a concept of time and are capable of delayed gratification. Cephalopods are the only non-vertebrate animals that have big, smart brains, says Cliff Ragsdale, a comparative neuroscientist at the University of Chicago in Illinois. And that presents a unique opportunity. Neuroscientists have gained a wealth of knowledge about how vertebrate brains work, but are increasingly looking to cephalopods for insights into ways to build large, high-functioning nervous systems. © 2026 Springer Nature Limited
Keyword: Evolution; Intelligence
Link ID: 30231 - Posted: 05.02.2026
By Meghan Rosen For the first time, doctors have used stem cells to try and repair the spinal cords of human fetuses in the womb. The new technique attempts to heal nerve damage caused by spina bifida, a disabling birth defect. In this condition, the bony tissue of a fetus’s spine doesn’t knit together properly around the spinal cord. That can cause a kaleidoscope of medical issues, including lifelong paralysis and bladder and bowel problems. Traditional fetal surgery to patch up the spine can limit the scope of these problems — but it does not repair nerve damage that has already occurred. Adding living stem cells to the procedure might. At least, that’s the goal of fetal surgeon Diana Farmer’s team. So far, the approach appears to be safe, the researchers reported earlier this year in theLancet. In six fetal patients with severe spina bifida, applying a stem cell–loaded patch to their exposed spinal cords did not cause infection, tumor growth or interfere with healing. That’s important because “no one knew what stem cells would do inside a fetus,” says Farmer, of the University of California, Davis. For now, the vital question — whether the technique mends fetal spinal cords — remains unanswered. That’s because researchers are still performing follow-up assessments of the patients, who are now toddlers. At this stage, it’s too early to say how well the surgery worked, and Farmer is careful not to speculate. “If we could get every kid to not be in a wheelchair,” she says, “that would be fantastic.” But the team won’t know for a few years. Until then, Farmer says, she doesn’t want to give people false hope. In some ways, this study represents “a seismic shift” in the field, says Ramen Chmait, director of Los Angeles Fetal Surgery at the University of Southern California, who was not involved with the work. If the technique pans out, he says, it “could be a huge, important step in modern-day medicine.” © Society for Science & the Public 2000–2026.
Keyword: Development of the Brain; Stem Cells
Link ID: 30230 - Posted: 05.02.2026
By Kate Golembiewski By watching their peers, dolphins learn to capture fish in empty conch shells, then ferry the shells up to the water’s surface in order to eat. Octopuses can master experimental tasks by watching their tankmates in the laboratory. Crows follow the cues of others in their flock to attack specific humans who have harassed fellow crows in the past. Scientists call it “social learning,” and it essentially means monkey see, monkey do, an adage that turns out to apply to many animals beyond just primates. Now, a study of Australia’s sulfur-crested cockatoos shows that the birds employ social learning to understand whether unfamiliar foods are safe to eat. In more forested areas of the cockatoos’ native range in Australia, New Guinea, and Indonesia, these mohawked parrots eat plant roots, seeds, fruits and insect larvae. But the birds have learned to thrive in urban environments. “They’re everywhere in Sydney,” said Julia Penndorf, a behavioral ecologist and lead author of the study in PLOS Biology, who encountered the birds as a postdoctoral researcher at the Australian National University in Canberra. In urban areas, the birds have expanded their diets to include nonnative plants and nuts, including almonds and sunflower seeds people offer to them, and they can be seen prying the lids off garbage bins in order to forage. “The big issue with urban birds is, they kind of eat everything,” Dr. Penndorf, who now works at the University of Exeter, said. This expanded diet is high-risk, high-reward: the birds have more options for food, but there’s always a chance that strange new snacks might be poisonous. © 2026 The New York Times Company
Keyword: Learning & Memory; Evolution
Link ID: 30229 - Posted: 05.02.2026
By Ellen Barry As Health Secretary Robert F. Kennedy Jr. sets out to rein in the use of psychiatric medications, a group of prominent psychiatrists are developing guidance for helping patients to stop taking them, noting that providers sometimes “park” patients on medications that are no longer necessary or effective. The experts, whose first recommendations appeared in JAMA Network Open and the British Journal of Psychiatry, identify structural problems that may lead to overprescribing: There are few clinical trials showing when it is advisable to stop a medication; many providers do not regularly review whether a prescription is still needed; and psychiatry residents receive more training in starting drug prescriptions than stopping them. “We have not really taught our trainees to think about, what is the logical endpoint?” said Dr. Joseph F. Goldberg, a past president of the American Society of Clinical Psychopharmacology, which convened a group of 45 psychiatrists to agree on basic principles for “deprescribing,” as supervised drug tapering is sometimes called. “You’ll see a patient in consultation who has been parked on a medication which seems to be ineffective for years, and you’ll ask, ‘Why are you still on this medicine?’” he said. “We’ve got a bugaboo going about passive re-prescribing, and I hope we’ll see much less of that.” The new recommendations come amid rising pressure from Mr. Kennedy and his allies in the Make America Healthy Again movement, who have long made the case that Americans overuse psychiatric medications. The Department of Health and Human Services will convene expert panels on deprescribing the main class of medication used to treat depression — selective serotonin reuptake inhibitors, or S.S.R.I.s — this summer, with an eye toward developing official guidance. © 2026 The New York Times Company
Keyword: Depression
Link ID: 30228 - Posted: 05.02.2026
By Jennie Erin Smith Seizures are the most dramatic symptom of epilepsy, but they’re not the only type of abnormal brain activity in people with the condition. Brief electrical bursts called interictal epileptiform discharges, or interictal spikes, can occur hundreds or thousands of times a day, usually without a person noticing. Though not as dangerous as seizures, they can cause temporary confusion and contribute to long-term cognitive problems even in those whose seizures are controlled. A new study of highly detailed recordings from human brains reveals these spikes occur in a choreographed sequence of events that is consistent and predictable. The research, published today in Nature Neuroscience, also shows spikes recruit some of the same neurons involved in speech perception, pulling them briefly off their jobs. The new findings are “impactful,” says Jennifer Gelinas, a neurologist and epilepsy researcher at the University of California (UC), Irvine who was not involved with the study. The work, she says, opens the door to a new generation of brain stimulation technologies that might anticipate and abort spikes before they can cause harm. Named for the distinct peaks they form on electroencephalography readings, spikes were once dismissed by clinicians as benign. But they’re increasingly recognized as far from it. In 2023, a team led by neurologist Jonathan Kleen of UC San Francisco (UCSF) reported that people with temporal lobe epilepsy could not remember or repeat back a word spoken to them during a spike: They went blank. “Imagine this happening when you’re in class, or giving a presentation,” Kleen says. One 2025 study went so far as to conclude that spikes occurring during sleep are the major culprit in long-term memory issues among people with temporal lobe epilepsy. Implantable brain stimulation devices used to suppress seizures can detect spikes and react to them, but they can’t predict them. And this type of treatment, known as closed-loop responsive neurostimulation, can take years to calm epileptic activity. Some antiseizure drugs can also reduce spikes, but treating them “is not as easy as it sounds,” says epileptologist Dániel Fabó of the University of Szeged, who was not involved in the study. Antiepilepsy drugs are tested for their effect on seizures, not spikes, he notes, and using too much of them can affect cognitive function.
Keyword: Epilepsy; Brain imaging
Link ID: 30227 - Posted: 05.02.2026
By Kristen French What is a cat, and how do we know when we’ve encountered one? This question may be harder to answer than it seems. Neuroscientists Lisa Feldman Barrett and Earl Miller say people typically think about categories such as cat and apple backward—bottom-up instead of top-down. In reality, you don’t hear a meow, and see whiskers and paws and then conclude, “Cat!” Before any of this happens, your brain has sent signals about a “cat hypothesis”—and a plan for how to respond to a cat—to your body, based on past experience, Barrett and Miller say. This cat hypothesis, in turn, actively orchestrates what signals your body processes and how. In other words, the brain constructs classifications on the fly, and we’re not even conscious this is happening until after the fact. Barrett, a renowned Harvard neuroscientist and psychologist who has written for Nautilus and is best known for her theory of constructed emotion, teamed up with Miller to review “converging” evidence from a wide range of disciplines: neuroanatomy, electrophysiology, brain imaging, and cognitive science. The pair published their results recently in Nature Reviews: Neuroscience. Their new theory of categories has a lot in common with Barrett’s theory of how emotions work. She argues that emotions aren’t hardwired universal reactions, but are instead predictions constructed rapidly and in the moment from internal bodily sensations, past experiences, and cultural context. While her work on emotions has been highly influential, it remains an active subject of debate in the field of psychology. I spoke with Barrett and Miller about what they call “folk psychology,” and how their theory of categorization relates to so-called beginner’s mind, human bias, and objectivity and mental illness. We also talked about Nobel Laureate Daniel Kahneman’s modes of thinking fast and slow. © Copyright 2026
Keyword: Attention; Emotions
Link ID: 30226 - Posted: 05.02.2026
By Rachel E. Gross The first question Sophie Davies had was: Will it affect my memory? In the three weeks since giving birth, Ms. Davies had been in a downward spiral. She checked herself into the mother-and-baby unit of her hospital in East Anglia, England, where doctors ratcheted up the dose of Prozac she took to manage her obsessive-compulsive disorder. But every morning she woke up in tears, and every time she looked at her baby boy, she felt hollow with guilt. “I’m never going to be able to be a mom,” she recalled thinking, “or if I am, I’m not going to be able to be a good one.” A month in, a hospital worker suggested she try a headset that used an electric current to treat depression. The word “electric” gave Ms. Davies, then 34, pause. It sounded like electroconvulsive therapy, or ECT, the scary-sounding treatment that triggers seizures and can result in memory loss. This therapy was different. Transcranial direct-current stimulation, or tDCS, uses a weak electric current to shock the brain and does not produce seizures. “This is as far from ECT as a jet engine is from my bicycle,” Dr. Mark George, of the Medical University of South Carolina, where he is a leading expert in neuromodulation, a term that encompasses all therapies that use electricity to modify brain function. Ms. Davies did an internet search and confirmed that the side effects of tDCS — ringing in the ears, headaches and mild burns or irritation where the electrode pads touched the forehead — were generally transient and didn’t include amnesia. She decided to give it a try. In England, the brain stimulation device has been approved for treating depression since 2019. It can be prescribed by a doctor or purchased over the counter, where it sells for around $530. © 2026 The New York Times Company
Keyword: Depression; Brain imaging
Link ID: 30225 - Posted: 04.29.2026
Chris Simms Olfactory receptors in the mouse nose have been mapped out in unprecedented detail — overturning researchers’ understanding of how noses build a sense of smell. The research, published today in Cell1, shows how around 1,100 olfactory receptors expressed on sensory neurons are organized in tightly regulated spatial locations in the epithelial tissue that lines the nasal cavity. A second study2 provides a complementary atlas of olfactory receptor expression in the olfactory epithelium and their neural connections to the olfactory bulb in the brain. “For 30 years, we’ve taught students that the mouse olfactory epithelium is divided into a handful of broad zones, within which receptor choice is essentially random,” says Johan Lundström, a psychologist and experimental neuroscientist at the Karolinska Institute in Stockholm. In the study, researchers examined about five million neurons from hundreds of individual mice. They first used single-cell sequencing to identify which smell receptors were expressed by neurons in the nose, and then used spatial transcriptomics to map out where key genes were being expressed. This allowed them to pinpoint where the receptors are and show that they are always arranged in horizontal stripes running from the top of the nose to the bottom. “Each receptor adopts a particular position in the nose. Since there are a thousand positions in the nose, each receptor is expressed basically in a stripe that overlaps with other receptor stripes, in a thousand overlapping stripes,” says study co-author Sandeep Robert Datta, a neurobiologist at Harvard Medical School in Boston, Massachusetts. Datta and his colleagues propose that this spatial mapping is organized during development and is controlled by sets of genes. The authors found that a molecule called retinoic acid had a key role in this process. They discovered a gradient in the amount of retinoic acid present at different points in the nose. By tweaking how much this molecule was expressed, they showed that it helps to control gene activity, guiding each neuron to express the correct type of smell receptor for its location. © 2026 Springer Nature Limited
Keyword: Chemical Senses (Smell & Taste)
Link ID: 30224 - Posted: 04.29.2026
By Siddhant Pusdekar Transcriptional changes are essential for converting new experiences into memories but may not be required to make memories last, a new study suggests. The findings, published in eNeuro in March, conflict with a model proposing that positive feedback loops of transcription can help maintain long-term memories, says study investigator Irina Calin-Jageman, professor of biological sciences at Dominican University. But they open up a set of hypotheses about how transcription maintains long-term memories and indicate that the handful of genes whose regulation persists for up to two weeks could be “really key,” she adds. The results, obtained in the sea slug Aplysia californica, are “one small step on our way to understanding this very important question of: What is the role of transcription in forming long-term memories?” says Wayne Sossin, distinguished James McGill professor of neurology and neurosurgery at McGill University, who is listed as a reviewer for the paper. Disproving models doesn’t “get the attention it deserves, I think, from the scientific community,” he says, but science is built on overturning theory. Irina Calin-Jageman and her colleagues focused on the transcriptional traces of a partially faded memory in the sea slug. When the animal feels threatened, it retracts a breathing apparatus on its back called a siphon. After traumatic experiences—such as induced shocks—the slug retracts its siphon for longer than usual, previous work showed. Also, sensory neurons in the pleural ganglia change their gene expression patterns and remain more excitable for up to 24 hours, and synaptic changes can last for several days to weeks, depending on the training. © 2026 Simons Foundation
Keyword: Learning & Memory
Link ID: 30223 - Posted: 04.29.2026
By Emma Yasinski “Relapse is a part of recovery”: That’s a common refrain among professionals who treat substance use disorders. Many people who have completed treatment programs return to substance use and reenter treatment multiple times, after days, weeks or even years of sobriety. Marina Wolf, a behavioral neuroscientist at the Oregon Health & Science University, studies how cells in the brain respond to drug exposure in ways that can lead people to develop powerful cravings even months after they stop using drugs such as cocaine, opioids or alcohol. Specifically, she has focused on an aspect of this problem called cue-induced craving, in which people’s brains come to associate a cue — such as seeing a certain location where they previously used drugs — with the desire to use that drug. These learned associations, as she described in the 2025 Annual Review of Pharmacology and Toxicology, are caused by structural changes to the brain — neuroplasticity — as a result of drug use, including the strengthening of connections, called synapses, between specific nerve cells. These changes don’t disappear as soon as a person, or animal, stops using a drug. Cravings, in fact, can strengthen after abstinence, leaving a person vulnerable to resume using. How did you become interested in neuroplasticity and addiction? I never had any formal training in synaptic plasticity or addiction. As a graduate student and then a postdoctoral fellow, I worked on how neurons are regulated by the neurotransmitter dopamine, but we studied dopamine’s role in antipsychotic drug effects, not addiction. But when I was setting up my own lab in the early 1990s, I had a friend from graduate school who was involved in groundbreaking studies to work out synaptic plasticity mechanisms in the brain’s hippocampus, a region of the brain responsible for encoding memories. This was fascinating work that helped demonstrate a critical role for a neurotransmitter called glutamate in synaptic plasticity, so I followed it closely. © 2026 Annual Reviews
Keyword: Drug Abuse
Link ID: 30222 - Posted: 04.29.2026
By Gina Kolata Before the new obesity drugs came on the market, almost no one used the term food noise. Researchers studying and developing drugs like Ozempic, Wegovy, Mounjaro and Zepbound analyzed doses, side effects, weight loss and improvements in conditions such as diabetes, heart disease and sleep apnea. Incessant thoughts about food and internal dialogues about what to eat, what not to eat, when to eat, how to resist eating — these were not on the research agenda. But if the obesity-drug researchers weren’t talking about food noise, people taking GLP-1s had a lot to say about it. For as long as they could remember, users of the drugs said, they had been plagued by food noise. But they thought it was just a normal part of life. They thought everyone had it. Until they took one of the new drugs. Suddenly, food noise was silenced. And that effect is leading to new questions about the drugs. If researchers can clarify the source of this inner buzz and what makes it go away, that could lead to a clearer understanding of what causes obesity in the first place. ‘You Don’t Want the Salad’ People who struggle with their weight describe relentless thoughts of food. Lena Smith Parker, 53, of Hamden, Conn., spent decades dieting and regaining weight. All the while, she said, she was plagued by internal voices urging her to eat and shaming her for eating. © 2026 The New York Times Company
Keyword: Obesity
Link ID: 30221 - Posted: 04.29.2026
Nicola Davis Science correspondent It has long been known that dogs have less between their ears than wolves, but now research has suggested their brains started to get smaller at least 5,000 years ago. Experts say the results offer fresh insights into the domestication of our canine companions. However, the findings are unlikely to explain why your spaniel will only drink from a muddy puddle: the researchers say a reduction in brain size does not mean dogs are dafter than their wolf-like ancestors. “The way our dogs live nowadays doesn’t give them the opportunity to always express most of their intelligence,” said Dr Thomas Cucchi, first author of the study from the French National Centre for Scientific Research. “But they are extremely clever and domestication didn’t make them stupid, but made them really capable of reading us and communicating with us.” The relationship between humans and canines is ancient, with research revealing the oldest direct genetic evidence for domestic dogs dates back more than 15,000 years. But while a reduction in brain size is typically considered a hallmark of domestication, there has long been debate over exactly when dogs ended up with smaller brains than wolves, with some experts suggesting this may have occurred early in the dog-human relationship. However, others argue smaller brain size is not a hallmark of domestication but instead reflects the emergence of pedigree breeds in the last 200 years. Writing in the journal Royal Society Open Science, Cucchi and colleagues studied CT scans of the skulls of 22 prehistoric wolves and dogs, dating from 35,000 to 5,000 years ago, as well as CT scans from the skulls of 59 modern wolves and 104 modern dogs. The latter included different modern breeds as well as stray or “village” dogs, and dingoes. © 2026 Guardian News & Media Limited
Keyword: Evolution
Link ID: 30220 - Posted: 04.29.2026
By Yasemin Saplakoglu Every experience we have changes our brain, the way a ceramicist reshapes a slab of clay. Every corner we turn, every conversation we have, every shudder we feel causes cascading effects: Chemicals are released, electricity surges, the connections between brain cells tighten, and our mental models update. The brain is “incredibly plastic, and it stays that way throughout the lifespan of a human,” said Christine Grienberger (opens a new tab), a neuroscientist at Brandeis University. This plasticity, the quality of being easily reshaped, makes the brain really good at learning — a quintessential process that allows us to remember the plotline of a novel, navigate a new city, pick up a new language, and avoid touching a hot stove. But neuroscientists are still uncovering fundamental rules that describe how neuroplasticity reshapes brain connections. Recently, neuroscientists described a new form of neuroplasticity that might be helping the brain learn across a timescale of several seconds — long enough to capture the behavioral process of learning from a single experience. In two recent reviews, published in The Journal of Neuroscience (opens a new tab) and Nature Neuroscience (opens a new tab), they describe “behavioral timescale synaptic plasticity,” or BTSP. This type of learning in the hippocampus, the brain’s memory hub, is caused by an electrical change that affects multiple neurons at once and unfolds across several seconds. Researchers suspect that it may help the brain learn in a single attempt. “It’s pretty clear that [BTSP is] a strong, powerful mechanism that can lead to immediate memory formation,” said Daniel Dombeck, a neuroscientist at Northwestern University who was not involved with the theory’s development. “It’s something that has been missing in the field for a long time.” © 2026 Simons Foundation
Keyword: Learning & Memory
Link ID: 30219 - Posted: 04.26.2026
Katherine Bourzac Scientists have discovered that the unsung brain cells called astrocytes form extensive networks in the mouse brain1 — networks similar in some respects to the brain circuits formed by the more celebrated brain cells called neurons. The researchers compiled a whole-brain, 3D map of astrocyte networks, which the authors say is the first of its kind. It , shows that webs of the cells connect far-flung regions of the brain, allowing the cells to exchange molecules with each other over long distances. The ‘silent’ brain cells that shape our behaviour, memory and health “It’s a secret subway system we didn’t know was there,” says Shane Liddelow, a neuroscientist at NYU Grossman School of Medicine in New York City and a co-author of a paper published today in Nature describing the work. “This opens up a whole new avenue of investigation.” Astrocyte networks can bridge the brain’s hemispheres, and they display plasticity, reshaping their connections in response to sensory deprivation, the team found. The work is “a fundamentally important advance in our understanding of nervous system structure”, says David Lyons, a neurobiologist at the University of Edinburgh, UK, who was not involved with the research. He adds that so far, this new evidence of complex astrocyte networks raises more questions than it answers. “Clearly we are some way from understanding what the functional relevance and role of such [networks] is, but there are a myriad of possibilities.” © 2026 Springer Nature Limited
Keyword: Glia; Learning & Memory
Link ID: 30218 - Posted: 04.26.2026
By Calli McMurray Last Saturday, President Donald Trump issued an executive order outlining regulatory tweaks intended to “accelerate” U.S. research on and increase access to psychedelic drugs for mental health treatments. The measures target clinical research, not basic studies on how the drugs work. “This may not be the breakthrough the basic research community has been looking for,” says Shawn Lockery, professor of neuroscience at the University of Oregon. The order directs the U.S. Food and Drug Administration (FDA) to speed up review of psychedelic drugs and allots “at least $50 million” from the Department of Health and Human Services for state governments’ own psychedelics research programs. One section of the order, however, could eventually make it easier for basic researchers to access psychedelics for their work. The U.S. Drug Enforcement Agency (DEA) classifies most psychedelics—including psilocybin, MDMA and LSD—as Schedule I, meaning they have “no currently accepted medical use and a high potential for abuse.” Trump’s order calls for the U.S. attorney general to review “any product containing a Schedule I substance that has successfully completed Phase 3 clinical trials for a serious mental health disorder” and consider it for rescheduling to the less restrictive Schedule III. To study a Schedule I drug, researchers must apply for a license and, if approved, follow strict storage and security requirements. Approval can take up to a year, says Alex Kwan, professor of biomedical engineering at Cornell University, who studies psilocybin’s mechanism of action in the brain. “It’s a decent bar to get it. It’s not easy.” © 2026 Simons Foundation
Keyword: Drug Abuse; Depression
Link ID: 30217 - Posted: 04.26.2026
By Gina Kolata The Food and Drug Administration on Thursday approved a gene therapy that can cure a rare, inherited form of deafness. The treatment is the first to restore normal hearing in children who were born deaf. The maker of the therapy, Regeneron, plans to provide it free to any child who needs it. “We wanted to make a statement,” Dr. George Yancopoulos, Regeneron’s chief scientific officer said on Thursday morning. He explained that the company wants to be sure its treatment “would be able to reach its full potential and help as many people as possible.” Some gene therapies for other diseases, priced in the millions of dollars, have had dismal sales. The therapy called Otarmeni, is intended for children with otoferlin deafness, a rare form of hearing loss caused by a mutation in a single gene. The mutation destroys a protein in the inner ear that is needed to transmit sound to the brain. Although otoferlin deafness accounts for just 2 percent to 8 percent of congenital hearing loss, the new treatment “is groundbreaking,” Dr. Dylan Chan, a pediatric otolaryngologist at the University of California, San Francisco, said. He added, “This is the first time in history that there has been a medical therapy that has enabled deaf children to hear.” Dr. Chan has been a paid adviser to Regeneron and to Eli Lilly, which is also developing a gene therapy for otoferlin deafness. He is also a principal investigator for Lilly’s clinical trial of the treatment. © 2026 The New York Times Company
Keyword: Hearing; Genes & Behavior
Link ID: 30216 - Posted: 04.26.2026
By Jake Currie Chimpanzees and humans share 98 percent of their genomes, so what’s in that 2 percent that makes us uniquely human? According to a new study published in Science Advances, a tiny portion of these genes play an outsized role in our language skills—and Neanderthals had the same sequences. Subscribe to skip ads Featured Video These segments of the human genome, known as Human Ancestor Quickly Evolved Regions (HAQERs) are non-coding sequences that showed accelerated evolution after humans split from the ancestor they shared with apes. Even though they represent only 0.1 percent of our genes, they’re responsible for the neural “hardware” for language. “What we’re seeing is how a very small part of the genome can have an outsized influence, not just on who we were as a species, but on who we are as individuals,” study author Jacob Michaelson of the University of Iowa said in a statement. “These aren’t genes we’re talking about. They’re regulatory regions that act like the volume knob on genes.” The HAQERs also interact with another vital speech gene: FOXP2. Identified in 1998, FOXP2 is a transcription factor active in the development of the neural circuitry of language use, and mutations in the gene can cause speech problems. “So, if the HAQERs are like volume knobs that can be turned, FOXP2 is one of the hands that is turning these volume knobs,” Michaelson said. © Nautilus 2026
Keyword: Language; Evolution
Link ID: 30215 - Posted: 04.26.2026
By Nora Bradford If you were to imagine a waterfall, a misty cascade into an azure pool surrounded by towering trees might come to mind. That mental vision might also be accompanied by the imagined roar of water splashing down. But when it comes to our brains, does imagining a waterfall activate different areas compared with seeing or hearing one in real life? For both sounds and sights, the overlap between imagination and perception appears not in brain areas linked to a single sense, but in high-level areas that accept multiple types of sensory inputs, researchers report March 31 in Neuron. For years, cognitive neuroscientist Rodrigo Braga has been working to determine whether the human brain is processing mental imagery through hearing and other senses or whether something else is at play. “When I was a teenager, I remember the first time realizing that there’s like a voice I can hear in my head and thinking, ‘Oh, that’s really strange’,” says Braga, of Northwestern University Feinberg School of Medicine in Chicago. In this study, he and his colleagues prompted eight participants to imagine scenes, faces, someone else speaking, internal monologues and sounds while in an MRI scanner. The small number of individuals allowed the researchers to collect hours of MRI data to create individualized brain maps rather than averaging across individuals. This technique allowed the team to reliably find individual variation in brain activity during imagination. © Society for Science & the Public 2000–2026.
Keyword: Consciousness; Attention
Link ID: 30214 - Posted: 04.26.2026
Hannah Critchlow About 2 billion years ago, evolution performed an improbable experiment. A larger ancestral cell engulfed a smaller bacterium. It should have been a meal. Instead, it became a merger. The bacterium survived inside its host, and together they forged one of the most consequential partnerships in the history of life. The host offered shelter and access to oxygen. The bacterium supplied something revolutionary: a vastly more efficient way to generate energy. From this intimate alliance emerged the eukaryotic cell – and with it, the possibility of complex life. Every plant, animal and thinking being traces its lineage back to that ancient symbiosis. Our capacity for reflection, imagination and doubt rests upon what was once a free-living microbe. We call these descendants mitochondria. They persist in nearly every cell of our bodies, hundreds to thousands at a time. In total, we carry an estimated 10 million billion of them – collectively accounting for roughly a 10th of our body mass. Red blood cells are the exception: they lack mitochondria, which maximises oxygen transport. Almost every other cell depends on them absolutely. Neurons are especially demanding hosts. Each contains thousands of mitochondria, occupying up to 40 per cent of its volume. These rod-shaped structures are often described as the cell’s powerhouses. Through aerobic metabolism, they generate most of the chemical energy that keeps cells alive and functioning – the molecular fuel that sustains every biological process. Although the brain represents just 2 per cent of body weight, it consumes about 20 per cent of our energy at rest. Every perception, memory, emotion and idea is metabolically expensive. Thought itself is an energy-hungry act. Weight for weight, our brains are more mitochondrial than neural. This is more than a biological curiosity. It suggests that cognition is inseparable from metabolism – that the mind is not only shaped by networks of neurons but by networks of energy. © Aeon Media Group Ltd. 2012-2026.
Keyword: Biomechanics; Evolution
Link ID: 30213 - Posted: 04.22.2026
Ian Sample Science editor A married couple who met over a dissected brain and went on to create the first approved gene therapy for blindness have been awarded one of the most lucrative prizes in science. Molecular biologist Jean Bennett and ophthalmologist Albert Maguire share the $3m (£2.2m) Breakthrough prize for life sciences with physician Katherine High for the 25-year-long project, during which the couple adopted a pair of dogs they had treated for blindness. The therapy, named Luxturna, was approved in the US in 2017 and has transformed the lives of people born with Leber congenital amaurosis (LCA), a genetic disorder that typically causes total blindness by early adulthood. Proof that the therapy worked came in a clinical trial in which one patient described seeing their child’s face for the first time, the fine grain in wooden furniture and branches waving in the wind. Other patients reported similar profound improvements. Nine slices of bread toasted and burned to different degrees, from white to blackened. “I was overwhelmed,” said Bennett, who is now retired from the University of Pennsylvania. “It was one of the most miraculous eureka moments you can imagine.” Bennett said it was a “tremendously exciting time” for scientific and medical research, but warned that the US administration’s attacks on science could “cause damage for generations to come”, leading her to fear a brain drain that the country would struggle to recover from. “Agendas have become politicised, government agencies that support basic and applied research have been undermined, knowledgable advisers and experts have been dismissed or have fled and revised guidelines contradict decades of rigorous research,” she said. © 2026 Guardian News & Media Limited
Keyword: Vision
Link ID: 30212 - Posted: 04.22.2026


.gif)

