Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 28993

By Andrea Tamayo Kidney cells can make memories too. At least, in a metaphorical sense. Neurons have historically been the cell most associated with memory. But far outside the brain, kidney cells can also store information and recognize patterns in a similar way to neurons, researchers report November 7 in Nature Communications. “We’re not saying that this kind of memory helps you learn trigonometry or remember how to ride a bike or stores your childhood memories,” says Nikolay Kukushkin, a neuroscientist at New York University. “This research adds to the idea of memory; it doesn’t challenge the existing conceptions of memory in the brain.” In experiments, the kidney cells showed signs of what’s called a “massed-space effect.” This well-known feature of how memory works in the brain facilitates storing information in small chunks over time, rather than a big chunk at once. Outside the brain, cells of all types need to keep track of stuff. One way they do that is through a protein central to memory processing, called CREB. It, and other molecular components of memory, are found in neurons and nonneuronal cells. While the cells have similar parts, the researchers weren’t sure if the parts worked the same way. In neurons, when a chemical signal passes through, the cell starts producing CREB. The protein then turns on more genes that further change the cell, kick-starting the molecular memory machine (SN: 2/3/04). Kukushkin and colleagues set out to determine whether CREB in nonneuronal cells responds to incoming signals the same way. © Society for Science & the Public 2000–2024.

Keyword: Learning & Memory
Link ID: 29576 - Posted: 11.27.2024

Heather Margonari The opioid crisis remains a significant public health challenge in the United States. In 2022, over 2.5 million American adults had an opioid use disorder, and opioids accounted for nearly 76% of overdose deaths. Some patients are fearful of using opioids after surgery due to concerns about dependence and potential side effects, even when appropriately prescribed by a doctor to manage pain. Surgery is often the first time patients receive an opioid prescription, and their widespread use raises concerns about patients becoming long-term users. Leftover pills from a patient’s prescriptions may also be misused. Researchers like us are working to develop a personalized and comprehensive surgical experience that doesn’t use opioids. Our approach to opioid-free surgery addresses both physical and emotional well-being through effective anesthesia and complementary pain-management techniques. What is opioid-free anesthesia? Clinicians have used morphine and other opioids to manage pain for thousands of years. These drugs remain integral to anesthesia. Help us raise up the voices of experts. Most surgical procedures use a strategy called balanced anesthesia, which combines drugs that induce sleep and relax muscles with opioids to control pain. However, using opioids in anesthesia can lead to unwanted side effects, such as serious cardiac and respiratory problems, nausea and vomiting, and digestive issues. Concerns over these adverse effects and the opioid crisis have fueled the development of opioid-free anesthesia. This approach uses non-opioid drugs to relieve pain before, during and after surgery while minimizing the risk of side effects and dependency. Studies have shown that opioid-free anesthesia can provide similar levels of pain relief to traditional methods using opioids. Copyright © 2010–2024, The Conversation US, Inc.

Keyword: Pain & Touch
Link ID: 29575 - Posted: 11.27.2024

By Diana Kwon Since a schizophrenia drug, the first in decades with an innovative mechanism of action, gained US regulatory approval in September, some researchers have proclaimed a new era for psychiatric medicine. About half a dozen similar drugs — for schizophrenia, Alzheimer’s disease and other conditions involving the brain — are in various stages of development, most in early-stage clinical trials. But the success of these medicines is not a given. Last week, a trial of a highly anticipated schizophrenia drug reported disappointing results. For decades, schizophrenia drugs worked in essentially the same way. They blunted the activity of dopamine, a chemical involved in the disorder’s hallmark symptoms, such as hallucinations and delusions. The new kid on the block is KarXT, sold as Cobenfy. It targets muscarinic receptors and leads to antipsychotic and cognitive benefits. “I don’t think I’ve ever seen this much buzz and excitement over a new approach in psychiatry in my career,” says Jeffrey Conn, a pharmacologist at Vanderbilt University in Nashville, Tennessee, who was one of the company’s scientific co-founders. KarXT’s success in winning US regulatory approval has revived interest in muscarinic drugs. “Drug discovery is coming back to psychiatry,” says Arthur Christopoulos, a molecular pharmacologist at Monash University in Melbourne, Australia, who was involved in the development of KarXT. But developing new medicines is a hard, long road. On 11 November, Abbvie, a pharmaceutical company in North Chicago, Illinois, announced that its muscarinic drug for schizophrenia, called emraclidine, had failed to outperform a placebo. What this means for other muscarinic drugs in development remains to be seen, Christopoulos says. “It is still early days.” © 2024 Springer Nature Limited

Keyword: Schizophrenia
Link ID: 29574 - Posted: 11.23.2024

By Sofia Quaglia It’s amazing what chimpanzees will do for a snack. In Congolese rainforests, the apes have been known to poke a hole into the ground with a stout stick, then grab a long stem and strip it through their teeth, making a brush-like end. Into the hole that lure goes, helping the chimps fish out a meal of termites. How did the chimps figure out this sophisticated foraging technique and others? “It’s difficult to imagine that it can just have appeared out of the blue,” said Andrew Whiten, a cultural evolution expert from the University of St. Andrews in Scotland who has studied tool use and foraging in chimpanzees. Now Dr. Whiten’s team has set out to demonstrate that advanced uses of tools are an example of humanlike cultural transmission that has accumulated over time. Where bands of apes in Central and East Africa exhibit such complex behaviors, they say, there are also signs of genes flowing between groups. They describe this as evidence that such foraging techniques have been passed from generation to generation, and innovated over time across different interconnected communities. In a study published on Thursday in the journal Science, Dr. Whiten and colleagues go as far as arguing that chimpanzees have a “tiny degree of cumulative culture,” a capability long thought unique to humans. From mammals to birds to reptiles and even insects, many animals exhibit some evidence of culture, when individuals can socially learn something from a nearby individual and then start doing it. But culture becomes cumulative over time when individuals learn from others, each building on the technique so much that a single animal wouldn’t have been able to learn all of it on its own. For instance, some researchers interpret using rocks as a hammer and anvil to open a nut as something chimpanzees would not do spontaneously without learning it socially. Humans excel at this, with individual doctors practicing medicine each day, but medicine is no one single person’s endeavor. Instead, it is an accumulation of knowledge over time. Most chimpanzee populations do not use a complex set of tools, in a specific sequence, to extract food. © 2024 The New York Times Company

Keyword: Evolution; Learning & Memory
Link ID: 29573 - Posted: 11.23.2024

By Janna Levin It’s fair to say that enjoyment of a podcast would be severely limited without the human capacity to create and understand speech. That capacity has often been cited as a defining characteristic of our species, and one that sets us apart in the long history of life on Earth. Yet we know that other species communicate in complex ways. Studies of the neurological foundations of language suggest that birdsong, or communication among bats or elephants, originates with brain structures similar to our own. So why do some species vocalize while others don’t? In this episode, Erich Jarvis, who studies behavior and neurogenetics at the Rockefeller University, chats with Janna Levin about the surprising connections between human speech, birdsong and dance. JANNA LEVIN: All animals exhibit some form of communication, from the primitive hiss of a lizard to the complex gestures natural to chimps, or the songs shared by whales. But human language does seem exceptional, a vast and discrete cognitive leap. Yet recent research is finding surprising neurological connections between our expressive speech and the types of communication innate to other animals, giving us new ideas about the biological and developmental origins of language. Erich is a professor at the Rockefeller University and a Howard Hughes Medical Institute investigator. At Rockefeller, he directs the Field Research Center of Ethology and Ecology. He also directs the Neurogenetics Lab of Language and codirects the Vertebrate Genome Lab, where he studies song-learning birds and other species to gain insight into the mechanism’s underlying language and vocal learning. ERICH JARVIS: So, the first part: Language is built-in genetically in us humans. We’re born with the capacity to learn how to produce and how to understand language, and pass it on culturally from one generation to the next. The actual detail is learned, but the actual plan in the brain is there. Second part of your question: Is it, you know, special or unique to humans? It is specialized in humans, but certainly many components of what gives rise to language is not unique to humans. There’s a spectrum of abilities out there in other species that we share some aspects of with other species. © 2024 Simons Foundation

Keyword: Language; Evolution
Link ID: 29572 - Posted: 11.23.2024

By Claudia López Lloreda For decades, researchers have considered the brain “immune privileged”—protected from the vagaries of the body’s immune system. But building evidence suggests that the brain may be more immunologically active than previously thought, well beyond its own limited immune response. The choroid plexus in particular—the network of blood vessels and cerebrospinal-fluid (CSF)-producing epithelial cells that line the organ’s ventricles—actively recruits immune cells from both the periphery and the CSF, according to a new study in mice. The epithelial layer of the choroid plexus shields the rest of the brain from toxic substances, pathogens and other molecules that circulate in the blood. Dysfunction and neuroinflammation in the choroid plexus is associated with aging and many neurological conditions, such as amyotrophic lateral sclerosis and Alzheimer’s disease. Even in the absence of inflammation, the choroid plexus harbors immune cells, some of which reside in the space between the vessels and the epithelial layer, and some on the epithelial surface. During an immune response, it also contains recruited cells, such as macrophages and other leukocytes, and pro-inflammatory signals, previous research has shown. But those findings offered only a snapshot of the cells’ locations, says Maria Lehtinen, professor of pathology at Harvard Medical School, who led the new work. “Just because [the cell] is in the tissue doesn’t mean it’s necessarily crossing or has gone in the direction that you anticipate that it would be going in.” How the choroid plexus gatekeeps immune cells remains a big question in the field, says Michal Schwartz, a neuroimmunologist at the Weizmann Institute of Science, who was not involved with the new work. © 2024 Simons Foundation

Keyword: Neuroimmunology
Link ID: 29571 - Posted: 11.23.2024

By Roni Caryn Rabin The number of deaths caused by alcohol-related diseases more than doubled among Americans between 1999 and 2020, according to new research. Alcohol was involved in nearly 50,000 deaths among adults ages 25 to 85 in 2020, up from just under 20,000 in 1999. The increases were in all age groups. The biggest spike was observed among adults ages 25 to 34, whose fatality rate increased nearly fourfold between 1999 and 2020. Women are still far less likely than men to die of an illness caused by alcohol, but they also experienced a steep surge, with rates rising 2.5-fold over 20 years. The new study, published in The American Journal of Medicine, drew on data from the Centers for Disease Control and Prevention. Deaths related to alcohol included those caused by certain forms of heart disease, liver disease, nerve damage, muscle damage, pancreatitis and alcohol poisoning, as well as related mental and behavioral disorders. The study did not include other deaths influenced by alcohol, such as accidents. “The totality of the evidence indicates that people who consume moderate to large amounts of alcohol have a markedly increased incidence of premature deaths and disability,” said Dr. Charles Hennekens, a professor of medicine at Charles E. Schmidt College of Medicine at Florida Atlantic University and one of the study’s authors. The increase at the onset of the pandemic appears to have persisted. Adults reported more heavy drinking and binge drinking in 2022, another recent study found. Some 48,870 alcohol-related deaths were reported in 2020, up from 19,356 in 1999, the new study found. The mortality rate rose to 21.6 deaths per 100,000 in 2020, an increase from 10.7 deaths per 100,000 in 1999. © 2024 The New York Times Company

Keyword: Drug Abuse
Link ID: 29570 - Posted: 11.23.2024

By Tomas Weber Trinian Taylor, a 52-year-old car dealer, pushed his cart through the aisles of a supermarket as I pretended not to follow him. It was a bright August day in Northern California, and I had come to the store to meet Emily Auerbach, a relationship manager at Mattson, a food-innovation firm that creates products for the country’s largest food and beverage companies: McDonald’s and White Castle, PepsiCo and Hostess. Auerbach was trying to understand the shopping behavior of Ozempic users, and Taylor was one of her case studies. She instructed me to stay as close as I could without influencing his route around the store. In her experience of shop-alongs, too much space, or taking photos, would be a red flag for the supermarket higher-ups, who might figure out we were not here to shop. “They’d be like, ‘You need to exit,’” she said. Auerbach watched in silence as Taylor, who was earning $150 in exchange for being tailed, propelled his cart through snack aisles scattered with products from Mattson’s clients. He took us straight past the Doritos and the Hostess HoHos, without a side glance at the Oreos or the Cheetos. We rushed past the Pop-Tarts and the Hershey’s Kisses, the Lucky Charms and the Lay’s — they all barely registered. Clumsily, close on his heels, Auerbach and I stumbled right into what has become, under the influence of the revolutionary new diet drug, Taylor’s happy place: the produce section. He inspected the goods. “I’m on all of these,” he told us. “I eat a lot of pineapple. A lot of pineapple, cucumber, ginger. Oh, a lot of ginger.” Taylor, who lives in Hayward, Calif., used to nurse a sugar addiction, he said, but he can no longer stomach Hostess treats. A few days earlier, his daughter fed him some candy. “I just couldn’t,” he said. “It was so sweet it choked me.” His midnight snack used to be cereal, but now he stirs at night with strange urges. Salads. Chicken. He has sworn off canned sodas and fruit juices and infuses his water with lemon and cucumber. He dropped a heavy bag of lemons into the cart and sauntered over to the leafy vegetables. “I love Swiss chard,” he said. “I eat a lot of kale.” For decades, Big Food has been marketing products to people who can’t stop eating, and now, suddenly, they can. The active ingredient in Ozempic, as in Wegovy, Zepbound and several other similar new drugs, mimics a natural hormone, called glucagon-like peptide-1 (GLP-1), that slows digestion and signals fullness to the brain. Around seven million Americans now take a GLP-1 drug, and Morgan Stanley estimates that by 2035 the number of U.S. users could expand to 24 million. © 2024 The New York Times Company

Keyword: Obesity
Link ID: 29569 - Posted: 11.20.2024

Ian Sample Science editor Losing weight can be a frustrating game: after months of successful slimming, the kilos may soon pile on again, leaving people back where they started. No one factor drives the yo-yo effect, but new research points to fatty tissue as a leading culprit. Fat “remembers” past obesity and resists attempts to lose weight, scientists found. Researchers identified the biological memory after examining fat tissue from people with obesity before and after they lost weight after bariatric surgery. The tissues were further compared with fat from healthy individuals who had never been obese. The analysis showed that fat cells were affected by obesity in a way that altered how they responded to food, potentially for years. In tests, the cells grew faster than others by absorbing nutrients more swiftly. Prof Ferdinand von Meyenn, a senior author on the study at the Federal Institute of Technology in Zurich, said: “Our study indicates that one reason maintaining body weight after initial weight loss is difficult is that the fat cells remember their prior obese state and likely aim to return to this state. “The memory seems to prepare cells to respond quicker, and maybe also in unhealthy ways, to sugars or fatty acids.” Further work on mouse cells traced the biological memory to chemical modifications on DNA or the proteins DNA is wrapped around. These epigenetic changes alter gene activity and metabolism. Writing in Nature, the scientists describe how formerly obese mice gained weight faster than others when put on a high-fat diet, suggesting a shift in metabolism that made it easier for them to gain weight. The memory of obesity in fat cells was not solely to blame, however. The scientists suspect a similar memory exists in brain cells that affects how much food animals consume and how much energy they expend. © 2024 Guardian News & Media Limited

Keyword: Obesity
Link ID: 29568 - Posted: 11.20.2024

By Joanne Silberner To describe the destructive effects of intense health anxiety to his young doctors in training at Columbia University Irving Medical Center in New York City, psychiatrist Brian Fallon likes to quote 19th-century English psychiatrist Henry Maudsley: “The sorrow which has no vent in tears may make other organs weep.” That weeping from other parts of the body may come in the form of a headache that, in the mind of its sufferer, is flagging a brain tumor. It may be a rapid heartbeat a person wrongly interprets as a brewing heart attack. The fast beats may be driven by overwhelming, incapacitating anxiety. Hal Rosenbluth, a businessman in the Philadelphia area, says he used to seek medical care for the slightest symptom. In his recent book Hypochondria, he describes chest pains, breathing difficulties and vertigo that came on after he switched from a daily diabetes drug to a weekly one. He ended up going to the hospital by ambulance for blood tests, multiple electrocardiograms, a chest x-ray, a cardiac catheterization and an endoscopy, all of which were normal. Rosenbluth’s worries about glucose levels had led him to push for the new diabetes drug, and its side effects were responsible for many of his cardiac symptoms. His own extreme anxiety had induced doctors to order the extra care. Hypochondria can, in extreme cases, leave people unable to hold down a job or make it impossible for them to leave the house, cook meals, or care for themselves and their families. Recent medical research has shown that hypochondria is as much a real illness as depression and post-traumatic stress disorder. This work, scientists hope, will convince doctors who believed the disorder was some kind of character flaw that their patients are truly ill—and in danger. A study published just last year showed that people with hypochondria have higher death rates than similar but nonafflicted people, and the leading nonnatural cause of death was suicide. It was relatively rare, but the heightened risk was clear.

Keyword: Stress; Attention
Link ID: 29567 - Posted: 11.20.2024

By Laura Sanders Growing up, Roberto S. Luciani had hints that his brain worked differently than most people. He didn’t relate when people complained about a movie character looking different than what they’d pictured from the book, for instance. But it wasn’t until he was a teenager that things finally clicked. His mother had just woken up and was telling him about a dream she had. “Movielike,” is how she described it. “Up until then, I assumed that cartoon depictions of imagination were exaggerated,” Luciani says, “I asked her what she meant and quickly realized my visual imagery was not functioning like hers.” That’s because Luciani has a condition called aphantasia — an inability to picture objects, people and scenes in his mind. When he was growing up, the term didn’t even exist. But now, Luciani, a cognitive scientist at the University of Glasgow in Scotland, and other scientists are getting a clearer picture of how some brains work, including those with a blind mind’s eye. In a recent study, Luciani and colleagues explored the connections between the senses, in this case, hearing and seeing. In most of our brains, these two senses collaborate. Auditory information influences activity in brain areas that handle vision. But in people with aphantasia, this connection isn’t as strong, researchers report November 4 in Current Biology. While in a brain scanner, blindfolded people listened to three sound scenes: A forest full of birds, a crowd of people, and a street bustling with traffic. In 10 people without aphantasia, these auditory scenes create reliable neural hallmarks in parts of the visual cortex. But in 23 people with aphantasia, these hallmarks were weaker. © Society for Science & the Public 2000–2024.

Keyword: Attention
Link ID: 29566 - Posted: 11.20.2024

By Miryam Naddaf Humans have evolved disproportionately large brains compared with our primate relatives — but this neurological upgrade came at a cost. Scientists exploring the trade-off have discovered unique genetic features that show how human brain cells handle the stress of keeping a big brain working. The work could inspire new lines of research to understand conditions such as Parkinson’s disease and schizophrenia. The study, which was posted to the bioRxiv preprint server on 15 November1, focuses on neurons that produce the neurotransmitter dopamine, which is crucial for movement, learning and emotional processing. By comparing thousands of laboratory-grown dopamine neurons from humans, chimpanzees, macaques and orangutans, researchers found that human dopamine neurons express more genes that boost the activity of damage-reducing antioxidants than do those of the other primates. The findings, which are yet to be peer-reviewed, are a step towards “understanding human brain evolution and all the potentially negative and positive things that come with it”, says Andre Sousa, a neuroscientist at the University of Wisconsin–Madison. “It's interesting and important to really try to understand what's specific about the human brain, with the potential of developing new therapies or even avoiding disease altogether in the future.” Just as walking upright has led to knee and back problems, and changes in jaw structure and diet resulted in dental issues, the rapid expansion of the human brain over evolutionary time has created challenges for its cells, says study co-author Alex Pollen, a neuroscientist at the University of California, San Francisco. “We hypothesized that the same process may be occurring, and these dopamine neurons may represent vulnerable joints.” © 2024 Springer Nature Limited

Keyword: Development of the Brain; Stress
Link ID: 29565 - Posted: 11.20.2024

By Sara Manning Peskin Seven Deadly Sins: The Biology of Being Human Guy Leschziner William Collins (2024) There is no food in sight in Alex’s house. Even the rubbish bin is fastened closed. The kitchen is like a bank vault, hidden behind a locked door from which staff members bring out portioned meals for Alex and her six housemates, all of whom have a genetic disorder called Prader–Willi syndrome. Although Alex was born underweight, by early adulthood she could eat three servings in a sitting, had gorged on cat food and carried 110 kilograms on her small frame. Her ‘gluttony’, writes neurologist Guy Leschziner in Seven Deadly Sins, is the result of a condition that instils such a voracious appetite that some people have eaten to the point of bursting their stomachs. Whereas marketers of diet programmes have conventionally coupled obesity to a lack of willpower, Leschziner uses Alex’s case to argue that body size is driven less by moralistic factors and more by genetics, hormones and gut microorganisms. Similar themes run throughout the book, as the author examines lust, envy and other supposed infractions, gathering examples of people who exhibit these traits because of neurological disorders. Like his earlier books about sleep and the senses, Seven Deadly Sins educates as much as it entertains, turning complex neuroscientific topics into fodder for cocktail-party conversations. The biology of behaviour Exploring wrath, Leschziner introduces two men with epilepsy. One lurches into rages in the wake of his seizures and finds himself surrounded by shards of broken dishes afterwards. Another, a “gentle giant”, has anger outbursts because of a medication prescribed to control his disease. © 2024 Springer Nature Limited

Keyword: Emotions
Link ID: 29564 - Posted: 11.20.2024

By Claudia López Lloreda Fear memories serve a purpose: A mouse in the wild learns to fear the sound of footsteps, which helps it avoid predators. But in certain situations, those fear memories can also tinge neutral memories with fear, resulting in maladaptive behavior. A mouse or person, for instance, may learn to fear stimuli that should presumably be safe. This shift can occur when an existing fear memory broadens—either by recruiting inappropriate neurons into the cell ensemble that contains it or by linking up to a previously neutral memory, according to two new studies in mice, one published today and another last week. Memories are embodied in the brain through sparse ensembles of neurons, called engrams, that activate when an animal forms a new memory or recalls it later. These ensembles were thought to be “stable and permanent,” says Denise Cai, associate professor of neuroscience at the Icahn School of Medicine at Mount Sinai, who led one of the studies. But the new findings reveal how, during times of fear and stress, memories can become malleable, either as they are brought back online or as the neurons that encode them expand. There is “this really powerful ability of stress to look back and change memories for neutral experiences that have come before by pulling them into the same neural representation or by exciting them more during offline periods,” says Elizabeth Goldfarb, assistant professor of psychiatry at the Yale School of Medicine, who was not involved in the studies. That challenges the previous dogma, Cai says. “We’ve learned that these memory ensembles are actually quite dynamic.” © 2024 Simons Foundation

Keyword: Learning & Memory; Stress
Link ID: 29563 - Posted: 11.16.2024

By Ann Gibbons As the parent of any teenager knows, humans need a long time to grow up: We take about twice as long as chimpanzees to reach adulthood. Anthropologists theorize that our long childhood and adolescence allow us to build comparatively bigger brains or learn skills that help us survive and reproduce. Now, a study of an ancient youth’s teeth suggests a slow pattern of growth appeared at least 1.8 million years ago, half a million years earlier than any previous evidence for delayed dental development. Researchers used state-of-the art x-ray imaging methods to count growth lines in the molars of a member of our genus, Homo, who lived 1.77 million years ago in what today is Dmanisi, Georgia. Although the youth developed much faster than children today, its molars grew as slowly as a modern human’s during the first 5 years of life, the researchers report today in Nature. The finding, in a group whose brains are hardly larger than chimpanzees, could provide clues to why humans evolved such long childhoods. “One of the main questions in paleoanthropology is to understand when this pattern of slow development evolves in [our genus] Homo,” says Alessia Nava, a bioarchaeologist at the Sapienza University of Rome who is not part of the study. “Now, we have an important hint.” Others caution that although the teeth of this youngster grew slowly, other individuals, including our direct ancestors, might have developed faster. Researchers have known since the 1930s that humans stay immature longer than other apes. Some posit our ancestors evolved slow growth to allow more time and energy to build bigger brains, or to learn how to adapt to complex social interactions and environments before they had children. To pin down when this slow pattern of growth arose, researchers often turn to teeth, especially permanent molars, because they persist in the fossil record and contain growth lines like tree rings. What’s more, the dental growth rate in humans and other primates correlates with the development of the brain and body.

Keyword: Evolution; Sexual Behavior
Link ID: 29562 - Posted: 11.16.2024

Ari Daniel The birds of today descended from the dinosaurs of yore. Researchers have known relatively little, however, about how the bird's brain took shape over tens of millions of years. "Birds are one of the most intelligent groups of living vertebrate animals," says Daniel Field, a vertebrate biologist at the University of Cambridge. "They really rival mammals in terms of their relative brain size and the complexity of their behaviors, social interactions, breeding displays." Now, a newly discovered fossil provides the most complete glimpse to date of the brains of the ancestral birds that once flew above the dinosaurs. The species was named Navaornis hestiae, and it's described in the journal Nature. Piecing together how bird brains evolved has been a challenge. First, most of the fossil evidence dates back to tens of millions of years before the end of the Cretaceous period when dinosaurs went extinct and birds diversified. In addition, the fossils of feathered dinosaurs that have turned up often have a key problem. "They're beautiful, but they're all like roadkill," says Luis Chiappe, a paleontologist and curator at the Natural History Museum of Los Angeles County. "They're all flattened and there are aspects that you're never going to be able to recover from those fossils." The shape and three-dimensional structure of the brain are among those missing aspects. But in 2016, Brazilian paleontologist William Nava discovered a remarkably well-preserved fossil in São Paulo state. It came from a prehistoric bird that fills in a crucial gap in understanding of how modern bird brains evolved. © 2024 npr

Keyword: Evolution; Development of the Brain
Link ID: 29561 - Posted: 11.16.2024

By Elena Renken Small may be mightier than we think when it comes to brains. This is what neuroscientist Marcella Noorman is learning from her neuroscientific research into tiny animals like fruit flies, whose brains hold around 140,000 neurons each, compared to the roughly 86 billion in the human brain. Nautilus Members enjoy an ad-free experience. Log in or Join now . In work published earlier this month in Nature Neuroscience, Noorman and colleagues showed that a small network of cells in the fruit fly brain was capable of completing a highly complex task with impressive accuracy: maintaining a consistent sense of direction. Smaller networks were thought to be capable of only discrete internal mental representations, not continuous ones. These networks can “perform more complex computations than we previously thought,” says Noorman, an associate at the Howard Hughes Medical Institute. The scientists monitored the brains of fruit flies as they walked on tiny rotating foam balls in the dark, and recorded the activity of a network of cells responsible for keeping track of head direction. This kind of brain network is called a ring attractor network, and it is present in both insects and in humans. Ring attractor networks maintain variables like orientation or angular velocity—the rate at which an object rotates—over time as we navigate, integrating new information from the senses and making sure we don’t lose track of the original signal, even when there are no updates. You know which way you’re facing even if you close your eyes and stand still, for example. After finding that this small circuit in fruit fly brains—which contains only about 50 neurons in the core of the network—could accurately represent head direction, Noorman and her colleagues built models to identify the minimum size of a network that could still theoretically perform this task. Smaller networks, they found, required more precise signaling between neurons. But hundreds or thousands of cells weren’t necessary for this basic task. As few as four cells could form a ring attractor, they found. © 2024 NautilusNext Inc.,

Keyword: Development of the Brain; Vision
Link ID: 29560 - Posted: 11.16.2024

By Tim Vernimmen Few people are fond of earwigs, with their menacing abdominal pincers — whether they’re skittering across your floor, getting comfy in the folds of your camping tent or minding their own business. Scientists, too, have given them short shrift, compared with the seemingly endless attention they have lavished on social insects like ants and bees. Yet there are a handful of exceptions. Some researchers have made conscious career decisions to dig into the hidden, underground world where earwigs reside, and have found the creatures to be surprisingly interesting and social, if still not exactly endearing. Work in the 1990s and early 2000s focused on earwig courtship. These often-intricate performances of attraction and repulsion — in which pincers and antennae play prominent roles — can last hours, and the mating itself as long as 20 hours, at least in one Papua New Guinea species, Tagalina papua. The females usually decide when they’ve had enough, though males of some species use their pincers to restrain the object of their desire. Males of the bone-house earwig Marava arachidis (often found in bone meal plants and slaughterhouses) are particularly coercive, says entomologist Yoshitaka Kamimura of Keio University in Japan, who has studied earwig mating for 25 years. “They bite the female’s antennae and use a little hook on their genitalia to lock them inside her reproductive tract.” Size matters

Keyword: Sexual Behavior; Evolution
Link ID: 29559 - Posted: 11.16.2024

By Angie Voyles Askham Engrams, the physical circuits of individual memories, consist of more than just neurons, according to a new study published today in Nature. Astrocytes, too, shape how some memories are stored and retrieved, the work shows. The results represent “a fundamental change” in how the neuroscience field should think about indexing memories, says lead researcher Benjamin Deneen, professor of neurosurgery at Baylor College of Medicine. “We need to reconsider the cellular, physical basis of how we store memories.” When mice form a new memory, a specific set of neurons becomes active and expresses the immediate early gene c-FOS, past work has found. Reactivating that ensemble of neurons, the engram, causes the mice to recall that memory. Interactions between neurons and astrocytes are critical for the formation of long-term memory, according to a spatial transcriptomics study from February, and both astrocytes and oligodendrocytes are involved in memory formation, other work has shown. Yet engram studies have largely ignored the activity of non-neuronal cells, says Sheena Josselyn, senior scientist at the Hospital for Sick Children, who was not involved in the new study. But astrocytes are also active alongside neurons as memories are formed and recalled, and disrupting the star-shaped cells’ function interferes with these processes, the new work reveals. The study does not dethrone neurons as the lead engram stars, according to Josselyn. “It really shows that, yes, neurons are important. But there are also other players that we’re just beginning to understand the importance of,” she says. “It’ll help broaden our focus.” © 2024 Simons Foundation

Keyword: Learning & Memory; Glia
Link ID: 29558 - Posted: 11.13.2024

By Fred Schwaller Scott Imbrie still remembers the first time that physicians switched on the electrodes sitting on the surface of his brain. He felt a tingling, poking sensation in his hand, like “reaching into an evergreen bush”, he says. “It was like I was decorating a Christmas tree.” Back in 1985, a car crash shattered three of Imbrie’s vertebrae and severed 70% of his spinal cord, leaving him with very limited sensation or mobility in parts of his body. Now, thanks to an implanted brain–computer interface (BCI), Imbrie can operate a robotic arm, and receive sensory information related to what that arm is doing. Imbrie spends four days a week, three hours at a time, testing, refining and tuning the device with a team of researchers at the University of Chicago in Illinois. Scientists have been trying to restore mobility for people with missing or paralysed limbs for decades. The aim, historically, was to give people the ability to control prosthetics with commands from the nervous system. But this motor-first approach produced bionic limbs that were much less helpful than hoped: devices were cumbersome and provided only rudimentary control of a hand or leg. What’s more, they just didn’t feel like they were part of the body and required too much concentration to use. Scientists gradually began to realize that restoring full mobility meant restoring the ability to sense touch and temperature, says Robert Gaunt, a bioengineer at the University of Pittsburgh in Pennsylvania. Gaunt says that this realization has led to a revolution in the field. A landmark study1 came in 2016, when a team led by Gaunt restored tactile sensations in a person with upper-limb paralysis using a computer chip implanted in a region of the brain that controls the hand. Gaunt then teamed up with his Pittsburgh colleague, bioengineer Jennifer Collinger, to integrate a robotic arm with the BCI, allowing the individual to feel and manipulate objects. © 2024 Springer Nature Limited

Keyword: Robotics; Pain & Touch
Link ID: 29557 - Posted: 11.13.2024