Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 161 - 180 of 28707

By Christine Dell'Amore Thunderclouds rolled across Kenya’s Masai Mara savanna as the spotted hyena cubs played, tumbling over each other in the wet grass. The cubs’ mother lounged nearby, rising occasionally to discourage a bigger one-year-old from joining the little play group. When the older animal approached again, one of the pluckier cubs took a cue from its high-ranking mom and stood tall, trying its best to look intimidating. That action seemed comical, but both animals knew their place. The larger, lower ranking hyena stopped short, then bowed its head and slunk off. Photographer Jen Guyton recorded this scene with an infrared camera, allowing an intimate look into hyenas’ nocturnal behaviors. In doing so, she provided a small window into the intriguing structure of hyena society, where all members inherit their place in the pecking order from their mother. Females are in charge, and rank means everything—a matrilineal system that has fueled the spotted hyena’s rise as the most abundant large carnivore in Africa. These and other insights into hyena behavior wouldn’t be possible were it not for 35 years of on-the-ground research by Kay Holekamp, founder of the Mara Hyena Project. Her efforts have helped reveal a creature noted for its advanced society, cognition, and ability to adjust to new surroundings. Holekamp, a biologist at Michigan State University, has been studying the African species in the Masai Mara since 1988—one of the longest running investigations of any mammal ever. “I thought I’d be there for two years,” she says, “but I got hooked.” Hooked on hyenas? Mention their name, and most people grimace. Aristotle described them as “exceedingly fond of putrefied flesh.” Theodore Roosevelt called them a “singular mixture of abject cowardice and the utmost ferocity.” Across Africa, hyenas are seen as evil, greedy, and associated with witchcraft and sexual deviance. Even the 1994 movie The Lion King portrayed them as cunning and malicious. © 1996-2015 National Geographic Society

Keyword: Sexual Behavior; Evolution
Link ID: 29149 - Posted: 02.13.2024

By Kevin Mitchell It is often said that “the mind is what the brain does.” Modern neuroscience has indeed shown us that mental goings-on rely on and are in some sense entailed by neural goings-on. But the truth is that we have a poor handle on the nature of that relationship. One way to bridge that divide is to try to define the relationship between neural and mental representations. The basic premise of neuroscience is that patterns of neural activity carry some information — they are about something. But not all such patterns need be thought of as representations; many of them are just signals. Simple circuits such as the muscle stretch reflex or the eye-blink reflex, for example, are configured to respond to stimuli such as the lengthening of a muscle or a sudden bright light. But they don’t need to internally represent this information — or make that information available to other parts of the nervous system. They just need to respond to it. More complex information processing, by contrast, such as in our image-forming visual system, requires internal neural representation. By integrating signals from multiple photoreceptors, retinal ganglion cells carry information about patterns of light in the visual stimulus — particularly edges where the illumination changes from light to dark. This information is then made available to the thalamus and the cortical hierarchy, where additional processing goes on to extract higher- and higher-order features of the entire visual scene. Scientists have elucidated the logic of these hierarchical systems by studying the types of stimuli to which neurons are most sensitively tuned, known as “receptive fields.” If some neuron in an early cortical area responds selectively to, say, a vertical line in a certain part of the visual field, the inference is that when such a neuron is active, that is the information that it is representing. In this case, it is making that information available to the next level of the visual system — itself just a subsystem of the brain. © 2024 Simons Foundation

Keyword: Consciousness; Vision
Link ID: 29148 - Posted: 02.13.2024

Rob Stein Benjamin Franklin famously wrote: "In this world nothing can be said to be certain, except death and taxes." While that may still be true, there's a controversy simmering today about one of the ways doctors declare people to be dead. The debate is focused on the Uniform Determination of Death Act, a law that was adopted by most states in the 1980s. The law says that death can be declared if someone has experienced "irreversible cessation of all functions of the entire brain." But some parts of the brain can continue to function in people who have been declared brain dead, prompting calls to revise the statute. Many experts say the discrepancy needs to be resolved to protect patients and their families, maintain public trust and reconcile what some see as a troubling disconnect between the law and medical practice. The debate became so contentious, however, that the Uniform Law Commission, the group charged with rewriting model laws for states, paused its process last summer because participants couldn't reach a consensus. "I'm worried," says Thaddeus Pope, a bioethicist and lawyer at Mitchell Hamline School of Law in St. Paul, Minnesota. "There's a lot of conflict at the bedside over this at hospitals across the United States. Let's get in front of it and fix it before it becomes a crisis. It's such an important question that everyone needs to be on the same page." The second method, brain death, can be declared for people who have sustained catastrophic brain injury causing the permanent cessation of all brain function, such as from a massive traumatic brain injury or massive stroke, but whose hearts are still pumping through the use of ventilators or other artificial forms of life support. © 2024 npr

Keyword: Brain Injury/Concussion; Brain imaging
Link ID: 29147 - Posted: 02.13.2024

By Miryam Naddaf An analysis of around 1,500 blood proteins has identified biomarkers that can be used to predict the risk of developing dementia up to 15 years before diagnosis. The findings, reported today in Nature Aging1, are a step towards a tool that scientists have been in search of for decades: blood tests that can detect Alzheimer’s disease and other forms of dementia at a very early, pre-symptomatic stage. Researchers screened blood samples from more than 50,000 healthy adults in the UK Biobank, 1,417 of whom developed dementia in a 14-year period. They found that high blood levels of four proteins — GFAP, NEFL, GDF15 and LTBP2 — were strongly associated with dementia. “Studies such as this are required if we are to intervene with disease-modifying therapies at the very earliest stage of dementia,” said Amanda Heslegrave, a neuroscientist at University College London, in a statement to the Science Media Centre in London. According to the World Health Organization, more than 55 million people worldwide currently live with dementia. People are often diagnosed only when they notice memory problems or other symptoms. At that point, the disease might have been progressing for years. “Once we diagnose it, it’s almost too late,” says study co-author Jian-Feng Feng, a computational biologist at Fudan University in Shanghai, China. “And it’s impossible to reverse it.” By screening 1,463 proteins in blood samples from 52,645 people, the authors found that increased levels of GFAP, NEFL, GDF15 and LTBP2 were associated with dementia and Alzheimer’s disease. For some participants who developed dementia, blood levels of these proteins were outside normal ranges more than ten years before symptom onset. © 2024 Springer Nature Limited

Keyword: Alzheimers
Link ID: 29146 - Posted: 02.13.2024

By Claudia López Lloreda By squirting cells from a 3D printer, researchers have created tissue that looks—and acts—like a chunk of brain. In recent years, scientists have learned how to load up 3D printers with cells and other scaffolding ingredients to create living tissues, but making realistic brainlike constructs has been a challenge. Now, one team has shown that, by modifying its printing techniques, it can print and combine multiple subtypes of cells that better mimic signaling in the human brain. “It’s remarkable that [the researchers] can replicate” how brain cells work, says Riccardo Levato, a regenerative medicine researcher at Utrecht University who was not involved with the study. “It’s the first demonstration that, with some simple organization [of cells], you can start getting some interesting functional [responses].” The new technology, described last week in Cell Stem Cell, could offer advantages over existing techniques that neuroscientists use to create 3D brain tissues in the lab. One common approach involves using stem cells to grow miniature brainlike blobs called organoids. But researchers can’t control the types of cells or their precise location in these constructs. Each organoid “is unique,” making it difficult to reproduce research results, says neuroscientist Su-Chun Zhang of the University of Wisconsin–Madison, an author of the new study. With the right kind of 3D printing, however, “you can control where different cell types are placed,” says developmental biologist Francis Szele of the University of Oxford. Past studies have used 3D printers to construct brain tissues that allowed researchers to study how the cells matured and made connections, and even integrate printed tissue into mouse brains. But those constructs had limited functionality. And efforts that produced more functional printed tissue used rat cells, not human cells. © 2024 American Association for the Advancement of Science.

Keyword: Development of the Brain; Robotics
Link ID: 29145 - Posted: 02.10.2024

By Simon Makin A new device makes it possible for a person with an amputation to sense temperature with a prosthetic hand. The technology is a step toward prosthetic limbs that restore a full range of senses, improving both their usefulness and acceptance by those who wear them. A team of researchers in Italy and Switzerland attached the device, called ”MiniTouch,” to the prosthetic hand of a 57-year-old man named Fabrizio, who has an above-the-wrist amputation. In tests, the man could identify cold, cool and hot bottles of liquid with perfect accuracy; tell the difference between plastic, glass and copper significantly better than chance; and sort steel blocks by temperature with around 75 percent accuracy, researchers report February 9 in Med. Thank you for being a subscriber to Science News! Interested in more ways to support STEM? Consider making a gift to our nonprofit publisher, Society for Science, an organization dedicated to expanding scientific literacy and ensuring that every young person can strive to become an engineer or scientist. “It’s important to incorporate these technologies in a way that prosthesis users can actually use to perform functional tasks,” says neuroengineer Luke Osborn of Johns Hopkins University Applied Physics Laboratory in Laurel, Md., who was not involved in the study. “Introducing new sensory feedback modalities could help give users more functionality they weren’t able to achieve before.” The device also improved Fabrizio’s ability to tell whether he was touching an artificial or human arm. His accuracy was 80 percent with the device turned on, compared with 60 percent with it off. “It’s not quite as good as with the intact hand, probably because we’re not giving [information about] skin textures,” says neuroengineer Solaiman Shokur of EPFL, the Swiss Federal Institute of Technology in Lausanne. © Society for Science & the Public 2000–2024.

Keyword: Pain & Touch
Link ID: 29144 - Posted: 02.10.2024

By Kristin Kiesel and Richard J. Sexton Many public health advocates and scholars see sugar-sweetened-beverage taxes (often simply called soda taxes) as key to reducing obesity and its adverse health effects. But a careful look at the data challenges this view. We reviewed close to 100 studies that have analyzed current taxes in more than 50 countries and conducted our own research on the effectiveness of soda taxes in the US. There is no conclusive evidence that soda taxes have reduced how much sugar or calories people consume in any meaningful way. Soda taxes alone simply cannot nudge consumers toward healthier food choices. The World Health Organization estimates that more than 17 million people die prematurely each year from chronic noncommunicable diseases. Being overweight or obese is a major risk factor for many of these conditions, including type 2 diabetes, cardiovascular diseases, asthma and several types of cancer. A widely publicized 2019 Lancet Commission report pegged annual obesity-related health-care costs and economic productivity losses at $2 trillion, about 3 percent of the global gross domestic product. Consuming large amounts of added sugars is a key part of this problem. A single 12-ounce can of soda can have more than 10 teaspoons of sugar; drinking just one exceeds the American Heart Association’s recommended daily limits on added sugars. It is easy to see why reducing soda consumption has been a popular target in the war against obesity. One would think that taxing sodas would raise their prices and discourage consumers from purchasing them. With this idea in mind, a wave of taxes has been slapped on sugar-sweetened beverages across the world. For example, cities in California’s Bay Area have imposed a tax of 1 cent per ounce on sugary beverages (a seemingly large price increase given soda’s cost of about 5 cents per ounce in the western US).

Keyword: Obesity
Link ID: 29143 - Posted: 02.10.2024

By Benjamin Breen When I began researching Tripping on Utopia in 2018, I was aware that many midcentury scientists and psychiatrists had shown a keen interest in the promise of psychedelics. But what I didn’t realize was how remarkably broad-based this interest was. As I dug deeper into the archival record, I was struck by the public enthusiasm for the use of substances like LSD and mescaline in therapy—as manifested not just in scientific studies, but in newspaper articles and even television specials. (My favorite is this remarkable 1957 broadcast which shows a woman taking LSD on camera, then uttering memorable lines like “I’ve never seen such infinite beauty in my life” and “I wish I could talk in Technicolor.”) Above all, I was surprised by the public response to the Hollywood actor Cary Grant’s reveal that he was regularly using LSD in psychedelic therapy sessions. In a series of interviews starting in 1959—the same year he starred in North by Northwest—Grant went public as an unlikely advocate for psychedelic therapy. It was the surprisingly positive reaction to Grant’s endorsement that most struck me. As recounted in my book, the journalist who broke the story was overwhelmed by phone calls and letters. “Psychiatrists called, complaining that their patients were now begging them for LSD,” he remembered. “Every actor in town under analysis wanted it.” Nor was this first wave of legal psychedelic therapy restricted to Hollywood. Two other very prominent advocates of psychedelic therapy in the late 1950s were former Congresswoman Clare Boothe Luce and her husband Henry Luce, the founder of Time and Life magazines. It is not an exaggeration to say that this married couple dominated the media landscape of the 20th century. Nor is it an exaggeration to say that psychedelics profoundly influenced Clare Boothe Luce’s life in the late 1950s. She credited LSD with transformative insights that helped her to overcome lasting trauma associated with her abusive childhood and the death of her only daughter in a car accident. © 2024 NautilusNext Inc.,

Keyword: Drug Abuse; Consciousness
Link ID: 29142 - Posted: 02.10.2024

Rhitu Chatterjee In recent years, there's been growing interest in psilocybin, the psychoactive ingredient in "magic mushrooms" or "shrooms" as a potentially beneficial therapy for mental health conditions. At the same time, drug busts of mushrooms went way up between 2017 and 2022, and the amount of the psychedelic substance seized by law enforcement more than tripled, according to a new study. "What I think the results indicate is that shroom availability has likely been increasing," says Joseph Palamar, an epidemiologist at NYU Langone Health and the main author of the new study published in the journal Drug and Alcohol Dependence. Sponsor Message The findings come at a time when there's a "psychedelic renaissance" happening in the country, says Dr. Joshua Siegel of Washington University in St. Louis, who wasn't involved in the new study. There's growing public and scientific interest in psychedelics' potential therapeutic effects on various mental and behavioral health issues, says Siegel, who also studies how psychedelics affect the human brain. At the same time, a small number of states have already decriminalized psychedelic drugs, and many more are looking into doing the same. The new study is "an important part of the bigger picture of where we are headed as a nation" with psychedelics, says Siegel. "It's important to understand what's happening in terms of the health care side of things. It's important to understand what's happening recreationally and legally." The new study found that the total amount of mushrooms seized by law enforcement across the country went from nearly 500 pounds in 2017 to more than 1,800 pounds in 2022. The largest amount (42.6% of total) seized was in the West, followed closely by the Midwest (41.8%). © 2024 npr

Keyword: Drug Abuse; Depression
Link ID: 29141 - Posted: 02.08.2024

By Matt Richtel For decades, eating disorders were thought to afflict mostly, if not exclusively, women and girls. In fact, until 2013, the loss of menstruation had long been considered an official symptom of anorexia nervosa. Over the last decade, however, health experts have increasingly recognized that boys and men also suffer from eating disorders, and they have gained a better understanding of how differently the illness presents in that group. A small but growing body of scientists and physicians have dedicated themselves to identifying the problem, assessing its scope and developing treatments. Recently, two of these experts spoke to The New York Times about how the disease is affecting adolescent boys, what symptoms and behaviors parents should look for, and which treatments to consider. Dr. Jason Nagata is a pediatrician at the University of California, San Francisco, who specializes in eating disorders; he is senior editor of the Journal of Eating Disorders and editor of the book “Eating Disorders in Boys and Men.” Dr. Sarah Smith is a child and adolescent psychiatrist at the University of Toronto who specializes in eating disorders; she was the lead author on a study published in JAMA Open Network in December that showed sharp increases in the rates of hospitalizations for boys with eating disorders. The medical and scientific understanding of eating disorders is changing and expanding. What happened? Dr. Smith: Historically, eating disorders have been conceptualized mostly as anorexia, which has been portrayed as an illness of adolescent females who want to lose weight for aesthetic reasons. Dr. Nagata: There’s increasing recognition, particularly in the last decade or so, that some people with body image dissatisfaction are not trying to lose weight at all. Some men and boys are trying to become large and muscular. In fact, one-third of teenage boys across the United States report that they’re trying to bulk up and get more muscular. And a subset of those may develop eating disorders or muscle dysmorphia that can lead to significant psychological distress and physical health complications. © 2024 The New York Times Company

Keyword: Anorexia & Bulimia; Sexual Behavior
Link ID: 29140 - Posted: 02.08.2024

By Harriet Brown The minute I saw the headline — “Should Patients Be Allowed to Die From Anorexia?” — in The New York Times Magazine last month, my heart sank. Over the last two years, more and more psychiatrists have floated the idea that it’s OK to stop trying to cure some people with anorexia nervosa. People with anorexia develop a deep fear of food and eating, often losing so much weight that they die of starvation or complications. About 20 percent die by suicide. Anorexia is a terrible disease, one that inflicts maximum pain on the person diagnosed and their families and friends. The suffering is continuous and intense, and it gets even worse during recovery. Our family experienced this for eight years. Having to watch my daughter suffer made me realize that anorexia is not a choice or a question of vanity but a tsunami of fear and anxiety that makes one of the most basic human acts, the act of eating, as terrifying as jumping out of a plane without a parachute. It usually takes years of steady, consistent, calorie-dense eating to fully heal the body and brain of a person with anorexia. Without the right kind of support and treatment, it’s nearly impossible. So when psychiatrists suggest that maybe some people can’t recover and should be allowed to stop trying, they’re sidestepping their own responsibility. What they should be saying instead is that current views on and treatments of anorexia are abysmal, and medicine needs to do better. If you’ve never experienced anorexia firsthand, consider yourself blessed. Anorexia has one of the highest mortality rates of any psychiatric illness. People with anorexia are 18 times as likely to die from suicide as their peers. Fewer than half of those with anorexia make a full recovery.

Keyword: Anorexia & Bulimia
Link ID: 29139 - Posted: 02.08.2024

Ian Sample Science editor After a decades-long and largely fruitless hunt for drugs to combat Alzheimer’s disease, an unlikely candidate has raised its head: the erectile dysfunction pill Viagra. Researchers found that men who were prescribed Viagra and similar medications were 18% less likely to develop the most common form of dementia years later than those who went without the drugs. The effect was strongest in men with the most prescriptions, with scientists finding a 44% lower risk of Alzheimer’s in those who received 21 to 50 prescriptions of the erectile dysfunction pills over the course of their study. While the findings are striking, the observational study cannot determine whether Viagra and similar pills protect against Alzheimer’s or whether men who are already less prone to the condition are simply more likely to use the tablets. “We can’t say that the drugs are responsible, but this does give us food for thought on how we move into the future,” said the lead author Dr Ruth Brauer at University College London. “We now need a proper clinical trial to look at the effects of these drugs on Alzheimer’s in women as well as men.” Brauer and her colleagues analysed medical records for more than 260,000 men who were diagnosed with erectile dysfunction but had no evidence of memory or thinking problems. Just over half were taking PDE5 inhibitor drugs, including sildenafil (sold as Viagra), avanafil, vardenafil and tadalafil. The men were followed for an average of five years to record any new cases of Alzheimer’s. © 2024 Guardian News & Media Limited

Keyword: Alzheimers
Link ID: 29138 - Posted: 02.08.2024

By Lisa Sanders, M.D. “We were thinking about going bowling with the kids tomorrow,” the woman told her 43-year-old brother as they settled into their accustomed spots in the living room of their mother’s home in Chicago. It was late — nearly midnight — and he had arrived from Michigan to spend the days between Christmas and New Year’s with this part of his family. She and her husband and her brother grew up together and spent many late nights laughing and talking. She knew her brother was passionate about bowling. He had spent almost every day in his local alley two summers ago. So she was taken by surprise when he answered, “I can’t do that anymore.” Certainly, her brother had had a tough year. It seemed to start with his terrible heartburn. For most of his life, he had what he described as run-of-the-mill heartburn, usually triggered by eating late at night, and he would have to take a couple of antacid tablets. But that year his heartburn went ballistic. His mouth always tasted like metal. And the reflux of food back up the esophagus would get so bad that it would make him vomit. Nothing seemed to help. He quit drinking coffee. Quit drinking alcohol. Stopped eating spicy foods. He told his doctor, who started him on a medication known as a proton pump inhibitor (P.P.I.) to reduce the acid or excess protons his stomach made. That pill provided relief from the burning pain. But he still had the metallic taste in his mouth, still felt sick after eating. He still vomited several times a week. When he discovered that he wouldn’t throw up when he drank smoothies, he almost completely gave up solid foods. When he was still feeling awful after weeks on the P.P.I., his gastroenterologist used a tiny camera to take a look at his esophagus. His stomach looked fine, but the region where the esophagus entered the stomach was a mess. Normally the swallowing tube ends with a tight sphincter that stays closed to protect delicate tissue from the harsh acid of the stomach. It opens when swallowing, to let the food pass. But his swallowing tube was wide open and the tissue around the sphincter was red and swollen. © 2024 The New York Times Company

Keyword: Hearing
Link ID: 29137 - Posted: 02.08.2024

Nicholas J. Kelley In the middle of 2023, a study conducted by the HuthLab at the University of Texas sent shockwaves through the realms of neuroscience and technology. For the first time, the thoughts and impressions of people unable to communicate with the outside world were translated into continuous natural language, using a combination of artificial intelligence (AI) and brain imaging technology. This is the closest science has yet come to reading someone’s mind. While advances in neuroimaging over the past two decades have enabled non-responsive and minimally conscious patients to control a computer cursor with their brain, HuthLab’s research is a significant step closer towards accessing people’s actual thoughts. As Alexander Huth, the neuroscientist who co-led the research, told the New York Times: Combining AI and brain-scanning technology, the team created a non-invasive brain decoder capable of reconstructing continuous natural language among people otherwise unable to communicate with the outside world. The development of such technology – and the parallel development of brain-controlled motor prosthetics that enable paralysed patients to achieve some renewed mobility – holds tremendous prospects for people suffering from neurological diseases including locked-in syndrome and quadriplegia. In the longer term, this could lead to wider public applications such as fitbit-style health monitors for the brain and brain-controlled smartphones. On January 29, Elon Musk announced that his Neuralink tech startup had implanted a chip in a human brain for the first time. He had previously told followers that Neuralink’s first product, Telepathy, would one day allow people to control their phones or computers “just by thinking”. © 2010–2024, The Conversation US, Inc.

Keyword: Brain imaging
Link ID: 29136 - Posted: 02.08.2024

By Nora Bradford Whenever you’re actively performing a task — say, lifting weights at the gym or taking a hard exam — the parts of your brain required to carry it out become “active” when neurons step up their electrical activity. But is your brain active even when you’re zoning out on the couch? The answer, researchers have found, is yes. Over the past two decades they’ve defined what’s known as the default mode network, a collection of seemingly unrelated areas of the brain that activate when you’re not doing much at all. Its discovery has offered insights into how the brain functions outside of well-defined tasks and has also prompted research into the role of brain networks — not just brain regions — in managing our internal experience. In the late 20th century, neuroscientists began using new techniques to take images of people’s brains as they performed tasks in scanning machines. As expected, activity in certain brain areas increased during tasks — and to the researchers’ surprise, activity in other brain areas declined simultaneously. The neuroscientists were intrigued that during a wide variety of tasks, the very same brain areas consistently dialed back their activity. It was as if these areas had been active when the person wasn’t doing anything, and then turned off when the mind had to concentrate on something external. Researchers called these areas “task negative.” When they were first identified, Marcus Raichle, a neurologist at the Washington University School of Medicine in St. Louis, suspected that these task-negative areas play an important role in the resting mind. “This raised the question of ‘What’s baseline brain activity?’” Raichle recalled. In an experiment, he asked people in scanners to close their eyes and simply let their minds wander while he measured their brain activity. All Rights Reserved © 2024

Keyword: Attention; Consciousness
Link ID: 29135 - Posted: 02.06.2024

By David Marchese Our memories form the bedrock of who we are. Those recollections, in turn, are built on one very simple assumption: This happened. But things are not quite so simple. “We update our memories through the act of remembering,” says Charan Ranganath, a professor of psychology and neuroscience at the University of California, Davis, and the author of the illuminating new book “Why We Remember.” “So it creates all these weird biases and infiltrates our decision making. It affects our sense of who we are.” Rather than being photo-accurate repositories of past experience, Ranganath argues, our memories function more like active interpreters, working to help us navigate the present and future. The implication is that who we are, and the memories we draw on to determine that, are far less fixed than you might think. “Our identities,” Ranganath says, “are built on shifting sand.” What is the most common misconception about memory? People believe that memory should be effortless, but their expectations for how much they should remember are totally out of whack with how much they’re capable of remembering.1 Another misconception is that memory is supposed to be an archive of the past. We expect that we should be able to replay the past like a movie in our heads. The problem with that assumption is that we don’t replay the past as it happened; we do it through a lens of interpretation and imagination. Semantic memory is the term for the memory of facts and knowledge about the world. standpoint? It’s exceptionally hard to answer the question of how much we can remember. What I’ll say is that we can remember an extraordinary amount of detail that would make you feel at times as if you have a photographic memory. We’re capable of these extraordinary feats. I would argue that we’re all everyday-memory experts, because we have this exceptional semantic memory, which is the scaffold for episodic memory. I know it sounds squirmy to say, “Well, I can’t answer the question of how much we remember,” but I don’t want readers to walk away thinking memory is all made up. © 2024 The New York Times Company

Keyword: Learning & Memory
Link ID: 29134 - Posted: 02.06.2024

By Shruti Ravindran When preparing to become a butterfly, the Eastern Black Swallowtail caterpillar wraps its bright striped body within a leaf. This leaf is its sanctuary, where it will weave its chrysalis. So when the leaf is disturbed by a would-be predator—a bird or insect—the caterpillar stirs into motion, briefly darting out a pair of fleshy, smelly horns. To humans, these horns might appear yellow—a color known to attract birds and many insects—but from a predator’s-eye-view, they appear a livid, almost neon violet, a color of warning and poison for some birds and insects. “It’s like a jump scare,” says Daniel Hanley, an assistant professor of biology at George Mason University. “Startle them enough, and all you need is a second to get away.” Hanley is part of a team that has developed a new technique to depict on video how the natural world looks to non-human species. The method is meant to capture how animals use color in unique—and often fleeting—behaviors like the caterpillar’s anti-predator display. Most animals, birds, and insects possess their own ways of seeing, shaped by the light receptors in their eyes. Human retinas, for example, are sensitive to three wavelengths of light—blue, green, and red—which enables us to see approximately 1 million different hues in our environment. By contrast, many mammals, including dogs, cats, and cows, sense only two wavelengths. But birds, fish, amphibians, and some insects and reptiles typically can sense four—including ultraviolet light. Their worlds are drenched in a kaleidoscope of color—they can often see 100 times as many shades as humans do. Hanley’s team, which includes not just biologists but multiple mathematicians, a physicist, an engineer, and a filmmaker, claims that their method can translate the colors and gradations of light perceived by hundreds of animals to a range of frequencies that human eyes can comprehend with an accuracy of roughly 90 percent. That is, they can simulate the way a scene in a natural environment might look to a particular species of animal, what shifting shapes and objects might stand out most. The team uses commercially available cameras to record video in four color channels—blue, green, red, and ultraviolet—and then applies open source software to translate the picture according to the mix of light receptor sensitivities a given animal may have. © 2024 NautilusNext Inc.,

Keyword: Vision; Evolution
Link ID: 29133 - Posted: 02.06.2024

By Sabrina Malhi Researchers have found a possible link between the common hormone disorder PCOS and cognitive decline later in life. PCOS, which stands for polycystic ovary syndrome, is the most common endocrine disorder among women ages 15 to 44. However, it is often underdiagnosed because many of its symptoms, including abnormal menstrual cycles and excess hair, can be attributed to other causes. The syndrome was first described in 1935 by American gynecologists Irving F. Stein and Michael L. Leventhal. They published a paper documenting a group of women with lack of periods, excess body hair and enlarged ovaries with multiple cysts. Their work helped identify and characterize PCOS as it is known today. Health experts hypothesize that genetic factors could contribute to the development of the condition, but the exact causes are still unknown. Here’s what to know about PCOS and its potential link to cognitive health. PCOS is a chronic hormonal disorder characterized by overproduction of androgens, which are typically considered male hormones. High androgen levels can lead to irregular menstrual cycles and fertility issues when excessively produced in women. In the United States, 6 to 12 percent of people assigned female at birth who are of reproductive age are affected by PCOS, according to data from the Centers for Disease Control and Prevention. The condition is associated with an increased risk of obesity, high blood pressure, high cholesterol and endometrial cancer. PCOS is also often linked to insulin resistance, which can result in elevated blood sugar levels and an escalated risk of Type 2 diabetes. The condition can contribute to various metabolic issues, including high blood pressure, excess abdominal fat, and abnormal cholesterol or triglyceride levels. People with PCOS face an elevated risk of developing cardiovascular problems, such as high blood pressure, high cholesterol levels and an increased risk of heart disease. A recent study in the journal Neurology found that people with PCOS performed lower than normal on a suite of cognitive tests.

Keyword: Hormones & Behavior; Learning & Memory
Link ID: 29132 - Posted: 02.06.2024

By Ashley Juavinett In the 2010 award-winning film “Inception,” Leonardo DiCaprio’s character and others run around multiple layers of someone’s consciousness, trying to implant an idea in the person’s mind. If you can plant something deep enough, the film suggests, you can make them believe it is their own idea. The film was billed as science fiction, but three years later, in 2013, researchers actually did this — in a mouse, at least. The work focused on the hippocampus, along with its closely interconnected structures, long recognized by scientists to hold our dearest memories. If you damage significant portions of just one region of your hippocampus, the dentate gyrus, you’ll lose the ability to form new memories. How these memories are stored, however, is still up for debate. One early but persistent idea posits that enduring changes in our neural circuitry, or “engrams,” may represent the physical traces of specific memories. An engram is sometimes thought of as a group of cells, along with their synaptic weights and connections throughout the brain. In sum, the engram is what DiCaprio’s character would have had to discreetly manipulate in his target. In 2012, a team in Susumu Tonegawa’s lab at the Massachusetts Institute of Technology (MIT) showed that you could mark the cells of a real memory engram and reactivate them later. Taking that work one step further, Steve Ramirez, Xu Liu and others in Tonegawa’s lab demonstrated the following year that you can implant a memory of something that never even happened. In doing so, they turned science fiction into reality, one tiny foot shock at a time. Published in Science, Ramirez and Liu’s study is a breath of fresh air, scientifically speaking. The abstract starts with one of the shortest sentences you’ll ever find in a scientific manuscript: “Memories can be unreliable.” The entire paper is extremely readable, and there is no shortage of related papers and review articles that you could give your students to read for additional context. © 2024 Simons Foundation

Keyword: Learning & Memory
Link ID: 29131 - Posted: 02.06.2024

By Ernesto Londoño Seizures of psychedelic mushrooms across the nation by law enforcement officials have increased significantly in recent years as attitudes regarding their use have grown more permissive, according to a government-funded study released Tuesday. Researchers found that law enforcement officials confiscated 844 kilos of mushrooms containing psilocybin in 2022, an increase of 273 percent from 2017. Psilocybin is the psychoactive component in the fungi commonly known as magic mushrooms. Officials at the National Institute on Drug Abuse, which commissioned the study, said that the increase in seizures of magic mushroom reflected rising use of the drugs, rather than an indication that counternarcotics officials were pursuing the substances more aggressively than before. The marketplace for magic mushrooms, which are illegal under federal law, has boomed in recent years as several clinical studies have shown that they may be effective as therapies to treat depression and other serious conditions. But many medical professionals say they worry that the hype surrounding psychedelics has moved faster than the science. Dr. Nora Volkow, the director of the N.I.D.A, said that preliminary clinical studies had shown that psychedelics might one day become an important tool for the treatment of psychiatric disorders, including addiction to other drugs. But she said she worried that many people were self-medicating with psychedelics. “Psychedelic drugs have been promoted as a potential cure for many health conditions without adequate research to support these claims,” Dr. Volkow said. “There are people who are very desperate for mental health care, and there are businesses that are very eager to make money by marketing substances as treatments or cures.” © 2024 The New York Times Company

Keyword: Drug Abuse; Depression
Link ID: 29130 - Posted: 02.06.2024