Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 28323

By Sonia Shah Can a mouse learn a new song? Such a question might seem whimsical. Though humans have lived alongside mice for at least 15,000 years, few of us have ever heard mice sing, because they do so in frequencies beyond the range detectable by human hearing. As pups, their high-pitched songs alert their mothers to their whereabouts; as adults, they sing in ultrasound to woo one another. For decades, researchers considered mouse songs instinctual, the fixed tunes of a windup music box, rather than the mutable expressions of individual minds. But no one had tested whether that was really true. In 2012, a team of neurobiologists at Duke University, led by Erich Jarvis, a neuroscientist who studies vocal learning, designed an experiment to find out. The team surgically deafened five mice and recorded their songs in a mouse-size sound studio, tricked out with infrared cameras and microphones. They then compared sonograms of the songs of deafened mice with those of hearing mice. If the mouse songs were innate, as long presumed, the surgical alteration would make no difference at all. Jarvis and his researchers slowed down the tempo and shifted the pitch of the recordings, so that they could hear the songs with their own ears. Those of the intact mice sounded “remarkably similar to some bird songs,” Jarvis wrote in a 2013 paper that described the experiment, with whistlelike syllables similar to those in the songs of canaries and the trills of dolphins. Not so the songs of the deafened mice: Deprived of auditory feedback, their songs became degraded, rendering them nearly unrecognizable. They sounded, the scientists noted, like “squawks and screams.” Not only did the tunes of a mouse depend on its ability to hear itself and others, but also, as the team found in another experiment, a male mouse could alter the pitch of its song to compete with other male mice for female attention. Inside these murine skills lay clues to a puzzle many have called “the hardest problem in science”: the origins of language. In humans, “vocal learning” is understood as a skill critical to spoken language. Researchers had already discovered the capacity for vocal learning in species other than humans, including in songbirds, hummingbirds, parrots, cetaceans such as dolphins and whales, pinnipeds such as seals, elephants and bats. But given the centuries-old idea that a deep chasm separated human language from animal communications, most scientists understood the vocal learning abilities of other species as unrelated to our own — as evolutionarily divergent as the wing of a bat is to that of a bee. The apparent absence of intermediate forms of language — say, a talking animal — left the question of how language evolved resistant to empirical inquiry. © 2023 The New York Times Company

Keyword: Language; Animal Communication
Link ID: 28921 - Posted: 09.21.2023

COMIC: When, why and how did neurons first evolve? Scientists are piecing together the ancient story. By Tim Vernimmen Illustrated by Maki Naro 09.14.2023 © 2023 Annual Reviews

Keyword: Evolution; Development of the Brain
Link ID: 28920 - Posted: 09.21.2023

Hannah Devlin Science correspondent The brain circuit that causes the sound of a newborn crying to trigger the release of breast milk in mothers has been uncovered by scientists. The study, in mice, gives fresh insights into sophisticated changes that occur in the brain during pregnancy and parenthood. It found that 30 seconds of continuous crying by mouse pups triggered the release of oxytocin, the brain chemical that controls the breast-milk release response in mothers. “Our findings uncover how a crying infant primes its mother’s brain to ready her body for nursing,” said Habon Issa, a graduate student at NYU Langone Health and co-author of the study. “Without such preparation, there can be a delay of several minutes between suckling and milk flow, potentially leading to a frustrated baby and stressed parent.” The study showed that once prompted, the surge of hormones continued for roughly five minutes before tapering off, enabling mouse mothers to feed their young until they were sated or began crying again. The observation that a mother’s breasts can leak milk when they hear a crying baby is not new. But the latest research is the first to identify the brain mechanisms behind what the scientists described as the “wail-to-milk pipeline”, and could pave the way for a better understanding of the challenges of breastfeeding for many women. The findings, published in Nature, showed that when a mouse pup starts crying, sound information travels to an area of its mother’s brain called the posterior intralaminar nucleus of the thalamus (PIL). This sensory hub then sends signals to oxytocin-releasing brain cells (neurons) in another region called the hypothalamus. Most of the time these hypothalamus neurons are “locked down” to prevent false alarms and wasted milk. However, after 30 seconds of continuous crying, signals from the PIL built up and overpowered the in-built inhibitory mechanism, setting off oxytocin release. © 2023 Guardian News & Media Limited

Keyword: Sexual Behavior; Hormones & Behavior
Link ID: 28919 - Posted: 09.21.2023

Mariana Lenharo A letter, signed by 124 scholars and posted online last week1, has caused an uproar in the consciousness research community. It claims that a prominent theory describing what makes someone or something conscious — called the integrated information theory (IIT) — should be labelled “pseudoscience”. Since its publication on 15 September in the preprint repository PsyArXiv, the letter has some researchers arguing over the label and others worried it will increase polarization in a field that has grappled with issues of credibility in the past. “I think it’s inflammatory to describe IIT as pseudoscience,” says neuroscientist Anil Seth, director of the Centre for Consciousness Science at the University of Sussex near Brighton, UK, adding that he disagrees with the label. “IIT is a theory, of course, and therefore may be empirically wrong,” says neuroscientist Christof Koch, a meritorious investigator at the Allen Institute for Brain Science in Seattle, Washington, and a proponent of the theory. But he says that it makes its assumptions — for example, that consciousness has a physical basis and can be mathematically measured — very clear. There are dozens of theories that seek to understand consciousness — everything that a human or non-human experiences, including what they feel, see and hear — as well as its underlying neural foundations. IIT has often been described as one of the central theories, alongside others, such as global neuronal workspace theory (GNW), higher-order thought theory and recurrent processing theory. It proposes that consciousness emerges from the way information is processed within a ‘system’ (for instance, networks of neurons or computer circuits), and that systems that are more interconnected, or integrated, have higher levels of consciousness. Hakwan Lau, a neuroscientist at Riken Center for Brain Science in Wako, Japan, and one of the authors of the letter, says that some researchers in the consciousness field are uncomfortable with what they perceive as a discrepancy between IIT’s scientific merit and the considerable attention it receives from the popular media because of how it is promoted by advocates. “Has IIT become a leading theory because of academic acceptance first, or is it because of the popular noise that kind of forced the academics to give it acknowledgement?”, Lau asks. © 2023 Springer Nature Limited

Keyword: Consciousness
Link ID: 28918 - Posted: 09.21.2023

Kimberlee D'Ardenne Dopamine seems to be having a moment in the zeitgeist. You may have read about it in the news, seen viral social media posts about “dopamine hacking” or listened to podcasts about how to harness what this molecule is doing in your brain to improve your mood and productivity. But recent neuroscience research suggests that popular strategies to control dopamine are based on an overly narrow view of how it functions. Dopamine is one of the brain’s neurotransmitters – tiny molecules that act as messengers between neurons. It is known for its role in tracking your reaction to rewards such as food, sex, money or answering a question correctly. There are many kinds of dopamine neurons located in the uppermost region of the brainstem that manufacture and release dopamine throughout the brain. Whether neuron type affects the function of the dopamine it produces has been an open question. Recently published research reports a relationship between neuron type and dopamine function, and one type of dopamine neuron has an unexpected function that will likely reshape how scientists, clinicians and the public understand this neurotransmitter. Dopamine is involved with more than just pleasure. Dopamine neuron firing Dopamine is famous for the role it plays in reward processing, an idea that dates back at least 50 years. Dopamine neurons monitor the difference between the rewards you thought you would get from a behavior and what you actually got. Neuroscientists call this difference a reward prediction error. Understand new developments in science, health and technology, each week Eating dinner at a restaurant that just opened and looks likely to be nothing special shows reward prediction errors in action. If your meal is very good, that results in a positive reward prediction error, and you are likely to return and order the same meal in the future. Each time you return, the reward prediction error shrinks until it eventually reaches zero when you fully expect a delicious dinner. But if your first meal was terrible, that results in a negative reward prediction error, and you probably won’t go back to the restaurant. Dopamine neurons communicate reward prediction errors to the brain through their firing rates and patterns of dopamine release, which the brain uses for learning. They fire in two ways. © 2010–2023, The Conversation US, Inc.

Keyword: Drug Abuse; Learning & Memory
Link ID: 28917 - Posted: 09.21.2023

By Jim Crotty The opioid crisis continues to rage across the U.S., but there are some positive, if modest, signs that it may be slowing. Overdose deaths due to opioids are flattening in many places and dropping in others, awareness of the dangers of opioid abuse continues to increase, and more than $50 billion in opioid settlement funds are finally making their way to state and local governments after years of delay. There is still much work to be done, but all public health emergencies eventually subside. Then what? First, it’s important to realize that synthetic opioids like fentanyl will never fully disappear from the drug supply. They are too potent, too addictive, and perhaps most importantly, too lucrative. Opioids, like Covid-19, are here to stay, consistently circulating in the community but at more manageable levels. More alarming is what may take its place. Since 2010, overdoses involving both stimulants and fentanyl have increased 50-fold. Experts suggest this dramatic rise in polysubstance use represents a “fourth wave” in the opioid crisis, but what if it is really the start of a new wave of an emerging stimulant crisis? Substance abuse tends to move in cycles. Periods with high rates of depressant drug use (like opioids) are almost always followed by ones with high rates of stimulant drug use (like methamphetamine and cocaine), and vice versa. The heroin crisis of the 1960s and 1970s was followed by the crack epidemic of the 1980s and 1990s, which gave way to the current opioid epidemic. As the think tank scholar Charles Fain Lehman quipped, “As with fashion, so with drugs — whatever the last generation did, the next generation tends to abhor.” The difference now is the primacy of synthetic drugs — that is, illicit substances created in a lab that are designed to mimic the effects of naturally occurring drugs.

Keyword: Drug Abuse
Link ID: 28916 - Posted: 09.21.2023

By Janet Lee Doing puzzles, playing memory-boosting games, taking classes and reading are activities that we often turn to for help keeping our brains sharp. But research is showing that what you eat, how often you exercise and the type of exercise you do can help lower your risk of dementia to a greater extent than previously thought. Live well every day with tips and guidance on food, fitness and mental health, delivered to your inbox every Thursday. Although more studies are needed, “there’s a lot of data that suggests exercise and diet are good for the brain and can prevent or help slow down” cognitive changes, says Jeffrey Burns, co-director of the University of Kansas Alzheimer’s Disease Research Center in Fairway. And living a healthy lifestyle can produce brain benefits no matter what your age. The big diet picture If you’re already eating in a way that protects your heart — plenty of whole grains, vegetables, and fruit, and little saturated fat, sodium and ultra-processed “junk” foods — there’s good news: You’re also protecting your brain. A healthy cardiovascular system keeps blood vessels open, allowing good blood flow to the brain and reducing the risk of high blood pressure, stroke and dementia. Research suggests that two specific dietary approaches — the Mediterranean diet and the MIND diet (the Mediterranean-DASH Intervention for Neurodegenerative Delay, essentially a combo of two heart-healthy eating plans) — may help stave off cognitive decline. Both diets rely on eating mostly plant foods (fruits, vegetables, whole grains, beans, nuts), olive oil, fish and poultry. The main difference between the two is that the MIND diet emphasizes specific fruits and vegetables, such as berries and leafy greens. Studies show that people who most closely follow either diet have a reduced risk of dementia compared with those who don’t. For example, people eating the Mediterranean way had a 23 percent lower risk of dementia in a nine-year study of more than 60,000 men and women published this year in BMC Medicine.

Keyword: Alzheimers
Link ID: 28915 - Posted: 09.21.2023

By Gina Kolata Tucker Marr’s life changed forever last October. He was on his way to a wedding reception when he fell down a steep flight of metal stairs, banging the right side of his head so hard he went into a coma. He’d fractured his skull, and a large blood clot formed on the left side of his head. Surgeons had to remove a large chunk of his skull to relieve pressure on his brain and to remove the clot. “Getting a piece of my skull taken out was crazy to me,” Mr. Marr said. “I almost felt like I’d lost a piece of me.” But what seemed even crazier to him was the way that piece was restored. Mr. Marr, a 27-year-old analyst at Deloitte, became part of a new development in neurosurgery. Instead of remaining without a piece of skull or getting the old bone put back, a procedure that is expensive and has a high rate of infection, he got a prosthetic piece of skull made with a 3-D printer. But it is not the typical prosthesis used in such cases. His prosthesis, which is covered by his skin, is embedded with an acrylic window that would let doctors peer into his brain with ultrasound. A few medical centers are offering such acrylic windows to patients who had to have a piece of skull removed to treat conditions like a brain injury, a tumor, a brain bleed or hydrocephalus. “It’s very cool,” Dr. Michael Lev, director of emergency radiology at Massachusetts General Hospital, said. But, “it is still early days,” he added. Advocates of the technique say that if a patient with such a window has a headache or a seizure or needs a scan to see if a tumor is growing, a doctor can slide an ultrasound probe on the patient’s head and look at the brain in the office. © 2023 The New York Times Company

Keyword: Brain imaging; Brain Injury/Concussion
Link ID: 28914 - Posted: 09.16.2023

By Kenneth S. Kosik Before our evolutionary ancestors had a brain—before they had any organs—18 different cell types got together to make a sea sponge. Remarkably, some of these cells had many of the genes needed to make a brain, even though the sponge has neither neurons nor a brain. In my neuroscience lab at the University of California, Santa Barbara, my colleagues and collaborators discovered this large repository of brain genes in the sponge. Ever since, we have asked ourselves why this ancient, porous blob of cells would contain a set of neural genes in the absence of a nervous system? What was evolution up to? The sea sponge first shows up in the fossil record about 600 million years ago. They live at the bottom of the ocean and are immobile, passive feeders. In fact, early biologists thought they were plants. Often encased by a hard exterior, a row of cells borders a watery center. Each cell has a tiny cilium that gently circulates a rich flow of microorganisms on which they feed. This seemingly simple organization belies a giant step in evolution. For the previous 3 billion years, single-celled creatures inhabited the planet. In one of evolution’s most creative acts, independent cells joined together, first into a colony and later into a truly inseparable multicellular organism. Colonies of single cells offered the first inkling that not every cell in the colony had to be identical. Cells in the interior might differ subtly from those on the periphery that are subject to the whims of the environment. Colonies offered the advantages of cooperation among many nearly identical cells. The next evolutionary innovation, multicellularity, broke radically from the past. © 2023 NautilusNext Inc.,

Keyword: Evolution
Link ID: 28913 - Posted: 09.16.2023

By Darren Incorvaia By now, it’s no secret that the phrase “bird brain” should be a compliment, not an insult. Some of our feathered friends are capable of complex cognitive tasks, including tool use (SN: 2/10/23). Among the brainiest feats that birds are capable of is vocal learning, or the ability to learn to mimic sounds and use them to communicate. In birds, this leads to beautiful calls and songs; in humans, it leads to language. The best avian vocal learners, such as crows and parrots, also tend to be considered the most intelligent birds. So it’s natural to think that the two traits could be linked. But studies with smart birds have found conflicting evidence. Although vocal learning may be linked with greater cognitive capacity in some species, the opposite relationship seems to hold true in others. Now, a massive analysis of 214 birds from 23 species shows that there is indeed a link between vocal learning and at least one advanced cognitive ability — problem-solving. The study, described in the Sept. 15 Science, is the first to analyze multiple bird species instead of just one. More than 200 birds from 23 species were given different cognitive tests to gauge their intelligence. One of the problem-solving tasks asked birds to pull a cork lid off a glass flask to access a tasty treat (bottom left). Comparing these tests with birds’ ability to learn songs and calls showed that the better vocal learners are also better at problem-solving. To compare species, biologist Jean-Nicolas Audet of the Rockefeller University in New York City and colleagues had to devise a way to assess all the birds’ vocal learning and cognitive abilities. © Society for Science & the Public 2000–2023.

Keyword: Intelligence; Evolution
Link ID: 28912 - Posted: 09.16.2023

Sara Reardon The psychedelic drug MDMA, also known as ecstasy or molly, has passed another key hurdle on its way to regulatory approval as a treatment for mental illness. A second large clinical trial has found that the drug — in combination with psychotherapy — is effective at treating post-traumatic stress disorder (PTSD). The results allow the trial’s sponsor to now seek approval from the US Food and Drug Administration (FDA) for MDMA’s use as a PTSD treatment for the general public, which might come as soon as next year. “It’s an important study,” says Matthias Liechti, a psychopharmacologist who studies MDMA at the University of Basel in Switzerland, but who was not involved with the trial or its sponsor. “It confirms MDMA works.” In June, Australia became the first country to allow physicians to prescribe MDMA for treating psychiatric conditions. MDMA is illegal in the United States and other countries because of the potential for its misuse. But the Multidisciplinary Association for Psychedelic Studies (MAPS), a non-profit organization in San Jose, California, has long been developing a proprietary protocol for using MDMA as a treatment for PTSD and other disorders. MAPS has been campaigning for its legalization — a move that could encourage other countries to follow suit. In 2021, researchers sponsored by MAPS reported the results of a study1 in which 90 people received a form of psychotherapy developed by the organization alongside either MDMA or a placebo. After three treatment sessions, 67% of those who received MDMA with therapy no longer qualified for a PTSD diagnosis, compared with 32% of those who received therapy and a placebo. The results were widely hailed as promising, but the FDA typically requires two placebo-controlled trials before a drug can be approved. The results of a second trial, involving 104 further individuals with PTSD and published on 14 September in Nature Medicine2, were similar to those of the original: 71% of people who received MDMA alongside therapy lost their PTSD diagnosis, compared with 48% of those who received a placebo and therapy. © 2023 Springer Nature Limited

Keyword: Drug Abuse; Stress
Link ID: 28911 - Posted: 09.16.2023

By Jim Davies Think of what you want to eat for dinner this weekend. What popped into mind? Pizza? Sushi? Clam chowder? Why did those foods (or whatever foods you imagined) appear in your consciousness and not something else? Psychologists have long held that when we are making a decision about a particular category of thing, we tend to bring to mind items that are typical or common in our culture or everyday lives, or ones we value the most. On this view, whatever foods you conjured up are likely ones that you eat often, or love to eat. Sounds intuitive. But a recent paper published in Cognition suggests it’s more complicated than that. Tracey Mills, a research assistant working at MIT, led the study along with Jonathan Phillips, a cognitive scientist and philosopher at Dartmouth College. They put over 2,000 subjects, recruited online, through a series of seven experiments that allowed them to test a novel approach for understanding which ideas within a category will pop into our consciousness—and which won’t. In this case, they had subjects think about zoo animals, holidays, jobs, kitchen appliances, chain restaurants, sports, and vegetables. What they found is that what makes a particular thing come to mind—such as a lion when one is considering zoo animals—is determined not by how valuable or familiar it is, but by where it lies in a multidimensional idea grid that could be said to resemble a kind of word cloud. “Under the hypothesis we argue for,” Mills and Phillips write, “the process of calling members of a category to mind might be modeled as a search through feature space, weighted toward certain features that are relevant for that category.” Historical “value” just happens to be one dimension that is particularly relevant when one is talking about dinner, but is less relevant for categories such as zoo animals or, say, crimes, they write. © 2023 NautilusNext Inc., All rights reserved.

Keyword: Attention; Learning & Memory
Link ID: 28910 - Posted: 09.16.2023

By Molly Rains Over the past 50 years, worldwide obesity rates have tripled, creating a public health crisis so widespread and damaging that it is sometimes referred to as an epidemic. Most accounts put the roots of the problem firmly in the modern age. But could it have been brewing since before World War II? That’s one provocative conclusion of a study published today in Science Advances that purports to push the obesity epidemic’s origin back to as early as the 1930s. Historical measurements from hundreds of thousands of Danish youth show that in the decades before the problem was officially recognized, the heaviest members of society were already getting steadily bigger. The findings raise questions about the accepted narrative of the obesity epidemic, says Lindsey Haynes-Maslow, an obesity expert at the University of North Carolina at Chapel Hill who was not involved in the study. “This paper is an opportunity … to say maybe we’ve been looking at this wrong, maybe we should go back to the beginning—or, when was the beginning?” she says. Most epidemiologists trace that beginning to the 1970s, when health officials first observed an uptick in the prevalence of obesity—defined as a body mass index (BMI) above 30—in many Western nations. The crisis is usually blamed on the increased postwar availability of cheap, highly processed, and calorie-rich foods, as well as increasingly sedentary lifestyles and growing portion sizes. But University of Copenhagen epidemiologist Thorkild Sørensen was skeptical of that story. Years of slowly increasing body size typically precede obesity, and might show up in historical data, he suspected. And Sørensen wasn’t convinced that the so-called obesogenic diet and lifestyle were the only factors at play. Historical data, he hoped, could reveal whether other, yet-unknown factors had contributed to the crisis.

Keyword: Obesity
Link ID: 28909 - Posted: 09.16.2023

By Amber Dance We’ve all heard of the five tastes our tongues can detect — sweet, sour, bitter, savory-umami and salty. But the real number is actually six, because we have two separate salt-taste systems. One of them detects the attractive, relatively low levels of salt that make potato chips taste delicious. The other one registers high levels of salt — enough to make overly salted food offensive and deter overconsumption. Exactly how our taste buds sense the two kinds of saltiness is a mystery that’s taken some 40 years of scientific inquiry to unravel, and researchers haven’t solved all the details yet. In fact, the more they look at salt sensation, the weirder it gets. Many other details of taste have been worked out over the past 25 years. For sweet, bitter and umami, it’s known that molecular receptors on certain taste bud cells recognize the food molecules and, when activated, kick off a series of events that ultimately sends signals to the brain. Sour is slightly different: It is detected by taste bud cells that respond to acidity, researchers recently learned. In the case of salt, scientists understand many details about the low-salt receptor, but a complete description of the high-salt receptor has lagged, as has an understanding of which taste bud cells host each detector. “There are a lot of gaps still in our knowledge — especially salt taste. I would call it one of the biggest gaps,” says Maik Behrens, a taste researcher at the Leibniz Institute for Food Systems Biology in Freising, Germany. “There are always missing pieces in the puzzle.” A fine balance Our dual perception of saltiness helps us to walk a tightrope between the two faces of sodium, an element that’s crucial for the function of muscles and nerves but dangerous in high quantities. To tightly control salt levels, the body manages the amount of sodium it lets out in urine, and controls how much comes in through the mouth. © 2023 Annual Reviews

Keyword: Chemical Senses (Smell & Taste)
Link ID: 28908 - Posted: 09.16.2023

By Sarah Lyall The author Cat Bohannon was a preteen in Atlanta in the 1980s when she saw the film “2001: A Space Odyssey” for the first time. As she took in its famous opening scene, in which a bunch of apes picks up a bunch of bones and quickly begins using them to hit each other, Bohannon was struck by the sheer maleness of the moment. “I thought, ‘Where are the females in this story?’” Bohannon said recently, imagining what those absent females might have been up to at that particular time. “It’s like, ‘Oh, sorry, I see you’re doing something really important with a rock. I’m just going to go over there behind that hill and quietly build the future of the species in my womb.” That realization was just one of what Bohannon, 44, calls “a constellation of moments” that led her to write her new book, “Eve: How the Female Body Drove 200 Million Years of Human Evolution.” A page-turning whistle-stop tour of mammalian development that begins in the Jurassic Era, “Eve” recasts the traditional story of evolutionary biology by placing women at its center. The idea is that by examining how women evolved differently from men, Bohannon argues, we can “provide the latest answers to women’s most basic questions about their bodies.” These include, she says: Why do women menstruate? Why do they live longer? And what is the point of menopause? These are timely questions. Thanks to regulations established in the 1970s, clinical trials in the United States have typically used mostly male subjects, from mice to humans. (This is known as “the male norm.”) Though that changed somewhat in 1994, when the National Institutes of Health updated its rules, even the new protocols are replete with loopholes. For example: “From 1996 to 2006, more than 79 percent of animal studies published in the scientific journal Pain included only male subjects,” she writes. © 2023 The New York Times Company

Keyword: Sexual Behavior; Evolution
Link ID: 28907 - Posted: 09.13.2023

Nicola Davis Science correspondent Whether it’s seeing Jesus in burnt toast, a goofy grin in the grooves of a cheese grater, or simply the man in the moon, humans have long perceived faces in unlikely places. Now researchers say the tendency may not be fixed in adults, suggesting it appears to be enhanced in women who have just given birth. The scientists suggest the finding could be down to postpartum women having higher levels of oxytocin, colloquially referred to as the “love” or “trust” hormone because of its role in social bonding. “These data, collected online, suggest that our sensitivity to face-like patterns is not fixed and may change throughout adulthood,” the team write. Writing in the journal Biology Letters, researchers from Australia’s University of Queensland and the University of the Sunshine Coast describe how they set out to investigate whether the propensity to see faces in inanimate objects – a phenomenon known as face pareidolia – changes during life. Previous research has suggested that when humans are given oxytocin, their ability to recognise certain emotions in faces increases. As a result, the team wanted to explore if the hormone could play a role in how sensitive individuals are towards seeing faces in inanimate objects. The researchers used an online platform to recruit women, with participants asked if they were pregnant or had just given birth – the latter being a period when oxytocin levels are generally increased. The women were each shown 320 images in a random order online and asked to rate on an 11-point scale how easily they could see a face. While 32 of the images were of human faces, 256 were of inanimate objects with patterns that could be said to resemble a face, and 32 depicted inanimate objects with no such facial patterns. The team gathered data from 84 pregnant women, 79 women who had given birth in the past year, and 216 women who did not report being pregnant or having recently had a baby. © 2023 Guardian News & Media Limited

Keyword: Sexual Behavior; Attention
Link ID: 28906 - Posted: 09.13.2023

By Joanna Thompson= Like many people, Mary Ann Raghanti enjoys potatoes loaded with butter. Unlike most people, however, she actually asked the question of why we love stuffing ourselves with fatty carbohydrates. Raghanti, a biological anthropologist at Kent State University, has researched the neurochemical mechanism behind that savory craving. As it turns out, a specific brain chemical may be one of the things that not only developed our tendency to overindulge in food, alcohol and drugs but also helped the human brain evolve to be unique from the brains of closely related species. A new study, led by Raghanti and published on September 11 in the Proceedings of the National Academy of Sciences USA, examined the activity of a particular neurotransmitter in a region of the brain that is associated with reward and motivation across several species of primates. The researchers found higher levels of that brain chemical—neuropeptide Y (NPY)—in humans, compared with our closest living relatives. That boost in the reward peptide could explain our love of high-fat foods, from pizza to poutine. The impulse to stuff ourselves with fats and sugars may have given our ancestors an evolutionary edge, allowing them to develop a larger and more complex brain. “I think this is a first bit of neurobiological insight into one of the most interesting things about us as a species,” says Robert Sapolsky, a neuroendocrinology researcher at Stanford University, who was not directly involved in the research but helped review the new paper. Advertisement Neuropeptide Y is associated with “hedonic eating”—consuming food strictly to experience pleasure rather than to satisfy hunger. It drives individuals to seek out high-calorie foods, especially those rich in fat. Historically, though, NPY has been overlooked in favor of flashier “feel good” chemicals such as dopamine and serotonin. © 2023 Scientific American,

Keyword: Obesity; Intelligence
Link ID: 28905 - Posted: 09.13.2023

By Jacqueline Howard and Deidre McPhillips, Most families of children with autism may face long wait times to diagnose their child with the disorder, and once a diagnosis is made, it sometimes may not be definitive. But now, two studies released Tuesday suggest that a recently developed eye-tracking tool could help clinicians diagnose children as young as 16 months with autism – and with more certainty. Kids’ developmental disability diagnoses became more common during pandemic, but autism rates held steady, CDC report says “This is not a tool to replace expert clinicians,” said Warren Jones, director of research at the Marcus Autism Center at Children’s Healthcare of Atlanta and Nien Distinguished Chair in Autism at Emory University School of Medicine, who was an author on both studies. Rather, he said, the hope with this eye-tracking technology is that “by providing objective measurements that objectively measure the same thing in each child,” it can help inform the diagnostic process. The tool, called EarliPoint Evaluation, is cleared by the US Food and Drug Administration to help clinicians diagnose and assess autism, according to the researchers. Traditionally, children are diagnosed with autism based on a clinician’s assessment of their developmental history, behaviors and parents’ reports. Evaluations can take hours, and some subtle behaviors associated with autism may be missed, especially among younger children. “Typically, the way we diagnose autism is by rating our impressions,” said Whitney Guthrie, a clinical psychologist and scientist at the Children’s Hospital of Philadelphia’s Center for Autism Research. She was not involved in the new studies, but her research focuses on early diagnosis of autism.

Keyword: Autism; Schizophrenia
Link ID: 28904 - Posted: 09.13.2023

By Phil Jaekl In the mid-1970s, a British researcher named Anthony Barker wanted to measure the speed at which electrical signals travel down the long, slender nerves that can carry signals from the brain to muscles like those in the hand, triggering movement. To find out, he needed a way to stimulate nerves in people. Researchers had already used electrodes placed on the skin to generate a magnetic field that penetrated human tissue — this produced an electric current that activated the peripheral nerves in the limbs. But the technique was painful, burning the skin. Barker, at the University of Sheffield in England, and his colleagues started to work on a better method. In 1985, with promising results under their belts, they tried positioning the coil-shaped magnetic device they’d developed on participants’ heads. The coil emitted rapidly alternating magnetic pulses over the brain region that controls movement, generating weak electrical currents in the brain tissue and activating neurons that control muscles in the hand. After about 20 milliseconds, the participants’ fingers twitched. The technique, now called transcranial magnetic stimulation (TMS), has proved a vital tool for investigating how the human brain works. When targeted to specific brain regions, TMS can temporarily inhibit or enhance various functions – blocking the ability to speak, for instance, or making it easier to commit a series of numbers to memory. And when brain imaging technologies such as functional magnetic resonance imaging (fMRI) emerged in the 1990s, researchers could now “see” inside people’s brains as they received TMS stimulation. They could also observe how neural pathways respond differently to stimulation in psychiatric illnesses like schizophrenia and depression. In recent decades, this fundamental research has yielded new treatments that alter brain activity, with TMS therapies for depression at the fore. In 2008, the US Food and Drug Administration approved NeuroStar, the nation’s first TMS depression device, and many other countries have since sanctioned the approach. Yet even though TMS is now a widely available depression treatment, many questions remain about the method. It’s not clear how long the benefits of TMS can last, for example, or why it appears to work for some people with depression but not others. Another challenge is disentangling the effects of TMS from the placebo effect — when someone believes that they will benefit from treatment and gets better even though they’re receiving a “sham” form of stimulation. © 2023 Annual Reviews

Keyword: Depression
Link ID: 28903 - Posted: 09.10.2023

By Ann Gibbons Go to the Democratic Republic of the Congo, and you’re unlikely to encounter chimps so plump they have trouble climbing trees or vervet monkeys so chubby they huff and puff as they swing from branch to branch. Humans are a different story. Walk down a typical U.S. street and almost half of the people you encounter are likely to have obesity. Scientists have long blamed our status as the “fattest primate” on genes that help us store fat more efficiently or diets overloaded with sugars or fat. But a new study of 40 species of nonhuman primates, ranging from tiny mouse lemurs to hulking gorillas, finds many pack on the pounds just as easily as we do, regardless of diet, habitat, or genetic differences. All they need is extra food. “Lots of primates put on too much weight, the same as humans,” says Herman Pontzer, a biological anthropologist at Duke University and author of the new study, published this week in the Philosophical Transactions of the Royal Society B. “Humans are not special.” Some researchers have suggested our species is prone to obesity because our ancestors evolved to be incredibly efficient at storing calories. The adaptation would have helped our ancient relatives, who often faced famine after the transition to agriculture, get through lean times. This selection pressure for so-called thrifty genes set us apart from other primates, the thinking goes. But other primates can get fat. Kanzi, the first ape to show he understands spoken English, was triple the average weight of his bonobo species after years of being rewarded with bananas, peanuts, and other treats during research; scientists eventually put him on a diet. And then there was Uncle Fatty, an obese macaque who lived on the streets of Bangkok where tourists fed him milkshakes, noodles, and other junk food. He weighed an astonishing 15 kilograms—three times more than the average macaque—before he went to the monkey equivalent of a fat farm. © 2023 American Association for the Advancement of Science.

Keyword: Obesity; Evolution
Link ID: 28902 - Posted: 09.10.2023