Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 27407

Jon Hamilton People who have had a stroke appear to regain more hand and arm function if intensive rehabilitation starts two to three months after the injury to their brain. A study of 72 stroke patients suggests this is a "critical period," when the brain has the greatest capacity to rewire, a team reports in this week's journal PNAS. The finding challenges the current practice of beginning rehabilitation as soon as possible after a stroke and suggests intensive rehabilitation should go on longer than most insurance coverage allows, says Elissa Newport, a co-author of the study and director of the Center for Brain Plasticity and Recovery at Georgetown University Medical Center. Newport was speaking in place of the study's lead author, Dr. Alexander Dromerick, who died after the study was accepted but before it was published. If the results are confirmed with other larger studies, "the clinical protocol for the timing of stroke rehabilitation would be changed," says Li-Ru Zhao, a professor of neurosurgery at Upstate Medical University in Syracuse, N.Y., who was not involved in the research. The study involved patients treated at Medstar National Rehabilitation Hospital in Washington, D.C., most in their 50s and 60s. One of the study participants was Anthony McEachern, who was 45 when he had a stroke in 2017. Just a few hours earlier, McEachern had been imitating Michael Jackson dance moves with his kids. But at home that night he found himself unable stand up. © 2021 npr

Keyword: Stroke; Learning & Memory
Link ID: 28002 - Posted: 09.22.2021

By Carl Zimmer Sign up for Science Times Get stories that capture the wonders of nature, the cosmos and the human body. Get it sent to your inbox. For half a billion years or so, our ancestors sprouted tails. As fish, they used their tails to swim through the Cambrian seas. Much later, when they evolved into primates, their tails helped them stay balanced as they raced from branch to branch through Eocene jungles. But then, roughly 25 million years ago, the tails disappeared. Charles Darwin first recognized this change in our ancient anatomy. But how and why it happened has remained a mystery. Now a team of scientists in New York say they have pinpointed the genetic mutation that may have erased our tails. When the scientists made this genetic tweak in mice, the animals didn’t grow tails, according to a new study that was posted online last week. This dramatic anatomical change had a profound impact on our evolution. Our ancestors’ tail muscles evolved into a hammock-like mesh across the pelvis. When the ancestors of humans stood up and walked on two legs a few million years ago, that muscular hammock was ready to support the weight of upright organs. Although it’s impossible to definitively prove that this mutation lopped off our ancestors’ tails, “it’s as close to a smoking gun as one could hope for,” said Cedric Feschotte, a geneticist at Cornell who was not involved in the study. Darwin shocked his Victorian audiences by claiming that we descended from primates with tails. He noted that while humans and apes lack a visible tail, they share a tiny set of vertebrae that extend beyond the pelvis — a structure known as the coccyx. “I cannot doubt that it is a rudimentary tail,” he wrote. © 2021 The New York Times Company

Keyword: Evolution; Genes & Behavior
Link ID: 28001 - Posted: 09.22.2021

By Leigh Weingus I’ve struggled with sleep since I was a teenager, and have spent almost as long trying to fix it. I’ve absorbed countless books and articles on getting better sleep that instructed me to go blue-light free at least two hours before bedtime, take nightly baths to lower my body temperature, keep my phone far from my bedroom and avoid caffeine after 12 p.m. In between all my diligent sleep hygiene work, I couldn’t help but feel like there was a larger force at play. My sleep seemed to change throughout my menstrual cycle, for example, getting worse in the days before my period and significantly better afterward. When I was pregnant, I experienced the best sleep of my life, and when I stopped breastfeeding, I didn’t sleep for days. I finally started to ask myself: When we talk about getting better sleep, why aren’t we talking more about hormones? According to the National Sleep Foundation, the lifetime risk of insomnia is 40 percent higher for women than it is for men. Blaming this discrepancy entirely on hormones oversimplifies it — women also tend to take on the bulk of household worrying and emotional labor, and they tend to experience higher levels of anxiety. But according to Mary Jane Minkin, an obstetrician-gynecologist and clinical professor in the Department of Obstetrics, Gynecology, and Reproductive Sciences at the Yale School of Medicine, anecdotal evidence and studies suggest that hormones likely play a role.

Keyword: Sleep; Hormones & Behavior
Link ID: 28000 - Posted: 09.22.2021

By Nicholas Bakalar Electroconvulsive therapy, or ECT, can be effective for the treatment of major depression and is just as safe as antidepressant drugs combined with psychotherapy, a large new study concludes. The procedure, once referred to as electroshock therapy, has a controversial and largely unfavorable history. This was partly due to inaccurate portrayals in popular books and films like “One Flew Over the Cuckoo’s Nest,” and partly the result of real problems with the earliest versions of the procedure, which used strong electrical currents and no anesthesia. Today, ECT is performed under general anesthesia, and the doctor, working with an anesthesiologist and a nurse, applies a weak electric current to the brain (usually about 0.8 amperes at 120 volts) for one to six seconds. This causes a seizure inside the brain, but because of the anesthesia, the patient does not experience muscular contractions. The seizure leads to brain changes that relieve symptoms of depression and certain other mental illnesses. Usually, doctors administer a series of ECT treatments over a period of days or weeks. The only painful part of the procedure is the insertion of an intravenous line before anesthesia. There can be side effects afterward, including temporary memory loss, confusion or transitory headaches and muscle aches. Doctors debate whether ECT can cause long-term memory problems distinct from the memory problems that can be caused by depression itself. For this new study, published in Lancet Psychiatry, Canadian researchers used the records of 10,016 adults whose depression was severe enough that they spent three or more days in the hospital. Half of them had received ECT, while the other half were treated with drugs and psychotherapy. Their average age was 57, and about two-thirds were women. The researchers tracked how each group fared in the 30 days after they were discharged from the hospital. © 2021 The New York Times Company

Keyword: Depression
Link ID: 27999 - Posted: 09.18.2021

Abby Olena Delivering anything therapeutic to the brain has long been a challenge, largely due to the blood-brain barrier, a layer of cells that separates the vessels that supply the brain with blood from the brain itself. Now, in a study published August 12 in Nature Biotechnology, researchers have found that double-stranded RNA-DNA duplexes with attached cholesterol can enter the brains of both mice and rats and change the levels of targeted proteins. The results suggest a possible route to developing drugs that could target the genes implicated in disorders such as muscular dystrophy and amyotrophic lateral sclerosis (ALS). “It’s really exciting to have a study that’s focused on delivery to the central nervous system” with antisense oligonucleotides given systemically, says Michelle Hastings, who investigates genetic disease at the Rosalind Franklin University of Medicine and Science in Chicago and was not involved in the study. The authors “showed that it works for multiple targets, some clinically relevant.” In 2015, Takanori Yokota of Tokyo Medical and Dental University and colleagues published a study showing that a so-called heteroduplex oligonucleotide (HDO)—consisting of a short chain of both DNA and an oligonucleotide with modified bases paired with complementary RNA bound to a lipid on one end—was successful at decreasing target mRNA expression in the liver. Yokota’s team later joined forces with researchers at Ionis Pharmaceuticals to determine whether HDOs could cross the blood-brain barrier and target mRNA in the central nervous system. © 1986–2021 The Scientist.

Keyword: Genes & Behavior; ALS-Lou Gehrig's Disease
Link ID: 27998 - Posted: 09.18.2021

By Joshua Rapp Learn Giraffes don’t fight much, says Jessica Granweiler, a master’s student at the University of Manchester in England who studies nature’s tallest mammals. When they do, look out. “Fighting is extremely rare because it’s extremely violent,” Ms. Granweiler said. When older adult males joust for territory or mating rights, their hornlike pairs of ossicones thrust with the force of their long necks and can cut into their opponents’ flesh, wounding and sometimes even killing a combatant. But some forms of giraffe dueling serve other purposes. In a study published last month in the journal Ethology, Ms. Granweiler and her colleagues reported some discoveries about sparring behavior that help giraffes establish social hierarchies. They showed that the animals didn’t take advantage of smaller members of their herds, but rather practiced their head butts with males of similar stature in ways that to a human might even appear fair or honorable. Such findings could aid in the conservation of the dwindling populations of the animals. Ms. Granweiler and her colleagues observed social behavior in giraffes at the small Mogalakwena River Reserve in South Africa from November 2016 to May 2017. They began to record the details of these fights — basically a who-fought-who, and how in the giraffe world. They were surprised to find that giraffes, like humans, can be righties or southpaws when it comes to sparring. Even the youngest animals showed a clear preference, although unlike humans it seemed they were evenly split between lefties and righties. The researchers also noticed that the younger males sparred more with each other, and nearly always chose opponents similar in size to themselves — there wasn’t a lot of bullying going on. A bar brawl effect went on as well, where one sparring match seemed to infect the crowd and prompt more fights around them. © 2021 The New York Times Company

Keyword: Aggression; Sexual Behavior
Link ID: 27997 - Posted: 09.18.2021

by Giorgia Guglielmi Severe infections during early childhood are linked to autism — at least in some boys, a new study in mice and people suggests. The findings were published today in Science Advances. Researchers analyzed the health records of millions of children in the United States and found that boys diagnosed with autism are more likely than non-autistic boys to have had an infection requiring medical attention between age 1 and a half and 4. The study also showed that provoking a strong immune response in newborn mice with only one copy of TSC2, a gene tied to autism, leads to social memory problems in adult male rodents. In people, mutations in TSC2 cause tuberous sclerosis, a condition characterized by non-cancerous tumors and skin growths. About half of all people with tuberous sclerosis also have autism. “If the TSC2 mutation was sufficient to cause autism, then everyone with that mutation would have autism — but they don’t,” says senior investigator Alcino Silva, director of the Integrative Center for Learning and Memory at the University of California, Los Angeles. A child’s chances of having autism rise with severe infections in the child or his mother, previous studies show, but not all children who contract serious infections go on to be diagnosed with autism. The new study is the first to examine the relationship between immune activation and a specific genetic variant tied to autism, Silva says. The findings suggest that genetics and severe infection represent a ‘two-hit’ scenario for autism. © 2021 Simons Foundation

Keyword: Autism; Development of the Brain
Link ID: 27996 - Posted: 09.18.2021

Emma Yasinski Some genetic risk factors for alcohol use disorder overlap with those for neurodegenerative diseases like Alzheimer’s, scientists reported in Nature Communications on August 20. The study, which relied on a combination of genetic, transcriptomic, and epigenetic data, also offers insight into the molecular commonalities among these disorders, and their connections to immune disfunction. “By meshing findings from genome wide association studies . . . with gene expression in brain and other tissues, this new study has prioritized genes likely to harbor regulatory variants influencing risk of Alcohol Use Disorder,” writes David Goldman, a neurogenetics researcher at the National Institute on Alcohol Abuse and Alcoholism (NIAAA), in an email to The Scientist. “Several of these genes are also associated with neurodegenerative disorders—an intriguing connection because of alcohol’s ability to prematurely age the brain.” Over the past several years, researchers have published a handful of massive genome-wide association studies (GWAS) studies identifying loci—regions of the genome that can contain 10 or more individual genes—that likely influence a person’s risk of developing an alcohol use disorder (AUD). In a study published two years ago, Manav Kapoor, a neuroscientist and geneticist at the Icahn School of Medicine at Mount Sinai and first author on the new paper, and his team found evidence that the immune system might be overactive in people with AUD, but the finding left him with more questions. The first was whether excessive drinking directly causes immune dysfunction, or if instead some people’s genetic makeup puts them at risk for both simultaneously. The second was which of the dozen or so genes at each previous GWAS-identified locus actually influences drinking behaviors. Lastly, he wanted to know if there is a genetic difference between people who consume higher numbers of alcoholic beverages per week but are not diagnosed with AUD and those who have received the diagnosis. © 1986–2021 The Scientist.

Keyword: Alzheimers; Genes & Behavior
Link ID: 27995 - Posted: 09.18.2021

By Kim Tingley It’s simple, we are often told: All you have to do to maintain a healthy weight is ensure that the number of calories you ingest stays the same as the number of calories you expend. If you take in more calories, or energy, than you use, you gain weight; if the output is greater than the input, you lose it. But while we’re often conscious of burning calories when we’re working out, 55 to 70 percent of what we eat and drink actually goes toward fueling all the invisible chemical reactions that take place in our body to keep us alive. “We think about metabolism as just being about exercise, but it’s so much more than that,” says Herman Pontzer, an associate professor of evolutionary anthropology at Duke University. “It’s literally the running total of how busy your cells are throughout the day.” Figuring out your total energy expenditure tells you how many calories you need to stay alive. But it also tells you “how the body is functioning,” Pontzer says. “There is no more direct measure of that than energy expenditure.” Though scientists have been studying metabolism for at least a century, they have not been able to measure it precisely enough — in real-world conditions, in enough people, across a broad-enough age range — to see how it changes throughout the human life span. It is clear that the bigger someone is, the more cells they have, and thus the more total calories they burn per day. But it has been much harder to assess whether variables like age, sex, lifestyle and illness influence our rate of energy expenditure. This lack of data led to assumptions rooted in personal experience: for instance, that significant hormonal changes like those that take place during puberty and menopause cause our metabolism to speed up or slow down, prompting us to burn more or fewer calories per day; or that men have inherently faster metabolisms than women, because they seem able to shed pounds more easily; or that our energy expenditure slows in midlife, initiating gradual and inevitable weight gain. “I’m in my 40s; I feel different than I did in my 20s — I buy it, too,” Pontzer says. “All that intuition was never backed up by data. It just seemed so sure.” © 2021 The New York Times Company

Keyword: Obesity
Link ID: 27994 - Posted: 09.15.2021

David Kleinfeld My colleagues and I recently found that we were able to train mice to voluntarily increase the size and frequency of seemingly random dopamine impulses in their brains. Conventional wisdom in neuroscience has held that dopamine levels change solely in response to cues from the world outside of the brain. Our new research shows that increases in dopamine can also be driven by internally mediated changes within the brain. Dopamine is a small molecule found in the brains of mammals and is associated with feelings of reward and happiness. In 2014, my colleagues and I invented a new method to measure dopamine in real time in different parts of the brains of mice. Using this new tool, my former thesis student, Conrad Foo, found that neurons in the brains of mice release large bursts of dopamine – called impulses – for no easily apparent reason. This occurs at random times, but on average about once a minute. Pavlov was famously able to train his dogs to salivate at the sound of a bell, not the sight of food. Today, scientists believe that the bell sound caused a release of dopamine to predict the forthcoming reward. If Pavlov’s dogs could control their cue-based dopamine responses with a little training, we wondered if our mice could control their spontaneous dopamine impulses. To test this, our team designed an experiment that rewarded mice if they increased the strength of their spontaneous dopamine impulses. The mice were able to not only increase how strong these dopamine releases were, but also how often they occurred. When we removed the possibility of a reward, the dopamine impulses returned to their original levels. In the 1990s, neuroscientist Wolfram Schultz discovered that an animal’s brain will release dopamine if the animal expects a reward, not just when receiving a reward. This showed that dopamine can be produced in response to the expectation of a reward, not just the reward itself – the aforementioned modern version of Pavlov’s dog. © 2010–2021, The Conversation US, Inc.

Keyword: Drug Abuse; Learning & Memory
Link ID: 27993 - Posted: 09.15.2021

by Rachel Zamzow The X chromosome holds stronger-than-expected genetic sway over the structure of several brain regions, a new study finds. The X-linked genes that may underlie this oversized influence have ties to autism and intellectual disability. “There were already hints that the X chromosome was likely to be conspicuous, with how involved it is with the brain,” says lead investigator Armin Raznahan, chief of the section on developmental neurogenomics at the U.S. National Institute of Mental Health. Many X chromosome genes — including those at the root of several autism-related conditions, such as fragile X syndrome and Rett syndrome — are expressed in the brain, for example. But the new findings suggest that the X chromosome, despite containing only 5 percent of the human genome, has a privileged role in shaping the brain — one that may be particularly relevant to developmental conditions. What’s more, this influence may be stronger in men than in women, the study shows. “What they’re showing is X is fundamentally different,” says David Glahn, professor of psychology at Harvard University, who was not involved in the new study. “It’s off the scale.” Research over the past decade has linked genetic variation to shifts in brain features, such as overall size or patterns of connectivity between regions, Glahn says. But “the X chromosome and the Y chromosome are fundamentally understudied,” because including them requires extra analytical legwork, he says. © 2021 Simons Foundation

Keyword: Development of the Brain; Genes & Behavior
Link ID: 27992 - Posted: 09.15.2021

By Husseini Manji, Joseph Hayes Depression affects more than 264 million people of all ages globally. The World Health Organization ranks depression as one of the most debilitating diseases to society. It is the leading cause of disability worldwide and the psychiatric diagnosis most commonly associated with suicide, which accounts for nearly 800,000 deaths globally each year. Individuals suffering from depression may face an inability to manage life’s demands and maintain social connections, affecting all aspects of their experiences, from school and employment to relationships and overall quality of life. When it comes to treatment, approximately one third of those suffering from depression do not respond to two or more antidepressants and are considered treatment-resistant. Treatment-resistant depression is a chronic condition that places an increased emotional, functional and economic burden on the individual, their loved ones and society. It is also associated with greater morbidity, higher health care costs and various comorbid conditions. While a number of antidepressants exist, they all work through changing the levels of brain-signaling molecules called monoaminergic neurotransmitters. New drug development for depression had stalled for a number of years, and many pharmaceutical companies have withdrawn from neuroscience entirely. But recent scientific advances have led to the development of novel antidepressants working via completely different mechanisms. The brain is the most advanced, adaptive information processing system in existence—in large part because of its tremendous plasticity. Scientists have been building upon these neuroscience advances to develop completely novel, rapid-acting antidepressants. In this regard, considerable evidence has demonstrated that the regulation of two receptors—AMPA and NMDA—on many neurons that respond to the neurotransmitter glutamate control changes in the tiny junctions, or synapses, between neurons. © 2021 Scientific American

Keyword: Depression
Link ID: 27991 - Posted: 09.15.2021

Andrew Gregory Health editor Millions of people with eye conditions including age-related macular degeneration, cataracts and diabetes-related eye disease have an increased risk of developing dementia, new research shows. Vision impairment can be one of the first signs of the disease, which is predicted to affect more than 130 million people worldwide by 2050. Previous research has suggested there could be a link between eye conditions that cause vision impairment, and cognitive impairment. However, the incidence of these conditions increases with age, as do systemic conditions such as diabetes, high blood pressure, heart disease, depression and stroke, which are all accepted risk factors for dementia. That meant it was unclear whether eye conditions were linked with a higher incidence of dementia independently of systemic conditions. Now researchers have found that age-related macular degeneration, cataracts and diabetes-related eye disease are independently associated with increased risk of dementia, according to a new study published in the British Journal of Ophthalmology. The research examined data from 12,364 British adults aged 55 to 73, who were taking part in the UK Biobank study. They were assessed in 2006 and again in 2010 with their health information tracked until early 2021. More than 2,300 cases of dementia were documented, according to the international team of experts led by academics from the Guangdong Eye Institute in China. After assessing health data, researchers found those with age-related macular degeneration had a 26% increased risk of developing dementia. Those with cataracts had an 11% increased risk and people with diabetes-related eye disease had a 61% heightened risk. Glaucoma was not linked to a significant increase in risk. © 2021 Guardian News & Media Limited

Keyword: Alzheimers; Vision
Link ID: 27990 - Posted: 09.15.2021

Jon Hamilton The visual impairment known as "lazy eye" can be treated in kids by covering their other eye with a patch. Scientists may have found a way to treat adults with the condition using a pufferfish toxin. MARY LOUISE KELLY, HOST: Children who develop the visual impairment often called lazy eye can be treated by covering their other eye with a patch. Now researchers think they have found a way to treat adults using a toxin found in deadly puffer fish. The approach has only been tried in animals so far, but NPR's Jon Hamilton reports the results are encouraging. JON HAMILTON, BYLINE: A lazy eye isn't really lazy. The term refers to amblyopia, a medical condition that occurs when the brain starts ignoring the signals from one eye. Existing treatments restrict use of the strong eye in order to force the brain to pay attention to the weak one. But Mark Bear, a neuroscientist at MIT, says that approach has limits. MARK BEAR: There are a very significant number of adults with amblyopia where the treatment either didn't work or it was initiated too late. HAMILTON: After a critical period that ends at about age 10, the connections between eye and brain become less malleable. They lose what scientists call plasticity. So for several decades, Bear and a team of researchers have been trying to answer a question. BEAR: How can we rejuvenate these connections? How can they be brought back online? HAMILTON: To find out, Bear's team studied adults with amblyopia who lost their strong eye to a disease or an injury. © 2021 npr

Keyword: Vision; Development of the Brain
Link ID: 27989 - Posted: 09.15.2021

James M. Gaines Young macaques given the popular antidepressant fluoxetine for two years had lower levels of certain fatty acids and other lipids in their brains than ones not given the drug, finds a recent study (July 28) in International Journal of Molecular Sciences. The findings may help explain why younger people sometimes experience adverse side effects when taking the drug. Fluoxetine, often sold under the brand name Prozac, is a prescription medication that can be given to adults as well as children as young as 7 or 8 years old. But there’s not good literature on the long-term impact of fluoxetine and other psychoactive drugs that we use to treat adult symptoms in the young brain, says Bita Moghaddam, a behavioral neuroscientist at Oregon Health & Science University who was not involved in the study, “so [it] was really nice to see that there is this level of focus.” While genes and neurotransmitters may get the lion’s share of the attention in neuroscience research, brains are mostly made of up fats and other lipids. But lipids, it turns out, can be hard to study. So, when University of California Davis brain scientist Mari Golub and her colleagues wanted to know what was going on with the fats in the brains of the monkeys they were studying, they reached out to the brain lab at the Skoltech Institute of Science and Technology in Moscow where Anna Tkachev—the lead author on the new paper—works. “We happen to specialize in lipids in particular,” says Tkachev. For years, Golub and her colleagues had been using macaques to investigate the effects of fluoxetine. The antidepressant can be an effective treatment for maladies such as depression and obsessive-compulsive disorder. However, some studies suggest that the drug can occasionally cause serious, long-term side effects, and perhaps counter-intuitively for an antidepressant, it’s been linked to an increased risk of suicidal thinking and behavior, particularly in young people. © 1986–2021 The Scientist.

Keyword: Depression; Development of the Brain
Link ID: 27988 - Posted: 09.13.2021

Christie Wilcox If it walks like a duck and talks like a person, it’s probably a musk duck (Biziura lobata)—the only waterfowl species known that can learn sounds from other species. The Australian species’ facility for vocal learning had been mentioned anecdotally in the ornithological literature; now, a paper published September 6 in Philosophical Transactions of the Royal Society B reviews and discusses the evidence, which includes 34-year-old recordings made of a human-reared musk duck named Ripper engaging in an aggressive display while quacking “you bloody fool.” Ripper quacking "you bloody fool" while being provoked by a person separated from him by a fence The Scientist spoke with the lead author on the paper, Leiden University animal behavior researcher Carel ten Cate, to learn more about these unique ducks and what their unexpected ability reveals about the evolution of vocal learning. The Scientist: What is vocal learning? Carel ten Cate: Vocal learning, as it is used in this case, is that animals and humans, they learn their sounds from experience. So they learn from what they hear around them, which will usually be the parents, but it can also be other individuals. And if they don’t get that sort of exposure, then they will be unable to produce species-specific vocalizations, or in the human case, speech sounds and proper spoken language. © 1986–2021 The Scientist.

Keyword: Language; Evolution
Link ID: 27987 - Posted: 09.13.2021

Abby Olena Most people enjoy umami flavor, which is perceived when a taste receptor called T1R1/T1R3 senses the amino acid glutamate. In some other mammals, such as mice, however, this same receptor is much less sensitive to glutamate. In a new study published August 26 in Current Biology, researchers uncover the molecular basis for this difference. They show that the receptor evolved in humans and some other primates away from mostly binding free nucleotides, which are common in insects, to preferentially binding glutamate, which is abundant in leaves. The authors argue that the change facilitated a major evolutionary shift in these primates toward a plant-heavy diet. “The question always comes up about the evolution of umami taste: In humans, our receptor is narrowly tuned to glutamate, and we never had a good answer for why,” says Maude Baldwin, a sensory biologist at the Max Planck Institute for Ornithology in Germany. She was not involved in the new work, but coauthored a 2014 study with Yasuka Toda, who is also a coauthor on the new paper, showing that the T1R1/T1R3 receptor is responsible for sweet taste in hummingbirds. In the new study, the authors find “that this narrow tuning has evolved convergently multiple times [and] that it’s related to folivory,” she says, calling the paper “a hallmark, fantastic study, and one that will become a textbook example of how taste evolution can relate to diet and how to address these types of questions in a rigorous, comprehensive manner.” In 2011, Toda, who was then at the University of Tokyo and now leads a group at Meiji University in Japan, and Takumi Misaka of the University of Tokyo developed a strategy to use cultured cells to analyze the function of taste receptors. They used the technique to tease out the parts of the human T1R1/T1R3 that differed from that of mice and thus underlie the high glutamate sensitivity in the human receptor, work that they published in 2013. © 1986–2021 The Scientist.

Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 27986 - Posted: 09.13.2021

Sophie Fessl The hormone irisin is necessary for the cognitive benefits of exercise in healthy mice and can rescue cognitive decline associated with Alzheimer’s disease, according to a study published August 20 in Nature Metabolism. According to the authors, these results support the hypothesis that irisin undergirds the cognitive benefits of exercise—a link that has been long debated. In addition, this study has “paved the way for thinking whether irisin could be a therapeutic agent against Alzheimer’s disease,” says biologist Steffen Maak with the Leibniz Institute for Farm Animal Biology in Germany, who has been critical of the methods used to study irisin in the past and was not involved in the study. Many studies have found that exercise is good for the brain, but the molecular mechanisms responsible for the cognitive boost have remained elusive. During her postdoctoral studies, neuroscientist Christiane Wrann found that the gene that codes for irisin becomes highly expressed in the brain during exercise—one of the first studies linking irisin with the brain. See “Irisin Skepticism Goes Way Back” When she joined the faculties at Massachusetts General Hospital and Harvard Medical School, she decided to investigate the hormone further. Wrann, who holds a patent related to irisin and is academic cofounder and consultant for Aevum Therapeutics, a company developing drugs that harness the protective molecular mechanisms of exercise to treat neurodegenerative and neuromuscular disorders, began to investigate whether irisin mediates the positive effects of exercise on the brain. © 1986–2021 The Scientist.

Keyword: Learning & Memory; Hormones & Behavior
Link ID: 27985 - Posted: 09.13.2021

By Nicholas Bakalar Many animals are known to use tools, but a bird named Bruce may be one of the most ingenious nonhuman tool inventors of all: He is a disabled parrot who has designed and uses his own prosthetic beak. Bruce is a kea, a species of parrot found only in New Zealand. He is about 9 years old, and when wildlife researchers found him as a baby, he was missing his upper beak, probably because it had been caught in a trap made for rats and other invasive mammals the country was trying to eliminate. This is a severe disability, as kea use their dramatically long and curved upper beaks for preening their feathers to get rid of parasites and to remove dirt and grime. But Bruce found a solution: He has taught himself to pick up pebbles of just the right size, hold them between his tongue and his lower beak, and comb through his plumage with the tip of the stone. Other animals use tools, but Bruce’s invention of his own prosthetic is unique. Researchers published their findings Friday in the journal Scientific Reports. Studies of animal behavior are tricky — the researchers have to make careful, objective observations and always be wary of bias caused by anthropomorphizing, or erroneously attributing human characteristics to animals. “The main criticism we received before publication was, ‘Well, this activity with the pebbles may have been just accidental — you saw him when coincidentally he had a pebble in his mouth,’” said Amalia P.M. Bastos, an animal cognition researcher at the University of Auckland and the study’s lead author. “But no. This was repeated many times. He drops the pebble, he goes and picks it up. He wants that pebble. If he’s not preening, he doesn’t pick up a pebble for anything else.” Dorothy M. Fragaszy, an emerita professor of psychology at the University of Georgia who has published widely on animal behavior but was unacquainted with Bruce’s exploits, praised the study as a model of how to study tool use in animals. © 2021 The New York Times Company

Keyword: Intelligence; Evolution
Link ID: 27984 - Posted: 09.11.2021

By Carolyn Wilke Babies may laugh like some apes a few months after birth before transitioning to chuckling more like human adults, a new study finds. Laughter links humans to great apes, our evolutionary kin (SN: 6/4/09). Human adults tend to laugh while exhaling (SN: 6/10/15), but chimpanzees and bonobos mainly laugh in two ways. One is like panting, with sound produced on both in and out breaths, and the other has outbursts occurring on exhales, like human adults. Less is known about how human babies laugh. So Mariska Kret, a cognitive psychologist at Leiden University in the Netherlands, and colleagues scoured the internet for videos with laughing 3- to 18-month-olds, and asked 15 speech sound specialists and thousands of novices to judge the babies’ laughs. After evaluating dozens of short audio clips, experts and nonexperts alike found that younger infants laughed during inhalation and exhalation, while older infants laughed more on the exhale. That finding suggests that infants’ laughter becomes less apelike with age, the researchers report in the September Biology Letters. Humans start to laugh around 3 months of age, but early on, “it hasn’t reached its full potential,” Kret says. Both babies’ maturing vocal tracts and their social interactions may influence the development of the sounds, the researchers say.

Keyword: Language; Evolution
Link ID: 27983 - Posted: 09.11.2021