Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 28050

By Rodrigo Pérez Ortega Was Tyrannosaurus rex as smart as a baboon? Scientists don’t like to compare intelligence between species (everyone has their own talents, after all), but a controversial new study suggests some dino brains were as densely packed with neurons as those of modern primates. If so, that would mean they were very smart—more than researchers previously thought—and could have achieved feats only humans and other very intelligent animals have, such as using tools. The findings, reported last week in the Journal of Comparative Neurology, are making waves among paleontologists on social media and beyond. Some are applauding the paper as a good first step toward better understanding dinosaur smarts, whereas others argue the neuron estimates are flawed, undercutting the study’s conclusions. Measuring dinosaur intelligence has never been easy. Historically, researchers have used something called the encephalization quotient (EQ), which measures an animal’s relative brain size, related to its body size. A T. rex, for example, had an EQ of about 2.4, compared with 3.1 for a German shepherd dog and 7.8 for a human—leading some to assume it was at least somewhat smart. EQ is hardly foolproof, however. In many animals, body size evolves independently from brain size, says Ashley Morhardt, a paleoneurologist at Washington University School of Medicine in St. Louis who wasn’t involved in the study. “EQ is a fraught metric, especially when studying extinct species.” Looking for a more trustworthy alternative, Suzana Herculano-Houzel, a neuroanatomist at Vanderbilt University, turned to a different measure: the density of neurons in the cortex, the wrinkly outer brain area critical to most intelligence-related tasks. She had previously estimated the number of neurons in many animal species, including humans, by making “brain soup”—dissolving brains in a detergent solution—and counting the neurons in different parts of the brain. © 2023 American Association for the Advancement of Science.

Keyword: Evolution
Link ID: 28627 - Posted: 01.12.2023

Xiaofan Lei What comes to mind when you think of someone who stutters? Is that person male or female? Are they weak and nervous, or powerful and heroic? If you have a choice, would you like to marry them, introduce them to your friends or recommend them for a job? Your attitudes toward people who stutter may depend partly on what you think causes stuttering. If you think that stuttering is due to psychological causes, such as being nervous, research suggests that you are more likely to distance yourself from those who stutter and view them more negatively. I am a person who stutters and a doctoral candidate in speech, language and hearing sciences. Growing up, I tried my best to hide my stuttering and to pass as fluent. I avoided sounds and words that I might stutter on. I avoided ordering the dishes I wanted to eat at the school cafeteria to avoid stuttering. I asked my teacher to not call on me in class because I didn’t want to deal with the laughter from my classmates when they heard my stutter. Those experiences motivated me to investigate stuttering so that I can help people who stutter, including myself, to better cope with the condition. Get facts about the coronavirus pandemic and the latest research In writing about what the scientific field has to say about stuttering and its biological causes, I hope I can reduce the stigma and misunderstanding surrounding the disorder. The most recognizable characteristics of developmental stuttering are the repetitions, prolongations and blocks in people’s speech. People who stutter may also experience muscle tension during speech and exhibit secondary behaviors, such as tics and grimaces. © 2010–2023, The Conversation US, Inc.

Keyword: Language
Link ID: 28626 - Posted: 01.12.2023

By Oliver Whang Hod Lipson, a mechanical engineer who directs the Creative Machines Lab at Columbia University, has shaped most of his career around what some people in his industry have called the c-word. On a sunny morning this past October, the Israeli-born roboticist sat behind a table in his lab and explained himself. “This topic was taboo,” he said, a grin exposing a slight gap between his front teeth. “We were almost forbidden from talking about it — ‘Don’t talk about the c-word; you won’t get tenure’ — so in the beginning I had to disguise it, like it was something else.” That was back in the early 2000s, when Dr. Lipson was an assistant professor at Cornell University. He was working to create machines that could note when something was wrong with their own hardware — a broken part, or faulty wiring — and then change their behavior to compensate for that impairment without the guiding hand of a programmer. Just as when a dog loses a leg in an accident, it can teach itself to walk again in a different way. This sort of built-in adaptability, Dr. Lipson argued, would become more important as we became more reliant on machines. Robots were being used for surgical procedures, food manufacturing and transportation; the applications for machines seemed pretty much endless, and any error in their functioning, as they became more integrated with our lives, could spell disaster. “We’re literally going to surrender our life to a robot,” he said. “You want these machines to be resilient.” One way to do this was to take inspiration from nature. Animals, and particularly humans, are good at adapting to changes. This ability might be a result of millions of years of evolution, as resilience in response to injury and changing environments typically increases the chances that an animal will survive and reproduce. Dr. Lipson wondered whether he could replicate this kind of natural selection in his code, creating a generalizable form of intelligence that could learn about its body and function no matter what that body looked like, and no matter what that function was. ImageHod Lipson, in jeans, a dark jacket and a dark button-down shirt, stands at the double-door entrance to the Creative Machines Lab. Signs on and next to the doors read “Creative Machines Lab,” “Laboratory,” “No Smoking” and “Smile, You’re On Camera.” © 2023 The New York Times Company

Keyword: Consciousness; Robotics
Link ID: 28625 - Posted: 01.07.2023

By Elizabeth Pennisi Biologists have long known that new protein-coding genes can arise through the duplication and modification of existing ones. But some protein genes can also arise from stretches of the genome that once encoded aimless strands of RNA instead. How new protein genes surface this way has been a mystery, however. Now, a study identifies mutations that transform seemingly useless DNA sequences into potential genes by endowing their encoded RNA with the skill to escape the cell nucleus—a critical step toward becoming translated into a protein. The study’s authors highlight 74 human protein genes that appear to have arisen in this de novo way—more than half of which emerged after the human lineage branched off from chimpanzees. Some of these newcomer genes may have played a role in the evolution of our relatively large and complex brains. When added to mice, one made the rodent brains grow bigger and more humanlike, the authors report this week in Nature Ecology & Evolution. “This work is a big advance,” says Anne-Ruxandra Carvunis, an evolutionary biologist at the University of Pittsburgh, who was not involved with the research. It “suggests that de novo gene birth may have played a role in human brain evolution.” Although some genes encode RNAs that have structural or regulatory purposes themselves, those that encode proteins instead create an intermediary RNA. Made in the nucleus like other RNAs, these messenger RNAs (mRNAs) exit into the cytoplasm and travel to organelles called ribosomes to tell them how to build the gene’s proteins. A decade ago, Chuan-Yun Li, an evolutionary biologist at Peking University, and colleagues discovered that some human protein genes bore a striking resemblance to DNA sequences in rhesus monkeys that got transcribed into long noncoding RNAs (lncRNAs), which didn’t make proteins or have any other apparent purpose. Li couldn’t figure out what it had taken for those stretches of monkey DNA to become true protein-coding genes in humans. © 2023 American Association for the Advancement of Science.

Keyword: Development of the Brain; Genes & Behavior
Link ID: 28624 - Posted: 01.07.2023

by Giorgia Guglielmi About five years ago, Catarina Seabra made a discovery that led her into uncharted scientific territory. Seabra, then a graduate student in Michael Talkowski’s lab at Harvard University, found that disrupting the autism-linked gene MBD5 affects the expression of other genes in the brains of mice and in human neurons. Among those genes, several are involved in the formation and function of primary cilia — hair-like protrusions on the cell’s surface that sense its external environment. “This got me intrigued, because up to that point, I had never heard of primary cilia in neurons,” Seabra says. She wondered if other researchers had linked cilia defects to autism-related conditions, but the scientific literature offered only sparse evidence, mostly in mice. Seabra, now a postdoctoral researcher in the lab of João Peça at the Center for Neuroscience and Cell Biology at the University of Coimbra in Portugal, is spearheading an effort to look for a connection in people: The Peça lab established a biobank of dental stem cells obtained from baby teeth of 50 children with autism or other neurodevelopmental conditions. And the team plans to look at neurons and brain organoids made from those cells to see if their cilia show any defects in structure or function. Other neuroscientists, too, are working to understand the role of cilia during neurodevelopment. Last September, for example, researchers working with tissue samples from mice discovered that cilia on the surface of neurons can form junctions, or synapses, with other neurons — which means cilia defects could, at least in theory, hinder the development of neural circuitry and activity. Other teams have connected several additional autism-related genes, beyond MBD5, to the tiny cell antennae. © 2023 Simons Foundation

Keyword: Autism
Link ID: 28623 - Posted: 01.07.2023

By Laurie McGinley The Food and Drug Administration on Friday approved an Alzheimer’s drug that slowed cognitive decline in a major study, offering patients desperately needed hope — even as doctors sharply debated the safety of the drug and whether it provides a significant benefit. The FDA said the drug, called lecanemab, is for patients with mild cognitive impairment or early dementia because of Alzheimer’s. The accelerated approval was based on a mid-stage trial that showed the treatment effectively removed a sticky protein called amyloid beta — considered a hallmark of the illness — from the brain. A larger trial, conducted more recently, found the drug, which will be sold under the brand name Leqembi, slowed the progression of Alzheimer’s disease by 27 percent. “This treatment option is the latest therapy to target and affect the underlying disease process of Alzheimer’s, instead of only treating the symptoms of the disease,” Billy Dunn, director of the FDA’s Office of Neuroscience, said in a statement. The approval followed a barrage of criticism endured by the FDA for its 2021 approval of Aduhelm, another amyloid-targeting drug that had been panned by the agency’s outside experts. Lecanemab is getting a warmer reception but disagreements remain. Many neurologists and advocates hailed lecanemab, given intravenously twice a month, as an important advance — one that follows years of failure involving Alzheimer’s drugs. They said the treatment will allow patients to stay longer in the milder stages of the fatal, neurodegenerative disorder, which afflicts more than 6 million people in the United States.

Keyword: Alzheimers
Link ID: 28622 - Posted: 01.07.2023

McKenzie Prillaman The hotel ballroom was packed to near capacity with scientists when Susan Yanovski arrived. Despite being 10 minutes early, she had to manoeuvre her way to one of the few empty seats near the back. The audience at the ObesityWeek conference in San Diego, California, in November 2022, was waiting to hear the results of a hotly anticipated drug trial. The presenters — researchers affiliated with pharmaceutical company Novo Nordisk, based in Bagsværd, Denmark — did not disappoint. They described the details of an investigation of a promising anti-obesity medication in teenagers, a group that is notoriously resistant to such treatment. The results astonished researchers: a weekly injection for almost 16 months, along with some lifestyle changes, reduced body weight by at least 20% in more than one-third of the participants1. Previous studies2,3 had shown that the drug, semaglutide, was just as impressive in adults. The presentation concluded like no other at the conference, says Yanovski, co-director of the Office of Obesity Research at the US National Institute of Diabetes and Digestive and Kidney Diseases in Bethesda, Maryland. Sustained applause echoed through the room “like you were at a Broadway show”, she says. This energy has pervaded the field of obesity medicine for the past few years. After decades of work, researchers are finally seeing signs of success: a new generation of anti-obesity medications that drastically diminish weight without the serious side effects that have plagued previous efforts. These drugs are arriving in an era in which obesity is growing exponentially. Worldwide obesity has tripled since 1975; in 2016, about 40% of adults were considered overweight and 13% had obesity, according to the World Health Organization (WHO). With extra weight often comes heightened risk of health conditions such as type 2 diabetes, heart disease and certain cancers. The WHO recommends healthier diets and physical activity to reduce obesity, but medication might help when lifestyle changes aren’t enough. The new drugs mimic hormones known as incretins, which lower blood sugar and curb appetite. Some have already been approved for treating type 2 diabetes, and they are starting to win approval for inducing weight loss. © 2023 Springer Nature Limited

Keyword: Obesity
Link ID: 28621 - Posted: 01.04.2023

By Freda Kreier Living through the COVID-19 pandemic may have matured teens’ brains beyond their years. From online schooling and social isolation to economic hardship and a mounting death count, the last few years have been rough on young people. For teens, the pandemic and its many side effects came during a crucial window in brain development. Now, a small study comparing brain scans of young people from before and after 2020 reveals that the brains of teens who lived through the pandemic look about three years older than expected, scientists say. This research, published December 1 in Biological Psychiatry: Global Open Science, is the first to look at the impact of the pandemic on brain aging. The finding reveals that “the pandemic hasn’t been bad just in terms of mental health for adolescents,” says Ian Gotlib, a clinical neuroscientist at Stanford University. “It seems to have altered their brains as well.” The study can’t link those brain changes to poor mental health during the pandemic. But “we know there is a relationship between adversity and the brain as it tries to adapt to what it’s been given,” says Beatriz Luna, a developmental cognitive neuroscientist at the University of Pittsburgh, who wasn’t involved in the research. “I think this is a very important study that sets the ball rolling for us to look at this.” The roots of this study date back to nearly a decade ago, when Gotlib and his colleagues launched a project in California’s Bay Area to study depression in adolescents. The researchers were collecting information on the mental health of the kids in the study, and did MRI scans of their brains. © Society for Science & the Public 2000–2023.

Keyword: Development of the Brain; Stress
Link ID: 28620 - Posted: 01.04.2023

By Ellen Barry The effect of social media use on children is a fraught area of research, as parents and policymakers try to ascertain the results of a vast experiment already in full swing. Successive studies have added pieces to the puzzle, fleshing out the implications of a nearly constant stream of virtual interactions beginning in childhood. A new study by neuroscientists at the University of North Carolina tries something new, conducting successive brain scans of middle schoolers between the ages of 12 and 15, a period of especially rapid brain development. The researchers found that children who habitually checked their social media feeds at around age 12 showed a distinct trajectory, with their sensitivity to social rewards from peers heightening over time. Teenagers with less engagement in social media followed the opposite path, with a declining interest in social rewards. The study, published on Tuesday in JAMA Pediatrics, is among the first attempts to capture changes to brain function correlated with social media use over a period of years. The study has important limitations, the authors acknowledge. Because adolescence is a period of expanding social relationships, the brain differences could reflect a natural pivot toward peers, which could be driving more frequent social media use. “We can’t make causal claims that social media is changing the brain,” said Eva H. Telzer, an associate professor of psychology and neuroscience at the University of North Carolina, Chapel Hill, and one of the authors of the study. But, she added, “teens who are habitually checking their social media are showing these pretty dramatic changes in the way their brains are responding, which could potentially have long-term consequences well into adulthood, sort of setting the stage for brain development over time.” © 2023 The New York Times Company

Keyword: Development of the Brain; Stress
Link ID: 28619 - Posted: 01.04.2023

By Erin Blakemore Can the human body betray a lie? In the 1920s, inventors designed a device they said could detect deception by monitoring a subject’s breathing and blood pressure. “The Lie Detector,” an American Experience documentary that premieres Tuesday on PBS, delves into the history of the infamous device. In the century after its invention, the lie detector’s popularity skyrocketed. And despite a checkered legacy, polygraph tests are still regularly used by law enforcement and some employers. The documentary tells a story of honest intentions and sinister consequences. John Larson, one of its inventors, was a medical student and law enforcement officer in search of more humane methods of policing and interrogation. He piggybacked off new scientific and psychological concepts to create the device in 1921. The technologies Larson and his co-inventors used were still in their infancy, and the idea that people produce measurable, consistent physical symptoms when they lie was unproved. It still is. Polygraph protocols have evolved, but the devices’ detractors say they measure only anxiety, not truthfulness. And even as major organizations have raised questions about the scientific validity of the tests and federal laws have prohibited most private employers from requiring them, the idea that dishonesty can be measured through physical testing remains widespread. The documentary suggests that the polygraph tests’ popularity was tied more to publicity than accuracy — and over time, Larson’s vision was turned on its head as polygraphs were used to intimidate, incarcerate and interrogate people. With the help of expert interviews and a kaleidoscope of historical footage and imagery, director Rob Rapley tracks the tale of an invention its own creator compared to Frankenstein’s monster.

Keyword: Stress
Link ID: 28618 - Posted: 01.04.2023

By Andrew Jacobs PORTLAND, Ore. — The curriculum was set, the students were enrolled and Oregon officials had signed off on nearly every detail of training for the first class of “magic” mushroom facilitators seeking state certification. But as the four-day session got underway inside a hotel conference room in early December, an important pedagogical tool was missing: the mushrooms themselves. That’s because state officials, two years after Oregon voters narrowly approved the adult use of psilocybin, were still hammering out the regulatory framework for the production and sale of the tawny hallucinogenic fungi. Instead, the students, most of them seasoned mental health professionals, would have to role play with one another using meditation or intensive breathing practices that could lead to altered states of consciousness — the next best thing to the kind of psychedelic trip they would encounter as licensed guides. Not that anyone was complaining. Like many of the two dozen students who paid nearly $10,000 for the course, Jason Wright, 48, a hospital psychiatric nurse in Portland, said he was thrilled to be part of a bold experiment with national implications. “It’s incredible to be on the front lines of something that has the potential to change our relationship with drugs that should never have been criminalized in the first place,” he said. On Jan. 1, Oregon became the first state in the nation to legalize the adult use of psilocybin, a naturally occurring psychedelic that has shown significant promise for treating severe depression, post-traumatic stress disorder and end-of-life anxiety among the terminally ill, among other mental health conditions. © 2023 The New York Times Company

Keyword: Drug Abuse; Depression
Link ID: 28617 - Posted: 01.04.2023

Linda Geddes Science correspondent Scientists have developed a blood test to diagnose Alzheimer’s disease without the need for expensive brain imaging or a painful lumbar puncture, where a sample of cerebrospinal fluid (CSF) is drawn from the lower back. If validated, the test could enable faster diagnosis of the disease, meaning therapies could be initiated earlier. Alzheimer’s is the most common form of dementia, but diagnosis remains challenging – particularly during the earlier stages of the disease. Current guidelines recommend detection of three distinct markers: abnormal accumulations of amyloid and tau proteins, as well as neurodegeneration – the slow and progressive loss of neuronal cells in specified regions of the brain. This can be done through a combination of brain imaging and CSF analysis. However, a lumbar puncture can be painful and people may experience headaches or back pain after the procedure, while brain imaging is expensive and takes a long time to schedule. Prof Thomas Karikari at the University of Pittsburgh, in Pennsylvania, US, who was involved in the study, said: “A lot of patients, even in the US, don’t have access to MRI and PET scanners. Accessibility is a major issue.” The development of a reliable blood test would be an important step forwards. “A blood test is cheaper, safer and easier to administer, and it can improve clinical confidence in diagnosing Alzheimer’s and selecting participants for clinical trial and disease monitoring,” Karikari said. Although current blood tests can accurately detect abnormalities in amyloid and tau proteins, detecting markers of nerve cell damage that are specific to the brain has been harder. Karikari and his colleagues around the world focused on developing an antibody-based blood test that would detect a particular form of tau protein called brain-derived tau, which is specific to Alzheimer’s disease. © 2022 Guardian News & Media Limited

Keyword: Alzheimers
Link ID: 28616 - Posted: 12.28.2022

Miryam Naddaf Stimulating neurons that are linked to alertness helps rats with cochlear implants learn to quickly recognize tunes, researchers have found. The results suggest that activity in a brain region called the locus coeruleus (LC) improves hearing perception in deaf rodents. Researchers say the insights are important for understanding how the brain processes sound, but caution that the approach is a long way from helping people. “It’s like we gave them a cup of coffee,” says Robert Froemke, an otolaryngologist at New York University School of Medicine and a co-author of the study, published in Nature on 21 December1. Cochlear implants use electrodes in the inner-ear region called the cochlea, which is damaged in people who have severe or total hearing loss. The device converts acoustic sounds into electrical signals that stimulate the auditory nerve, and the brain learns to process these signals to make sense of the auditory world. Some people with cochlear implants learn to recognize speech within hours of the device being implanted, whereas others can take months or years. “This problem has been around since the dawn of cochlear implants, and it shows no signs of being resolved,” says Gerald Loeb at the University of Southern California in Los Angeles, who helped to develop one of the first cochlear implants. Researchers say that a person’s age, the duration of their hearing loss and the type of processor and electrodes in the implant don’t account for this variation, but suggest that the brain could be the source of the differences. “It’s sort of the black box,” says Daniel Polley, an auditory neuroscientist at Harvard Medical School in Boston, Massachusetts. Most previous research has focused on improving the cochlear device and the implantation procedure. Attempts to improve the brain’s ability to use the device open up a way to improve communication between the ear and the brain, says Polley. © 2022 Springer Nature Limited

Keyword: Hearing
Link ID: 28615 - Posted: 12.28.2022

Will Stone Maybe this happens to you sometimes, too: You go to bed with some morning obligation on your mind, maybe a flight to catch or an important meeting. The next morning, you wake up on your own and discover you've beat your alarm clock by just a minute or two. What's going on here? Is it pure luck? Or perhaps you possess some uncanny ability to wake up precisely on time without help? It turns out many people have come to Dr. Robert Stickgold over the years wondering about this phenomenon. "This is one of those questions in the study of sleep where everybody in the field seems to agree that's what's obviously true couldn't be," says Stickgold who's a cognitive neuroscientist at Harvard Medical School and Beth Israel Deaconess Medical Center. Stickgold even remembers bringing it up to his mentor when he was just starting out in the field — only to be greeted with a dubious look and a far from satisfactory explanation. "I can assure you that all of us sleep researchers say 'balderdash, that's impossible,' " he says. And yet Stickgold still believes there is something to it. "This kind of precision waking is reported by hundreds and thousands of people,'" he says, including himself. "I can wake up at 7:59 and turn off the alarm clock before my wife wakes up." At least, sometimes. Of course, it's well known that humans have an elegant and intricate system of internal processes that help our bodies keep time. Somewhat shaped by our exposure to sunlight, caffeine, meals, exercise and other factors, these processes regulate our circadian rhythms throughout the roughly 24-hour cycle of day and night, and this affects when we go to bed and wake up. © 2022 npr

Keyword: Biological Rhythms
Link ID: 28614 - Posted: 12.28.2022

By Kelsey Ables Persistent loss of smell has left some covid-19 survivors yearning for the scent of their freshly bathed child or a waft of their once-favorite meal. It’s left others inured to the stink of garbage and accidentally drinking spoiled milk. “Anosmia,” as experts call it, is one of long covid’s strangest symptoms — and researchers may be one step closer to figuring it out what causes it and how to fix it. A small study published online on Wednesday in Science Translational Medicine and led by researchers at Duke University, Harvard and the University of California San Diego offers a theory, and new insight, into lingering smell loss. Scientists analyzed samples of olfactory epithelial tissue — where smell cells live — from 24 biopsies, nine of which were from post-covid patients struggling with persistent loss of smell. Although the sample was small, the results suggest that the sensory deficit is linked to an ongoing immune attack on cells responsible for smell — which endures even after the virus is gone — and a decline in the number of olfactory nerve cells. Bradley Goldstein, associate professor in Duke’s Department of Head and Neck Surgery and Communication Sciences and the Department of Neurobiology, an author on the paper, called the results “striking” and said in a statement, “It’s almost resembling a sort of autoimmune-like process in the nose.” While there has been research that looks at short-term smell loss and uses animal models, the new study is notable because it focuses on persistent smell loss and uses high-tech molecular analysis on human tissue. The study reflects enduring interest in the mysterious symptom. In July, researchers estimated that at least 5.6 percent of covid-19 patients develop chronic smell problems. That study, published in the peer-reviewed medical trade publication BMJ, also suggested that women as well as those who had more severe initial dysfunction were less likely to recover their sense of smell. Seniors are also especially vulnerable, The Post has reported.

Keyword: Chemical Senses (Smell & Taste)
Link ID: 28613 - Posted: 12.28.2022

By Shayla Love On Valentine’s Day in 2016, Anne Lantoine received not flowers, but divorce papers. In the months preceding, she had been preparing for her family’s move from France to Canada—or so she thought. She arrived in Quebec early with one of her three children, who was preparing to start college there, while the other two remained in Europe for school. Her husband stayed behind to manage the sale of their house in Marseille. Then the realtors began to complain, through a barrage of calls and emails, to Lantoine. Her husband was not acting like a man who wanted his house sold. He wasn’t answering phone calls and was never available for showings. In January 2016, Lantoine called him after yet another complaint from a realtor. The next morning, he sent her an email with a notice for a court hearing, and she discovered her husband had actually filed for divorce, without telling her, months earlier. That February, she finally got the paperwork, not from her husband, but from her real estate agent. “It was not my last shock,” Lantoine, now 59, recalls. “I also discovered that my husband’s mistress was living in my home.” These revelations were a huge blow practically: It disrupted the immigration paperwork, and Lantoine and her daughter lost their visa applications. But the searing pain was in the betrayal and deceit. “I became very anxious and had constant nightmares,” she says. “I was tired all the time and had panic attacks each time I opened my mail or my emails, or when I had an unidentified phone call.” Though the details of each case vary, romantic betrayal through infidelity, abandonment, or emotional manipulation can upend one’s life in an instant. For Lantoine, her future plans, and the person they were attached to, were suddenly gone, and her functioning along with them. © 2022 NautilusThink Inc, All rights reserved.

Keyword: Stress; Learning & Memory
Link ID: 28612 - Posted: 12.28.2022

By Deborah Blum Back in the year 2000, sitting in his small home office in California’s Mill Valley, surrounded by stacks of spreadsheets, Jay Rosner hit one of those dizzying moments of dismay. An attorney and the executive director of The Princeton Review Foundation, the philanthropic arm of the private test-preparation and tutoring company, The Princeton Review, Rosner was scheduled to give testimony in a highly charged affirmative action lawsuit against the University of Michigan. He knew the case, Grutter v. Bollinger, was eventually headed to the U.S. Supreme Court, but as he reviewed the paperwork, he discovered a daunting gap in his argument.  Rosner had been asked to explore potential racial and cultural biases baked into standardized testing. He believed such biases, which critics had been surfacing for years prior, were real, but in that moment, he felt himself coming up short. “I suddenly realized that I would be deposed on this issue,” he recalled, “and I had no data to support my hypothesis, only deductive reasoning.”   The punch of that realization still resonates. Rosner is the kind of guy who really likes data to stand behind his points, and he recalls an anxiety-infused hunt for some solid facts. Rosner was testifying about an entrance exam for law school, the LSAT, for which he could find no particulars. But he knew that a colleague had data on how students of different racial backgrounds answered specific questions on another powerful standardized test, the SAT, long used to help decide undergraduate admission to colleges — given in New York state. He decided he could use that information to make a case by analogy. The two scholars agreed to crunch some numbers.  Based on past history of test results, he knew that White students would overall have higher scores than Black students. Still, Rosner expected Black students to perform better on some questions. To his shock, he found no trace of such balance. The results were “incredibly uniform,” he said, skewing almost entirely in favor of White students. “Every single question except one in the New York state data on four SATs favored Whites over Blacks,” Rosner recalled.

Keyword: Intelligence; Genes & Behavior
Link ID: 28611 - Posted: 12.24.2022

By Susan Milius As tiny glass frogs fall asleep for the day, they take almost 90 percent of their red blood cells out of circulation. The colorful cells cram into hideaway pockets inside the frog liver, which disguises the cells behind a mirrorlike surface, a new study finds. Biologists have known that glass frogs have translucent skin, but temporarily hiding bold red blood brings a new twist to vertebrate camouflage (SN: 6/23/17). “The heart stopped pumping red, which is the normal color of blood, and only pumped a bluish liquid,” says evolutionary biochemist Carlos Taboada of Duke University, one of the discoverers of the hidden blood. What may be even more amazing to humans — prone to circulatory sludge and clogs — is that the frogs hold almost all their red blood cells packed together for hours with no blood clots, says co-discoverer Jesse Delia, now at the American Museum of Natural History in New York City. Wake the frog up, and cells just unpack themselves and get circulating again. Hiding those red blood cells can double or triple the transparency of glass frogs, Taboada, Delia and colleagues report in the Dec. 23 Science. That greenish transparency can matter a lot for the snack-sized frogs, which spend the day hiding like little shadows on the undersides of the leaves high in the forest canopy. A photo on the left showing a sleeping female glass frog with most of her red blood cells tucked into her liver. While the photo on the right shows the frog while awake with blood circulating and less transparent. What got Delia wondering about transparency was a photo emergency. He had studied glass frog behavior, but had never even seen them asleep. “They go to bed, I go to bed — that was my life for years,” he says. When he needed some charismatic portraits, however, he put some frogs in lab dishes and at last saw how the animals sleep the day away. © Society for Science & the Public 2000–2022.

Keyword: Sleep; Aggression
Link ID: 28610 - Posted: 12.24.2022

By Tom Siegfried Survival of the fittest often means survival of the fastest. But fastest doesn’t necessarily mean the fastest moving. It might mean the fastest thinking. When faced with the approach of a powerful predator, for instance, a quick brain can be just as important as quick feet. After all, it is the brain that tells the feet what to do — when to move, in what direction, how fast and for how long. And various additional mental acrobatics are needed to evade an attacker and avoid being eaten. A would-be meal’s brain must decide whether to run or freeze, outrun or outwit, whether to keep going or find a place to hide. It also helps if the brain remembers where the best hiding spots are and recalls past encounters with similar predators. All in all, a complex network of brain circuitry must be engaged, and neural commands executed efficiently, to avert a predatory threat. And scientists have spent a lot of mental effort themselves trying to figure out how the brains of prey enact their successful escape strategies. Studies in animals as diverse as mice and crabs, fruit flies and cockroaches are discovering the complex neural activity — in both the primitive parts of the brain and in more cognitively advanced regions — that underlies the physical behavior guiding escape from danger and the search for safety. Lessons learned from such studies might not only illuminate the neurobiology of escape, but also provide insights into how evolution has shaped other brain-controlled behaviors. This research “highlights an aspect of neuroscience that is really gaining traction these days,” says Gina G. Turrigiano of Brandeis University, past president of the Society for Neuroscience. “And that is the idea of using ethological behaviors — behaviors that really matter for the biology of the animal that’s being studied — to unravel brain function.” © 2022 Annual Reviews

Keyword: Aggression; Attention
Link ID: 28609 - Posted: 12.24.2022

Jon Hamilton Time is woven into our personal memories. Recall a childhood fall from a bike and the brain replays the entire episode in excruciating detail: the glimpse of wet leaves on the road ahead, the moment of weightless dread, and then the painful impact. This exact sequence has been embedded in the memory, thanks to some special neurons known as time cells. When the brain detects a notable event, time cells begin a highly orchestrated performance, says Marc Howard, who directs the Brain, Behavior, and Cognition program at Boston University. "What we find is that the cells fire in a sequence," he says. "So cell one might fire immediately, but cell two waits a little bit, followed by cell three, cell four, and so on." As each cell fires, it places a sort of time stamp on an unfolding experience. And the same cells fire in the same order when we retrieve a memory of the experience, even something mundane. "If I remember being in my kitchen and making a cup of coffee," Howard says, "the time cells that were active at that moment are re-activated." They recreate the grinder's growl, the scent of Arabica, the curl of steam rising from a fresh mug – and your neurons replay these moments in sequence every time you summon the memory. This system appears to explain how we are able to virtually travel back in time, and play mental movies of our life experiences. There are also hints that time cells play a critical role in imagining future events. Without time cells, our memories would lack order. In an experiment at the University of California, San Diego, scientists gave several groups of people a tour of the campus. The tour included 11 planned events, including finding change in a vending machine and drinking from a water fountain. © 2022 npr

Keyword: Attention; Learning & Memory
Link ID: 28608 - Posted: 12.21.2022