Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 28654

by Alex Blasdel Patient One was 24 years old and pregnant with her third child when she was taken off life support. It was 2014. A couple of years earlier, she had been diagnosed with a disorder that caused an irregular heartbeat, and during her two previous pregnancies she had suffered seizures and faintings. Four weeks into her third pregnancy, she collapsed on the floor of her home. Her mother, who was with her, called 911. By the time an ambulance arrived, Patient One had been unconscious for more than 10 minutes. Paramedics found that her heart had stopped. After being driven to a hospital where she couldn’t be treated, Patient One was taken to the emergency department at the University of Michigan. There, medical staff had to shock her chest three times with a defibrillator before they could restart her heart. She was placed on an external ventilator and pacemaker, and transferred to the neurointensive care unit, where doctors monitored her brain activity. She was unresponsive to external stimuli, and had a massive swelling in her brain. After she lay in a deep coma for three days, her family decided it was best to take her off life support. It was at that point – after her oxygen was turned off and nurses pulled the breathing tube from her throat – that Patient One became one of the most intriguing scientific subjects in recent history. For several years, Jimo Borjigin, a professor of neurology at the University of Michigan, had been troubled by the question of what happens to us when we die. She had read about the near-death experiences of certain cardiac-arrest survivors who had undergone extraordinary psychic journeys before being resuscitated. Sometimes, these people reported travelling outside of their bodies towards overwhelming sources of light where they were greeted by dead relatives. Others spoke of coming to a new understanding of their lives, or encountering beings of profound goodness. Borjigin didn’t believe the content of those stories was true – she didn’t think the souls of dying people actually travelled to an afterworld – but she suspected something very real was happening in those patients’ brains. In her own laboratory, she had discovered that rats undergo a dramatic storm of many neurotransmitters, including serotonin and dopamine, after their hearts stop and their brains lose oxygen. She wondered if humans’ near-death experiences might spring from a similar phenomenon, and if it was occurring even in people who couldn’t be revived. © 2024 Guardian News & Media Limited

Keyword: Consciousness; Attention
Link ID: 29236 - Posted: 04.02.2024

By Tina Hesman Saey Atoosa Samani started learning about pigeon genetics at a young age. She grew up surrounded by pet pigeons in Isfahan, a city in central Iran famed for its pigeon towers. Her favorite was an all-white bird. But 6- or 7-year-old Samani noticed that this particular pigeon never fathered all-white offspring. She learned that white coloring is a recessive genetic trait — one that shows up only when an individual inherits two broken copies of a gene (SN: 2/7/22). In this case, the pigeon had two broken copies of a gene that normally makes pigment to color feathers, so his feathers were white. But his offspring inherited a normal, pigment-producing version of the gene from their mothers and had colored feathers. That early lesson in pigeon heredity stuck with Samani and fueled her desire to learn more about genetics. When she moved to the United States to study at the University of Utah in Salt Lake City, it seemed only natural to join Michael Shapiro’s lab to investigate why some pigeons (Columba livia) do backward somersaults (SN: 1/31/13). These roller pigeons come in two varieties: Flying rollers such as Birmingham rollers, which fly but do long tumbling runs toward the ground before resuming flight, and parlor rollers, which can’t fly but instead backflip along the ground. Many Persian poems say the pigeons perform the acrobatics because the birds are happy, but Samani says the truth is darker. “This is definitely a movement disorder, and it does not have any good aspects to it,” she says. The disorder is progressive, appearing soon after hatching and gradually getting worse until the birds can’t fly. © Society for Science & the Public 2000–2024.

Keyword: Movement Disorders; Genes & Behavior
Link ID: 29235 - Posted: 04.02.2024

By Paula Span The phone awakened Doug Nordman at 3 a.m. A surgeon was calling from a hospital in Grand Junction, Colo., where Mr. Nordman’s father had arrived at the emergency room, incoherent and in pain, and then lost consciousness. At first, the staff had thought he was suffering a heart attack, but a CT scan found that part of his small intestine had been perforated. A surgical team repaired the hole, saving his life, but the surgeon had some questions. “Was your father an alcoholic?” he asked. The doctors had found Dean Nordman malnourished, his peritoneal cavity “awash with alcohol.” The younger Mr. Nordman, a military personal finance author living in Oahu, Hawaii, explained that his 77-year-old dad had long been a classic social drinker: a Scotch and water with his wife before dinner, which got topped off during dinner, then another after dinner, and perhaps a nightcap. Having three to four drinks daily exceeds current dietary guidelines, which define moderate consumption as two drinks a day for men and one for women, or less. But “that was the normal drinking culture of the time,” said Doug Nordman, now 63. At the time of his 2011 hospitalization, though, Dean Nordman, a retired electrical engineer, was widowed, living alone and developing symptoms of dementia. He got lost while driving, struggled with household chores and complained of a “slipping memory.” He had waved off his two sons’ offers of help, saying he was fine. During that hospitalization, however, Doug Nordman found hardly any food in his father’s apartment. Worse, reviewing his father’s credit card statements, “I saw recurring charges from the Liquor Barn and realized he was drinking a pint of Scotch a day,” he said. Public health officials are increasingly alarmed by older Americans’ drinking. The annual number of alcohol-related deaths from 2020 through 2021 exceeded 178,000, according to recently released data from the Centers for Disease Control and Prevention: more deaths than from all drug overdoses combined. © 2024 The New York Times Company

Keyword: Drug Abuse; Alzheimers
Link ID: 29234 - Posted: 04.02.2024

Linda Geddes Science correspondent If you have wondered why your partner always beats you at tennis or one child always crushes the other at Fortnite, it seems there is more to it than pure physical ability. Some people are effectively able to see more “images per second” than others, research suggests, meaning they’re innately better at spotting or tracking fast-moving objects such as tennis balls. The rate at which our brains can discriminate between different visual signals is known as temporal resolution, and influences the speed at which we are able to respond to changes in our environment. Previous studies have suggested that animals with high visual temporal resolution tend to be species with fast-paced lives, such as predators. Human research has also suggested that this trait tends to decrease as we get older, and dips temporarily after intense exercise. However, it was not clear how much it varies between people of similar ages. One way of measuring this trait is to identify the point at which someone stops perceiving a flickering light to flicker, and sees it as a constant or still light instead. Clinton Haarlem, a PhD candidate at Trinity College Dublin, and his colleagues tested this in 80 men and women between the ages of 18 and 35, and found wide variability in the threshold at which this happened. The research, published in Plos One, found that some people reported a light source as constant when it was in fact flashing about 35 times a second, while others could still detect flashes at rates of greater than 60 times a second. © 2024 Guardian News & Media Limited

Keyword: Vision
Link ID: 29233 - Posted: 04.02.2024

By James Gaines A lethal, incurable malady similar to mad cow disease is sweeping across deer species in North America and starting to spread around the world. First identified in a single herd of captive mule deer in Colorado in 1967, chronic wasting disease — CWD — has now been found in captive and wild mule deer, white-tailed deer, elk, moose and reindeer. It’s been found in 32 states and has crossed international boundaries into Canada, South Korea and Norway, among other countries. The disease — caused by a rogue protein known as a prion — has not yet been shown to infect humans, though fears remain. But even if that never happens, CWD could kill off large numbers of deer and possibly wipe out individual populations. Wildlife management agencies may, in turn, introduce stricter hunting rules, and the fear of contaminated meat could scare away potential hunters, affecting the United States’ roughly $23 billion deer hunting industry. Since CWD’s emergence, scientists have been working to understand the disease and how it might be brought under control. Over the years, three potential mitigation strategies have emerged, but each has significant challenges. Nicholas Haley, a veterinary microbiologist at Midwestern University in Arizona, coauthored an overview of chronic wasting disease in the 2015 Annual Review of Animal Biosciences and has been working on the problem ever since. Knowable Magazine spoke with Haley about the options and whether we can ever contain the disease. What’s a prion disease? CWD isn’t caused by a bacterium or virus, but by a naturally occurring protein in our cells twisting out of shape.

Keyword: Prions
Link ID: 29232 - Posted: 04.02.2024

By Markham Heid The human hand is a marvel of nature. No other creature on Earth, not even our closest primate relatives, has hands structured quite like ours, capable of such precise grasping and manipulation. But we’re doing less intricate hands-on work than we used to. A lot of modern life involves simple movements, such as tapping screens and pushing buttons, and some experts believe our shift away from more complex hand activities could have consequences for how we think and feel. “When you look at the brain’s real estate — how it’s divided up, and where its resources are invested — a huge portion of it is devoted to movement, and especially to voluntary movement of the hands,” said Kelly Lambert, a professor of behavioral neuroscience at the University of Richmond in Virginia. Dr. Lambert, who studies effort-based rewards, said that she is interested in “the connection between the effort we put into something and the reward we get from it” and that she believes working with our hands might be uniquely gratifying. In some of her research on animals, Dr. Lambert and her colleagues found that rats that used their paws to dig up food had healthier stress hormone profiles and were better at problem solving compared with rats that were given food without having to dig. She sees some similarities in studies on people, which have found that a whole range of hands-on activities — such as knitting, gardening and coloring — are associated with cognitive and emotional benefits, including improvements in memory and attention, as well as reductions in anxiety and depression symptoms. These studies haven’t determined that hand involvement, specifically, deserves the credit. The researchers who looked at coloring, for example, speculated that it might promote mindfulness, which could be beneficial for mental health. Those who have studied knitting said something similar. “The rhythm and repetition of knitting a familiar or established pattern was calming, like meditation,” said Catherine Backman, a professor emeritus of occupational therapy at the University of British Columbia in Canada who has examined the link between knitting and well-being. © 2024 The New York Times Company

Keyword: Learning & Memory; Stress
Link ID: 29231 - Posted: 04.02.2024

By Erin Blakemore More than three-quarters of sudden infant deaths involved multiple unsafe sleep practices, including co-sleeping, a recent analysis suggests. A study published in the journal Pediatrics looked at 7,595 sudden infant death cases in a Centers for Disease Control and Prevention registry between 2011 and 2020. The majority of deaths occurred in babies less than 3 months old. The statistics revealed that 59.5 percent of the infants who died suddenly were sharing a sleep surface at the time of death, and 75.9 percent were in an adult bed when they died. Though some demographic factors such as sex and length of gestation were not clinically significant, the researchers found that the babies sharing a sleep surface were more likely to be Black and publicly insured than those who didn’t share sleep surfaces. Soft bedding was common among all the infants who died, and 76 percent of the cases involved multiple unsafe practices. The analysis mirrors known risk factors for sudden infant death. Current recommendations direct parents and other caretakers to provide infants with firm, flat, level sleep surfaces that contain nothing but a fitted sheet. Though room sharing reduces the risk of sudden infant death, CDC officials discourage parents from sharing a sleep surface with their child. Exposure to cigarette smoke during pregnancy was more common among infants who shared surfaces when they died. Though most infants were supervised by an adult when they died, the supervisor was more likely to be impaired by drug and alcohol use among those who shared a sleeping surface.

Keyword: Sleep
Link ID: 29230 - Posted: 04.02.2024

By Marta Zaraska The renowned Polish piano duo Marek and Wacek didn’t use sheet music when playing live concerts. And yet onstage the pair appeared perfectly in sync. On adjacent pianos, they playfully picked up various musical themes, blended classical music with jazz and improvised in real time. “We went with the flow,” said Marek Tomaszewski, who performed with Wacek Kisielewski until Wacek’s death in 1986. “It was pure fun.” The pianists seemed to read each other’s minds by exchanging looks. It was, Marek said, as if they were on the same wavelength. A growing body of research suggests that might have been literally true. Dozens of recent experiments studying the brain activity of people performing and working together — duetting pianists, card players, teachers and students, jigsaw puzzlers and others — show that their brain waves can align in a phenomenon known as interpersonal neural synchronization, also known as interbrain synchrony. “There’s now a lot of research that shows that people interacting together display coordinated neural activities,” said Giacomo Novembre, a cognitive neuroscientist at the Italian Institute of Technology in Rome, who published a key paper on interpersonal neural synchronization last summer. The studies have come out at an increasing clip over the past few years — one as recently as last week — as new tools and improved techniques have honed the science and theory. They’re finding that synchrony between brains has benefits. It’s linked to better problem-solving, learning and cooperation, and even with behaviors that help others at a personal cost. What’s more, recent studies in which brains were stimulated with an electric current hint that synchrony itself might cause the improved performance observed by scientists. © 2024 the Simons Foundation.

Keyword: Attention
Link ID: 29229 - Posted: 03.30.2024

By Jake Buehler Much like squirrels, black-capped chickadees hide their food, keeping track of many thousands of little treasures wedged into cracks or holes in tree bark. When a bird returns to one of their many food caches, a particular set of nerve cells in the memory center of their brains gives a brief flash of activity. When the chickadee goes to another stash, a different combination of neurons lights up. These neural combinations act like bar codes, and identifying them may give key insights into how episodic memories — accounts of specific past events, like what you did on your birthday last year or where you’ve left your wallet — are encoded and recalled in the brain, researchers report March 29 in Cell. This kind of memory is challenging to study in animals, says Selmaan Chettih, a neuroscientist at Columbia University. “You can’t just ask a mouse what memories it formed today.” But chickadees’ very precise behavior provides a golden opportunity for researchers. Every time a chickadee makes a cache, it represents a single, well-defined moment logged in the hippocampus, a structure in the vertebrate brain vital for memory. To study the birds’ episodic memory, Chettih and his colleagues built a special arena made of 128 small, artificial storage sites. The team inserted small probes into five chickadees’ brains to track the electrical activity of individual neurons, comparing that activity with detailed recordings of the birds’ body positions and behaviors. A black-capped chickadee stores sunflower seeds in an artificial arena made of 128 different perches and pockets. These birds excel at finding their hidden food stashes. The aim of the setup was to see how their brain stores and retrieves the memory of each hidey-hole. Researchers closely observed five chickadees, comparing their caching behavior with the activity from nerve cells in their hippocampus, the brain’s memory center. © Society for Science & the Public 2000–2024.

Keyword: Learning & Memory
Link ID: 29228 - Posted: 03.30.2024

By Saugat Bolakhe For desert ants, Earth’s magnetic field isn’t just a compass: It may also sculpt their brains. Stepping outside their nest for the first time, young ants need to learn how to forage. The ants train partly by walking a loop near their nests for the first three days. During this stroll, they repeatedly pause and then pirouette to gaze back at the nest entrance, learning how to find their way back home. But when the magnetic field around the nest entrance was disturbed, ant apprentices couldn’t figure out where to look, often gazing in random directions, researchers report in the Feb. 20 Proceedings of the National Academy of Sciences. What’s more, the altered magnetic field seemed to affect connections between neurons in the learning and memory centers in the young ants’ brains. The finding “may make it easier to better understand how magnetic fields are sensed [in animals]” as scientists now know one way that magnetic fields can influence brain development, says Robin Grob, a biologist at the Norwegian University of Science and Technology in Trondheim. For years, scientists have known that some species of birds, fishes, turtles, moths and butterflies rely on Earth’s magnetic field to navigate (SN: 4/3/18). In 2018, Grob and other scientists added desert ants to that list. Young ants first appeared to use the magnetic field as a reference while learning how to use landmarks and the sun as guides to orient themselves in the right direction to gaze back toward the nest with its small, hard-to-see entrance. However, knowing where in the brain magnetic cues are processed has proved challenging. © Society for Science & the Public 2000–2024.

Keyword: Animal Migration; Development of the Brain
Link ID: 29227 - Posted: 03.30.2024

By Angie Voyles Askham For Christopher Zimmerman, it was oysters: After a bout of nausea on a beach vacation, he could hardly touch the mollusks for months. For others, that gut-lurching trigger is white chocolate, margaritas or spicy cinnamon candy. Whatever the taste, most people know the feeling of not being able to stomach a food after it has caused—or seemed to cause—illness. That response helps us learn which foods are safe, making it essential for survival. But how the brain links an unpleasant gastric event to food consumed hours prior has long posed a mystery, says Zimmerman, who is a postdoctoral fellow in Ilana Witten’s lab at Princeton University. The time scale for this sort of conditioned food aversion is an order of magnitude different from other types of learning, which involve delays of only a few seconds, says Peter Dayan, director of computational neuroscience at the Max Planck Institute for Biological Cybernetics, who was not involved in the work. “You need to have something that bridges that gap in time” between eating and feeling ill, he says. A newly identified neuronal circuit can do just that. Neurons in the mouse brainstem that respond to drug-induced nausea reactivate a specific subset of cells in the animals’ central amygdala that encode information about a recently tasted food. And that reactivation happens with novel—but not familiar—flavors, according to work that Zimmerman presented at the annual COSYNE meeting in Lisbon last month. With new flavors, animals seem primed to recall a recent meal if they get sick, Zimmerman says. As he put it in his talk, “it suggests that the common phrase we associate with unexpected nausea, that ‘it must be something I ate,’ is literally built into the brain in the form of this evolutionarily hard-wired prior.” © 2024 Simons Foundation

Keyword: Learning & Memory; Evolution
Link ID: 29226 - Posted: 03.30.2024

By Catherine Offord Bone marrow transplants between mice can transmit symptoms and pathology associated with Alzheimer’s disease, according to a controversial study published today in Stem Cell Reports. Its authors found that healthy mice injected with marrow from a mouse strain carrying an extremely rare, Alzheimer’s-linked genetic mutation later developed cognitive problems and abnormal clumping of proteins in the brain. In claims that other scientists in the field have criticized as overstated, the team says its findings demonstrate “Alzheimer’s disease transmission” and support screening of human bone marrow, organ, and blood donors for mutations related to neurodegeneration. “The findings are not by any means conclusive,” says Lary Walker, a neuroscientist at Emory University. Although the team’s approach offers an interesting way to study potential causes of neurodegeneration, he says, “the mice do not have Alzheimer’s disease,” only certain symptoms that mimic those of the disorder and require further study. He and other scientists stress that the new findings should not deter people who medically need bone marrow or other transplants. Alzheimer’s is partly characterized by so-called plaques of beta amyloid, a fragment of a larger protein called APP, around cells in the brain. Although there are rare, early-onset versions of the disease driven by specific mutations in the gene coding for APP or related proteins, most cases arise in people over age 65 and don’t have a single known cause. Some research hints that in very unusual scenarios, Alzheimer’s could be transmitted via human tissue or medical equipment contaminated with disease-causing proteins. Earlier this year, for example, U.K. scientists described dementia and beta amyloid buildup in several people who had received injections of growth hormone from the brains of deceased donors. (The procedure was once a medical treatment for certain childhood disorders but was abandoned in the 1980s.)

Keyword: Alzheimers; Hormones & Behavior
Link ID: 29225 - Posted: 03.30.2024

Andrew Gregory Health editor Previous evidence has suggested a link between high body mass index (BMI) in adolescence and an increased risk of MS. But most of these studies were retrospective in design and used self-reported data. Researchers involved with the new study sought to prospectively evaluate the risk of developing MS in a large cohort of obese children compared with the general population. Academics analysed data from the Swedish Childhood Obesity Treatment Register. The database, known as Boris, is one of the world’s largest registries for treatment of childhood obesity. The research team looked at data on children aged two to 19 who joined the registry between 1995 and 2020, and compared their information with that of children in the general population. The study included data on more than 21,600 children with obesity, who started treatment for obesity when they were an average age of 11, and more than 100,000 children without obesity. Children involved in the study were tracked for an average of six years. During the follow-up period, MS was diagnosed in 28 of those with obesity (0.13% of the group) and 58 in the group without obesity (0.06%). © 2024 Guardian News & Media Limite

Keyword: Multiple Sclerosis; Obesity
Link ID: 29224 - Posted: 03.30.2024

By Max Kozlov Neurons (shown here in a coloured scanning electron micrograph) mend broken DNA during memory formation. Credit: Ted Kinsman/Science Photo Library When a long-term memory forms, some brain cells experience a rush of electrical activity so strong that it snaps their DNA. Then, an inflammatory response kicks in, repairing this damage and helping to cement the memory, a study in mice shows. The findings, published on 27 March in Nature1, are “extremely exciting”, says Li-Huei Tsai, a neurobiologist at the Massachusetts Institute of Technology in Cambridge who was not involved in the work. They contribute to the picture that forming memories is a “risky business”, she says. Normally, breaks in both strands of the double helix DNA molecule are associated with diseases including cancer. But in this case, the DNA damage-and-repair cycle offers one explanation for how memories might form and last. It also suggests a tantalizing possibility: this cycle might be faulty in people with neurodegenerative diseases such as Alzheimer’s, causing a build-up of errors in a neuron’s DNA, says study co-author Jelena Radulovic, a neuroscientist at the Albert Einstein College of Medicine in New York City. This isn’t the first time that DNA damage has been associated with memory. In 2021, Tsai and her colleagues showed that double-stranded DNA breaks are widespread in the brain, and linked them with learning2. To better understand the part these DNA breaks play in memory formation, Radulovic and her colleagues trained mice to associate a small electrical shock with a new environment, so that when the animals were once again put into that environment, they would ‘remember’ the experience and show signs of fear, such as freezing in place. Then the researchers examined gene activity in neurons in a brain area key to memory — the hippocampus. They found that some genes responsible for inflammation were active in a set of neurons four days after training. Three weeks after training, the same genes were much less active. © 2024 Springer Nature Limited

Keyword: Learning & Memory; Genes & Behavior
Link ID: 29223 - Posted: 03.28.2024

By Ingrid Wickelgren You see a woman on the street who looks familiar—but you can’t remember how you know her. Your brain cannot attach any previous experiences to this person. Hours later, you suddenly recall the party at a friend’s house where you met her, and you realize who she is. In a new study in mice, researchers have discovered the place in the brain that is responsible for both types of familiarity—vague recognition and complete recollection. Both, moreover, are represented by two distinct neural codes. The findings, which appeared on February 20 in Neuron, showcase the use of advanced computer algorithms to understand how the brain encodes concepts such as social novelty and individual identity, says study co-author Steven Siegelbaum, a neuroscientist at the Mortimer B. Zuckerman Mind Brain Behavior Institute at Columbia University. The brain’s signature for strangers turns out to be simpler than the one used for old friends—which makes sense, Siegelbaum says, given the vastly different memory requirements for the two relationships. “Where you were, what you were doing, when you were doing it, who else [was there]—the memory of a familiar individual is a much richer memory,” Siegelbaum says. “If you’re meeting a stranger, there’s nothing to recollect.” The action occurs in a small sliver of a brain region called the hippocampus, known for its importance in forming memories. The sliver in question, known as CA2, seems to specialize in a certain kind of memory used to recall relationships. “[The new work] really emphasizes the importance of this brain area to social processing,” at least in mice, says Serena Dudek, a neuroscientist at the National Institute of Environmental Health Sciences, who was not involved in the study. © 2024 SCIENTIFIC AMERICAN,

Keyword: Attention; Learning & Memory
Link ID: 29222 - Posted: 03.28.2024

By Dennis Normile By the time a person shows symptoms of Parkinson’s disease, neurons in a part of their brain key to movement have already quietly died. To learn how this process unfolds, identify warning signs, and test treatments, researchers have long wanted an animal model of the disease’s early stages. Now, they may have one: a cohort of transgenic marmosets, described at a conference on nonhuman primate models in Hong Kong last month. The animals, which neuroscientist Hideyuki Okano of Keio University and colleagues created using a mutated protein that seems to drive Parkinson’s in some people, closely mimic the disease’s onset and progression. And they have enabled Okano’s team to identify what could be an early, predictive sign of disease in brain imaging. The model could be “transformative” for Parkinson’s studies, says neurobiologist Peter Strick of the University of Pittsburgh, who attended the meeting, organized by the Hong Kong University of Science and Technology, Stanford University, and the University of California San Francisco. “We desperately need nonhuman primate models that recapitulate the natural onset and progression” of conditions like Parkinson’s, he says. Parkinson’s, which afflicts an estimated 8.5 million people, is thought to be triggered by a combination of genetic and environmental factors, such as exposure to toxic chemicals. It sets in as neurons that produce the chemical messenger dopamine in the substantia nigra, an area of the brain that controls movement, die off. Early symptoms include tremors, muscle stiffness, and hesitant motions. The disease can later affect cognition and lead to dementia. Researchers think one cause of neuronal death may be abnormal versions of a protein called alpha-synuclein that misfold and form toxic clumps in the brain years before symptoms emerge. © 2024 American Association for the Advancement of Science.

Keyword: Parkinsons; Genes & Behavior
Link ID: 29221 - Posted: 03.28.2024

By Charles Digges My default mode for writing term papers during my student days was the all-night slog, and I recall the giddy, slap-happy feeling that would steal over me as the sun rose. There was a quality of alert focus that came with it, as well as a gregariousness that would fuel bonding sessions with my other all-night companions. After we’d turned in the products of our midnight oil to our professors, we would all head out for pancakes. Then I’d go home and sleep the magic off. For years, I’d wondered if there was any basis for this temporary euphoria that I—though certainly not all my classmates—experienced after those sleepless nights. That I should feel so expansive and goofy after skipping sleep while many of them turned into drowsy grouches seemed to defy logic. Going without sleep isn’t supposed to be a good thing, especially for folks who experience depression, as I have. But it turns out this paradox has been the subject of inquiry for at least two centuries. In 1818, University of Leipzig psychiatrist Johann Christian August Heinroth was reportedly the first to suggest that partial or total sleep deprivation could be temporarily effective against “melancholia,” as depression was called in those days. He found this to be true only in a certain subset of patients—around 60 percent. More than a hundred years later, in the 1970s, evidence emerged that a “resynchronization” of disturbed circadian rhythms could be responsible for the improved moods of depressed patients after a night without sleep. And more recently, researchers have found that a neurotransmitter involved in reward known as dopamine may play a role in this effect, as may neuroplasticity—the nervous system’s ability to rearrange itself in response to stimuli. But the precise neural mechanisms responsible have remained unclear. © 2024 NautilusNext Inc.,

Keyword: Sleep; Depression
Link ID: 29220 - Posted: 03.28.2024

Ian Sample Science editor Two nights of broken sleep are enough to make people feel years older, according to researchers, who said consistent, restful slumber was a key factor in helping to stave off feeling one’s true age. Psychologists in Sweden found that, on average, volunteers felt more than four years older when they were restricted to only four hours of sleep for two consecutive nights, with some claiming the sleepiness made them feel decades older. The opposite was seen when people were allowed to stay in bed for nine hours, though the effect was more modest, with participants in the study claiming to feel on average three months younger than their real age after ample rest. “Sleep has a major impact on how old you feel and it’s not only your long-term sleep patterns,” said Dr Leonie Balter, a psychoneuroimmunologist at the Karolinska Institute in Stockholm and first author on the study. “Even when you only sleep less for two nights that has a real impact on how you feel.” Beyond simply feeling more decrepit, the perception of being many years older may affect people’s health, Balter said, by encouraging unhealthy eating, reducing physical exercise, and making people less willing to socialise and engage in new experiences. The researchers ran two studies. In the first, 429 people aged 18 to 70 answered questions about how old they felt and on how many nights, if any, they had slept badly in the past month. Their sleepiness was also rated according to a standard scale used in psychology research. For each day of poor sleep the volunteers felt on average three months older, the scientists found, while those who reported no bad nights in the preceding month felt on average nearly six years younger than their true age. It was unclear, however, whether bad sleep made people feel older or vice versa. © 2024 Guardian News & Media Limited

Keyword: Sleep
Link ID: 29219 - Posted: 03.28.2024

By Robert D. Hershey Jr. Daniel Kahneman, who never took an economics course but who pioneered a psychologically based branch of that field that led to a Nobel in economic science in 2002, died on Wednesday. He was 90. His death was confirmed by his partner, Barbara Tversky. She declined to say where he died. Professor Kahneman, who was long associated with Princeton University and lived in Manhattan, employed his training as a psychologist to advance what came to be called behavioral economics. The work, done largely in the 1970s, led to a rethinking of issues as far-flung as medical malpractice, international political negotiations and the evaluation of baseball talent, all of which he analyzed, mostly in collaboration with Amos Tversky, a Stanford cognitive psychologist who did groundbreaking work on human judgment and decision-making. (Ms. Tversky, also a professor of psychology at Stanford, had been married to Professor Tversky, who died in 1996. She and Professor Kahneman became partners several years ago.) As opposed to traditional economics, which assumes that human beings generally act in fully rational ways and that any exceptions tend to disappear as the stakes are raised, the behavioral school is based on exposing hard-wired mental biases that can warp judgment, often with counterintuitive results. “His central message could not be more important,” the Harvard psychologist and author Steven Pinker told The Guardian in 2014, “namely, that human reason left to its own devices is apt to engage in a number of fallacies and systematic errors, so if we want to make better decisions in our personal lives and as a society, we ought to be aware of these biases and seek workarounds. That’s a powerful and important discovery.” © 2024 The New York Times Company

Keyword: Attention
Link ID: 29218 - Posted: 03.28.2024

By Jyoti Madhusoodanan When the Philadelphia-based company Bioquark announced a plan in 2016 to regenerate neurons in brain-dead people, their proposal elicited skepticism and backlash. Researchers questioned the scientific merits of the planned study, which sought to inject stem cells and other materials into recently deceased subjects. Ethicists said it bordered on quackery and would exploit grieving families. Bioquark has since folded. But quietly, a physician who was involved in the controversial proposal, Himanshu Bansal, has continued the research. Bansal recently told Undark that he has been conducting work funded by him and his research team at a private hospital in Rudrapur, India, experimenting mostly with young adults who have succumbed to traffic accidents. He said he has data for 20 subjects for the first phase of the study and 11 for the second — some of whom showed glimmers of renewed electrical activity — and he plans to expand the study to include several more. Bansal said he has submitted his results to peer-reviewed journals over the past several years but has yet to find one that would publish them. Bansal may be among the more controversial figures conducting research with people who have been declared brain dead, but not by any stretch is he the only one. In recent years, high-profile experiments implanting non-human organs into human bodies, a procedure known as xenotransplantation, have fueled rising interest in using brain-dead subjects to study procedures that are too risky to perform on living people. With the support of a ventilator and other equipment, a person’s heart, kidneys, immune system, and other body parts can function for days, sometimes weeks or more, after brain death. For researchers who seek to understand drug delivery, organ transplantation, and other complexities of human physiology, these bodies can provide a more faithful simulacrum of a living human being than could be achieved with animals or lab-grown cells and tissues.

Keyword: Consciousness
Link ID: 29217 - Posted: 03.26.2024