Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 2421 - 2440 of 29394

The membranes surrounding our brains are in a never-ending battle against deadly infections, as germs constantly try to elude watchful immune cells and sneak past a special protective barrier called the meninges. In a study involving mice and human autopsy tissue, researchers at the National Institutes of Health and Cambridge University have shown that some of these immune cells are trained to fight these infections by first spending time in the gut. “This finding opens a new area of neuroimmunology, showing that gut-educated antibody-producing cells inhabit and defend regions that surround the central nervous system,” said Dorian McGavern, Ph.D., senior investigator at NINDS and co-senior author of the study, which was published in Nature. The central nervous system (CNS) is protected from pathogens both by a three-membrane barrier called the meninges and by immune cells within those membranes. The CNS is also walled off from the rest of the body by specialized blood vessels that are tightly sealed by the blood brain barrier. This is not the case, however, in the dura mater, the outermost layer of the meninges. Blood vessels in this compartment are not sealed, and large venous structures, referred to as the sinuses, carry slow moving blood back to the heart. The combination of slow blood flow and proximity to the brain requires strong immune protection to stop potential infections in their tracks. “The immune system has invested heavily in the dura mater,” said Dr. McGavern. “The venous sinuses within the dura act like drainage bins, and, consequently, are a place where pathogens can accumulate and potentially enter the brain. It makes sense that the immune system would set up camp in this vulnerable area.”

Keyword: Neuroimmunology
Link ID: 27569 - Posted: 11.07.2020

By Gretchen Reynolds Roiled by concerns about the pandemic and politics? Lifting weights might help, according to a timely new study of anxiety and resistance training. The study, which involved healthy young adults, barbells and lunges, indicates that regular weight training substantially reduces anxiety, a finding with particular relevance during these unsettling, bumpy days. We already have plenty of evidence that exercise helps stave off depression and other mental ills, and that exercise can elevate feelings of happiness and contentment. But most past studies of exercise and moods have looked at the effects of aerobic exercise, like running on a treadmill or riding a stationary bike. Scientists only recently have begun to investigate whether and how weight training might also affect mental health. A 2018 review of studies, for instance, concluded that adults who lift weights are less likely to develop depression than those who never lift. In another study, women with clinical anxiety disorders reported fewer symptoms after taking up either aerobic or weight training. But many of these studies involved frequent and complicated sessions of resistance exercise performed under the eyes of researchers, which is not how most of us are likely to work out. They also often focused on somewhat narrow groups, such as men or women with a diagnosed mental health condition like depression or an anxiety disorder, limiting their applicability. So for the new study, which was published in October in Scientific Reports, researchers at the University of Limerick in Ireland and other institutions decided to see if a simple version of weight training could have benefits for mood in people who already were in generally good mental health. © 2020 The New York Times Company

Keyword: Stress
Link ID: 27568 - Posted: 11.07.2020

Elena Renken More than a century ago, the zoologist Richard Semon coined the term “engram” to designate the physical trace a memory must leave in the brain, like a footprint. Since then, neuroscientists have made progress in their hunt for exactly how our brains form memories. They have learned that specific brain cells activate as we form a memory and reactivate as we remember it, strengthening the connections among the neurons involved. That change ingrains the memory and lets us keep memories we recall more often, while others fade. But the precise physical alterations within our neurons that bring about these changes have been hard to pin down — until now. In a study published last month, researchers at the Massachusetts Institute of Technology tracked an important part of the memory-making process at the molecular scale in engram cells’ chromosomes. Neuroscientists already knew that memory formation is not instantaneous, and that the act of remembering is crucial to locking a memory into the brain. These researchers have now discovered some of the physical embodiment of that mechanism. The MIT group worked with mice that had a fluorescent marker spliced into their genome to make their cells glow whenever they expressed the gene Arc, which is associated with memory formation. The scientists placed these mice in a novel location and trained them to fear a specific noise, then returned them to this location several days later to reactivate the memory. In the brain area called the hippocampus, the engram cells that formed and recalled this memory lit up with color, which made it easy to sort them out from other brain cells under the microscope during a postmortem examination. All Rights Reserved © 2020

Keyword: Learning & Memory; Stress
Link ID: 27567 - Posted: 11.04.2020

By Veronique Greenwood Some 230 million years ago, in the forests of what humans would eventually call Brazil, a small bipedal dinosaur zipped after its prey. It had a slender head, a long tail and sharp teeth, and it was about the size of a basset hound. Buriolestes schultzi, as paleontologists have named the creature, is one of the earliest known relatives of more famous dinosaurs that emerged 100 million years later: the lumbering brachiosaurus, up to 80 feet long and weighing up to 80 metric tons, the likewise massive diplodocus, as well as other sauropod dinosaurs. By the time the Jurassic period rolled around and the time of Buriolestes had passed, these quadrupedal cousins had reached tremendous size. They also had tiny brains around the size of a tennis ball. Buriolestes’s brain was markedly different, scientists who built a 3-D reconstruction of the inside of its skull report in a paper published Tuesday in the Journal of Anatomy. The brain was larger relative to its body size, and it had structures that were much more like those of predatory animals. The findings suggest that the enormous herbivores of later eras, whose ancestors probably looked a lot like Buriolestes, lost these features as they transitioned to their ponderous new lifestyle. It’s also a rare glimpse into dinosaurs’ neural anatomy at a very early moment in their evolution. In 2009, Rodrigo Müller of the Universidade Federal de Santa Maria and colleagues discovered the first partial Buriolestes fossil in southern Brazil. In 2015, they uncovered another Buriolestes nearby — and this time, to their excitement, the dinosaur’s skull was nearly all there. They used computed tomography scanning to get a peek inside, drawing inferences about the brain from the contours of the cavity left behind. They found that one portion of the cerebellum, the floccular lobe, was particularly large in Buriolestes. © 2020 The New York Times Company

Keyword: Evolution
Link ID: 27566 - Posted: 11.04.2020

Amber Dance Gerald Maguire has stuttered since childhood, but you might not guess it from talking to him. For the past 25 years, he has been treating his disorder with antipsychotic medications not officially approved for the condition. Only with careful attention might you discern his occasional stumble on multisyllabic words like "statistically" and "pharmaceutical." Maguire has plenty of company: More than 70 million people worldwide, including about 3 million Americans, stutter — they have difficulty with the starting and timing of speech, resulting in halting and repetition. That number includes approximately 5 percent of children (many of whom outgrow the condition) and 1 percent of adults. Their numbers include presidential candidate Joe Biden, deep-voiced actor James Earl Jones, and actress Emily Blunt. Though they and many others, including Maguire, have achieved career success, stuttering can contribute to social anxiety and draw ridicule or discrimination. Maguire, a psychiatrist at the University of California, Riverside, has been treating people who stutter, and researching potential treatments, for decades. He's now embarking on a clinical trial of a new medication, ecopipam, that streamlined speech and improved quality of life in a small pilot study in 2019. Others, meanwhile, are delving into the root causes of stuttering. In past decades, therapists mistakenly attributed stuttering to defects of the tongue and voice box, to anxiety, trauma, or even poor parenting — and some still do. Yet others have long suspected that neurological problems might underlie stuttering, says J. Scott Yaruss, a speech-language pathologist at Michigan State University. The first data to back up that hunch came in 1991, when researchers reported altered blood flow in the brains of people who stuttered. Since then research has made it more apparent that stuttering is all in the brain. "We are in the middle of an absolute explosion of knowledge being developed about stuttering," Yaruss says. ® 2020 The Week Publications Inc.

Keyword: Language
Link ID: 27565 - Posted: 11.04.2020

By Carolyn Wilke Fish fins aren’t just for swimming. They’re feelers, too. The fins of round gobies can detect textures with a sensitivity similar to that of the pads on monkeys’ fingers, researchers report November 3 in the Journal of Experimental Biology. Compared with landlubbers, little is known about aquatic animals’ sense of touch. And for fish, “we used to only think of fins as motor structures,” says Adam Hardy, a neuroscientist at the University of Chicago. “But it’s really becoming increasingly clear that fins play important sensory roles.” Studying those sensory roles can hint at ways to mimic nature for robotics and provide a window into the evolution of touch. The newfound parallels between primates and fish suggest that limbs that sense physical forces emerged early, before splits in the vertebrate evolutionary tree led to animals with fins, arms and legs, says Melina Hale, a neurobiologist and biomechanist also at the University of Chicago. “These capabilities arose incredibly early and maybe set the stage for what we can do with our hands now and what fish can do with their fins in terms of touch.” Hardy and Hale measured the activity of nerves in the fins of bottom-dwelling round gobies (Neogobius melanostomus) to get a sense of what fish learn about texture from their fins. In the wild, round gobies brush against the bottom surface and rest there on their large pectoral fins. “They’re really well suited to testing these sorts of questions,” Hardy says. Working with fins from six euthanized gobies, the researchers recorded electrical spikes from their nerves as a bumpy plastic ring attached to a motor rolled lightly above each fin. A salt solution keeps the nerves functioning as they would if the nerves were in a live fish, Hardy says. © Society for Science & the Public 2000–2020

Keyword: Pain & Touch; Evolution
Link ID: 27564 - Posted: 11.04.2020

By Nicholas Bakalar Some studies have suggested that older people who consistently engage in leisure activities are less likely to develop dementia than those who do not, suggesting that failure to participate in such pastimes could spur cognitive deterioration. A new study suggests another explanation: Failure to participate in leisure activities may be a consequence of dementia, not a cause. Researchers studied 8,280 people, average age 56, who were free of dementia at the start of the analysis. Over the next 18 years, the participants underwent periodic physical and psychological examinations, while researchers tracked their involvement in 13 leisure activities — listening to music, gardening, attending cultural events, playing cards, using a home computer and others. By the end of the project, 360 had developed dementia. The study, in Neurology, controlled for smoking, physical activity, education, coronary heart disease and other health and behavioral characteristics that are tied to dementia risk. They found no association between engagement in leisure activities at age 56 and the incidence of dementia over the following 18 years. The researchers concluded that actively pursuing leisure activities may not provide protection against developing dementia. “Dementia develops over a long period of time, so it’s possible that some changes happen before the diagnosis of dementia,” said the lead author, Andrew Sommerlad, a researcher at University College London. “Elderly people withdrawing from activities that they previously enjoyed may be developing early signs of dementia.” © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27563 - Posted: 11.04.2020

Anil Ananthaswamy In the winter of 2011, Daniel Yamins, a postdoctoral researcher in computational neuroscience at the Massachusetts Institute of Technology, would at times toil past midnight on his machine vision project. He was painstakingly designing a system that could recognize objects in pictures, regardless of variations in size, position and other properties — something that humans do with ease. The system was a deep neural network, a type of computational device inspired by the neurological wiring of living brains. “I remember very distinctly the time when we found a neural network that actually solved the task,” he said. It was 2 a.m., a tad too early to wake up his adviser, James DiCarlo, or other colleagues, so an excited Yamins took a walk in the cold Cambridge air. “I was really pumped,” he said. It would have counted as a noteworthy accomplishment in artificial intelligence alone, one of many that would make neural networks the darlings of AI technology over the next few years. But that wasn’t the main goal for Yamins and his colleagues. To them and other neuroscientists, this was a pivotal moment in the development of computational models for brain functions. DiCarlo and Yamins, who now runs his own lab at Stanford University, are part of a coterie of neuroscientists using deep neural networks to make sense of the brain’s architecture. In particular, scientists have struggled to understand the reasons behind the specializations within the brain for various tasks. They have wondered not just why different parts of the brain do different things, but also why the differences can be so specific: Why, for example, does the brain have an area for recognizing objects in general but also for faces in particular? Deep neural networks are showing that such specializations may be the most efficient way to solve problems. All Rights Reserved © 2020

Keyword: Learning & Memory
Link ID: 27562 - Posted: 10.31.2020

Jon Hamilton If you fall off a bike, you'll probably end up with a cinematic memory of the experience: the wind in your hair, the pebble on the road, then the pain. That's known as an episodic memory. And now researchers have identified cells in the human brain that make this sort of memory possible, a team reports in the journal Proceedings of the National Academy of Sciences. The cells are called time cells, and they place a sort of time stamp on memories as they are being formed. That allows us to recall sequences of events or experiences in the right order. "By having time cells create this indexing across time, you can put everything together in a way that makes sense," says Dr. Bradley Lega, the study's senior author and a neurosurgeon at the University of Texas Southwestern Medical Center in Dallas. Time cells were discovered in rodents decades ago. But the new study is critical because "the final arbitrator is always the human brain," says Dr. György Buzsáki, Biggs Professor of Neuroscience at New York University. Buzsáki is not an author of the study but did edit the manuscript. Lega and his team found the time cells by studying the brains of 27 people who were awaiting surgery for severe epilepsy. As part of their pre-surgical preparation, these patients had electrodes placed in the hippocampus and another area of the brain involved in navigation, memory and time perception. In the experiment, the patients studied sequences of 12 or 15 words that appeared on a laptop screen during a period of about 30 seconds. Then, after a break, they were asked to recall the words they had seen. © 2020 npr

Keyword: Learning & Memory
Link ID: 27561 - Posted: 10.31.2020

By Jonathan Lambert Octopus arms have minds of their own. Each of these eight supple yet powerful limbs can explore the seafloor in search of prey, snatching crabs from hiding spots without direction from the octopus’ brain. But how each arm can tell what it’s grasping has remained a mystery. Now, researchers have identified specialized cells not seen in other animals that allow octopuses to “taste” with their arms. Embedded in the suckers, these cells enable the arms to do double duty of touch and taste by detecting chemicals produced by many aquatic creatures. This may help an arm quickly distinguish food from rocks or poisonous prey, Harvard University molecular biologist Nicholas Bellono and his colleagues report online October 29 in Cell. The findings provide another clue about the unique evolutionary path octopuses have taken toward intelligence. Instead of being concentrated in the brain, two-thirds of the nerve cells in an octopus are distributed among the arms, allowing the flexible appendages to operate semi-independently (SN: 4/16/15). “There was a huge gap in knowledge of how octopus [arms] actually collect information about their environment,” says Tamar Gutnick, a neurobiologist who studies octopuses at Hebrew University of Jerusalem who was not involved in the study. “We’ve known that [octopuses] taste by touch, but knowing it and understanding how it’s actually working is a very different thing.” Working out the specifics of how arms sense and process information is crucial for understanding octopus intelligence, she says. “It’s really exciting to see someone taking a comprehensive look at the cell types involved,” and how they work. © Society for Science & the Public 2000–2020

Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 27560 - Posted: 10.31.2020

By Lucy Hicks Ogre-faced spiders might be an arachnophobe’s worst nightmare. The enormous eyes that give them their name allow them to see 2000 times better than we can at night. And these creepy crawlers are lightning-fast predators, snatching prey in a fraction of a second with mini, mobile nets. Now, new research suggests these arachnids use their legs not only to scuttle around, but also to hear. In light of their excellent eyesight, this auditory skill “is a surprise,” says George Uetz, who studies the behavioral ecology of spiders at the University of Cincinnati and wasn’t involved in the new research. Spiders don’t have ears—generally a prerequisite for hearing. So, despite the vibration-sensing hairs and receptors on most arachnids’ legs, scientists long thought spiders couldn’t hear sound as it traveled through the air, but instead felt vibrations through surfaces. The first clue they might be wrong was a 2016 study that found that a species of jumping spider can sense vibrations in the air from sound waves. Enter the ogre-faced spider. Rather than build a web and wait for their prey, these fearsome hunters “take a much more active role,” says Jay Stafstrom, a sensory ecologist at Cornell University. The palm-size spiders hang upside down from small plants on a silk line and create a miniweb across their four front legs, which they use as a net to catch their next meal. The spiders either lunge at bugs wandering below or flip backward to ensnare flying insects’ midair. © 2020 American Association for the Advancement of Science.

Keyword: Hearing; Evolution
Link ID: 27559 - Posted: 10.31.2020

By Laura Sanders Nearly 2,000 years ago, a cloud of scorching ash from Mount Vesuvius buried a young man as he lay on a wooden bed. That burning ash quickly cooled, turning some of his brain to glass. This confluence of events in A.D. 79 in the town of Herculaneum, which lay at the western base of the volcano, preserved the usually delicate neural tissue in a durable, glassy form. New scrutiny of this tissue has revealed signs of nerve cells with elaborate tendrils for sending and receiving messages, scientists report October 6 in PLOS ONE. That the young man once possessed these nerve cells, or neurons, is no surprise; human brains are packed with roughly 86 billion neurons (SN: 8/7/19). But samples from ancient brains are sparse. Those that do exist have become a soaplike substance or mummified, says Pier Paolo Petrone, a biologist and forensic anthropologist at the University of Naples Federico II in Italy. But while studying the Herculaneum site, Petrone noticed something dark and shiny inside this man’s skull. He realized that those glassy, black fragments “had to be the remains of the brain.” Petrone and colleagues used scanning electron microscopy to study glassy remains from both the man’s brain and spinal cord. The researchers saw tubular structures as well as cell bodies that were the right sizes and shapes to be neurons. In further analyses, the team found layers of tissue wrapped around tendrils in the brain tissue. This layering appears to be myelin, a fatty substance that speeds signals along nerve fibers. The preserved tissue was “something really astonishing and incredible,” Petrone says, because the conversion of objects to glass, a process called vitrification, is relatively rare in nature. “This is the first ever discovery of ancient human brain remains vitrified by hot ash during a volcanic eruption.” © Society for Science & the Public 2000–2020.

Keyword: Development of the Brain
Link ID: 27558 - Posted: 10.31.2020

By Sam Roberts Chris Pendergast, a Long Island teacher who defied the odds by surviving 27 years with Lou Gehrig’s disease, leading marathon “rides for life” for hundreds of miles from his motorized wheelchair to publicize the plight of fellow patients and raise $10 million for research, died on Oct. 14 at his home in Miller Place, N.Y. He was 71. His wife, Christine Pendergast, said the cause was complications of amyotrophic lateral sclerosis, the medical term for the disease that ended the career of Gehrig, the Yankee first baseman who, after playing in 2,130 consecutive games, proclaimed himself “the luckiest man on the face of the earth.” Gehrig died two years later, shortly before his 38th birthday. Mr. Pendergast was a 44-year-old teacher of gifted students at Dickinson Avenue elementary school in East Northport, on Long Island, when his eyes and hands began twitching and he started getting muscle spasms. On Oct. 13, 1993, he received the diagnosis: He had A.L.S., a degenerative disease, which diminishes muscle function and eventually the ability to breathe. The prognosis: He had three to five years to live. But Mr. Pendergast proved to be indomitable. He recast himself as the disease’s self-described squeaky wheel — “Since there’s no surviving constituency for A.L.S., there’s no squeaky wheel,” he told The New York Times in 2008. He founded the A.L.S. Ride for Life in 1997. The following year it mounted a 350-mile, two-week cavalcade from Yankee Stadium in the Bronx to Washington, with Mr. Pendergast leading it from his wheelchair. Subsequent annual rides went from Long Island’s East End to Manhattan with a small group of fellow patients. “We are dying men riding for life,” he told The Baltimore Sun in 2000. © 2020 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 27557 - Posted: 10.31.2020

By Abby Goodnough PHILADELPHIA — Steven Kelty had been addicted to crack cocaine for 32 years when he tried a different kind of treatment last year, one so basic in concept that he was skeptical. He would come to a clinic twice a week to provide a urine sample, and if it was free of drugs, he would get to draw a slip of paper out of a fishbowl. Half contained encouraging messages — typically, “Good job!” — but the other half were vouchers for prizes worth between $1 and $100. “I’ve been to a lot of rehabs, and there were no incentives except for the idea of being clean after you finished,” said Mr. Kelty, 61, of Winfield, Pa. “Some of us need something to motivate us — even if it’s a small thing — to live a better life.” The treatment is called contingency management, because the rewards are contingent on staying abstinent. A number of clinical trials have found it highly effective in getting people addicted to stimulants like cocaine and methamphetamine to stay in treatment and to stop using the drugs. But outside the research arena and the Department of Veterans Affairs, where Mr. Kelty is a patient, it is nearly impossible to find programs that offer such treatment — even as overdose deaths involving meth, in particular, have soared. There were more than 16,500 such deaths last year, according to preliminary data, more than twice as many as in 2016. Early data suggests that overdoses have increased even more during the coronavirus pandemic, which has forced most treatment programs to move online. Researchers say that one of the biggest obstacles to contingency management is a moral objection to the idea of rewarding someone for staying off drugs. That is one reason publicly funded programs like Medicaid, which provides health coverage for the poor, do not cover the treatment. Some treatment providers are also wary of giving prizes that they say patients could sell or trade for drugs. Greg Delaney, a pastor and the outreach coordinator at Woodhaven, a residential treatment center in Ohio, said, “Until you’re at the point where you can say, ‘I can make a good decision with this $50,’ it’s counterproductive.” © 2020 The New York Times Company

Keyword: Drug Abuse; Learning & Memory
Link ID: 27556 - Posted: 10.28.2020

R. Douglas Fields As I opened my copy of Science at home one night, an unfamiliar word in the title of a new study caught my eye: dopaminylation. The term refers to the brain chemical dopamine’s ability, in addition to transmitting signals across synapses, to enter a cell’s nucleus and control specific genes. As I read the paper, I realized that it completely upends our understanding of genetics and drug addiction. The intense craving for addictive drugs like alcohol and cocaine may be caused by dopamine controlling genes that alter the brain circuitry underlying addiction. Intriguingly, the results also suggest an answer to why drugs that treat major depression must typically be taken for weeks before they’re effective. I was shocked by the dramatic discovery, but to really understand it, I first had to unlearn some things. “Half of what you learned in college is wrong,” my biology professor, David Lange, once said. “Problem is, we don’t know which half.” How right he was. I was taught to scoff at Jean-Baptiste Lamarck and his theory that traits acquired through life experience could be passed on to the next generation. The silly traditional example is the mama giraffe stretching her neck to reach food high in trees, resulting in baby giraffes with extra-long necks. Then biologists discovered we really can inherit traits our parents acquired in life, without any change to the DNA sequence of our genes. It’s all thanks to a process called epigenetics — a form of gene expression that can be inherited but isn’t actually part of the genetic code. This is where it turns out that brain chemicals like dopamine play a role. All genetic information is encoded in the DNA sequence of our genes, and traits are passed on in the random swapping of genes between egg and sperm that sparks a new life. Genetic information and instructions are coded in a sequence of four different molecules (nucleotides abbreviated A, T, G and C) on the long double-helix strand of DNA. The linear code is quite lengthy (about 6 feet long per human cell), so it’s stored neatly wound around protein bobbins, similar to how magnetic tape is wound around spools in cassette tapes. All Rights Reserved © 2020

Keyword: Drug Abuse; Epigenetics
Link ID: 27555 - Posted: 10.28.2020

By Elizabeth Pennisi When Ian Ausprey outfitted dozens of birds with photosensor-containing backpacks, the University of Florida graduate student was hoping to learn how light affected their behavior. The unusual study, which tracked 15 species in Peru’s cloud forest, has now found that eye size can help predict where birds breed and feed—the bigger the eye, the smaller the prey or the darker the environment. The study also suggests birds with big eyes are especially at risk as humans convert forests into farmland. The study reveals a “fascinating new area of sensory biology,” says Richard Prum, an evolutionary biologist at Yale University who was not involved in the new work. It also shows the size of a bird’s eye says a lot about its owner, adds Matthew Walsh, an evolutionary ecologist at the University of Texas, Arlington, also not involved with the work. Light matters—not just for plants, but also for animals. Large eyes have long been associated with the need to see in dim conditions, but very little research has looked in depth at light’s impact on behavior. Recently, scientists have shown that the relative size of frogs’ eyes corresponds to where they live, hunt, and breed. And several studies published in the past 3 years suggest the eyes of killifish and water fleas vary in size depending on the presence of predators. With no predators, even slightly larger eyes offer a potential survival advantage. To find out how eye size might matter for birds, Ausprey and his adviser, Scott Robinson, an ecologist at the Florida Museum of Natural History, turned to the 240 species they had identified in one of Peru’s many cloud forests. The study area included a range of habitats—dense stands of trees, farms with fencerows, shrubby areas, and open ground. Because light can vary considerably by height—for example, in the tropics, the forest floor can have just 1% of the light at the tops of the trees—they included species living from the ground to the treetops. © 2020 American Association for the Advancement of Science.

Keyword: Vision; Evolution
Link ID: 27554 - Posted: 10.28.2020

By Lisa Sanders, M.D. The 61-year-old woman put on her reading glasses to try to decipher the tiny black squiggles on the back of the package of instant pudding. Was it two cups of milk? Or three? The glasses didn’t seem to help. The fuzzy, faded marks refused to become letters. The right side of her head throbbed — as it had for weeks. The constant aggravation of the headache made everything harder, and it certainly wasn’t helping her read this label. She rubbed her forehead, then brought her hand down to cover her right eye. The box disappeared into darkness. She could see only the upper-left corner of the instructions. Everything else was black. She quickly moved her hand to cover her left eye. The tiny letters sprang into focus. She moved back to the right: blackness. Over to the left: light and letters. That scared her. For the past few months, she’d had one of the worst headaches she had ever experienced in her lifetime of headaches. One that wouldn’t go away no matter how much ibuprofen she took. One that persisted through all the different medications she was given for her migraines. Was this terrible headache now affecting her vision? The neurologists she saw over the years always asked her about visual changes. She’d never had them, until now. “Should I take you to the hospital?” her husband asked anxiously when she told him about her nearly sightless left eye. “This could be serious.” She thought for a moment. No, tomorrow was Monday; her neurologist’s office would be open, and the doctor would see her right away. She was always reliable that way. The patient had bad headaches for most of her adult life. They were always on the right side. They were always throbbing. They could last for days, or weeks, or sometimes months. Loud noises were always bothersome. With really bad headaches, her eye would water and her nose would run, just on that side. Bending over was agony. For the past few weeks, her headache had been so severe that if she dropped something on the floor, she had to leave it there. When she bent down, the pounding was excruciating. © 2020 The New York Times Company

Keyword: Pain & Touch; Vision
Link ID: 27553 - Posted: 10.28.2020

Sara Reardon In Alysson Muotri’s laboratory, hundreds of miniature human brains, the size of sesame seeds, float in Petri dishes, sparking with electrical activity. These tiny structures, known as brain organoids, are grown from human stem cells and have become a familiar fixture in many labs that study the properties of the brain. Muotri, a neuroscientist at the University of California, San Diego (UCSD), has found some unusual ways to deploy his. He has connected organoids to walking robots, modified their genomes with Neanderthal genes, launched them into orbit aboard the International Space Station, and used them as models to develop more human-like artificial-intelligence systems. Like many scientists, Muotri has temporarily pivoted to studying COVID-19, using brain organoids to test how drugs perform against the SARS-CoV-2 coronavirus. But one experiment has drawn more scrutiny than the others. In August 2019, Muotri’s group published a paper in Cell Stem Cell reporting the creation of human brain organoids that produced coordinated waves of activity, resembling those seen in premature babies1. The waves continued for months before the team shut the experiment down. This type of brain-wide, coordinated electrical activity is one of the properties of a conscious brain. The team’s finding led ethicists and scientists to raise a host of moral and philosophical questions about whether organoids should be allowed to reach this level of advanced development, whether ‘conscious’ organoids might be entitled to special treatment and rights not afforded to other clumps of cells and the possibility that consciousness could be created from scratch. The idea of bodiless, self-aware brains was already on the minds of many neuroscientists and bioethicists. Just a few months earlier, a team at Yale University in New Haven, Connecticut, announced that it had at least partially restored life to the brains of pigs that had been killed hours earlier. By removing the brains from the pigs’ skulls and infusing them with a chemical cocktail, the researchers revived the neurons’ cellular functions and their ability to transmit electrical signals2.

Keyword: Consciousness; Development of the Brain
Link ID: 27552 - Posted: 10.28.2020

By Nicholas Bakalar Long-term exposure to noise may be linked to an increased risk for Alzheimer’s disease and other forms of dementia. Researchers did periodic interviews with 5,227 people 65 and older participating in a study on aging. They assessed them with standard tests of orientation, memory and language, and tracked average daytime noise levels in their neighborhoods for the five years preceding the cognitive assessments. About 11 percent had Alzheimer’s disease, and 30 percent had mild cognitive impairment, which often progresses to full-blown dementia. Residential noise levels varied widely, from 51 to 78 decibels, or from the level of a relatively quiet suburban neighborhood to that of an urban setting near a busy highway. The study is in Alzheimer’s & Dementia. After controlling for education, race, smoking, alcohol consumption, neighborhood air pollution levels and other factors, they found that each 10 decibel increase in community noise level was associated with a 36 percent higher likelihood of mild cognitive impairment, and a 29 percent increased risk for Alzheimer’s disease. The associations were strongest in poorer neighborhoods, which also had higher noise levels. The reasons for the connection are unknown, but the lead author, Jennifer Weuve, an associate professor of epidemiology at Boston University, suggested that excessive noise can cause sleep deprivation, hearing loss, increased heart rate, constriction of the blood vessels and elevated blood pressure, all of which are associated with an increased risk for dementia. © 2020 The New York Times Company

Keyword: Alzheimers; Hearing
Link ID: 27551 - Posted: 10.28.2020

By Jane E. Brody Do you have the heart to safely smoke pot? Maybe not, a growing body of medical reports suggests. Currently, increased smoking of marijuana in public, even in cities like New York where recreational use remains illegal (though no longer prosecuted), has reinforced a popular belief that this practice is safe, even health-promoting. “Many people think that they have a free pass to smoke marijuana,” Dr. Salomeh Keyhani, professor of medicine at the University of California, San Francisco, told me. “I even heard a suggestion on public radio that tobacco companies should switch to marijuana because then they’d be selling life instead of selling death.” But if you already are a regular user of recreational marijuana or about to become one, it would be wise to consider medical evidence that contradicts this view, especially for people with underlying cardiovascular diseases. Well: Get the best of Well, with the latest on health, fitness and nutrition. Compared with tobacco, marijuana smoking causes a fivefold greater impairment of the blood’s oxygen-carrying capacity, Dr. Keyhani and colleagues reported. In a review of medical evidence, published in January in the Journal of the American College of Cardiology, researchers described a broad range of risks to the heart and blood vessels associated with the use of marijuana. The authors, led by Dr. Muthiah Vaduganathan, cardiologist at Brigham and Women’s Hospital in Boston, point out that “marijuana is becoming increasingly potent, and smoking marijuana carries many of the same cardiovascular health hazards as smoking tobacco.” Edible forms of marijuana have also been implicated as a possible cause of a heart attack, especially when high doses of the active ingredient THC are consumed. © 2020 The New York Times Company

Keyword: Drug Abuse
Link ID: 27550 - Posted: 10.26.2020