Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 26955

By Jane E. Brody Do you have the heart to safely smoke pot? Maybe not, a growing body of medical reports suggests. Currently, increased smoking of marijuana in public, even in cities like New York where recreational use remains illegal (though no longer prosecuted), has reinforced a popular belief that this practice is safe, even health-promoting. “Many people think that they have a free pass to smoke marijuana,” Dr. Salomeh Keyhani, professor of medicine at the University of California, San Francisco, told me. “I even heard a suggestion on public radio that tobacco companies should switch to marijuana because then they’d be selling life instead of selling death.” But if you already are a regular user of recreational marijuana or about to become one, it would be wise to consider medical evidence that contradicts this view, especially for people with underlying cardiovascular diseases. Well: Get the best of Well, with the latest on health, fitness and nutrition. Compared with tobacco, marijuana smoking causes a fivefold greater impairment of the blood’s oxygen-carrying capacity, Dr. Keyhani and colleagues reported. In a review of medical evidence, published in January in the Journal of the American College of Cardiology, researchers described a broad range of risks to the heart and blood vessels associated with the use of marijuana. The authors, led by Dr. Muthiah Vaduganathan, cardiologist at Brigham and Women’s Hospital in Boston, point out that “marijuana is becoming increasingly potent, and smoking marijuana carries many of the same cardiovascular health hazards as smoking tobacco.” Edible forms of marijuana have also been implicated as a possible cause of a heart attack, especially when high doses of the active ingredient THC are consumed. © 2020 The New York Times Company

Keyword: Drug Abuse
Link ID: 27550 - Posted: 10.26.2020

By Scott Barry Kaufman Do you get excited and energized by the possibility of learning something new and complex? Do you get turned on by nuance? Do you get really stimulated by new ideas and imaginative scenarios? If so, you may have an influx of dopamine in your synapses, but not where we traditionally think of this neurotransmitter flowing. In general, the potential for growth from disorder has been encoded deeply into our DNA. We didn’t only evolve the capacity to regulate our defensive and destructive impulses, but we also evolved the capacity to make sense of the unknown. Engaging in exploration allows us to integrate novel or unexpected events with existing knowledge and experiences, a process necessary for growth. Dopamine production is essential for growth. But there are so many misconceptions about the role of dopamine in cognition and behavior. Dopamine is often labeled the “feel-good molecule,” but this is a gross mischaracterization of this neurotransmitter. As personality neuroscientist Colin DeYoung (a close colleague of mine) notes, dopamine is actually the “neuromodulator of exploration.” Dopamine’s primary role is to make us want things, not necessarily like things. We get the biggest rush of dopamine coursing through our brains at the possibility of reward, but this rush is no guarantee that we’ll actually like or even enjoy the thing once we get it. Dopamine is a huge energizing force in our lives, driving our motivation to explore and facilitating the cognitive and behavioral processes that allow us to extract the most delights from the unknown. If dopamine is not all about feeling good, then why does the feel-good myth persist in the public imagination? I think it’s because so much research on dopamine has been conducted with regard to its role in motivating exploration toward our more primal “appetitive” rewards, such as chocolate, social attention, social status, sexual partners, gambling or drugs like cocaine. © 2020 Scientific American

Keyword: Attention; Drug Abuse
Link ID: 27549 - Posted: 10.26.2020

By Perri Klass, M.D. In a new report on pediatric pain in the British medical journal The Lancet, a commission of experts, including scientists, doctors, psychologists, parents and patients, challenged those who take care of children to end what they described as the common undertreatment of pain in children, starting at birth. Isabel Jordan, of Squamish, British Columbia, took part as a parent partner, along with her son Zachary, 19, who has a genetic condition, and lives with chronic pain. “Pain matters with every child and at every intersection with the health care system,” she said. But for her son, “it didn’t matter with many providers, doctors, nurses, phlebotomists, and that made for worse outcomes.” “The professionals had a wealth of knowledge and experience, but what they lacked was the knowledge of what was really impacting patients in day-to-day life, they didn’t know how impactful poorly managed procedural pain was to patients,” especially children like her son who have ongoing medical issues, Ms. Jordan said. “He’s got a rare disease and has had a lifetime of chronic pain and also procedure pain.” Although we often pride ourselves, in pediatrics, on taking a kinder and gentler approach to our patients, pain experts feel that children’s pain is often taken for granted, and that simple and reliable strategies to mitigate it are disregarded; such as, for example, the 2015 World Health Organization recommendations that infants should be held by parents and perhaps breastfed during immunizations, and that distraction techniques should be used with older children. Christopher Eccleston, a professor of pain science and medical psychology at the University of Bath, where he directs the Centre for Pain Research, was the lead author on the report. He became interested in pediatric pain through working with adults with chronic pain, he said, and realizing that many of them had pain going back into adolescence, which had not been treated. © 2020 The New York Times Company

Keyword: Pain & Touch
Link ID: 27548 - Posted: 10.26.2020

By Stephani Sutherland Many of the symptoms experienced by people infected with SARS-CoV-2 involve the nervous system. Patients complain of headaches, muscle and joint pain, fatigue and “brain fog,” or loss of taste and smell—all of which can last from weeks to months after infection. In severe cases, COVID-19 can also lead to encephalitis or stroke. The virus has undeniable neurological effects. But the way it actually affects nerve cells still remains a bit of a mystery. Can immune system activation alone produce symptoms? Or does the novel coronavirus directly attack the nervous system? Some studies—including a recent preprint paper examining mouse and human brain tissue—show evidence that SARS-CoV-2 can get into nerve cells and the brain. The question remains as to whether it does so routinely or only in the most severe cases. Once the immune system kicks into overdrive, the effects can be far-ranging, even leading immune cells to invade the brain, where they can wreak havoc. Some neurological symptoms are far less serious yet seem, if anything, more perplexing. One symptom—or set of symptoms—that illustrates this puzzle and has gained increasing attention is an imprecise diagnosis called “brain fog.” Even after their main symptoms have abated, it is not uncommon for COVID-19 patients to experience memory loss, confusion and other mental fuzziness. What underlies these experiences is still unclear, although they may also stem from the body-wide inflammation that can go along with COVID-19. Many people, however, develop fatigue and brain fog that lasts for months even after a mild case that does not spur the immune system to rage out of control. Another widespread symptom called anosmia, or loss of smell, might also originate from changes that happen without nerves themselves getting infected. Olfactory neurons, the cells that transmit odors to the brain, lack the primary docking site, or receptor, for SARS-CoV-2, and they do not seem to get infected. Researchers are still investigating how loss of smell might result from an interaction between the virus and another receptor on the olfactory neurons or from its contact with nonnerve cells that line the nose. © 2020 Scientific American,

Keyword: Learning & Memory; Chemical Senses (Smell & Taste)
Link ID: 27547 - Posted: 10.24.2020

The plant compound apigenin improved the cognitive and memory deficits usually seen in a mouse model of Down syndrome, according to a study by researchers at the National Institutes of Health and other institutions. Apigenin is found in chamomile flowers, parsley, celery, peppermint and citrus fruits. The researchers fed the compound to pregnant mice carrying fetuses with Down syndrome characteristics and then to the animals after they were born and as they matured. The findings raise the possibility that a treatment to lessen the cognitive deficits seen in Down syndrome could one day be offered to pregnant women whose fetuses have been diagnosed with Down syndrome through prenatal testing. The study appears in the American Journal of Human Genetics. Down syndrome is a set of symptoms resulting from an extra copy or piece of chromosome 21. The intellectual and developmental disabilities accompanying the condition are believed to result from decreased brain growth caused by increased inflammation in the fetal brain. Apigenin is not known to have any toxic effects, and previous studies have indicated that it is an antioxidant that reduces inflammation. Unlike many compounds, it is absorbed through the placenta and the blood brain barrier, the cellular layer that prevents potentially harmful substances from entering the brain. Compared to mice with Down symptoms whose mothers were not fed apigenin, those exposed to the compound showed improvements in tests of developmental milestones and had improvements in spatial and olfactory memory. Tests of gene activity and protein levels showed the apigenin-treated mice had less inflammation and increased blood vessel and nervous system growth. Guedj, F. et al. Apigenin as a candidate prenatal treatment for Trisomy 21: effects in human amniocytes and the Ts1Cje mouse model. American Journal of Human Genetics. 2020.

Keyword: Development of the Brain; Genes & Behavior
Link ID: 27546 - Posted: 10.24.2020

Jon Hamilton Medical research was an early casualty of the COVID-19 pandemic. After cases began emerging worldwide, thousands of clinical trials unrelated to COVID-19 were paused or canceled amid fears that participants would be infected. But now some researchers are finding ways to carry on in spite of the coronavirus. "It's been a struggle of course," says Joshua Grill, who directs the Institute for Memory Impairments and Neurological Disorders at the University of California, Irvine. "But I think there's an imperative for us to find ways to move forward." Grill got a close-up view of the challenge in July when COVID-19 cases were spiking nationwide as he was trying to launch a study. UC Irvine and dozens of other research centers had just begun enrolling participants in the AHEAD study, a global effort that will test whether an investigational drug can slow down the earliest brain changes associated with Alzheimer's disease. Finding individuals willing and able to sign up for this sort of research is difficult even without a pandemic, says Grill, who also co-directs recruitment for the Alzheimer's Clinicals Trial Consortium, funded by the National Institute on Aging. "We're asking people do a lot, including enroll in long studies that require numerous visits," he says, "and in the AHEAD study, taking an investigational drug or placebo that's injected into a vein." Participants will receive either a placebo or a drug called BAN2401, made by Eisai, which is meant to reduce levels of amyloid, a toxic protein associated with Alzheimer's. People in the study will also have positron emission tomography, or PET, scans of their brains to measure changes in amyloid and another toxic protein called tau. © 2020 npr

Keyword: Alzheimers
Link ID: 27545 - Posted: 10.24.2020

By Jeremy Hsu Artificial intelligence could soon help screen for Alzheimer’s disease by analyzing writing. A team from IBM and Pfizer says it has trained AI models to spot early signs of the notoriously stealthy illness by looking at linguistic patterns in word usage. Other researchers have already trained various models to look for signs of cognitive impairments, including Alzheimer’s, by using different types of data, such as brain scans and clinical test results. But the latest work stands out because it used historical information from the multigenerational Framingham Heart Study, which has been tracking the health of more than 14,000 people from three generations since 1948. If the new models’ ability to pick up trends in such data holds up in forward-looking studies of bigger and more diverse populations, researchers say they could predict the development of Alzheimer’s a number of years before symptoms become severe enough for typical diagnostic methods to pick up. And such a screening tool would not require invasive tests or scans. The results of the Pfizer-funded and IBM-run study were published on Thursday in EClinicalMedicine. The new AI models provide “an augmentation to expert practitioners in how you would see some subtle changes earlier in time, before the clinical diagnosis has been achieved,” says Ajay Royyuru, vice president of health care and life sciences research at IBM. “It might actually alert you to some changes that [indicate] you ought to then go do a more complete exam.” To train these models, the researchers used digital transcriptions of handwritten responses from Framingham Heart Study participants who were asked to describe a picture of a woman who is apparently preoccupied with washing dishes while two kids raid a cookie jar behind her back. These descriptions did not preserve the handwriting from the original responses, says Rhoda Au, director of neuropsychology at the Framingham study and a professor at Boston University. © 2020 Scientific American,

Keyword: Alzheimers; Language
Link ID: 27544 - Posted: 10.24.2020

By Bruce Bower A type of bone tool generally thought to have been invented by Stone Age humans got its start among hominids that lived hundreds of thousands of years before Homo sapiens evolved, a new study concludes. A set of 52 previously excavated but little-studied animal bones from East Africa’s Olduvai Gorge includes the world’s oldest known barbed bone point, an implement probably crafted by now-extinct Homo erectus at least 800,000 years ago, researchers say. Made from a piece of a large animal’s rib, the artifact features three curved barbs and a carved tip, the team reports in the November Journal of Human Evolution. Among the Olduvai bones, biological anthropologist Michael Pante of Colorado State University in Fort Collins and colleagues identified five other tools from more than 800,000 years ago as probable choppers, hammering tools or hammering platforms. The previous oldest barbed bone points were from a central African site and dated to around 90,000 years ago (SN: 4/29/95), and were assumed to reflect a toolmaking ingenuity exclusive to Homo sapiens. Those implements include carved rings around the base of the tools where wooden shafts were presumably attached. Barbed bone points found at H. sapiens sites were likely used to catch fish and perhaps to hunt large land prey. The Olduvai Gorge barbed bone point, which had not been completed, shows no signs of having been attached to a handle or shaft. Ways in which H. erectus used the implement are unclear, Pante and his colleagues say. © Society for Science & the Public 2000–2020.

Keyword: Evolution; Learning & Memory
Link ID: 27543 - Posted: 10.24.2020

By James Gorman It’s good to have friends, for humans and chimpanzees. But the nature and number of those friends change over time. In young adulthood, humans tend to have a lot of friendships. But as they age, social circles narrow, and people tend to keep a few good friends around and enjoy them more. This trend holds across many cultures, and one explanation has to do with awareness of one’s own mortality. Zarin P. Machanda, an anthropologist at Tufts University, and her own good friend, Alexandra G. Rosati, a psychologist and anthropologist at the University of Michigan, wondered whether chimpanzees, which they both study, would show a similar pattern even though they don’t seem to have anything like a human sense of their own inevitable death. The idea, in humans, Dr. Machanda said, is that as we get older we think, “I don’t have time for these negative people in my life, or I don’t want to waste my time with all of this negativity.” So we concentrate on a few good friends and invest in them. This explanation is called socioemotional selectivity theory. Dr. Rosati and Dr. Machanda, who is the director of long-term research at the Kibale Chimpanzee Project in Uganda, drew on many years of observations of chimps at Kibale. Along with several colleagues, they reported Thursday in the journal Science that male chimps, at least, display the very same inclinations as humans. The team looked only at interactions of male chimpanzees because males are quite gregarious and form a lot of friendships, whereas females are more tied to family groups. So male relationships were easier to analyze. The finding doesn’t prove or disprove anything about whether knowledge of death is what drives the human behavior. But it does show that our closest primate relative displays the same bonding habits for some other reason, perhaps something about aging that the two species have in common. At the very least, the finding raises questions about humans. © 2020 The New York Times Company

Keyword: Aggression; Stress
Link ID: 27542 - Posted: 10.24.2020

By Meagan Cantwell Although bird brains are tiny, they’re packed with neurons, especially in areas responsible for higher level thinking. Two studies published last month in Science explore the structure and function of avian brains—revealing they are organized similarly to mammals’ and are capable of conscious thought. © 2020 American Association for the Advancement of Science.

Keyword: Evolution; Learning & Memory
Link ID: 27541 - Posted: 10.24.2020

Ashley Yeager Tiroyaone Brombacher sat in her lab at the University of Cape Town watching a video of an albino mouse swimming around a meter-wide tub filled with water. The animal, which lacked an immune protein called interleukin 13 (IL-13), was searching for a place to rest but couldn’t find the clear plexiglass stand that sat at one end of the pool, just beneath the water’s surface. Instead, it swam and swam, crisscrossing the tub several times before finally finding the platform on which to stand. Over and over, in repeated trials, the mouse failed to learn where the platform was located. Meanwhile, wildtype mice learned fairly quickly and repeatedly swam right to the platform. “When you took out IL-13, [the mice] just could not learn,” says Brombacher, who studies the intersection of psychology, neuroscience, and immunology. Curious as to what was going on, Brombacher decided to dissect the mice’s brains and the spongy membranes, called the meninges, that separate neural tissue from the skull. She wanted to know if the nervous system and the immune system were communicating using proteins such as IL-13. While the knockout mice had no IL-13, she reported in 2017 that the meninges of wildtype mice were chock full of the cytokine. Sitting just outside the brain, the immune protein did, in fact, seem to be playing a critical role in learning and memory, Brombacher and her colleagues concluded. As far back as 2004, studies in rodents suggested that neurons and their support cells release signals that allow the immune system to passively monitor the brain for pathogens, toxins, and debris that might form during learning and memory-making, and that, in response, molecules of the immune system could communicate with neurons to influence learning, memory, and social behavior. Together with research on the brain’s resident immune cells, called microglia, the work overturned a dogma, held since the 1940s, that the brain was “immune privileged,” cut off from the immune system entirely. © 1986–2020 The Scientist.

Keyword: Neuroimmunology
Link ID: 27540 - Posted: 10.21.2020

By Rachel Nuwer With their bright saucer eyes, button noses and plump, fuzzy bodies, slow lorises — a group of small, nocturnal Asian primates — resemble adorable, living stuffed animals. But their innocuous looks belie a startling aggression: They pack vicious bites loaded with flesh-rotting venom. Even more surprising, new research reveals that the most frequent recipients of their toxic bites are other slow lorises. “This very rare, weird behavior is happening in one of our closest primate relatives,” said Anna Nekaris, a primate conservationist at Oxford Brookes University and lead author of the findings, published Monday in Current Biology. “If the killer bunnies on Monty Python were a real animal, they would be slow lorises — but they would be attacking each other.” Even before this new discovery, slow lorises already stood out as an evolutionary oddity. Scientists know of just five other types of venomous mammals: vampire bats, two species of shrew, platypuses and solenodons (an insectivorous mammal found in Cuba, the Dominican Republic and Haiti). Researchers are just beginning to untangle the many mysteries of slow loris venom. One key component resembles the protein found in cat dander that triggers allergies in humans. But other unidentified compounds seem to lend additional toxicity and cause extreme pain. Strangely, to produce the venom, the melon-sized primates raise their arms above their head and quickly lick venomous oil-secreting glands located on their upper arms. The venom then pools in their grooved canines, which are sharp enough to slice into bone. “The result of their bite is really, really horrendous,” Dr. Nekaris says. “It causes necrosis, so animals may lose an eye, a scalp or half their face.” © 2020 The New York Times Company

Keyword: Aggression; Neurotoxins
Link ID: 27539 - Posted: 10.21.2020

By Jake Buehler Naked mole-rats — with their subterranean societies made up of a single breeding pair and an army of workers — seem like mammals trying their hardest to live like insects. Nearly 300 of the bald, bucktoothed, nearly blind rodents can scoot along a colony’s labyrinth of tunnels. New research suggests there’s brute power in those numbers: Like ants or termites, the mole-rats go to battle with rival colonies to conquer their lands. Wild naked mole-rats (Heterocephalus glaber) will invade nearby colonies to expand their territory, sometimes abducting pups to incorporate them into their own ranks, researchers report September 28 in the Journal of Zoology. This behavior may put smaller, less cohesive colonies at a disadvantage, potentially supporting the evolution of bigger colonies. Researchers stumbled across this phenomenon by accident while monitoring naked mole-rat colonies in Kenya’s Meru National Park. The team was studying the social structure of this extreme form of group living among mammals (SN: 6/20/06). Over more than a decade, the team trapped and marked thousands of mole-rats from dozens of colonies by either implanting small radio-frequency transponder chips under their skin, or clipping their toes. One day in 1994, while marking mole-rats in a new colony, researchers were surprised to find in its tunnels mole-rats from a neighboring colony that had already been marked. The queen in the new colony had wounds on her face from the ravages of battle. It looked like a war was playing out down in the soil. © Society for Science & the Public 2000–2020.

Keyword: Evolution; Sexual Behavior
Link ID: 27538 - Posted: 10.21.2020

Catherine Offord Overactivation of the brain’s immune cells, called microglia, may play a role in cognitive impairments associated with Down syndrome, according to research published today (October 6) in Neuron. Researchers in Italy identified elevated numbers of the cells in an inflammation-promoting state in the brains of mice with a murine version of the syndrome as well as in postmortem brain tissue from people with the condition. The team additionally showed that drugs that reduce the number of activated microglia in juvenile mice could boost the animals’ performance on cognitive tests. “This is a fabulous study that gives a lot of proof of principle to pursuing some clinical trials in people,” says Elizabeth Head, a neuroscientist at the University of California, Irvine, who was not involved in the work. “The focus on microglial activation, I thought, was very novel and exciting,” she adds, noting that more research will be needed to see how the effects of drugs used in the study might translate from mice to humans. Down syndrome is caused by an extra copy of part or all of human chromosome 21, and is the most commonly occurring chromosomal condition in the US. Children with Down syndrome often experience cognitive delays compared to typically developing children, although there’s substantial variation and the effects are usually mild or moderate. People with the syndrome also have a higher risk of certain medical conditions, including Alzheimer’s disease. © 1986–2020 The Scientist.

Keyword: Development of the Brain; Glia
Link ID: 27537 - Posted: 10.21.2020

By Sundas Hashmi It was the afternoon of Jan. 31. I was preparing for a dinner party and adding final touches to my cheese platter when everything suddenly went dark. I woke up feeling baffled in a hospital bed. My husband filled me in: Apparently, I had suffered a massive seizure a few hours before our guests were to arrive at our Manhattan apartment. Our children’s nanny found me and I was rushed to the hospital. That had been three days earlier. My husband and I were both mystified: I was 37 years old and had always been in excellent health. In due course, a surgeon dropped by and told me I had a glioma, a type of brain tumor. It was relatively huge but operable. I felt sick to my stomach. Two weeks later, I was getting wheeled to the operating theater. I wouldn’t know the pathology until much later. I said my goodbyes to everyone — most importantly to my children, Sofia, 6, and Nyle, 2 — and prepared to die. But right before the surgery, in a very drugged state, I asked the surgeon to please get photos of me and my brother from my husband. I wanted the surgeon to see them. My brother had died two decades earlier from a different kind of brain tumor — a glioblastoma. I was 15 at the time, and he was 18. He died within two years of being diagnosed. Those two years were the worst period of my life. Doctors in my home country of Pakistan refused to take him, saying his case was fatal. So, my parents gathered their savings and flew him to Britain, where he was able to get a biopsy (his tumor was in an inoperable location) and radiation. Afterward, we had to ask people for donations so he could get the gamma knife treatment in Singapore that my parents felt confident would save him. In the end, nothing worked, and he died, taking 18 years of memories with him. © 2020 The New York Times Company

Keyword: Glia
Link ID: 27536 - Posted: 10.21.2020

Shawna Williams In Greek mythology, Orpheus descends to the underworld and persuades Hades to allow him to take his dead wife, Eurydice, back to the realm of the living. Hades agrees, but tells Orpheus that he must not look back until he has exited the underworld. Despite the warning, Orpheus glances behind him on his way out to check whether Eurydice is indeed following him—and loses her forever. The story hints at a dark side to curiosity, a drive to seek certain kinds of knowledge even when doing so is risky—and even if the information serves no practical purpose at the time. In fact, the way people pursue information they’re curious about can resemble the drive to attain more tangible rewards such as food—a parallel that hasn’t been lost on scientists. To investigate the apparent similarity between curiosity and hunger, researchers led by Kou Murayama of the University of Reading in the UK recently devised an experiment to compare how the brain processes desires for food and knowledge, and the risks people are willing to take to satisfy those desires. Beginning in 2016, the team recruited 32 volunteers and instructed them not to eat for at least two hours before coming into the lab. After they arrived, the volunteers’ fingers were hooked up to electrodes that could deliver a weak current, and researchers calibrated the level of electricity to what each participant reported was uncomfortable, but not painful. Then, still hooked up to the electrodes, the volunteers were asked to gamble: they viewed either a photo of a food item or a video of a magician performing a trick, followed by a visual depiction of their odds of “winning” that round (which ranged from 1:6 to 5:6). © 1986–2020 The Scientist.

Keyword: Attention; Obesity
Link ID: 27535 - Posted: 10.21.2020

By Nicholas Bakalar A mother’s psychological distress during pregnancy may increase the risk for asthma in her child, a new study suggests. Researchers had the parents of 4,231 children fill out well-validated questionnaires on psychological stress in the second trimester of pregnancy, and again three years later. The mothers also completed questionnaires at two and six months after giving birth. The study, in the journal Thorax, found that 362 of the mothers and 167 of the fathers had clinically significant psychological distress during the mothers’ pregnancies. When the children were 10 years old, parents reported whether their child had ever been diagnosed with asthma. As an extra measure, the researchers tested the children using forced expiratory volume, or FEV, a standard clinical test of lung function. After controlling for age, smoking during pregnancy, body mass index, a history of asthma and other factors, they found that maternal depression and anxiety during pregnancy was significantly associated with both diagnoses of asthma and poorer lung function in their children. There was no association between childhood asthma and parents’ psychological distress in the years after pregnancy, and no association with paternal psychological stress at any time. “Of course, this could be only one of many causes of asthma,” said the lead author, Dr. Evelien R. van Meel of Erasmus University in Rotterdam, “but we corrected for many confounders, and we saw the effect only in mothers. This seems to suggest that there’s something going on in the uterus. But this is an observational study, and we can’t say that it’s a causal effect.” © 2020 The New York Times Company

Keyword: Depression; Development of the Brain
Link ID: 27534 - Posted: 10.21.2020

By Pam Belluck A potential therapy for amyotrophic lateral sclerosis, a fatal neurological disorder, may allow patients to live several months longer than they otherwise would have, according to a study published Friday. The two-drug combination, dreamed up by two college students, is one of several potential treatments raising the hopes of patients with A.L.S., also known as Lou Gehrig’s disease. The paralytic condition steals people’s ability to walk, speak, eat and ultimately breathe, typically causing death within two to five years. There are only two approved A.L.S. medications, neither tremendously effective. But advocacy efforts by patients and organizations, along with the Ice Bucket Challenge, a highly successful fundraising campaign, have galvanized research into more than 20 therapies that are currently in clinical trials. The two-drug combination, called AMX0035, was conceived seven years ago by Joshua Cohen and Justin Klee, then a junior and senior at Brown University, with the goal of preventing the destruction of neurons that occurs in many brain disorders. It is a combination of an existing supplement and a medication for a pediatric urea disorder. Last month, a study of 137 patients reported that AMX0035 slowed progression of A.L.S. paralysis by about 25 percent more than a placebo. Measuring patients using a scale of physical function, researchers found that those receiving a placebo declined in 18 weeks to a level that patients receiving the treatment didn’t reach until 24 weeks, according to the study’s principal investigator, Dr. Sabrina Paganoni. But because that trial was conducted for only 24 weeks, it left unanswered a crucial question of whether the treatment extended survival for the patients receiving the therapy. After that study ended, 98 of the participants, who had not been told whether they had received placebo or therapy, were given the option of taking the therapy for up to 30 months, a format called an open-label extension study. © 2020 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 27533 - Posted: 10.19.2020

By Laurie Archbald-Pannone The number of cases of dementia in the United States is rising as baby boomers age, raising questions for boomers themselves and also for their families, caregivers and society. Dementia, which is not technically a disease but a term for impaired ability to think, remember or make decisions, is one of the most feared impairments of old age. Incidence increases dramatically as people move into their 90s. About 5 percent of those 71 to 79 have dementia, and about 37 percent of those about 90 live with it. Older people may worry about their own loss of function as well as the cost and toll of caregiving for someone with dementia. A 2018 study estimated that the lifetime cost of care for a person with Alzheimer’s, the most common form of dementia, to be $329,360. That figure, too, will no doubt rise, putting even more burdens on family, Medicare and Medicaid. There’s also been a good deal of talk and reporting about dementia in recent months because of the presidential election. Some voters have asked whether one or both candidates might have dementia. But is this even a fair question to ask? When these types of questions are posed — adding further stigma to people with dementia — it can unfairly further isolate them and those caring for them. We need to understand dementia and the impact it has on more than 5 million people in the United States who now live with dementia and their caregivers. That number is expected to triple by 2060. First, it is important to know that dementia cannot be diagnosed from afar or by someone who is not a doctor. A person needs a detailed doctor’s exam for a diagnosis. Sometimes, brain imaging is required. And, forgetting an occasional word — or even where you put your keys — does not mean a person has dementia. There are different types of memory loss and they can have different causes, such as other medical conditions, falls or even medication, including herbals, supplements and anything over-the-counter. © 1996-2020 The Washington Post

Keyword: Alzheimers
Link ID: 27532 - Posted: 10.19.2020

By John Horgan One of the most impressive, disturbing works of science journalism I’ve encountered is Anatomy of an Epidemic: Magic Bullets, Psychiatric Drugs, and the Astonishing Rise of Mental Illness in America, published in 2010. In the book, which I review here, award-winning journalist Robert Whitaker presents evidence that medications for mental illness, over time and in the aggregate, cause net harm. In 2012, I brought Whitaker to my school to give a talk, in part to check him out. He struck me as a smart, sensible, meticulous reporter whose in-depth research had led him to startling conclusions. Since then, far from encountering persuasive rebuttals of Whitaker’s thesis, I keep finding corroborations of it. If Whitaker is right, modern psychiatry, together with the pharmaceutical industry, has inflicted iatrogenic harm on millions of people. Reports of surging mental distress during the pandemic have me thinking once again about Whitaker’s views and wondering how they have evolved. Below he answers some questions. —John Horgan
 Horgan: When and why did you start reporting on mental health? Whitaker: It came about in a very roundabout way. In 1994, I had co-founded a publishing company called CenterWatch that covered the business aspects of the “clinical trials industry,” and I soon became interested in writing about how financial interests were corrupting drug trials. Risperdal and Zyprexa had just come to market, and after I used a Freedom of Information request to obtain the FDA’s review of those two drugs, I could see that psychiatric drug trials were a prime example of that corruption. In addition, I had learned of NIMH-funded research that seemed abusive of schizophrenia patients, and in 1998, I co-wrote a series for the Boston Globe on abuses of patients in psychiatric research. My interest was in that broader question of corruption and abuse in research settings, and not specific to psychiatry. © 2020 Scientific American

Keyword: Depression; Schizophrenia
Link ID: 27531 - Posted: 10.19.2020