Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 3221 - 3240 of 29528

Scientists say they have discovered a possible underlying cause of the neurological disorder, motor neurone disease (MND). The University of Exeter team says it has found evidence that MND is linked to an imbalance of cholesterol and other fats in cells. It says the research could lead to more accurate diagnosis and new treatments. MND affects around 5,000 people in the UK and causes more than 2,000 deaths a year. What is MND? Motor neurone disease is a group of diseases that affect the nerve cells in the brain and spinal cord that tell your muscles what to do. Also known as ALS, it causes muscle weakness and stiffness. Eventually people with the disease are unable to move, talk, swallow and finally, breathe. There is no cure and the exact causes are unclear - it's been variously linked to genes, exposure to heavy metals and agricultural pollution. What did the researchers find? Scientists at the University of Exeter say they had a "eureka moment" when they realised that 13 genes - which, if altered, can cause the condition - were directly involved in processing cholesterol. They say their theory could help predict the course and severity of the disease in patients and monitor the effect of potential new drugs. The theory is outlined in a paper, published in Brain: A Journal of Neurology. Lead author Prof Andrew Crosby said: "For years, we have known that a large number of genes are involved in motor neurone disease, but so far it hasn't been clear if there's a common underlying pathway that connects them." The finding particularly relates to what is known as the "spastic paraplegias", where the malfunction is in the upper part of the spinal cord. Dr Emma Baple, also from the University of Exeter Medical School, said: "Currently, there are no treatments available that can reverse or prevent progression of this group of disorders. Patients who are at high risk of motor neurone disease really want to know how their disease may progress and the age at which symptoms may develop, but that's very difficult to predict." © 2019 BBC

Keyword: ALS-Lou Gehrig's Disease
Link ID: 26902 - Posted: 12.18.2019

By Gretchen Reynolds Top athletes’ brains are not as noisy as yours and mine, according to a fascinating new study of elite competitors and how they process sound. The study finds that the brains of fit, young athletes dial down extraneous noise and attend to important sounds better than those of other young people, suggesting that playing sports may change brains in ways that alter how well people sense and respond to the world around them. For most of us with normal hearing, of course, listening to and processing sounds are such automatic mental activities that we take them for granted. But “making sense of sound is actually one of the most complex jobs we ask of our brains,” says Nina Kraus, a professor and director of the Auditory Neuroscience Laboratory at Northwestern University in Evanston, Ill., who oversaw the new study. Sound processing also can be a reflection of broader brain health, she says, since it involves so many interconnected areas of the brain that must coordinate to decide whether any given sound is familiar, what it means, if the body should respond and how a particular sound fits into the broader orchestration of other noises that constantly bombard us. For some time, Dr. Kraus and her collaborators have been studying whether some people’s brains perform this intricate task more effectively than others. By attaching electrodes to people’s scalps and then playing a simple sound, usually the spoken syllable “da,” at irregular intervals, they have measured and graphed electrical brain wave activity in people’s sound-processing centers. © 2019 The New York Times Company

Keyword: Attention
Link ID: 26901 - Posted: 12.18.2019

By Virginia Morell Dogs may not be able to count to 10, but even the untrained ones have a rough sense of how many treats you put in their food bowl. That’s the finding of a new study, which reveals that our canine pals innately understand quantities in much the same way we do. The study is “compelling and exciting,” says Michael Beran, a psychologist at Georgia State University in Atlanta who was not involved in the research. “It further increases our confidence that [these representations of quantity in the brain] are ancient and widespread among species.” The ability to rapidly estimate the number of sheep in a flock or ripened fruits on a tree is known as the “approximate number system.” Previous studies have suggested monkeys, fish, bees, and dogs have this talent. But much of this research has used trained animals that receive multiple tests and rewards. That leaves open the question of whether the ability is innate in these species, as it is in humans. In the new study, Gregory Berns, a neuroscientist at Emory University in Atlanta, and colleagues recruited 11 dogs from various breeds, including border collies, pitbull mixes, and Labrador golden retriever mixes, to see whether they could find brain activity associated with a sensitivity to numbers. The team, which pioneered canine brain scanning (by getting dogs to voluntarily enter a functional magnetic resonance imaging scanner and remain motionless), had their subjects enter the scanner, rest their heads on a block, and fix their eyes on a screen at the opposite end (see video, above). On the screen was an array of light gray dots on a black background whose number changed every 300 milliseconds. If dogs, like humans and nonhuman primates, have a dedicated brain region for representing quantities, their brains should show more activity there when the number of dots was dissimilar (three small dots versus 10 large ones) than when they were constant (four small dots versus four large dots). © 2019 American Association for the Advancement of Science.

Keyword: Attention; Evolution
Link ID: 26900 - Posted: 12.18.2019

Joshua Schrock You know what it’s like to be sick. You feel fatigued, maybe a little depressed, less hungry than usual, more easily nauseated and perhaps more sensitive to pain and cold. The fact that illness comes with a distinct set of psychological and behavioral features is not a new discovery. In medical terminology, the symptom of malaise encompasses some of the feelings that come with being ill. Animal behaviorists and neuroimmunologists use the term sickness behavior to describe the observable behavior changes that occur during illness. Health care providers often treat these symptoms as little more than annoying side effects of having an infectious disease. But as it turns out, these changes may actually be part of how you fight off infection. I’m an anthropologist interested in how illness and infection have shaped human evolution. My colleagues and I propose that all these aspects of being sick are features of an emotion that we call “lassitude.” And it’s an important part of how human beings work to recover from illness. The human immune system is a complex set of mechanisms that help you suppress and eliminate organisms – such as bacteria, viruses and parasitic worms – that cause infection. Activating the immune system, however, costs your body a lot of energy. This presents a series of problems that your brain and body must solve to fight against infection most effectively. Where will this extra energy come from? What should you do to avoid additional infections or injuries that would increase the immune system’s energy requirements even more? © 2010–2019, The Conversation US, Inc.

Keyword: Neuroimmunology; Emotions
Link ID: 26899 - Posted: 12.18.2019

Nicola Slawson When Lynn Enright had a hysteroscopy to examine the inside of the womb, her searing pain was dismissed by medical professionals. She only understood why when she started working on her book on female anatomy, Vagina: A Re-education. She was looking for research on pain and women’s health, only to be shocked by how little data she found. It wasn’t just the topic of pain that was poorly researched. The lack of evidence was a problem she encountered time and time again, which is no surprise when you look at the research gap: less than 2.5% of publicly funded research is dedicated solely to reproductive health, despite the fact that one in three women in the UK will suffer from a reproductive or gynaecological health problem. There is five times more research into erectile dysfunction, which affects 19% of men, than into premenstrual syndrome, which affects 90% of women. “Women have been woefully neglected in studies on pain. Most of our understanding of ailments comes from the perspective of men; it is overwhelmingly based on studies of men, carried out by men,” Enright says. Her book is one of several in the past year about the female body and the impact a lack of knowledge can have on diagnosis and treatment. They include Emma Barnett’s Period, Eleanor Morgan’s Hormonal, and Gabrielle Jackson’s Pain and Prejudice, which draws on her experience of being diagnosed with endometriosis, a chronically underfunded condition. Given that in the US, which produces a lot of medical research, research trials weren’t required by the National Institutes of Health to include women until 1993, the lack of knowledge is perhaps no surprise. Traditionally this was justified by the idea that women’s bodies were seen to be too complex due to fluctuating hormones, so clinical trials often excluded them. © 2019 Guardian News & Media Limited

Keyword: Sexual Behavior; Hormones & Behavior
Link ID: 26898 - Posted: 12.18.2019

By Abby Goodnough TULSA, Okla. — The teenager had pink cheeks from the cold and a matter-of-fact tone as she explained why she had started using methamphetamine after becoming homeless last year. “Having nowhere to sleep, nothing to eat — that’s where meth comes into play,” said the girl, 17, who asked to be identified by her nickname, Rose. “Those things aren’t a problem if you’re using.” She stopped two months ago, she said, after smoking so much meth over a 24-hour period that she hallucinated and nearly jumped off a bridge. Deaths associated with meth use are climbing here in Oklahoma and in many other states, an alarming trend for a nation battered by the opioid epidemic, and one that public health officials are struggling to fully explain. The meth problem has sneaked up on state and national leaders. In Oklahoma, meth and related drugs, including prescription stimulants, now play a role in more deaths than all opioids combined, including painkillers, heroin and fentanyl, according to the Centers for Disease Control and Prevention. The spending package that lawmakers agreed on this week includes legislation from Senators Jeanne Shaheen, Democrat of New Hampshire, and Rob Portman, Republican of Ohio, that would allow states to address the resurgence of meth and cocaine by using some of the billions of dollars that Congress had appropriated to combat opioid addiction. Meth use first ballooned in the United States from the 1990s into the early 2000s, when it was often made in small home labs with pseudoephedrine, the main ingredient in many drugstore cold medicines. But today’s meth, largely imported from Mexico, is far more potent. © 2019 The New York Times Company

Keyword: Drug Abuse
Link ID: 26897 - Posted: 12.18.2019

By Jennifer Couzin-Frankel Andrea VonMarkle arrived in Madison by helicopter ambulance 2 years ago, her life hanging in the balance. One month earlier she'd been a healthy 21-year-old juggling community college, waitressing, and weightlifting at a local gym. But after several weeks of feeling vaguely ill and forgetful, she was struck by a terrifying crisis. On New Year's Eve, VonMarkle's aunt had returned to the home they shared in northern Michigan to find her niece in trouble. "The door was open, and our dog was running down the street," VonMarkle says of the scene that greeted her aunt. "I just kept saying, ‘I don't know what's going on, and I don't know why I don't know.’" Then she started seizing. The seizures, which VonMarkle had never experienced before, didn't stop. Doctors at a local hospital were unable to quell her brain's electrical storm with powerful antiseizure medications. Because unremitting seizures can destroy brain tissue and damage other organs, the doctors put her into a medically induced coma. "They didn't know what to do with me," she says, "so they flew me to Madison," where the University of Wisconsin hospital had more resources and staff. VonMarkle, unconscious for weeks, wouldn't find out until much later what a stroke of luck that was. She became one of the first people whose sudden-onset, life-threatening epilepsy would be treated in a whole new way: not with standard antiseizure medications, but by disabling the deeper roots of the disease. For her, that meant a drug normally used for arthritis that seemed to soothe the inflammation powering her seizures. © 2019 American Association for the Advancement of Science.

Keyword: Epilepsy
Link ID: 26896 - Posted: 12.13.2019

By Tina Hesman Saey WASHINGTON — Clumps of misfolded proteins cause traffic jams in brain cells. Those jams may have deadly consequences in neurodegenerative diseases. Clusters of prions block passage of crucial cargo along intracellular roadways in brain cells, cell biologist Tai Chaiamarit of the Scripps Research Institute in La Jolla, Calif., reported December 10 at the joint annual meeting of the American Society for Cell Biology and the European Molecular Biology Organization. Prions, misshaped versions of a normal brain protein, clump together in large aggregates that are hallmarks of degenerative brain diseases, such as mad cow disease in cattle, chronic wasting disease in deer and Creutzfeldt-Jakob disease in people. It’s unclear why those clumpy proteins are so deadly to nerve cells called neurons, but the new study may provide clues about what goes wrong in these diseases. Axons, the long stringlike projections of nerve cells that carry electrical signals to other nerves, are the sites of prion traffic jams, Chaiamarit and colleagues found. As more prions clump together, they cause swollen bulges that make the axon look like a snake that has just swallowed a big meal. Through a microscope, Chaiamarit and colleagues saw mitochondria being transported toward the cell’s furthest reaches derailed at the bulges. Mitochondria, cells’ energy-generating organelles, are carried outbound from the main body of the cell by a motor protein called kinesin-1. The protein motors along molecular rails called microtubules. A different motor protein, dynein, transports mitochondria back toward the cell body along those same rails. © Society for Science & the Public 2000–2019

Keyword: Prions
Link ID: 26895 - Posted: 12.13.2019

Christof Koch A future where the thinking capabilities of computers approach our own is quickly coming into view. We feel ever more powerful machine-learning (ML) algorithms breathing down our necks. Rapid progress in coming decades will bring about machines with human-level intelligence capable of speech and reasoning, with a myriad of contributions to economics, politics and, inevitably, warcraft. The birth of true artificial intelligence will profoundly affect humankind’s future, including whether it has one. The following quotes provide a case in point: “From the time the last great artificial intelligence breakthrough was reached in the late 1940s, scientists around the world have looked for ways of harnessing this ‘artificial intelligence’ to improve technology beyond what even the most sophisticated of today’s artificial intelligence programs can achieve.” Advertisement “Even now, research is ongoing to better understand what the new AI programs will be able to do, while remaining within the bounds of today’s intelligence. Most AI programs currently programmed have been limited primarily to making simple decisions or performing simple operations on relatively small amounts of data.” These two paragraphs were written by GPT-2, a language bot I tried last summer. Developed by OpenAI, a San Francisco–based institute that promotes beneficial AI, GPT-2 is an ML algorithm with a seemingly idiotic task: presented with some arbitrary starter text, it must predict the next word. The network isn’t taught to “understand” prose in any human sense. Instead, during its training phase, it adjusts the internal connections in its simulated neural networks to best anticipate the next word, the word after that, and so on. Trained on eight million Web pages, its innards contain more than a billion connections that emulate synapses, the connecting points between neurons. When I entered the first few sentences of the article you are reading, the algorithm spewed out two paragraphs that sounded like a freshman’s effort to recall the gist of an introductory lecture on machine learning during which she was daydreaming. The output contains all the right words and phrases—not bad, really! Primed with the same text a second time, the algorithm comes up with something different. © 2019 Scientific American,

Keyword: Consciousness; Robotics
Link ID: 26894 - Posted: 12.12.2019

Thomas R. Sawallis and Louis-Jean Boë Sound doesn’t fossilize. Language doesn’t either. Even when writing systems have developed, they’ve represented full-fledged and functional languages. Rather than preserving the first baby steps toward language, they’re fully formed, made up of words, sentences and grammar carried from one person to another by speech sounds, like any of the perhaps 6,000 languages spoken today. So if you believe, as we linguists do, that language is the foundational distinction between humans and other intelligent animals, how can we study its emergence in our ancestors? Happily, researchers do know a lot about language – words, sentences and grammar – and speech – the vocal sounds that carry language to the next person’s ear – in living people. So we should be able to compare language with less complex animal communication. And that’s what we and our colleagues have spent decades investigating: How do apes and monkeys use their mouth and throat to produce the vowel sounds in speech? Spoken language in humans is an intricately woven string of syllables with consonants appended to the syllables’ core vowels, so mastering vowels was a key to speech emergence. We believe that our multidisciplinary findings push back the date for that crucial step in language evolution by as much as 27 million years. The sounds of speech Say “but.” Now say “bet,” “bat,” “bought,” “boot.” The words all begin and end the same. It’s the differences among the vowel sounds that keep them distinct in speech. © 2010–2019, The Conversation US, Inc.

Keyword: Language; Evolution
Link ID: 26893 - Posted: 12.12.2019

By Rachel E. Gross In the 1960s, manufacturers of the new birth-control pill imagined their ideal user as feminine, maternal and forgetful. She wanted discretion. She was married. And she wanted visible proof that her monthly cycle was normal and that she wasn’t pregnant. In 2019, the user of the pill is perceived as an altogether different person. She’s unwed, probably would prefer to skip her period and is more forthright about when it’s that time of the month. As such, many birth-control brands now come in brightly colored rectangular packs that make no effort to be concealed. But one part of the equation remains: the week of placebo pills, in which hormones are abruptly withdrawn and a woman experiences what looks and feels a lot like her regular period — blood, cramps and all — but isn’t. Physicians have widely described this pseudoperiod as medically unnecessary. So why do millions still endure it? That’s largely the legacy of two men: John Rock and David Wagner. First there’s Rock, a Harvard fertility expert and a developer of the pill. There’s a longstanding myth that Rock, a Catholic, designed the pill in the 1950s with the church in mind and included a week of hormonal withdrawal — and therefore bleeding — to make his invention seem more natural. In fact, the thought never crossed his mind, the Rutgers University historian Margaret Marsh says. Instead, it was Gregory (Goody) Pincus, the other developer of the pill, who suggested that the pill be given as a 20-days-on, 5-days-off regimen. Pincus wanted to provide women in his trials with reassurance that they weren’t pregnant, and to know himself that the pill was working as a contraceptive. Rock agreed. After the F.D.A. approved the pill in 1960, however, those few days of light bleeding took on a new significance. Anticipating the church’s opposition, Rock became not just a researcher but also an advocate. In his 1963 book “The Time Has Come: A Catholic Doctor’s Proposals to End the Battle Over Birth Control,” he argued that the pill was merely a scientific extension of the church-sanctioned “rhythm method.” It “completely mimics” the body’s own hormones, he wrote, to extend the “safe period” in which a woman could have intercourse and not become pregnant. © 2019 The New York Times Company

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 26892 - Posted: 12.12.2019

By Diana Kwon MDMA, or ecstasy, once had the reputation of exclusively being an illicit party drug popular at raves and dance clubs. That view has changed in recent years. The substance, known for its ability to produce feelings of euphoria and affection for others, has developed a new identity as a promising therapeutic tool. Researchers are currently investigating MDMA-assisted therapy as a potential treatment for post-traumatic stress disorder in late-stage clinical trials. The drug’s capacity to enhance sociability has also led to studies investigating its benefits for other conditions, such as social anxiety in individuals with autism spectrum disorder. Despite the promise of its therapeutic benefits, concern persists among some scientists that MDMA could be abused because its pleasurable effects can make it addictive. “By no means [does the drug] have the addictive liability of methamphetamine or certain opioids,” says Robert Malenka, a professor of psychiatry and behavioral sciences at Stanford University. “But it does have abuse potential.” A new study by Malenka and his team suggests it may be possible to circumvent this risk. The findings, published today in Science Translational Medicine, reveal that MDMA’s sociability-enhancing abilities and its pleasurable properties are controlled by distinct pathways in the brain—at least in mice. That insight opens the possibility of developing a safer version of the drug. Previous research by Malenka’s group and others had revealed that MDMA stimulated the release of both serotonin and dopamine in the brain. The existing evidence suggested the drug’s effects on sociability were linked to serotonin and its addictive potential to dopamine, but the extent to which these pathways were distinct was unknown. “Separating out the prosocial from the addictive effects has tremendous implications for drug development,” says Boris Heifets, an anesthesiologist at Stanford and lead author of the latest study. A key question is, “Can we make something with the same kind of prosocial effect that maybe isn’t as prone to abuse?” © 2019 Scientific American

Keyword: Drug Abuse; Depression
Link ID: 26891 - Posted: 12.12.2019

By Nicholas Bakalar Sleeping a lot may increase the risk for stroke, a new study has found. Chinese researchers followed 31,750 men and women whose average age was 62 for an average of six years, using physical examinations and self-reported data on sleep. They found that compared with sleeping (or being in bed trying to sleep) seven to eight hours a night, sleeping nine or more hours increased the relative risk for stroke by 23 percent. Sleeping less than six hours a night had no effect on stroke incidence. The study, in Neurology, also found that midday napping for more than 90 minutes a day was associated with a 25 percent increased risk for stroke compared with napping 30 minutes or less. And people who both slept more than nine hours and napped more than 90 minutes were 85 percent more likely to have a stroke. The study controlled for smoking, drinking, exercise, family history of stroke, body mass index and other health and behavioral characteristics. The reason for the association is unclear, but long sleep duration is associated with increased inflammation, unfavorable lipid profiles and increased waist circumference, factors known to increase cardiovascular risk. © 2019 The New York Times Company

Keyword: Stroke; Sleep
Link ID: 26890 - Posted: 12.12.2019

By Sharon Begley, STAT Even allowing for the fact that these were lilliputian brains, they were not behaving at all according to plan. From the first days of the tiny lab-grown organs’ development, primitive “progenitor cells” romped out of their birthplaces in the deep interior and quickly turned into neurons and glia, specialized cells that do the brain’s heavy lifting, from thinking and feeling and moving to boring old neurological housekeeping. But the cells were jumping the gun. In healthy developing human brains, progenitor cells spend a good chunk of prenatal existence simply reproducing, vastly increasing their numbers and postponing becoming other brain cells. The impatient progenitor cells, however, were in cerebral organoids—minuscule 3-D versions of the brain—created from the cells of people with Huntington’s disease in hopes of mimicking the patients’ actual brain development decades earlier. It was new evidence that, in their understanding of this devastating genetic illness, scientists know only half the story: In addition to being a neurodegenerative disease, it is also neurodevelopmental, starting in the womb. These recent findings and other research are spurring a radical rethinking of Huntington’s, with implications for the age when any potential cure is likely to be most effective. “It’s not conclusive, but there is suggestive evidence that neurodevelopment is altered in Huntington’s disease,” said neurobiologist Mahmoud Pouladi of the National University of Singapore, who led the organoid work. If so, then if scientists discover a way to repair the mutant gene or remove the aberrant molecules it makes, “the earlier you intervene the better it should be.” In contrast, today’s most-watched clinical trials in Huntington’s include only adults, often middle-aged ones, reflecting the belief that most mutation carriers can reach their 30s or beyond cerebrally unscathed. In fact, doctors and advocacy groups strongly discourage genetic testing for Huntington’s in anyone under 18, presuming there’s nothing to be gained. According to the genetic-testing guidelines from the Huntington’s Disease Society of America, “Predictive testing of minors currently has no medical benefits and the possibility for psychosocial harm and lowered self-esteem is high.” © 2019 Scientific American

Keyword: Huntingtons
Link ID: 26889 - Posted: 12.11.2019

New results from the largest long-term study of brain development and children's health raise provocative questions about obesity and brain function. Does excess body weight somehow reduce brain regions that regulate planning and impulse control? Is obesity a result of that brain difference? Or are eating habits, lifestyle, family circumstances and genetics to blame? Previous studies in children and adults have had conflicting results. The new research doesn't settle the matter and outside experts cautioned that misinterpreting it could unfairly perpetuate weight stigma. But an editorial published with the study Monday in JAMA Pediatrics called it an important addition to mounting evidence of a link between weight, brain structure and mental function. If follow-up research confirms the findings, it could lead to new ways to prevent obesity that target improved brain function. "We don't know which direction these relationships go nor do they suggest that people with obesity are not as smart as people at a healthy weight,"said Dr. Eliana Perrin, a Duke University pediatrics professor who co-wrote the editorial. The federally-funded study involved 3,190 U.S. children aged 9 and 10. They had height and weight measurements, MRI brain scans and computer-based tests of mental function including memory, language, reasoning and impulse control. Nearly 1,000 kids — almost 1 in 3 —were overweight or obese, similar to national statistics. Inflammatory changes early in life ©2019 CBC/Radio-Canada

Keyword: Obesity
Link ID: 26888 - Posted: 12.11.2019

By Eva Frederick Many human grandmothers love to spoil their grandchildren with attention and treats, and for good reason: Studies have shown that having a living grandmother increases a child’s chance of survival. Now, new research shows the same may be true for killer whales. By providing young animals with some freshly caught salmon now and then—or perhaps with knowledge on where to find it—grannies increase their grand-offspring’s chance of survival. The new study is the first direct evidence in nonhuman animals of the “grandmother hypothesis.” The idea posits that females of some species live long after they stop reproducing to provide extra care for their grandchildren. “It’s very cool that these long-lived cetaceans have what looks like a postfertile life stage,” says Kristen Hawkes, an anthropologist at the University of Utah in Salt Lake City who has dedicated much of her career to studying the grandmother effect; she was not involved in the new study. Women usually go through menopause between ages 45 and 55, even though they may live to age 80, 90, or older. Studies in modern-day hunter-gatherer communities as well as in populations in Finland and Canada show that older women can help increase the number of children their daughters have, and boost the survival rates of their grandchildren. Dan Franks, a computer scientist and biologist at the University of York in the United Kingdom, wanted to know whether this grandmother effect occurs in other species as well. © 2019 American Association for the Advancement of Science

Keyword: Sexual Behavior; Evolution
Link ID: 26887 - Posted: 12.10.2019

By Nayef Al-Rodhan Facebook recently announced it had acquired CTRL-Labs, a U.S. start-up working on wearable tech that allows people to control digital devices with their brain. The social media company is only the latest in a long string of firms investing in what has come to be termed “neurotechnology.” Earlier this year Neuralink, a company backed by Elon Musk, announced that it hopes to begin human trials for computerized brain implants. These projects may seem like science fiction, but this drive to get more out of our brains is nothing new—from tea, caffeine and nicotine, to amphetamines and the narcolepsy drug Modafinil, drugs have long been used as rudimentary attempts at cognitive enhancement. And in our tech-driven world, the drive to cognitively enhance is stronger than ever—and is leading us to explore new and untested methods. In today’s hypercompetitive world, everyone is looking for an edge. Improving memory, focus or just the ability to work longer hours are all key to getting ahead, and a drug exists to improve each of them. In 2017, 30 percent of Americans said they had used “smart drug” supplements, known as nootropics, at least once that year, even if studies repeatedly demonstrate that they have a negligible effect on intellect. Advertisement For some, however, nootropics are not enough, and so they turn to medical-grade stimulants. The most famous of these is Adderall, which boosts focus and productivity far more than commercial nootropics. A well-established black market thrives on university campuses and in financial centers, supplying these drugs to people desperate to gain a competitive edge. © 2019 Scientific American

Keyword: Learning & Memory; Drug Abuse
Link ID: 26886 - Posted: 12.10.2019

By Jonah Engel Bromwich Pete Frates, a former college baseball player whose participation in the social media phenomenon known as the Ice Bucket Challenge helped raise more than $100 million toward fighting amyotrophic lateral sclerosis, commonly known as A.L.S. or Lou Gehrig’s disease, died on Monday at his home in Beverly, Mass. He was 34. His death was announced in a statement by Boston College, his alma mater. Quoting his family, it said he died “after a heroic battle with A.L.S.” Mr. Frates learned he had the disease in 2012. A.L.S. attacks the body’s nerve cells and leads to full paralysis. Patients are typically expected to live for two to five years from the time of diagnosis. Mr. Frates did not create the Ice Bucket Challenge, in which participants dumped buckets of ice water over their heads while pledging to donate money to fight A.L.S. But a Facebook video in July 2014 showing him doing his version of the challenge — in which he bobbed his head to Vanilla Ice’s song “Ice Ice Baby” — prompted a surge in participation that summer, to where it became a viral sensation. LeBron James, Bill Gates, Oprah Winfrey and other celebrities stepped forward to be drenched, and millions of others followed suit. Mr. Frates became one of the most visible supporters of the effort, and in August 2014 he completed the challenge again (this time with ice water) at Fenway Park, along with members of the Boston Red Sox organization. The videos were inescapable for anyone on Facebook, and the A.L.S. Association, a Washington-based nonprofit that works to fight the disease, received more than $115 million. In 2015 the organization released an infographic showing how those funds were being spent. About $77 million, or 67 percent, of the money was used for research that ultimately identified the NEK1 gene, which contributes to the disease. The finding gave scientists guidance in developing treatment drugs. © 2019 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 26885 - Posted: 12.10.2019

By Andrea Petersen Anne Firmender, 74, was working with her psychologist to come up with a list of her positive attributes. “I cook for others,” said Ms. Firmender. “It’s giving,” encouraged the psychologist, Dimitris Kiosses. “Good kids,” continued Ms. Firmender, who has four grown children and four grandchildren. “And great mother,” added Dr. Kiosses. Ms. Firmender smiled. Dr. Kiosses typed up the list and handed a printout to Ms. Firmender to take home. “When you’re feeling down and hard on yourself, you can remind yourself of your strengths,” he told her. Ms. Firmender, who has a history of mental health problems, was in therapy for depression. But she also has mild cognitive impairment and can have trouble remembering what day it is. So Dr. Kiosses was treating her with a novel approach called Problem Adaptation Therapy, or PATH. The therapy, developed at Weill Cornell Medicine in New York City and White Plains, N.Y., focuses on solving tangible problems that fuel feelings of sadness and hopelessness. It incorporates tools, like checklists, calendars, signs and videos, to make it accessible for people with memory issues. A caregiver is often involved. The approach is one of several new psychotherapies to treat anxiety and depression in people with cognitive impairments, including early to moderate dementia. Another, the Peaceful Mind program, developed by researchers at Baylor College of Medicine and elsewhere for patients with anxiety and dementia, simplifies traditional cognitive behavioral therapy and focuses on scheduling pleasurable activities and skills, like deep breathing. Therapy sessions are short and take place in patients’ homes. A program designed by researchers at University College London gives cards to patients to take home to remind them of key strategies. One that says “Stop and Think” prompts them to pause when they have panicky and unhelpful thoughts to help keep those thoughts from spiraling and creating more anxiety. © 2019 The New York Times Company

Keyword: Alzheimers; Depression
Link ID: 26884 - Posted: 12.09.2019

Andrew Anthony Katrina Karkazis, a senior research fellow at Yale University, is a cultural anthropologist working at the intersection of science, technology, gender studies and bioethics. With Rebecca Jordan-Young, a sociomedical scientist, she has written Testosterone: An Unauthorised Biography. It is a critique of both popular and scientific understandings of the hormone, and how they have been used to explain, or even defend, inequalities of power. You suggest that testosterone is understood as an exclusively male hormone, even though it’s also found in women. But surely no scientist believes this. No, what we’re saying is that the hormone has a century-long biography and identity that continues to be that of a male sex hormone. That language is used by authoritative sources in the US like the National Library of Medicine, but also in many media articles. It’s an argument that has to do with how the hormone is understood, which then shapes the kinds of research questions that get asked, what kinds of research get done or not done. There’s actually almost no research on the relationship between testosterone and aggression in women. That is a consequence of the framing of the hormone as having to do with men, masculinity, behaviours understood and framed as masculine. It’s the idea that because men generally have more testosterone, somehow that makes it more relevant in men. But the truth is we know very little about it. You write that testosterone’s authorised biography is about libido, aggression and masculinity. Does this mean that testosterone is not about these things? I think that it’s still very widely understood as the driver of all things masculine. When people think about testosterone, aggression is one of the first things that comes to mind. But when you look at the evidence, there’s not good evidence at all. In fact, it’s very weak regarding the relationship between endogenous testosterone [ie testosterone that originates within an organism] and aggression. So it’s an artefact of the ideology of testosterone that we continue to believe that it drives aggression, because aggression has been framed as a masculine behaviour and testosterone has been framed as a masculine hormone. © 2019 Guardian News & Media Limited

Keyword: Hormones & Behavior; Sexual Behavior
Link ID: 26883 - Posted: 12.09.2019