Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 23268

Ashley Yeager DNA might reveal how dogs became man’s best friend. A new study shows that some of the same genes linked to the behavior of extremely social people can also make dogs friendlier. The result, published July 19 in Science Advances, suggests that dogs’ domestication may be the result of just a few genetic changes rather than hundreds or thousands of them. “It is great to see initial genetic evidence supporting the self-domestication hypothesis or ‘survival of the friendliest,’” says evolutionary anthropologist Brian Hare of Duke University, who studies how dogs think and learn. “This is another piece of the puzzle suggesting that humans did not create dogs intentionally, but instead wolves that were friendliest toward humans were at an evolutionary advantage as our two species began to interact.” Not much is known about the underlying genetics of how dogs became domesticated. In 2010, evolutionary geneticist Bridgett vonHoldt of Princeton University and colleagues published a study comparing dogs’ and wolves’ DNA. The biggest genetic differences gave clues to why dogs and wolves don’t look the same. But major differences were also found in WBSCR17, a gene linked to Williams-Beuren syndrome in humans. Williams-Beuren syndrome leads to delayed development, impaired thinking ability and hypersociability. VonHoldt and colleagues wondered if changes to the same gene in dogs would make the animals more social than wolves, and whether that might have influenced dogs’ domestication. © Society for Science & the Public 2000 - 2017.

Keyword: Aggression; Genes & Behavior
Link ID: 23855 - Posted: 07.20.2017

How well cancer patients fared after chemotherapy was affected by their social interaction with other patients during treatment, according to a new study by researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, and the University of Oxford in the United Kingdom. Cancer patients were a little more likely to survive for five years or more after chemotherapy if they interacted during chemotherapy with other patients who also survived for five years or more. Patients were a little more likely to die in less than five years after chemotherapy when they interacted during chemotherapy with those who died in less than five years. The findings were published online July 12, 2017, in the journal Network Science. “People model behavior based on what’s around them,” Jeff Lienert, lead author in NHGRI’s Social and Behavioral Research Branch and a National Institutes of Health Oxford-Cambridge Scholars Program fellow. “For example, you will often eat more when you’re dining with friends, even if you can’t see what they’re eating. When you’re bicycling, you will often perform better when you’re cycling with others, regardless of their performance.” Lienert set out to see if the impact of social interaction extended to cancer patients undergoing chemotherapy. Joining this research effort were Lienert’s adviser, Felix Reed-Tsochas, Ph.D., at Oxford’s CABDyN Complexity Centre at the Saïd Business School, Laura Koehly, Ph.D., chief of NHGRI’s Social and Behavioral Research Branch, and Christopher Marcum, Ph.D., a staff scientist also in the Social and Behavioral Research Branch at NHGRI.

Keyword: Stress
Link ID: 23854 - Posted: 07.20.2017

By LISA SANDERS, M.D. The 35-year-old man lay on the bed with his eyes closed, motionless except for the regular jerking of his abdomen and chest — what is known medically as a singultus (from the Latin for ‘‘sob’’) but popularly and onomatopoeically as a hiccup. The man was exhausted. He couldn’t eat, could barely drink and hadn’t slept much since the hiccups began, nearly three weeks earlier. Unending Contractions At first it was just annoying — these spasms that interrupted his life every 10 to 12 seconds. Friends and family suggested remedies, and he tried them all: holding his breath, drinking cold water, drinking hot water, drinking out of the wrong side of the glass, drinking water while holding his nose. Sometimes they even worked for a while. He would find himself waiting for the next jerk, and when it didn’t come, he’d get this tiny sense of triumph that the ridiculous ordeal was over. But after 15 minutes, maybe 30, they would suddenly return: hiccup, hiccup, hiccup. His neck, stomach and chest muscles ached from the constant regular contractions. On this evening, the man had one of the all too rare breaks from the spasms and fell asleep. When his wife heard the regular sound start up again, she came into their bedroom to check on him. He looked awful — thin, tired and uncomfortable. And suddenly she was scared. They needed to go to the hospital, she told him. He was too weak, he told her, ‘‘and so very tired.’’ He would go, but first he’d rest. They had been to the emergency room several times already. During their first visit — nearly two weeks earlier — the doctors at the local hospital in their Queens neighborhood gave him a medication, chlorpromazine, an antipsychotic that has been shown to stop hiccups, though it’s not clear why. It was like a miracle; the rhythmic spasms stopped. But a few hours later, when the drug wore off, the hiccups returned. The couple went back a few days later because he started throwing up while hiccupping. Those doctors offered an acid reducer for his stomach and more chlorpromazine. They encouraged the man to have patience. Sometimes these things can last, they said. © 2017 The New York Times Company

Keyword: Miscellaneous
Link ID: 23853 - Posted: 07.20.2017

Xiaomeng (Mona) Xu, assistant professor of experimental psychology, and Ariana Tart-Zelvin, If you have experienced the evolution from having a crush to falling in love, it may seem like the transition happens naturally. But have you ever wondered how we make such a huge emotional leap? In other words, what changes take place in our brains that allow us to fall deeply in love? Stephanie Cacioppo, a psychologist at the University of Chicago who has studied the neuroscience of romantic love for the past decade, explains that the process involves several complex changes, particularly in the brain’s reward system. More specifically, in a 2012 review of the love research literature Lisa Diamond and Janna Dickenson, psychologists at the University of Utah, found romantic love is most consistently associated with activity in two brain regions—the ventral tegmental area (VTA) and the caudate nucleus. These areas play an essential role in our reward pathway and regulate the “feel good” neurotransmitter dopamine. In other words, during the early stages of love you crave the person because he or she makes you feel so good. And over time these feelings persist. Our neuroimaging research and that of others suggests that once you are in love—as long as the relationship remains satisfying—simply thinking about your partner not only makes you feel good but can also buffer against pain, stress and other negative feelings. © 2017 Scientific American,

Keyword: Emotions; Sexual Behavior
Link ID: 23852 - Posted: 07.20.2017

By Diana Kwon Using CRISPR, researchers have successfully treated congenital muscular dystrophy type 1A (MDC1A), a rare disease that can lead to severe muscle wasting and paralysis, in mice. The team was able to restore muscle function by correcting a splicing site mutation that causes the disorder, according to a study published today (July 17) in Nature Medicine. “Instead of inserting the corrected piece of information, we used CRISPR to cut DNA in two strategic places,” study coauthor Dwi Kemaladewi, a research fellow at the Hospital for Sick Children (Sick Kids) in Toronto, explains in a statement. “This tricked the two ends of the gene to come back together and create a normal splice site.” By targeting both the skeletal muscles and peripheral nerves, the team was able to improve the animals’ motor function and mobility. “This is important because the development of therapeutic strategies for muscular dystrophies have largely focused on improving the muscle conditions,” Kemaladewi says in the release. “Experts know the peripheral nerves are important, but the skeletal muscles have been perceived as the main culprit in MDC1A and have traditionally been the focus of treatment options.” “The robustness of the correction we see in animal models to me is very encouraging,” Amy Wagers, a biologist at Harvard University who was not involved in this study, tells the Toronto Star. © 1986-2017 The Scientist

Keyword: Muscles
Link ID: 23851 - Posted: 07.19.2017

By Jack Turban Fourteen-year-old Nicole, whose name I changed for her privacy, told her mother every day for years that she wanted to end her own life. Between suicide attempts were more psychiatric hospital visits than she or her mother could count. She refused to get out of bed, shower, or go to school, missing sixty school days in a single year. In one visit with her therapist, she admitted to praying every night that she would not wake up the next morning. After countless psychiatrists and psychotherapists were unable to improve her depression, her mother converted a bathroom cabinet into a locked safe, containing all of the sharp objects and pills in the house. Her parents were certain it was only a matter of time until Nicole killed herself. Today, a now seventeen-year-old Nicole greets me with a big smile. Her blonde hair is pulled back into a ponytail to reveal her bright blue eyes. She tells me she hasn’t missed a day of school and is preparing for college. Blushing, she lets me know that her first date is coming up, a prom date to be precise. For the first time in years, she is happy and wants to live. What happened to cause this dramatic change? In December, Nicole started infusions of a psychedelic drug called ketamine. Though she had failed to respond to endless medication trials for her depression (selective serotonin reuptake inhibitors, mirtazapine, topiramate, antipsychotics, and lithium to name just a few), ketamine cleared her depression within hours. The effect lasts about two weeks before she needs a new infusion. © 2017 Scientific America

Keyword: Depression; Development of the Brain
Link ID: 23850 - Posted: 07.19.2017

Laurel Hamers A common blood sugar medication or an extra dose of a thyroid hormone can reverse signs of cognitive damage in rats exposed in utero to alcohol. Both affect an enzyme that controls memory-related genes in the hippocampus, researchers report July 18 in Molecular Psychiatry. That insight might someday help scientists find an effective human treatment for fetal alcohol spectrum disorders, which can cause lifelong problems with concentration, learning and memory. “At this moment, there’s really no pharmaceutical therapy,” says R. Thomas Zoeller, a neurobiologist at the University of Massachusetts Amherst. Fetal alcohol syndrome disorders may affect up to 5 percent of U.S. kids, according to estimates from the U.S. Centers for Disease Control and Prevention. Scientists don’t know exactly why alcohol has such a strong effect on developing brains. But the lower thyroid hormone levels commonly induced by alcohol exposure might be one explanation, suggests study coauthor Eva Redei, a psychiatrist at Northwestern University Feinberg School of Medicine in Chicago. “The mother has to supply the thyroid hormones for brain development,” says Redei. So, pregnant women who drink might not be providing their fetuses with enough hormones for normal brain development. That could disrupt the developing hippocampus, a brain region involved in learning and memory. To counter alcohol’s effects, Redei and her colleagues gave doses of thyroxine, a thyroid hormone, to newborn rats that had been exposed to alcohol before birth. (That timing coincides developmentally with the third trimester of pregnancy in humans.) The amount of alcohol fed to the rat moms corresponded roughly to a woman drinking a glass or two of wine a day. |© Society for Science & the Public 2000 - 2017

Keyword: Drug Abuse; Development of the Brain
Link ID: 23849 - Posted: 07.19.2017

By Tara Bahrampour A significant portion of people with mild cognitive impairment or dementia who are taking medication for Alzheimer’s may not actually have the disease, according to interim results of a major study currently underway to see how PET scans could change the nature of Alzheimer’s diagnosis and treatment. The findings, presented Wednesday at the Alzheimer’s Association International Conference in London, come from a four-year study launched in 2016 that is testing over 18,000 Medicare beneficiaries with MCI or dementia to see if their brains contain the amyloid plaques that are one of the two hallmarks of the disease. So far, the results have been dramatic. Among 4,000 people tested so far in the Imaging Dementia-Evidence for Amyloid Scanning (IDEAS) study, researchers from the Memory and Aging Center at the University of California, San Francisco found that just 54.3 percent of MCI patients and 70.5 percent of dementia patients had the plaques. A positive test for amyloid does not mean someone has Alzheimer’s, though its presence precedes the disease and increases the risk of progression. But a negative test definitively means a person does not have it. The findings could change the way doctors treat people in these hard-to-diagnose groups and save money currently being spent on inappropriate medication. “If someone had a putative diagnosis of Alzheimer’s Disease, they might be on an Alzheimer’s drug like Aricept or Namenda,” said James Hendrix, the Alzheimer Association’s director of global science initiatives who co-presented the findings. “What if they had a PET scan and it showed that they didn’t have amyloid in their brain? Their physician would take them off that drug and look for something else.” © 1996-2017 The Washington Post

Keyword: Alzheimers; Brain imaging
Link ID: 23848 - Posted: 07.19.2017

Shirley S. Wang In nursing homes and residential facilities around the world, health care workers are increasingly asking dementia patients questions: What are your interests? How do you want to address us? What should we do to celebrate the life of a friend who has passed away? The questions are part of an approach to care aimed at giving people with memory loss and other cognitive problems a greater sense of control and independence. At its core is the idea that an individual with dementia should be treated as a whole person and not "just" a patient. Scientists sometimes call this approach an ecopsychosocial intervention. The goal is to create environments that better meet patients' psychological and emotional needs through strategies other than medication. At the Alzheimer's Association International Conference this week in London, researchers from the U.S., the U.K. and Israel presented data from four trials demonstrating that such interventions significantly improve residents' mood and quality of life. The interventions can also reduce their use of antipsychotic drugs and improve their ability to care for themselves. Taken together, these studies and others suggest that relatively simple and potentially cost-effective interventions can yield significant benefits for people with dementia, even those in residential facilities in the later stages of disease. Behavioral Therapy Helps More Than Drugs For Dementia Patients As the population continues to age, and the number of people with dementia continues to rise, these interventions are likely to increase in importance as well. © 2017 npr

Keyword: Alzheimers; Attention
Link ID: 23847 - Posted: 07.19.2017

By LISA FELDMAN BARRETT Imagine that a bully threatens to punch you in the face. A week later, he walks up to you and breaks your nose with his fist. Which is more harmful: the punch or the threat? The answer might seem obvious: Physical violence is physically damaging; verbal statements aren’t. “Sticks and stones can break my bones, but words will never hurt me.” But scientifically speaking, it’s not that simple. Words can have a powerful effect on your nervous system. Certain types of adversity, even those involving no physical contact, can make you sick, alter your brain — even kill neurons — and shorten your life. Your body’s immune system includes little proteins called proinflammatory cytokines that cause inflammation when you’re physically injured. Under certain conditions, however, these cytokines themselves can cause physical illness. What are those conditions? One of them is chronic stress. Your body also contains little packets of genetic material that sit on the ends of your chromosomes. They’re called telomeres. Each time your cells divide, their telomeres get a little shorter, and when they become too short, you die. This is normal aging. But guess what else shrinks your telomeres? Chronic stress. If words can cause stress, and if prolonged stress can cause physical harm, then it seems that speech — at least certain types of speech — can be a form of violence. But which types? This question has taken on some urgency in the past few years, as professed defenders of social justice have clashed with professed defenders of free speech on college campuses. Student advocates have protested vigorously, even violently, against invited speakers whose views they consider not just offensive but harmful — hence the desire to silence, not debate, the speaker. “Trigger warnings” are based on a similar principle: that discussions of certain topics will trigger, or reproduce, past trauma — as opposed to merely challenging or discomfiting the student. The same goes for “microaggressions.” © 2017 The New York Times Company

Keyword: Aggression; Brain imaging
Link ID: 23846 - Posted: 07.18.2017

Katherine Hobson The theory behind artificial sweeteners is simple: If you use them instead of sugar, you get the joy of sweet-tasting beverages and foods without the downer of extra calories, potential weight gain and related health issues. In practice, it's not so simple, as a review of the scientific evidence on non-nutritive sweeteners published Monday shows. After looking at two types of scientific research, the authors conclude that there is no solid evidence that sweeteners like aspartame and sucralose help people manage their weight. And observational data suggest that the people who regularly consume these sweeteners are also more likely to develop future health problems, though those studies can't say those problems are caused by the sweeteners. The health effects of artificial sweeteners are important to study, because so many people use them. Another study published earlier this year found that a quarter of U.S. children and 41 percent of adults reported consuming them, most of them once per day. Even more people may be consuming them unwittingly in products such as granola bars or yogurt. "We were really interested in the everyday person who is consuming these products not to lose weight, but because they think it's the healthier choice, for many years on end," says Meghan Azad, lead author of the review and a research scientist at the University of Manitoba. While more research needs to be done, from what we know now, "there is no clear benefit for weight loss, and there's a potential association with increased weight gain, diabetes and other negative cardiovascular outcomes," says Azad. © 2017 npr

Keyword: Obesity
Link ID: 23845 - Posted: 07.18.2017

By PERRI KLASS, M.D. We want to believe we’re raising our kids to think for themselves, and not to do dumb or unhealthy things just because the cool kids are doing them. But research shows that when it comes to smoking, children are heavily influenced by some of the folks they consider the coolest of the cool: actors in movies. “There’s a dose-response relationship: The more smoking kids see onscreen, the more likely they are to smoke,” said Dr. Stanton Glantz, a professor and director of the University of California, San Francisco, Center for Tobacco Control Research and Education. He is one of the authors of a new study that found that popular movies are showing more tobacco use onscreen. “The evidence shows it’s the largest single stimulus,” for smoking, he said; “it overpowers good parental role modeling, it’s more powerful than peer influence or even cigarette advertising.” He said that epidemiological studies have shown that if you control for all the other risk factors of smoking (whether parents smoke, attitudes toward risk taking, socioeconomic status, and so on), younger adolescents who are more heavily exposed to smoking on film are two to three times as likely to start smoking, compared with the kids who are more lightly exposed. Those whose parents smoke are more likely to smoke, he said, but exposure to smoking in movies can overcome the benefit of having nonsmoking parents. In one study, the children of nonsmoking parents with heavy exposure to movie smoking were as likely to smoke as the children of smoking parents with heavy movie exposure. To Dr. Glantz, and the other people who study this topic, that makes smoking in movies an “environmental toxin,” a factor endangering children. © 2017 The New York Times Company

Keyword: Drug Abuse; Attention
Link ID: 23844 - Posted: 07.18.2017

By TALYA MINSBERG When Marti Noxon set out to make “To the Bone,” a film about a 20-year-old battling an eating disorder, she initially faced the question: Was the topic too niche? The answer came in the form of a rousing premiere in January at the Sundance Film Festival, Netflix’s reported $8 million purchase of the film, a trailer that went viral with 54 million views in the first week and arguments over whether it glamorized excessive thinness. The film debuted on Netflix on Friday. The film is loosely based on Ms. Noxon’s experience with an eating disorder. She and its star, Lily Collins, are among the 30 million Americans — a third of them men — who have struggled with one. Ms. Collins plays Ellen, an anorexia patient who enters her fifth eating disorder treatment center, an unorthodox group home run by a doctor played by Keanu Reeves. Many of those reacting to the film’s trailer worried that watching it could trigger unhealthy thoughts in viewers who may be prone to eating disorders or already struggling with them. Indeed, some experts said that people who have had eating disorders should consider the state of their health before watching the film. “If you don’t feel solid in your recovery, don’t watch it right now. It could be triggering at any part of your life if you aren’t feeling strong and solid in your recovery,” Dr. Dena Cabera, executive clinical director at Rosewood Centers for Eating Disorders said. “It will always be there; you can look it up later.” Others say the film may help spur action. Eating disorders have the highest mortality rate of any psychiatric disorder, and can affect individuals across every demographic. “If the film helps raise awareness and more people seek treatment, that would be a success that we can be pleased with,” Dr. S. Bryn Austin, a professor at Boston Children’s Hospital and Harvard T.H. Chan School of Public Health, said. “Eating disorders can be successfully treated, they just need to take the first step in reaching out for care.” © 2017 The New York Times Company

Keyword: Anorexia & Bulimia
Link ID: 23843 - Posted: 07.17.2017

Tim Adams Henry Marsh made the decision to become a neurosurgeon after he had witnessed his three-month-old son survive the complex removal of a brain tumour. For two decades he was the senior consultant in the Atkinson Morley wing at St George’s hospital in London, one of the country’s largest specialist brain surgery units. He pioneered techniques in operating on the brain under local anaesthetic and was the subject of the BBC documentary Your Life in Their Hands. His first book, Do No Harm: Stories of Life, Death, and Brain Surgery, was published in 2014 to great acclaim, and became a bestseller across the world. Marsh retired from full-time work at St George’s in 2015, though he continues with long-standing surgical roles at hospitals in the Ukraine and Nepal. He is also an avid carpenter. Earlier this year he published a second volume of memoir, Admissions: a Life in Brain Surgery, in which he looks back on his career as he takes up a “retirement project” of renovating a decrepit lock-keeper’s cottage near where he grew up in Oxfordshire. He lives with his second wife, the social anthropologist and author Kate Fox. They have homes in Oxford, and in south London, which is where the following conversation took place. Have you officially retired now? Well, I still do one day a week for the NHS, though apparently they want a “business case” for it, so I’m not getting paid at present. Yes, well, people talk about the mind-matter problem – it’s not a problem for me: mind is matter. That’s not being reductionist. It is actually elevating matter. We don’t even begin to understand how electrochemistry and nerve cells generate thought and feeling. We have not the first idea. The relation of neurosurgery to neuroscience is a bit like the relationship between plumbing and quantum mechanics.

Keyword: Consciousness
Link ID: 23842 - Posted: 07.17.2017

Jon Hamilton Harsh life experiences appear to leave African-Americans vulnerable to Alzheimer's and other forms of dementia, researchers reported Sunday at the Alzheimer's Association International Conference in London. Several teams presented evidence that poverty, disadvantage and stressful life events are strongly associated with cognitive problems in middle age and dementia later in life among African-Americans. The findings could help explain why African-Americans are twice as likely as white Americans to develop dementia. And the research suggests genetic factors are not a major contributor. "The increased risk seems to be a matter of experience rather than ancestry," says Megan Zuelsdorff, a postdoctoral fellow in the Health Disparities Research Scholars Program at the University of Wisconsin-Madison. Scientists have struggled to understand why African-Americans are so likely to develop dementia. They are more likely to have conditions like high blood pressure and diabetes, which can affect the brain. And previous research has found some evidence that African-Americans are more likely to carry genes that raise the risk. But more recent studies suggest those explanations are incomplete, says Rachel Whitmer, an epidemiologist with Kaiser Permanente's Division of Research in Northern California. Whitmer has been involved in several studies that accounted for genetic and disease risks when comparing dementia in white and black Americans. "And we still saw these [racial] differences," she says. "So there is still something there that we are trying to get at." © 2017 npr

Keyword: Alzheimers; Stress
Link ID: 23841 - Posted: 07.17.2017

Nicola Davis People who experience hearing loss could be at greater risk of memory and thinking problems later in life than those without auditory issues, research suggests. The study focused on people who were at risk of Alzheimer’s disease, revealing that those who were diagnosed with hearing loss had a higher risk of “mild cognitive impairment” four years later. “It’s really not mild,” said Clive Ballard, professor of age-related disease at the University of Exeter. “They are in the lowest 5% of cognitive performance and about 50% of those individuals will go on to develop dementia.” Guardian Morning Briefing - sign up and start the day one step ahead Read more Presented at the Alzheimer’s Association International Conference in London, researchers from the US looked at the memory and thinking skills of 783 cognitively healthy participants in late middle age, more than two-thirds of whom had at least one parent who had been diagnosed with Alzheimer’s disease. The team carried out a range of cognitive tests on the participants over a four-year period, aimed at probing memory and mental processing, revealing that those who had hearing loss at the start of the study were more than twice as likely to be found to have mild cognitive impairment four years later than those with no auditory problems, once a variety of other risk factors were taken into account. Taylor Fields, a PhD student at the University of Wisconsin who led the research, said that the findings suggest hearing loss could be an early warning sign that an individual might be at greater risk of future cognitive impairment - but added more research was necessary to unpick the link. “There is something here and it should be looked into,” she said. © 2017 Guardian News and Media Limited

Keyword: Hearing; Alzheimers
Link ID: 23840 - Posted: 07.17.2017

Pagan Kennedy In 2011, Ben Trumble emerged from the Bolivian jungle with a backpack containing hundreds of vials of saliva. He had spent six weeks following indigenous men as they tramped through the wilderness, shooting arrows at wild pigs. The men belonged to the Tsimane people, who live as our ancestors did thousands of years ago — hunting, foraging and farming small plots of land. Dr. Trumble had asked the men to spit into vials a few times a day so that he could map their testosterone levels. In return, he carried their kills and helped them field-dress their meat — a sort of roadie to the hunters. Dr. Trumble wanted to find out whether the hunters who successfully shot an animal would be rewarded with a spike in testosterone. (They were.) As a researcher with the Tsimane Health and Life History Project, he had joined a long-running investigation into human well-being and aging in the absence of industrialization. That day when he left the jungle, he stumbled across a new and more urgent question about human health. He dropped his backpack, called his mom and heard some terrible news: His 64-year-old uncle had learned he had dementia, probably Alzheimer’s. In just a few short years, his uncle, a vibrant former lawyer, would stop speaking, stop eating and die. “I couldn’t help my uncle,” Dr. Trumble said, but he was driven to understand the disease that killed him. He wondered: Do the Tsimane suffer from Alzheimer’s disease like we do? And if not, what can we learn from them about treating or preventing dementia? “There is really no cure yet for Alzheimer’s,” Dr. Trumble told me. “We have nothing that can undo the damage already done.” Why, he wondered, had billions of dollars and decades of research yielded so little? Perhaps major clues were being missed. Dr. Trumble was trained as an anthropologist, and his field — evolutionary medicine — taught him to see our surroundings as a blip in the timeline of human history. He thinks it’s a problem that medical research focuses almost exclusively on “people who live in cities like New York or L.A.” Scientists often refer to these places as “Weird” — Western, educated, industrialized, rich and democratic — and point out that our bodies are still designed for the not-Weird environment in which our species evolved. Yet we know almost nothing about how dementia affected humans during the 50,000 years before developments like antibiotics and mechanized farming. Studying the Tsimane, Dr. Trumble believes, could shed light on this modern plague. © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23839 - Posted: 07.14.2017

Brandie Jefferson It wasn't long ago that there were no treatments for multiple sclerosis. In the 1970s, some doctors used chemotherapy to treat the degenerative neurological disease. Since then, more than a dozen drugs have been developed or approved, including infusions, oral medications and self-administered shots. None of these are a magic bullet for a disease that can be disabling and deadly. But now there is a new drug, Ocrevus, that looks like a game-changer. It uses a novel approach to blocking the inflammation that drives the disease and looks as if it's spectacularly effective. It also costs $65,000 a year. I have MS. Should I take Ocrevus? That, I discovered, is not a simple question to answer. But because I'm an MS patient and a science journalist, I was determined to try to figure it out. In March, the FDA approved Ocrevus (ocrelizumab) for the treatment of relapsing-remitting multiple sclerosis, the most common form of the disease. People with RRMS tend to have flare-ups when their symptoms worsen, followed by periods of remission and, in some cases, a full or partial recovery. In two clinical trials sponsored by the drug's eventual manufacturer, F. Hoffmann-La Roche, RRMS patients who were given ocrelizumab had about 50 percent fewer relapses and up to 95 percent fewer new lesions on the brain and spinal cord than those who were given Rebif, a common therapy. MS is an autoimmune disease, meaning the body attacks itself. The body's nerve endings and the fatty tissue that coats them, called myelin, bear the brunt of the immune system's attacks. As a result, the central nervous system has difficulty communicating with the nerves, leading to a disease that manifests itself in different ways, such as pain, fatigue, disability and slurred speech. © 2017 npr

Keyword: Multiple Sclerosis
Link ID: 23838 - Posted: 07.14.2017

By Aaron Reuben, Jonathan Schaefer Most of us know at least one person who has struggled with a bout of debilitating mental illness. Despite their familiarity, however, these kinds of episodes are typically considered unusual, and even shameful. New research, from our lab and from others around the world, however, suggests mental illnesses are so common that almost everyone will develop at least one diagnosable mental disorder at some point in their lives. Most of these people will never receive treatment, and their relationships, job performance and life satisfaction will likely suffer. Meanwhile the few individuals who never seem to develop a disorder may offer psychology a new avenue of study, allowing researchers to ask what it takes to be abnormally, enduringly, mentally well. Epidemiologists have long known that, at any given point in time, roughly 20 to 25 percent of the population suffers from a mental illness, which means they experience psychological distress severe enough to impair functioning at work, school or in their relationships. Extensive national surveys, conducted from the mid-1990s through the early 2000s, suggested that a much higher percentage, close to half the population, would experience a mental illness at some point in their lives. These surveys were large, involving thousands of participants representative of the U.S. in age, sex, social class and ethnicity. They were also, however, retrospective, which means they relied on survey respondents’ accurate recollection of feelings and behaviors months, years and even decades in the past. Human memory is fallible, and modern science has demonstrated that people are notoriously inconsistent reporters about their own mental health history, leaving the final accuracy of these studies up for debate. Of further concern, up to a third of the people contacted by the national surveys failed to enroll in the studies. Follow-up tests suggested that these “nonresponders” tended to have worse mental health. © 2017 Scientific American

Keyword: Schizophrenia; Depression
Link ID: 23837 - Posted: 07.14.2017

Hannah Devlin Science correspondent Brash, brawny and keen to impose their will on anyone who enters their sphere of existence: the alpha male in action is unmistakable. Now scientists claim to have pinpointed the biological root of domineering behaviour. New research has located a brain circuit that, when activated in mice, transformed timid individuals into bold alpha mice that almost always prevailed in aggressive social encounters. In some cases, the social ranking of the subordinate mice soared after the scientists’ intervention, hinting that it might be possible to acquire “alphaness” simply by adopting the appropriate mental attitude. Or as Donald Trump might put it: “My whole life is about winning. I almost never lose.” Prof Hailan Hu, a neuroscientist at Zhejiang University in Hangzhou, China, who led the work said: “We stimulate this brain region and we can make lower ranked mice move up the social ladder.” The brain region, called the dorsal medial prefrontal cortex (dmPFC), was already known to light up during social interactions involving decisions about whether to be assertive or submissive with others. But brain imaging alone could not determine whether the circuit was ultimately controlling how people behave. The latest findings answer the question, showing that when the circuit was artificially switched on, low-ranking mice were immediately emboldened. “It’s not aggressiveness per se,” Hu said. “It increases their perseverance, motivational drive, grit.” © 2017 Guardian News and Media Limited

Keyword: Aggression; Sexual Behavior
Link ID: 23836 - Posted: 07.14.2017