Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By GARY TAUBES The first time the sugar industry felt compelled to “knock down reports that sugar is fattening,” as this newspaper put it, it was 1956. Papers had run a photograph of President Dwight D. Eisenhower sweetening his coffee with saccharin, with the news that his doctor had advised him to avoid sugar if he wanted to remain thin. The industry responded with a national advertising campaign based on what it believed to be solid science. The ads explained that there was no such thing as a “fattening food”: “All foods supply calories and there is no difference between the calories that come from sugar or steak or grapefruit or ice cream.” More than 60 years later, the sugar industry is still making the same argument, or at least paying researchers to do it for them. The stakes have changed, however, with a near tripling of the prevalence of obesity in the intervening decades and what the Centers for Disease Control and Prevention figures reveal to be an almost unimaginable 655 percent increase in the percentage of Americans with diabetes diagnoses. When it comes to weight gain, the sugar industry and purveyors of sugary beverages still insist, a calorie is a calorie, regardless of its source, so guidelines that single out sugar as a dietary evil are not evidence-based. Surprisingly, the scientific consensus is technically in agreement. It holds that obesity is caused “by a lack of energy balance,” as the National Institutes of Health website explains — in other words, by our taking in more calories than we expend. Hence, the primary, if not the only, way that foods can influence our body weight is through their caloric content. Another way to say this is that what we eat doesn’t matter; it’s only how much — just as the sugar industry would have us believe. A 2014 article in an American Diabetes Association journal phrased the situation this way: “There is no clear or convincing evidence that any dietary or added sugar has a unique or detrimental impact relative to any other source of calories on the development of obesity or diabetes.” © 2017 The New York Times Company
Link ID: 23105 - Posted: 01.14.2017
Charles Q. Choi Prions, the infectious agents best known for causing degenerative brain disorders such as ‘mad cow’ disease, may have been spotted in bacteria. A section of a protein in Clostridium botulinum, the microbe that causes botulism, can behave like a prion when it is inserted into yeast and Escherichia coli bacteria, researchers report in the 13 January issue of Science1. Prions are formed by proteins that can fold in a number of structurally distinct ways. A prion version of a protein can perpetuate itself in an infectious manner by converting normal forms of that protein into the prion version. Scientists first discovered prions in the 1980s as the agents behind fatal brain disorders known as transmissible spongiform encephalopathies. Since then, researchers have found the misfolded proteins in mammals, insects, worms, plants and fungi2, and learned that not all prions harm their hosts. But until now, prions were only seen in the cells of eukaryotic organisms, a group that includes animals, plants and fungi. In the latest study, researchers analysed roughly 60,000 bacterial genomes using software trained to recognize prion-forming proteins in yeast. They focused on a section of the bacterial protein Rho. In many bacteria, such as C. botulinum and E. coli, Rho is a global regulator of gene expression, meaning that it can control the activity of many genes. © 2017 Macmillan Publishers Limited,
Link ID: 23104 - Posted: 01.14.2017
Mi Zhang, David Mohr, Jingbo Meng Depression is the leading mental health issue on college campuses in the U.S. In 2015, a survey of more than 90,000 students at 108 American colleges and universities found that during the previous year, more than one-third of them had felt so depressed at some point that it was difficult to function. More than two-thirds had felt hopeless in the preceding academic year. Today’s college students are dealing with depression at an alarmingly high rate, and are increasingly seeking help from on-campus mental health services. Depression is also an underlying cause of other common problems on college campuses, including alcohol and substance abuse, eating disorders, self-injury, suicide and dropping out of school. But university counseling centers, the primary sources for students to get mental health care, are struggling to meet this rising demand. First, it can take a long time for clinicians to gain a full picture of what students are experiencing: Depressed students’ accounts of their symptoms are often inaccurate and incomplete. In addition, budget constraints and limited office hours mean the number of clinicians on campus has not grown, and in some cases has shrunk, despite increasing demand. There simply are not enough university clinicians available to serve every student – and few, if any, at critical times like nights and weekends. The number of students on counseling waiting lists doubled from 2010 to 2012. This can leave students waiting long periods without help. In the worst cases, this can have lifelong – or life-ending – consequences. Using mobile technology for mental illness diagnosis and treatment is becoming a hot research topic nowadays because of the pervasiveness of mobile devices and their behavior-tracking capabilities. Building on others’ work, we have found a way to enhance counseling services with mobile technology and big data analytics. It can help students and clinicians alike, by offering a new tool for assessing depression that may shed increased light on a condition that is challenging to study. © 2010–2017, The Conversation US, Inc.
Link ID: 23103 - Posted: 01.14.2017
Jonathan Sadowsky Carrie Fisher’s ashes are in an urn designed to look like a Prozac pill. It’s fitting that in death she continues to be both brash and wryly funny about a treatment for depression. The public grief over Carrie Fisher’s death was not only for an actress who played one of the most iconic roles in film history. It was also for one who spoke with wit and courage about her struggle with mental illness. In a way, the fearless General Leia Organa on screen was not much of an act. Carrie Fisher at a screening of ‘Catastrophe’ at the Tribeca Film Festival in April 2016. PBG/AAD/STAR MAX/IPx via AP Fisher’s bravery, though, was not just in fighting the stigma of her illness, but also in declaring in her memoir “Shockaholic” her voluntary use of a stigmatized treatment: electroconvulsive therapy (ECT), often known as shock treatment. Many critics have portrayed ECT as a form of medical abuse, and depictions in film and television are usually scary. Yet many psychiatrists, and more importantly, patients, consider it to be a safe and effective treatment for severe depression and bipolar disorder. Few medical treatments have such disparate images. I am a historian of psychiatry, and I have published a book on the history of ECT. I had, like many people, been exposed only to the frightening images of ECT, and I grew interested in the history of the treatment after learning how many clinicians and patients consider it a valuable treatment. My book asks the question: Why has this treatment been so controversial? © 2010–2017, The Conversation US, Inc.
Link ID: 23102 - Posted: 01.14.2017
By LISA SANDERS, M.D. “You don’t look well,” the man at the gas station told the older woman in the car. He’d known her for years, always thinking of her as a lively, robust woman. But that day she looked pale and tired. Her sharp blue eyes seemed dim. She gave a feeble smile. “I don’t feel well at all,” she told him. There’s an urgent-care clinic just up the street, he said. Could she make it there? She was nearly 45 minutes away from her home in Halifax, Nova Scotia. Stopping just up the street seemed a much better option. At the clinic, the doctor took one look at her, put a blood pressure cuff around her arm and told her assistant to call an ambulance. The rest of the day was a blur. The woman remembers being bundled onto a stretcher and one of the E.M.T.s saying her blood pressure was very low. It was an odd thing to hear, because her blood pressure was usually high enough to require three medications. She was taken to the emergency room at the Queen Elizabeth II Health Sciences Center in Halifax. She remembers being fussed over — having blood drawn, receiving intravenous fluids, feeling sticky snaps being placed on her chest that connected her to a continuous heart monitor. She had been a nurse for many years when she was younger, yet seeing herself at the center of these familiar activities was strange. A blood test indicated that there may have been damage to her heart. The doctor told her she was having a heart attack, she recalls. You’ve got the wrong patient, she thought to herself. Sure, she had a little high blood pressure, a little asthma, a little back pain. But problems with her heart? Never. The patient used a cane, but she had no difficulty getting up on the exam table — an important test of mobility. © 2017 The New York Times Company
Keyword: Hormones & Behavior
Link ID: 23101 - Posted: 01.14.2017
Susan Milius NEW ORLEANS — The self-cleaning marvel known as earwax may turn the dust particles it traps into agents of their own disposal. Earwax, secreted in the ear canal, protects ears from building up dunes of debris from particles wafting through the air. The wax creates a sticky particle-trapper inside the canal, explained Zac Zachow January 6 at the annual meeting of the Society of Integrative and Comparative Biology. The goo coats hairs and haphazardly pastes them into a loose net. Then, by a process not yet fully understood, bits of particle-dirtied wax leave the ear, taking their burden of debris with them. Earwax may accomplish such a feat because trapping more and more dust turns it from gooey to crumbly, Zachow said. Working with Alexis Noel in David Hu’s lab at Georgia Tech in Atlanta, he filmed a rough demonstration of this idea: Mixing flour into a gob of pig’s earwax eventually turned the lump from stickier to drier, with crumbs fraying away at the edges. Jaw motions might help shake loose these crumbs, Zachow said. A video inside the ear of someone eating a doughnut showed earwax bucking and shifting. This dust-to-crumb scenario needs more testing, but Noel points out that earwax might someday inspire new ways of reducing dust buildup in machinery such as home air-filtration systems. Z. Zachow, A. Noel and D.L. Hu. Earwax has properties like paint, enabling self-cleaning. Annual meeting of the Society for Integrative and Comparative Biology, New Orleans, January 6, 2017. © Society for Science & the Public 2000 - 2017
Link ID: 23100 - Posted: 01.14.2017
Jon Hamilton Mice that kill at the flip of a switch may reveal how hunting behavior evolved hundreds of millions of years ago. The mice became aggressive predators when two sets of neurons in the amygdala were activated with laser light, a team reported Thursday in the journal Cell. "The animals become very efficient in hunting," says Ivan de Araujo, an associate professor of psychiatry at Yale University and an associate fellow at The John B. Pierce Laboratory in New Haven. "They pursue the prey [a live cricket] faster and they are more capable of capturing and killing it." Activating the neurons even caused the mice to attack inanimate objects, including sticks, bottle caps and an insectlike toy. "The animals intensively bite the toy and use their forepaws in an attempt to kill it," De Araujo says. But the aggressive behavior is reserved for prey. Mice didn't attack each other, even when both sets of neurons were activated. The results hint at how the brain changed hundreds of millions of years ago when the first animals with jaws began to appear. This new ability to pursue and kill prey "must have influenced the way the brain is wired up in a major way," De Araujo says. Specifically, the brain needed to develop hunting circuits that would precisely coordinate the movements of a predator's jaw and neck. "This is a very complex and demanding task," De Araujo says. © 2017 npr
Link ID: 23099 - Posted: 01.13.2017
Bruce Bower Marijuana’s medical promise deserves closer, better-funded scientific scrutiny, a new state-of-the-science report concludes. The report, released January 12 by the National Academies of Sciences, Engineering and Medicine in Washington, D.C., calls for expanding research on potential medical applications of cannabis and its products, including marijuana and chemical components called cannabinoids. Big gaps in knowledge remain about health effects of cannabis use, for good or ill. Efforts to study these effects are hampered by federal classification of cannabis as a Schedule 1 drug, meaning it has no accepted medical use and a high potential for abuse. Schedule 1 status makes it difficult for researchers to access cannabis. The new report recommends reclassifying the substance to make it easier to study. Recommendations from the 16-member committee that authored the report come at a time of heightened acceptance of marijuana and related substances. Cannabis is a legal medical treatment in 28 states and the District of Columbia. Recreational pot use is legal in eight of those states and the District. “The legalization and commercialization of cannabis has allowed marketing to get ahead of science,” says Raul Gonzalez, a psychologist at Florida International University in Miami who reviewed the report before publication. While the report highlights possible medical benefits, Gonzalez notes that it also underscores negative consequences of regular cannabis use. These include certain respiratory and psychological problems. |© Society for Science & the Public 2000 - 2017.
Alison Abbott Bats have brain cells that keep track of their angle and distance to a target, researchers have discovered. The neurons, called ‘vector cells’, are a key piece of the mammalian’s brain complex navigation system — and something that neuroscientists have been seeking for years. Our brain’s navigation system has many types of cells, but a lot of them seem designed to keep track of where we are. Researchers know of ‘place’ cells, for example, which fire when animals are in a particular location, and ‘head direction’ cells that fire in response to changes in the direction the head is facing. Bats also have a kind of neuronal compass that enables them to orient themselves as they fly. The vector cells, by contrast, keep spatial track of where we are going. They are in the brain’s hippocampus, which is also where ‘place’ and ‘head-direction’ cells were discovered. That’s a surprise, considering how well this area has been studied by researchers, says Nachum Ulanovsky, who led the team at the Weizmann Institute of Science in Rehovot, Israel, that discovered the new cells. His team published their findings in Science on 12 January1. Finding the cells "was one of those very rare discovery moments in a researcher’s life,” says Ulanovsky. “My heart raced, I started jumping around.” The trick to finding them was a simple matter of experimental design, he says. © 2017 Macmillan Publishers Limited
By Virginia Morell Only three known species go through menopause: killer whales, short-finned pilot whales, and humans. Two years ago, scientists suggested whales do this to focus their attention on the survival of their families rather than on birthing more offspring. But now this same team reports there’s another—and darker—reason: Older females enter menopause because their eldest daughters begin having calves, leading to fights over resources. The findings might also apply to humans, the scientists say. “What an interesting paper,” says Phyllis Lee, a behavioral ecologist at the University of Stirling in the United Kingdom, who was not involved in the study. “It brings two perspectives on menopause neatly together, and provides an elegant model for its rarity.” The new work came about when Darren Croft, a behavioral ecologist at the University of Exeter in the United Kingdom, and his colleagues looked back on their 2015 killer whale menopause study. “That showed how they helped and why they lived so long after menopause, but it didn’t explain why they stop reproducing,” he says, noting that in other species, such as elephants, older females also share wisdom and knowledge with their daughters, but continue to have calves. © 2017 American Association for the Advancement of Science.
By Peter Godfrey-Smith Adapted from Other Minds: The Octopus, the Sea and the Deep Origins of Consciousness, by Peter Godfrey-Smith. Copyright © 2016 by Peter Godfrey-Smith. Someone is watching you, intently, but you can't see them. Then you notice, drawn somehow by their eyes. You're amid a sponge garden, the seafloor scattered with shrublike clumps of bright orange sponge. Tangled in one of these sponges and the gray-green seaweed around it is an animal about the size of a cat. Its body seems to be everywhere and nowhere. The only parts you can keep a fix on are a small head and the two eyes. As you make your way around the sponge, so, too, do those eyes, keeping their distance, keeping part of the sponge between the two of you. The creature's color perfectly matches the seaweed, except that some of its skin is folded into tiny, towerlike peaks with tips that match the orange of the sponge. Eventually it raises its head high, then rockets away under jet propulsion. A second meeting with an octopus: this one is in a den. Shells are strewn in front, arranged with some pieces of old glass. You stop in front of its house, and the two of you look at each other. This one is small, about the size of a tennis ball. You reach forward a hand and stretch out one finger, and one octopus arm slowly uncoils and comes out to touch you. The suckers grab your skin, and the hold is disconcertingly tight. It tugs your finger, tasting it as it pulls you gently in. The arm is packed with sensors, hundreds of them in each of the dozens of suckers. The arm itself is alive with neurons, a nest of nervous activity. Behind the arm, large round eyes watch you the whole time. © 2017 Scientific American
Parkinson’s disease, a chronic, progressive movement disorder characterized by tremors and stiffness, is not considered a fatal disease in and of itself, though it may reduce life expectancy by a modest amount. It is often said that people die “with” Parkinson’s rather than “of” the disease. “People who are healthy when diagnosed will generally live about as long as other people in their age cohort,” said James Beck, the vice president for scientific affairs at the Parkinson’s Disease Foundation, which is involved in research, education and advocacy. “It is not a death sentence.” Since Parkinson’s generally affects people later in life — patients are typically given a diagnosis in their 60s — patients often die of unrelated age-related diseases like cancer, heart disease or stroke. But the most common cause of death in those with Parkinson’s is pneumonia, because the disease impairs patients’ ability to swallow, putting them at risk for inhaling or aspirating food or liquids into their lungs, leading to aspiration pneumonia. Since Parkinson’s also impairs mobility and balance, those with the disease are also at high risk for falls and accidents, which can trigger a cascade of medical problems, including being bedridden and developing pneumonia, Dr. Beck said. In its advanced stages, the disease can make walking and talking difficult and cause other problems not related to movement, including cognitive impairment. Patients often cannot care for themselves and need assistance carrying out simple activities of daily living. One long-term study followed a group of 142 Parkinson’s patients after they were given their diagnosis; their mean age at diagnosis was around 70. The researchers found that 23 percent were generally doing well 10 years later, meaning they could maintain their balance and did not have dementia. But over half of the patients in the original group had died, with the most common cause related to Parkinson’s being pneumonia. © 2017 The New York Times Company
Link ID: 23094 - Posted: 01.13.2017
By Anthony Warner Other things being equal, you’d think the strongest influence on expanding midriffs might be fizzy drinks or fried food. But a study out yesterday reinforces the growing idea that poverty is a bigger factor. It found socio-economic status offered the best explanation for greater weight gain when comparing people in the UK with the same genetic vulnerability to obesity (International Journal of Epidemiology, DOI: 10.1093/ije/dyw337). Mounting evidence of poverty’s role in this health crisis makes even more repulsive the rise in vile and deeply offensive prejudice based solely on a failure to fit with the physical ideals of privileged society. This is no longer just about random acts of unkindness. It is everywhere. These views were aired without challenge at a large food and health conference recently. I heard open expression of the idea that obese people should be banned from working in the public sector or that food prices should be increased to force poorer people to eat less. This is the respectable face of prejudice and it has crept into just about every walk of life, stoked by extreme media commentators. It risks creating bigger divides within already fragmented societies. In countries battling obesity, such vitriol extends to repeated talk of denying access to healthcare. It seems this prejudice is OK if its intention is to help people lose weight and often portrays them as slovenly, lazy, lacking self-control, a drain on our health system and morally weak. © Copyright Reed Business Information Ltd.
Link ID: 23093 - Posted: 01.13.2017
By Amy Ellis Nutt Martin M. Katz might never have begun his groundbreaking scientific career were it not for a quirk in his vision: He was colorblind. As a budding chemist in college, that flaw forced him to reconsider his options. The result, eventually, was a PhD in psychology from the University of Texas in 1955. He went on to become a key figure in neuropsychopharmacology. Katz, who died Jan. 12 at age 89, spent more than two decades at the National Institute of Mental Health. Among his accomplishments: In a multi-institutional collaborative project at NIMH, developing a behavioral methodology to study the effects of new antidepressant drugs; designing the Katz Adjustment Scales, which created an easy-to-use checkoff method for laypeople to observe and measure over time the symptoms of mentally ill patients and track their behavioral changes from treatment; and creating the multivantage model of measurement, which insisted on the necessity of assessing patient, family, and professional views of patient symptoms and experience. The Post spoke with Katz last month. Q: You’ve said you think a lot of your success was fortuitous. How so? A: I was looking for a job in California [after graduate school], but I didn’t want to do clinical work. That was my problem. So I went back to Texas to do a postdoc. A woman who was the dean of the school was experimenting with nutrition of underfed Latino kids in Texas schools. She wanted to get a psychometric background on these kids. That was really the beginning of my career.
Link ID: 23092 - Posted: 01.13.2017
Russell Poldrack Sex, Lies, and Brain Scans: How fMRI Reveals What Really Goes on in our Minds Barbara J. Sahakian & Julia Gottwald Oxford University Press: 2017. Since its 1992 debut, functional magnetic resonance imaging (fMRI) has revolutionized our ability to view the human brain in action and understand the processes that underlie mental functions such as decision-making. As brain-imaging technologies have grown more powerful, their influence has seeped from the laboratory into the real world. In Sex, Lies, and Brain Scans, clinical neuropsychologist Barbara Sahakian and neuroscientist Julia Gottwald give a whistle-stop tour of some ways in which neuroimaging has begun to affect our views on human behaviour and society. Their discussion balances a rightful enthusiasm for fMRI with a sober appreciation of its limitations and risks. After the obligatory introduction to fMRI, which measures blood oxygenation to image neural activity, Sahakian and Gottwald address a question at the heart of neuroimaging: can it read minds? The answer largely depends on one's definition of mind-reading. As the authors outline, in recent years fMRI data have been used to decode the contents of thoughts (such as words viewed by a study participant) and mental states (such as a person's intention to carry out an action), even in sleep. These methods don't yet enable researchers to decode the 'language of thought', which is what mind-reading connotes for many. But given the growing use of advanced machine-learning methods such as deep neural networks to analyse neuroimaging data, that may just be a matter of time. © 2017 Macmillan Publishers Limited
Keyword: Brain imaging
Link ID: 23091 - Posted: 01.13.2017
Rachel Ehrenberg A protein that sounds the alarm when the body encounters something painful also helps put out the fire. Called Nav1.7, the protein sits on pain-sensing nerves and has long been known for sending a red alert to the brain when the body has a brush with pain. Now, experiments in rodent cells reveal another role for Nav1.7: Its activity triggers the production of pain-relieving molecules. The study, published online January 10 in Science Signaling, suggests a new approach to pain management that takes advantage of this protein’s dual role. “This is very interesting research,” says neuroscientist Munmun Chattopadhyay of Texas Tech University Health Sciences Center El Paso. The findings suggest that when opiates are given for certain kinds of pain relief, also targeting Nav1.7 might lessen the need for those pain relievers, Chattopadhyay says. That could reduce opiate use and their associated side effects. The new research also solves a puzzle that has frustrated researchers and pharmaceutical companies alike. People with rare mutations in the gene for making Nav1.7 feel no pain at all. That discovery, made more than a decade ago, suggested that Nav1.7 was an ideal target for controlling pain. If a drug could block Nav1.7 activity, some kinds of pain might be eradicated (SN: 6/30/12, p 22). Yet drugs designed to do just that didn’t wipe out people’s pain. “It seemed so obvious and simple,” says study leader Tim Hucho, a neuroscientist at the University Hospital Cologne in Germany. “But it was not so simple.” |© Society for Science & the Public 2000 - 2017
Keyword: Pain & Touch
Link ID: 23090 - Posted: 01.12.2017
By Tanya Lewis To the untrained listener, a bunch of babbling baboons may not sound like much. But sharp-eared experts have now found that our primate cousins can actually produce humanlike vowel sounds. The finding suggests the last common ancestor of humans and baboons may have possessed the vocal machinery for speech—hinting at a much earlier origin for language than previously thought. Researchers from the National Center for Scientific Research (CNRS) and Grenoble Alpes University, both in France, and their colleagues recorded baboons in captivity, finding the animals were capable of producing five distinct sounds that have the same characteristic frequencies as human vowels. As reported today in PLoS ONE, the animals could make these sounds despite the fact that, as dissections later revealed, they possess high voice boxes, or larynxes, an anatomical feature long thought to be an impediment to speech. “This breaks a serious logjam” in the study of language, says study co-author Thomas Sawallis, a linguist at the University of Alabama. “Theories of language evolution have developed based on the idea that full speech was only available to anatomically modern Homo sapiens,” approximately 70,000 to 100,000 years ago, he says, but in fact, “we could have had the beginnings of speech 25 million years ago.” The evolution of language is considered one of the hardest problems in science, because the process left no fossil evidence behind. One practical approach, however, is to study the mechanics of speech. Language consists roughly of different combinations of vowels and consonants. Notably, humans possess low larynxes, which makes it easier to produce a wide range of vowel sounds (and as Darwin observed, also makes it easier for us to choke on food). A foundational theory of speech production, developed by Brown University cognitive scientist Philip Lieberman in the 1960s, states the high larynxes and thus shorter vocal tracts of most nonhuman primates prevents them from producing vowel-like sounds. Yet recent research calls Lieberman’s hypothesis into question. © 2017 Scientific American
By Ashley P. Taylor Neurodegenerative diseases are often associated with aging. To learn what happens within the aging brain and potentially gain information relevant to human health, researchers examined gene-expression patterns in postmortem brain samples. Overall, the researchers found, gene expression of glial cells changed more with age than did that of neurons. These gene-expression changes were most significant in the hippocampus and substantia nigra, regions damaged in Alzheimer’s and Parkinson’s diseases, respectively, according to the study published today (January 10) in Cell Reports. “Typically we have concentrated on neurons for studies of dementia, as they are the cells involved in brain processing and memories. [This] study demonstrates that glia are likely to be equally important,” study coauthors Jernej Ule and Rickie Patani of the Francis Crick Institute and University College London wrote in an email to The Scientist. “The authors’ effort in this comprehensive work is a ‘genomic tour de force,’ showing that, overall, non-neuronal cells undergo gene expression changes at a larger scale than previously thought in aging,” Andras Lakatos, a neuroscientist at the University of Cambridge, U.K., who was not involved in the study, wrote in an email. “This finding puts glial cells again at the center stage of functional importance in neurodegenerative conditions in which aging carries a proven risk.” © 1986-2017 The Scientist
Being stressed out increases our risk of heart disease and stroke, and the key to how to counter it may lie in calming the brain, a new medical study suggests. Psychological stress has long been considered a source of sickness. But personal stress levels are difficult to measure and there isn't direct evidence of the link, even though population studies finger stress as a risk factor for cardiovascular disease just like smoking and hypertension. "I think that this relatively vague or insufficient link reduced our enthusiasm of taking stress seriously as an important risk factor," said Dr. Ahmed Tawakol, a cardiologist at Massachusetts General Hospital in Boston. Tawakol led a study published in Wednesday's online issue of The Lancet that sheds light on how the amygdala — a key part of the brain that is more active during emotional, stressful times — is linked to a greater risk of cardiovascular disease such as heart attacks and strokes. The researchers gave 293 patients aged 30 or older without cardiovascular disease PET/CT brain imaging scans, mainly for cancer screening and followed them over time. After an average of nearly four years, activity in the amygdala was significantly associated with cardiovascular events such as heart attacks, heart failure and strokes, after taking other factors into account. People with more amygdala activity also tended to suffer the events sooner, Tawakol said. ©2017 CBC/Radio-Canada.
Link ID: 23087 - Posted: 01.12.2017
Sarah Boseley Health editor No new drugs for depression are likely in the next decade, even though those such as Prozac work for little more than half of those treated and there have been concerns over their side-effects, say scientists. Leading psychiatrists, some of whom have been involved in drug development, say criticism of the antidepressants of the Prozac class, called the SSRIs (selective serotonin reuptake inhibitors), is partly responsible for the pharmaceutical industry’s reluctance to invest in new drugs – even though demand is steadily rising. But the main reason, said Guy Goodwin, professor of psychiatry at Oxford University, is that the the NHS and healthcare providers in other countries do not want to pay the bill for new drugs that will have to go through expensive trials. The antidepressants that GPs currently prescribe work for only about 58% of people, but they are cheap because they are out of patent. Why 'big pharma' stopped searching for the next Prozac Pharma giants have cut research on psychiatric medicine by 70% in 10 years, so where will the next ‘wonder drug’ come from? “We are not going to get any more new drugs for depression in the next decade simply because the pharmaceutical industry is not investing in research,” said Goodwin. “It can’t make money on these drugs. It costs approximately $1bn to do all the trials before you launch a new drug. “There is also a failure of the science. It has to get more understanding of how these things work before they can improve them.” © 2017 Guardian News and Media Limited
Link ID: 23086 - Posted: 01.12.2017