Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 20063

By Nathan Seppa Ask anybody — stress is bad news. The negative view of stress has been expressed so consistently that the concept is now built into our vernacular, which is spiced with advice on avoiding it: Take it easy. Calm down. Chill. Of course, a good case of stress comes in handy during an encounter with a grizzly bear on a hiking trail. In that situation, a stress reaction delivers a burst of hormones that revs up the heart and sharpens attention. This automatic response has served humans well throughout evolution, improving our odds of seeing another day. Problems arise, however, when stress becomes a feature of daily life. Chronic stress is the kind that comes from recurring pain, post-traumatic memories, unemployment, family tension, poverty, childhood abuse, caring for a sick spouse or just living in a sketchy neighborhood. Nonstop, low-grade stress contributes directly to physical deterioration, adding to the risk of heart attack, stroke, infection and asthma. Even recovery from cancer becomes harder. Scientists have now identified many of the biological factors linking stress to these medical problems. The evidence centers on nagging inflammation and genetic twists that steer cells off a healthy course, resulting in immune changes that allow ailments to take hold or worsen. Despite the bad rap stress has acquired throughout history, researchers have only recently been able to convince others that it’s dangerous. “It’s taken much more seriously now,” says Janice Kiecolt-Glaser, a clinical psychologist at Ohio State University in Columbus. “In the 1980s, we were still in the dark ages on this stuff.” © Society for Science & the Public 2000 - 2015

Keyword: Stress; Neuroimmunology
Link ID: 20599 - Posted: 02.21.2015

By Elizabeth Pennisi Researchers have increased the size of mouse brains by giving the rodents a piece of human DNA that controls gene activity. The work provides some of the strongest genetic evidence yet for how the human intellect surpassed those of all other apes. "[The DNA] could easily be a huge component in how the human brain expanded," says Mary Ann Raghanti, a biological anthropologist at Kent State University in Ohio, who was not involved with the work. "It opens up a whole world of possibilities about brain evolution." For centuries, biologists have wondered what made humans human. Once the human and chimp genomes were deciphered about a decade ago, they realized they could now begin to pinpoint the molecular underpinnings of our big brain, bipedalism, varied diet, and other traits that have made our species so successful. By 2008, almost two dozen computerized comparisons of human and ape genomes had come up with hundreds of pieces of DNA that might be important. But rarely have researchers taken the next steps to try to prove that a piece of DNA really made a difference in human evolution. "You could imagine [their roles], but they were just sort of 'just so' stories,” says Greg Wray, an evolutionary biologist at Duke University in Durham, North Carolina. Wray is particularly interested in DNA segments called enhancers, which control the activity of genes nearby. He and Duke graduate student Lomax Boyd scanned the genomic databases and combed the scientific literature for enhancers that were different between humans and chimps and that were near genes that play a role in the brain. Out of more than 100 candidates, they and Duke developmental neurobiologist Debra Silver tested a half-dozen. They first inserted each enhancer into embryonic mice to learn whether it really did turn genes on. Then for HARE5, the most active enhancer in an area of the brain called the cortex, they made minigenes containing either the chimp or human version of the enhancer linked to a “reporter” gene that caused the developing mouse embryo to turn blue wherever the enhancer turned the gene on. Embryos’ developing brains turned blue sooner and over a broader expanse if they carried the human version of the enhancer, Silver, Wray, and their colleagues report online today in Current Biology. © 2015 American Association for the Advancement of Science

Keyword: Development of the Brain; Genes & Behavior
Link ID: 20598 - Posted: 02.21.2015

by Sarah Zielinski No one would be shocked to find play behavior in a mammal species. Humans love to play — as do our cats and dogs. It’s not such a leap to believe that, say, a red kangaroo would engage in mock fights. But somehow that behavior seems unlikely in animals other than mammals. It shouldn’t, though. Researchers have documented play behavior in an astonishing range of animals, from insects to birds to mammals. The purpose of such activities isn’t always clear — and not all scientists are convinced that play even exists — but play may help creatures establish social bonds or learn new skills. Here are five non-mammals you may be surprised to find hard at play: Crocodilians Alligators and crocodiles might seem more interested in lurking near the water and chomping on their latest meal, but these frightening reptiles engage in play, Vladimir Dinets of the University of Tennessee in Knoxville reports in the February Animal Behavior and Cognition. Dinets combined 3,000 hours of observations of wild and captive crocodilians with published reports and information gathered from other people who work with the animals. He found examples of all three types of play: Locomotor play: This is movement without any apparent reason or stimulus. Young, captive American alligators, for instance, have been spotted sliding down slopes of water over and over. And a 2.5-meter-long crocodile was seen surfing the waves near a beach in Australia. Object play: Animals like toys, too. A Cuban crocodile at a Miami zoo picked up and pushed around flowers floating in its pool for several days of observation. And like a cat playing with a mouse, a Nile crocodile was photographed as it repeatedly threw a dead hippo into the air. Object play is recognized as so important to crocodilian life “that many zoo caretakers now provide various objects as toys for crocodilians as part of habitat enrichment programs,” Dinets notes. © Society for Science & the Public 2000 - 2015.

Keyword: Development of the Brain
Link ID: 20597 - Posted: 02.21.2015

Maanvi Singh Your tongue doubtless knows the difference between a high-fat food and the low-fat alternative. Full-fat ice cream and cream cheese feel silkier and more sumptuous. Burgers made with fatty meat are typically juicer than burgers made with lean meat. OK, so, we've long known fat gives food a desirable texture. But some scientists are now making the case that we should also think of fat as the sixth primary taste, along with sweet, salt, sour, bitter and umami. Early in February, researchers from Deakin University in Australia published a paper in the journal Flavour arguing that "the next 5 to 10 years should reveal, conclusively, whether fat can be classified as the sixth taste." So what would it take for fat to become an official taste? "Strictly speaking, taste is a chemical function," Russell Keast, a sensory scientist at Deakin and lead author of the paper, tells The Salt. He says that when a chemical substance – a salt or sugar crystal, for example — comes into contact with sensory cells in our mouths, it triggers a series of reactions. The cells in our mouths tell other nerve cells that they're perceiving something sweet or salty and those nerve cells eventually pass this information on to the brain. According to the paper, there are five criteria that need to be met to call something a primary taste. It starts with a chemical stimuli (like sugar or salt), which then trigger specific receptors on our taste buds. Then, there has to be a viable a pathway between these receptors and our brains, and we've got to be able to perceive and process the taste in the brain. And finally, this whole process has to trigger downstream effects in the body. © 2015 NPR

Keyword: Chemical Senses (Smell & Taste)
Link ID: 20596 - Posted: 02.21.2015

By Christie Aschwanden Paul Offit likes to tell a story about how his wife, pediatrician Bonnie Offit, was about to give a child a vaccination when the kid was struck by a seizure. Had she given the injection a minute sooner, Paul Offit says, it would surely have appeared as though the vaccine had caused the seizure and probably no study in the world would have convinced the parent otherwise. (The Offits have such studies at the ready — Paul is the director of the Vaccine Education Center at the Children’s Hospital of Philadelphia and author of “Deadly Choices: How the Anti-Vaccine Movement Threatens Us All.”) Indeed, famous anti-vaxxer Jenny McCarthy has said her son’s autism and seizures are linked to “so many shots” because vaccinations preceded his symptoms. But, as Offit’s story suggests, the fact that a child became sick after a vaccine is not strong evidence that the immunization was to blame. Psychologists have a name for the cognitive bias that makes us prone to assigning a causal relationship to two events simply because they happened one after the other: the “illusion of causality.” A study recently published in the British Journal of Psychology investigates how this illusion influences the way we process new information. Its finding: Causal illusions don’t just cement erroneous ideas in the mind; they can also prevent new information from correcting them. Helena Matute, a psychologist at Deusto University in Bilbao, Spain, and her colleagues enlisted 147 college students to take part in a computer-based task in which they each played a doctor who specializes in a fictitious rare disease and assessed whether new medications could cure it. ©2015 ESPN Internet Ventures.

Keyword: Attention; Emotions
Link ID: 20595 - Posted: 02.19.2015

Tom Stafford Trusting your instincts may help you to make better decisions than thinking hard, a study suggests. It is a common misconception that we know our own minds. As I move around the world, walking and talking, I experience myself thinking thoughts. "What shall I have for lunch?", I ask myself. Or I think, "I wonder why she did that?" and try and figure it out. It is natural to assume that this experience of myself is a complete report of my mind. It is natural, but wrong. There's an under-mind, all psychologists agree – an unconscious which does a lot of the heavy lifting in the process of thinking. If I ask myself what is the capital of France the answer just comes to mind – Paris! If I decide to wiggle my fingers, they move back and forth in a complex pattern that I didn't consciously prepare, but which was delivered for my use by the unconscious. The big debate in psychology is exactly what is done by the unconscious, and what requires conscious thought. Or to use the title of a notable paper on the topic, 'Is the unconscious smart or dumb?' One popular view is that the unconscious can prepare simple stimulus-response actions, deliver basic facts, recognise objects and carry out practised movements. Complex cognition involving planning, logical reasoning and combining ideas, on the other hand, requires conscious thought. A recent experiment by a team from Israel scores points against this position. Ran Hassin and colleagues used a neat visual trick called Continuous Flash Suppression to put information into participants’ minds without them becoming consciously aware of it.

Keyword: Attention
Link ID: 20594 - Posted: 02.19.2015

by Catherine Lawson Over the last six years Adam Gazzaley's research has undergone a transformation. He's moved from studying how the brain works, to studying the brain as it ages, then into the domain of applying methodology he's developed to improve the brain's functions. At WIRED Health 2015 he'll outline his vision of the future, one where "we're thinking about software and hardware as medicine". In particular, Gazzaley plans to talk to the WIRED Health audience about video games "that are custom-designed to challenge the brain in a very particular way". Gazzaley's team at University of California, San Francisco previously demonstrated that a custom-designed video game can be highly effective in treating a specific cognitive deficit. They developed NeuroRacer, a driving game aimed at improving multi-tasking skills in older people. The success of NeuroRacer propelled Gazzaley into new partnerships, giving him access to resources that further advance his games development program into areas like motion capture and virtual reality. He's excited about coupling his games with mobile devices that will allow them to function outside the lab. Gazzaley will talk about four new games he's working on, in particular a meditation-inspired one. Meditrain is the product of his collaboration with Buddhist author and teacher Jack Kornfield. Developed for the iPad, he hopes to demonstrate part of it at WIRED Health.

Keyword: Learning & Memory
Link ID: 20593 - Posted: 02.19.2015

Boer Deng Smoking marijuana may stoke a yearning for crisps, but understanding how it affects hunger is relevant not just to those who indulge in it. The drug has yielded a ripe target for scientists who seek to stimulate or suppress appetite: the receptor CB1, found in cells throughout the body. When activated by the anti-nausea drug dronabinol — which is also a component of marijuana (Cannabis sativa) — CB1 prompts the release of hunger-promoting hormones1. And suppressing its activity is thought to aid in weight loss2. But the mechanism by which the receptor kills or kindles appetite is not entirely understood. Now neuroscientist Tamas Horvath, of Yale University in New Haven, and colleagues report in Nature that nerve cells called pro-opiomelanocortin (POMC) neurons play a key role in this process3. POMC had generally been thought to promote satiation, but Horvath's team found that POMC neurons in the brain release not just a hunger-suppressing hormone, but also one that promotes appetite. Which hormone is secreted is regulated by a protein in the cells' mitochondria, structures that regulate energy levels. When the CB1 receptor is activated, this mitochondrial protein induces POMC to switch from secreting the substance that suppresses gorging to one that encourages it. The finding is intriguing, says Uberto Pagotto, a neuroscientist at the University of Bologna who has studied cannabinoids for many years. “It gives us a different starting point to look at CB1 receptors and the mitochondria,” he says. © 2015 Nature Publishing Group

Keyword: Drug Abuse; Obesity
Link ID: 20592 - Posted: 02.18.2015

Catherine Brahic THE nature versus nurture debate is getting a facelift this week, with the publication of a genetic map that promises to tell us which bits of us are set in stone by our DNA, and which bits we can affect by how we live our lives. The new "epigenomic" map doesn't just look at genes, but also the instructions that govern them. Compiled by a consortium of biologists and computer scientists, this information will allow doctors to pinpoint precisely which cells in the body are responsible for various diseases. It might also reveal how to adjust your lifestyle to counter a genetic predisposition to a particular disease. "The epigenome is the additional information our cells have on top of genetic information," says lead researcher Manolis Kellis of the Massachusetts Institute of Technology. It is made of chemical tags that are attached to DNA and its packaging. These tags act like genetic controllers, influencing whether a gene is switched on or off, and play an instrumental role in shaping our bodies and disease. Researchers are still figuring out exactly how and when epigenetic tags are added to our DNA, but the process appears to depend on environmental cues. We inherit some tags from our parents, but what a mother eats during pregnancy, for instance, might also change her baby's epigenome. Others tags relate to the environment we are exposed to as children and adults. "The epigenome sits in a very special place between nature and nurture," says Kellis. Each cell type in our body has a different epigenome – in fact, the DNA tags are the reason why our cells come in such different shapes and sizes despite having exactly the same DNA. So for its map, the Roadmap Epigenomics Consortium collected thousands of cells from different adult and embryonic tissues, and meticulously analysed all the tags. © Copyright Reed Business Information Ltd.

Keyword: Epigenetics; Genes & Behavior
Link ID: 20591 - Posted: 02.18.2015

By Abigail Zuger, M.D. I had intended to discuss President Obama’s plans for personalized precision medicine with my patient Barbara last week, but she missed her appointment. Or, more accurately, she arrived two hours late, made the usual giant fuss at the reception desk and had to be rescheduled. I was disappointed. Barbara has some insight into the vortex of her own complications, and I thought she might help organize my thoughts. Mr. Obama announced last month that his new budget included $215 million toward the creation of a national databank of medical information, intended to associate specific gene patterns with various diseases and to predict what genetic, lifestyle and environmental factors correlate with successful treatment. Once all those relationships are clarified, the path will open to drugs or other interventions that firm up the good links and interrupt the bad ones. This step up the scientific ladder of medicine has many advocates. Researchers who sequence the genome are enthusiastic, as are those with a financial interest in the technology. Also celebrating are doctors and patients in the cancer community, where genetic data already informs some treatment choices and where the initial thrust of the initiative and much of its funding will be directed. Skeptics point out that genetic medicine, for all its promise, has delivered relatively few clinical benefits. And straightforward analyses of lifestyle and environment effects on health may occasionally lead to clear-cut advice (don’t smoke), but more often sow confusion, as anyone curious about the best way to lose weight or the optimal quantity of dietary salt knows. Without Barbara’s presence, I was left to ponder her medical record, a 20-year saga that might be titled “Genes, Lifestyle and Environment.” and published as a cautionary tale. © 2015 The New York Times Company

Keyword: Drug Abuse; Genes & Behavior
Link ID: 20590 - Posted: 02.18.2015

Claire Ainsworth As a clinical geneticist, Paul James is accustomed to discussing some of the most delicate issues with his patients. But in early 2010, he found himself having a particularly awkward conversation about sex. A 46-year-old pregnant woman had visited his clinic at the Royal Melbourne Hospital in Australia to hear the results of an amniocentesis test to screen her baby's chromosomes for abnormalities. The baby was fine — but follow-up tests had revealed something astonishing about the mother. Her body was built of cells from two individuals, probably from twin embryos that had merged in her own mother's womb. And there was more. One set of cells carried two X chromosomes, the complement that typically makes a person female; the other had an X and a Y. Halfway through her fifth decade and pregnant with her third child, the woman learned for the first time that a large part of her body was chromosomally male1. “That's kind of science-fiction material for someone who just came in for an amniocentesis,” says James. Sex can be much more complicated than it at first seems. According to the simple scenario, the presence or absence of a Y chromosome is what counts: with it, you are male, and without it, you are female. But doctors have long known that some people straddle the boundary — their sex chromosomes say one thing, but their gonads (ovaries or testes) or sexual anatomy say another. Parents of children with these kinds of conditions — known as intersex conditions, or differences or disorders of sex development (DSDs) — often face difficult decisions about whether to bring up their child as a boy or a girl. Some researchers now say that as many as 1 person in 100 has some form of DSD2. © 2015 Nature Publishing Group

Keyword: Sexual Behavior
Link ID: 20589 - Posted: 02.18.2015

By Kate Baggaley A buildup of rare versions of genes that control the activity of nerve cells in the brain increases a person’s risk for bipolar disorder, researchers suggest in a paper posted online the week of February 16 in Proceedings of the National Academy of Sciences. “There are many different variants in many different genes that contribute to the genetic risk,” says coauthor Jared Roach, a geneticist at the Institute for Systems Biology in Seattle. “We think that most people with bipolar disorder will have inherited several of these…risk variants.” The bulk of a person’s risk for bipolar disorder comes from genetics, but only a quarter of that risk can be explained by common variations in genes. Roach’s team sequenced the genomes of 200 people from 41 families with a history of bipolar disorder. They then identified 164 rare forms of genes that show up more often in people with the condition. People with bipolar disorder had, on average, six of these rare forms, compared with just one, on average, found in their healthy relatives and the general population. The identified genes control the ability of ions, or charged particles, to enter or leave nerve cells, or neurons. This affects neurons’ ability to pass information through the brain. Some of the gene variants probably increase how much neurons fire while others decrease it, the researchers say. Future research will need to explain what role these brain changes play in bipolar disorder. Citations S.A. Ament et al. Rare variants in neuronal excitability genes influence risk for bipolar disorder. Proceedings of the National Academy of Sciences. Published online the week of February 16, 2015. doi:10.1073/pnas.1424958112. © Society for Science & the Public 2000 - 2015

Keyword: Schizophrenia; Genes & Behavior
Link ID: 20588 - Posted: 02.18.2015

By Warren Cornwall The green wings of the luna moth, with their elegant, long tails, aren’t just about style. New research finds they also help save the insect from becoming a snack for a bat. The fluttering tails appear to create an acoustic signal that is attractive to echolocating bats, causing the predators to zero in on the wings rather than more vital body parts. Scientists pinned down the tails’ lifesaving role by taking 162 moths and plucking the tails off 75 of them. They used fishing line to tether two moths—one with tails, the other without—to the ceiling of a darkened room. Then, they let loose a big brown bat. The bats caught 81% of the tailless moths, but just 35% of those with fully intact wings, they report in a study published online today in the Proceedings of the National Academy of Sciences. High-speed cameras helped show why. In 55% of attacks on moths with tails, the bats went after the tails, often missing the body. It’s the first well-documented example of an organism using body shape to confuse predators that use echolocation, the researchers say—the equivalent of fish and insects that display giant eyespots for visual trickery. © 2015 American Association for the Advancement of Science

Keyword: Hearing; Evolution
Link ID: 20587 - Posted: 02.18.2015

Berit Brogaard On popular websites, we read headlines such as “Scientists are finding that love really is a chemical addiction between people.” Love, of course, is not literally a chemical addiction. It’s a drive perhaps, or a feeling or an emotion, but not a chemical addiction or even a chemical state. Nonetheless, romantic love, no doubt, often has a distinct physiological, bodily, and chemical profile. When you fall in love, your body chemicals go haywire. The exciting, scary, almost paranormal and unpredictable elements of love stem, in part, from hyper-stimulation of the limbic brain’s fear center known as the amygdala. It’s a tiny, almond-shaped brain region in the temporal lobe on the side of your head. In terms of evolutionary history, this brain region is old. It developed millions of years before the neocortex, the part of the brain responsible for logical thought and reasoning. While it has numerous biological functions, the prime role of the amagdala is to process negative emotional stimuli. Significant changes to normal amygdala activation are associated with serious psychological disorders. For example, human schizophrenics have significantly less activation in the amygdala and the memory system (the hippocampus), which is due to a substantial reduction in the size of these areas. People with depression, anxiety, and attachment insecurity, on the other hand, have significantly increased blood flow in the amygdala and memory system. Neuroscientist Justin Feinstein and his colleagues (2010) studied a woman whose amygdala was destroyed after a rare brain condition. They exposed her to pictures of spiders and snakes, took her on a tour of the world’s scariest haunted house, and had her take notes about her emotional state when she heard a beep from a random beeper that had been attached to her. After three months of investigation, the researchers concluded that the woman could not experience fear. This is very good evidence for the idea that the amygdala is the main center for fear processing. (The chief competing hypothesis is that fear is processed in a brain region that receives its main information from the amygdala.) © 2015 Salon Media Group, Inc.

Keyword: Emotions; Drug Abuse
Link ID: 20586 - Posted: 02.16.2015

By Emily Underwood SAN JOSE, CALIFORNIA—If you've ever had a migraine, you know it's no ordinary headache: In addition to throbbing waves of excruciating pain, symptoms often include nausea, visual disturbances, and acute sensitivity to sounds, smells, and light. Although there's no cure for the debilitating headaches, which affect roughly 10% of people worldwide, researchers are starting to untangle their cause and find more effective treatments. Here today at the annual meeting of AAAS (which publishes Science), Science sat down with Teshamae Monteith, a clinical neurologist at the University of Miami Health System in Florida, today to discuss the latest advances in the field. Q: How is our understanding of migraine evolving? A: It's more complicated than we thought. In the past, researchers thought of migraine as a blood vessel disorder, in part because some patients can feel a temple pulsation during a migraine attack. Now, migraine is considered a sensory perceptual disorder, because so many of the sensory systems—light, sound, smell, hearing—are altered. During an attack, patients have concentration impairments, appetite changes, mood changes, and sleeping is off. What fascinates me is that patients are often bothered by manifestations of migraine, such as increased sensitivity to light, in between attacks, suggesting that they may be wired differently, or their neurobiology may be altered. About two-thirds of patients with acute migraine attacks have allodynia, a condition that makes people so sensitive to certain stimuli that even steam from a shower can be incredibly painful. One way to view it is that migraineurs at baseline are at a different threshold for sensory stimuli. © 2015 American Association for the Advancement of Science.

Keyword: Pain & Touch
Link ID: 20585 - Posted: 02.16.2015

|By Gary Stix Implantation of electrodes deep within the brain is now commonly performed for treatment of the neurological disorders Parkinson’s disease and essential tremor. But the use of deep-brain stimulation, as it is known, is expanding. It is now being assessed in as many as 200 patients for major depression—and is being considered for other disorders such as anorexia. Helen Mayberg, a neurologist from Emory University, has pioneered the use of imaging techniques to understand the functioning of different brain circuits to determine how to tailor various treatments for depression, including deep-brain stimulation, to a patient’s needs. Learn about her work below in “Deep-Brain Stimulation: A Decade of Progress with Helen Mayberg,” a Webinar put on by the Brain & Behavior Research Foundation. © 2015 Scientific American

Keyword: Depression
Link ID: 20584 - Posted: 02.16.2015

By Rachel Ehrenberg SAN JOSE, Calif. — New moms suffering from postpartum depression change their activity on Facebook, suggesting that the social media site could help detect the onset of the baby blues. Many new parents share pictures and videos of their babies on Facebook and use the site to interact with friends they might be too busy to see in person. But compared with most typical new moms, those suffering from postpartum depression are less active on the social media site, Munmun De Choudhury of Georgia Tech reported February 14 at the annual meeting of the American Association for the Advancement of Science. She and her colleagues at Microsoft Research in Redmond, Wash., conducted an elaborate study that included a depression screening questionnaire, interviews and an analysis of Facebook activity and interactions of 165 mothers both before and after they had their babies. These women also tend to keep a stiff upper lip on the site, refraining from reporting on their emotional well-being and instead posting objective content geared toward getting feedback or advice on a specific matter, De Choudhury and her colleagues discovered. The scientists also found they could train a computer program to identify which moms had the blues. Such research might help with designing interventions, whereby moms could be warned that they might be sinking into depression and encouraged to reach out for social support or medical attention. M. De Choudhury. Online Social Dynamics and Emotional Wellbeing. American Association for the Advancement of Science Annual Meeting, San Jose, Calif., February 14, 2015. © Society for Science & the Public 2000 - 2015

Keyword: Depression; Hormones & Behavior
Link ID: 20583 - Posted: 02.16.2015

By Lizzie Wade SAN JOSE, CALIFORNIA—Humans have been using cannabis for more than 5000 years. So why don’t scientists know more about it? Three experts gathered here at the annual meeting of AAAS (which publishes Science) to discuss what scientists and doctors know about the drug and what they still need to learn. “By the end of this session, you’ll know more about cannabis than your physician does,” said Mark Ware, a family physician at the McGill University Health Center in Montreal, Canada, who organized the talk. How does marijuana work? Our brains are primed to respond to marijuana, because “there are chemicals in our own bodies that act like THC [the psychoactive ingredient in pot]” and other compounds in cannabis called cannabinoids, explained Roger Pertwee, a neuropharmacologist at the University of Aberdeen in the United Kingdom who has studied cannabinoids since the 1960s. Cannabinoids produced by our bodies or ingested through marijuana use react with a series of receptors in our brains called the endocannabinoid system, which is involved in appetite, mood, memory, and pain sensation. Scientists have discovered 104 cannabinoids so far, but “the pharmacology of most of them has yet to be investigated,” Pertwee said. What are the known medical uses of marijuana? Marijuana has been used for decades to stimulate appetite and treat nausea and vomiting, especially in patients undergoing chemotherapy. Its success in easing the symptoms of multiple sclerosis patients led to the development of Sativex, a drug manufactured by GW Pharmaceuticals that includes THC and cannabidiol (CBD), a cannabinoid that isn’t psychoactive. © 2015 American Association for the Advancement of Science

Keyword: Drug Abuse; Multiple Sclerosis
Link ID: 20582 - Posted: 02.16.2015

Smoking potent cannabis was linked to 24% of new psychosis cases analysed in a study by King's College London. The research suggests the risk of psychosis is three times higher for users of potent "skunk-like" cannabis than for non-users. The study of 780 people was carried out by KCL's Institute of Psychiatry, Psychology and Neuroscience. A Home Office spokesman said the report underlines the reasons why cannabis is illegal. Scientists found the risk of psychosis was five times higher for those who use it every day compared with non-users. They also concluded the use of hash, a milder form of the drug, was not associated with increased risk of psychosis. Psychosis refers to delusions or hallucinations that can be present in certain psychiatric conditions such as schizophrenia and bipolar disorder. "Compared with those who had never tried cannabis, users of high potency skunk-like cannabis had a threefold increase in risk of psychosis,' said Dr Marta Di Forti, lead author on the research. She added: "The results show that psychosis risk in cannabis users depends on both the frequency of use and cannabis potency." Dr Di Forti told BBC Radio 4's Today programme that the availability of skunk-like cannabis was becoming more widespread. "In London, it's very difficult to find anything else," she said. "There were lots of reports from police across the UK saying we have become a great producer of skunk. And not only do we use it locally but we export, so this is a Made in England product." Someone suffering from psychosis would often be "extremely paranoid and become very suspicious" about the people around them, she added. She has called for "a clear public message" to cannabis users, comparable to medical advice on alcohol and tobacco. © 2015 BBC

Keyword: Drug Abuse; Schizophrenia
Link ID: 20581 - Posted: 02.16.2015

By Jane E. Brody Bereavement — how one responds and adjusts to the death of a loved one — is a very individual matter. It is natural to experience a host of negative reactions in the weeks and months following the loss of a loved one: among them, sadness, difficulty sleeping, painful reminders of the person, difficulty enjoying activities once shared, even anger. Grief is a normal human reaction, not a disease, and there is no one right way to get through it. Most often, within six months of a death, survivors adjust and are more or less able to resume usual activities, experience joy, and remember their loved ones without intense pain. But sometimes, even when the loss is neither sudden nor unexpected, as is true in the majority of deaths in the United States, survivors close to the deceased can experience extremely disruptive grief reactions that persist far longer. In a report last month in The New England Journal of Medicine, Dr. M. Katherine Shear presents a composite portrait of what is known as complicated grief, an extreme, unrelenting reaction to loss that persists for more than six months and can result in a serious risk to health. She describes a 68-year-old widow who continued to be seriously impaired by grief four years after her husband died. The woman slept on the couch because she could not bear to sleep in the bed she had shared with him. She found it too painful to engage in activities they used to do together. She no longer ate regular meals because preparing them was a too-distressing reminder of her loss. And she remained alternately angry with the medical staff who cared for him and with herself for not recognizing his illness earlier. Symptoms of complicated grief commonly include intense yearning, longing or emotional pain; frequent preoccupying, intrusive thoughts and memories of the person lost; a feeling of disbelief or inability to accept the loss; and difficulty imagining a meaningful life without that person. © 2015 The New York Times Company

Keyword: Emotions; Depression
Link ID: 20580 - Posted: 02.16.2015