Chapter 12. Psychopathology: Biological Basis of Behavioral Disorders
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Michael Slezak "Cannabis catastrophic for young brains" screamed the headline on an Australian medical news website this week. The article, and others like it, were reporting on a study linking teenage cannabis use with school dropouts, addiction and suicide, published in the The Lancet Psychiatry. Echoing the research findings, the articles declared that if teenagers smoke cannabis daily, it makes them seven times more likely to commit suicide compared with non-users. Indeed, "there is no safe level of use", most reported. They also urged caution to legislators around the world that are gingerly taking steps towards weakening prohibition of cannabis, lest young people get easier access to it. So does smoking pot cause suicide? The Lancet authors say it probably does. Their study combined data from three previous longitudinal studies which together tracked cannabis use in more than 3000 people in Australia and New Zealand over many years. The authors looked for associations between the frequency of cannabis use before the age of 17 and outcomes, such as high school completion, until the people reached the age of 30. They found that those who smoked cannabis daily before they were 17 had lower odds of finishing high school and getting a degree than people who had never used cannabis. Larger increased odds were associated with cannabis dependence later in life, trying other illicit drugs and suicide attempts. But longitudinal studies only show correlation, not causation. The difficulty is that people take drugs for a reason, and that reason could be what's causing the outcome. In the case of school dropout, suicide and daily pot smoking, it is not hard to imagine what else could be going on in someone's life to engender these behaviours. © Copyright Reed Business Information Ltd
By MICHAEL HEDRICK I can remember the early days of having schizophrenia. I was so afraid of the implications of subtle body language, like a lingering millisecond of eye contact, the way my feet hit the ground when I walked or the way I held my hands to my side. It was a struggle to go into a store or, really, anywhere I was bound to see another living member of the human species. With a simple scratch of the head, someone could be telling me to go forward, or that what I was doing was right or wrong, or that they were acknowledging the symbolic crown on my head that made me a king or a prophet. It’s not hard to imagine that I was having a tough time in the midst of all the anxiety and delusions. Several months after my diagnosis, I took a job at a small town newspaper as a reporter. I sat in on City Council meetings, covering issues related to the lowering water table and interviewing local business owners for small blurbs in the local section, all the while wondering if I was uncovering some vague connections to an international conspiracy. The nights were altogether different. Every day, I would come home to my apartment and smoke pot, then lay on my couch watching television or head out to the bar and get so hammered that I couldn’t walk. It’s hard to admit, but the only time I felt relaxed was when I was drunk. I eventually lost my newspaper job, but that wasn’t the catalyst for change. It all came to a head one night in July. I had been out drinking all night and, in a haze, I decided it would be a good idea to drive the two miles back to my apartment. This is something I had done several times before, but it had never dawned on me that it was a serious deal. I thought I was doing well, not swerving and being only several blocks from my house, when I saw flashing lights behind me. What started as a trip to the bar to unwind ended with me calling my parents to bail me out of jail at 3 a.m. © 2014 The New York Times Company
Being bullied regularly by a sibling could put children at risk of depression when they are older, a study led by the University of Oxford suggests. Around 7,000 children aged 12 were asked if they had experienced a sibling saying hurtful things, hitting, ignoring or lying about them. The children were followed up at 18 and asked about their mental health. A charity said parents should deal with sibling rivalry before it escalates. Previous research has suggested that victims of peer bullying can be more susceptible to depression, anxiety and self-harm. This study claims to be the first to examine bullying by brothers or sisters during childhood for the same psychiatric problems in early adulthood. Researchers from the Universities of Oxford, Warwick and Bristol and University College London sent questionnaires to thousands of families with 12-year-old children in 2003-04 and went back to them six years later to assess their mental health. If they had siblings they were asked about bullying by brothers and sisters. The questionnaire said: "This means when a brother or sister tries to upset you by saying nasty and hurtful things, or completely ignores you from their group of friends, hits, kicks, pushes or shoves you around, tells lies or makes up false rumours about you." Most children said they had not experienced bullying. Of these, at 18, 6.4% had depression scores in the clinically significant range, 9.3% experienced anxiety and 7.6% had self-harmed in the previous year. The 786 children who said they had been bullied by a sibling several times a week were found to be twice as likely to have depression, self-harm and anxiety as the other children. BBC © 2014
By S. Matthew Liao As many as 20 percent of war veterans return from combat in Afghanistan and Iraq with post-traumatic stress disorder (PTSD) or major depression, according to a 2008 report from the RAND Corporation. Many experience constant nightmares and flashbacks and many can’t live normal lives. For significant number of veterans, available medications do not seem to help. In 2010, at least 22 veterans committed suicide each day, according to the Department of Veterans Affairs. In her book, Demon Camp, the author Jen Percy describes damaged veterans who have even resorted to exorcism to alleviate their PTSD symptoms. As part of President Obama’s BRAIN Initiative, the federal Defense Advanced Research Projects Agency (DARPA) plans to spend more than $70 million over five years to develop novel devices that would address neurological disorders such as PTSD. DARPA is particularly interested in a technology called Deep Brain Stimulation (DBS). DBS involves inserting a thin electrode through a small opening in the skull into a specific area in the brain; the electrode is then connected by an insulated wire to a battery pack underneath the skin; the battery pack then sends electrical pulses via the wire to the brain. About 100,000 people around the world today have a DBS implant to ameliorate the effects of Parkinson’s disease, epilepsy and major depression. There is evidence that DBS can also help with PTSD. Functional neuroimaging studies indicate that amygdala hyperactivity is responsible for the symptoms of PTSD and that DBS can functionally reduce the activity of the amygdala. In animal PTSD models, DBS has been found to be more effective than current treatment using selective serotonin reuptake inhibitors. © 2014 Scientific American
Link ID: 20039 - Posted: 09.06.2014
On 5th May, 1953, the novelist Aldous Huxley dissolved four-tenths of a gram of mescaline in a glass of water, drank it, then sat back and waited for the drug to take effect. Huxley took the drug in his California home under the direct supervision of psychiatrist Humphry Osmond, to whom Huxley had volunteered himself as “a willing and eager guinea pig”. Osmond was one of a small group of psychiatrists who pioneered the use of LSD as a treatment for alcoholism and various mental disorders in the early 1950s. He coined the term psychedelic, meaning ‘mind manifesting’ and although his research into the therapeutic potential of LSD produced promising initial results, it was halted during the 1960s for social and political reasons. Born in Surrey in 1917, Osmond studied medicine at Guy’s Hospital, London. He served in the navy as a ship’s psychiatrist during World War II, and afterwards worked in the psychiatric unit at St. George’s Hospital, London, where he became a senior registrar. While at St. George’s, Osmond and his colleague John Smythies learned about Albert Hoffman’s discovery of LSD at the Sandoz Pharmaceutical Company in Bazel, Switzerland. Osmond and Smythies started their own investigation into the properties of hallucinogens and observed that mescaline produced effects similar to the symptoms of schizophrenia, and that its chemical structure was very similar to that of the hormone and neurotransmitter adrenaline. This led them to postulate that schizophrenia was caused by a chemical imbalance in the brain, but these ideas were not favourably received by their colleagues. In 1951 Osmond took a post as deputy director of psychiatry at the Weyburn Mental Hospital in Saskatchewan, Canada and moved there with his family. Within a year, he began collaborating on experiments using LSD with Abram Hoffer. Osmond tried LSD himself and concluded that the drug could produce profound changes in consciousness. Osmond and Hoffer also recruited volunteers to take LSD and theorised that the drug was capable of inducing a new level of self-awareness which may have enormous therapeutic potential. © 2014 Guardian News and Media Limited
Greta Kaul, Stanford researchers say poor sleep may be an independent risk factor for suicide in adults over 65. Researchers used data from a previous epidemiological study to compare the sleep quality of 20 older adults who committed suicide and 400 who didn't, over 10 years. Researchers found that those who didn't sleep well were 1.4 times more likely to commit suicide within a decade. Older adults have disproportionately high suicide rates in the first place, especially older men. The Stanford researchers believe that on its own, sleeping poorly could be a risk factor for suicide later in life. It may even be a more powerful predictor of suicide risk than symptoms of depression. They found that the strongest predictor of suicide was the combination of bad sleep and depression. Unlike many biological, psychological and social risk factors for suicide, sleep disorders tend to be treatable, said Rebecca Bernert, the lead author of the study. Sleep disorders are also less stigmatized than other suicide risk factors. Bernert is now studying whether treating insomnia is effective in preventing depression and suicide. The study was published in JAMA Psychiatry in August. © 2014 Hearst Communications, Inc.
By RONI CARYN RABIN Pregnant women often go to great lengths to give their babies a healthy start in life. They quit smoking, skip the chardonnay, switch to decaf, forgo aspirin. They say no to swordfish and politely decline Brie. Yet they rarely wean themselves from popular selective serotonin reuptake inhibitor antidepressants like Prozac, Celexa and Zoloft despite an increasing number of studies linking prenatal exposure to birth defects, complications after birth and even developmental delays and autism. Up to 14 percent of pregnant women take antidepressants, and the Food and Drug Administration has issued strong warnings that one of them, paroxetine (Paxil), may cause birth defects. But the prevailing attitude among doctors has been that depression during pregnancy is more dangerous to mother and child than any drug could be. Now a growing number of critics are challenging that assumption. “If antidepressants made such a big difference, and women on them were eating better, sleeping better and taking better care of themselves, then one would expect to see better birth outcomes among the women who took medication than among similar women who did not,” said Barbara Mintzes, an associate professor at the University of British Columbia School of Population and Public Health. “What’s striking is that there’s no research evidence showing that.” On the contrary, she said, “when you look for it, all you find are harms.” S.S.R.I.s are believed to work in part by blocking reabsorption (or reuptake) of serotonin, altering levels of this important neurotransmitter in the brain and elsewhere in the body. Taken by a pregnant woman, the drugs cross the placental barrier, affecting the fetus. © 2014 The New York Times Company
By GARY GREENBERG Joel Gold first observed the Truman Show delusion — in which people believe they are the involuntary subjects of a reality television show whose producers are scripting the vicissitudes of their lives — on Halloween night 2003 at Bellevue Hospital, where he was the chief attending psychiatrist. “Suspicious Minds,” which he wrote with his brother, Ian, an associate professor of philosophy and psychology at McGill University, is an attempt to use this delusion, which has been observed by many clinicians, to pose questions that have gone out of fashion in psychiatry over the last half-century: Why does a mentally ill person have the delusions he or she has? And, following the lead of the medical historian Roy Porter, who once wrote that “every age gets the lunatics it deserves,” what can we learn about ourselves and our times from examining the content of madness? The Golds’ answer is a dual broadside: against a psychiatric profession that has become infatuated with neuroscience as part of its longstanding attempt to establish itself as “real medicine,” and against a culture that has become too networked for its own good. Current psychiatric practice is to treat delusions as the random noise generated by a malfunctioning (and mindless) brain — a strategy that would be more convincing if doctors had a better idea of how the brain produced madness and how to cure it. According to the Golds, ignoring the content of delusions like T.S.D. can only make mentally ill people feel more misunderstood, even as it distracts the rest of us from the true significance of the delusion: that we live in a society that has put us all under surveillance. T.S.D. sufferers may be paranoid, but that does not mean they are wrong to think the whole world is watching. This is not to say they aren’t crazy. Mental illness may be “just a frayed, weakened version of mental health,” but what is in tatters for T.S.D. patients is something crucial to negotiating social life, and that, according to the Golds, is the primary purpose toward which our big brains have evolved: the ability to read other people’s intentions or, as cognitive scientists put it, to have a theory of mind. This capacity is double-edged. “The better you are at ToM,” they write, “the greater your capacity for friendship.” © 2014 The New York Times Company
Link ID: 20013 - Posted: 08.30.2014
One of the best things about being a neuroscientist used to be the aura of mystery around it. It was once so mysterious that some people didn’t even know it was a thing. When I first went to university and people asked what I studied, they thought I was saying I was a “Euroscientist”, which is presumably someone who studies the science of Europe. I’d get weird questions such as “what do you think of Belgium?” and I’d have to admit that, in all honesty, I never think of Belgium. That’s how mysterious neuroscience was, once. Of course, you could say this confusion was due to my dense Welsh accent, or the fact that I only had the confidence to talk to strangers after consuming a fair amount of alcohol, but I prefer to go with the mystery. It’s not like that any more. Neuroscience is “mainstream” now, to the point where the press coverage of it can be studied extensively. When there’s such a thing as Neuromarketing (well, there isn’t actually such a thing, but there’s a whole industry that would claim otherwise), it’s impossible to maintain that neuroscience is “cool” or “edgy”. It’s a bad time for us neurohipsters (which are the same as regular hipsters, except the designer beards are on the frontal lobes rather than the jaw-line). One way that we professional neuroscientists could maintain our superiority was by correcting misconceptions about the brain, but lately even that avenue looks to be closing to us. The recent film Lucy is based on the most classic brain misconception: that we only use 10% of our brain. But it’s had a considerable amount of flack for this already, suggesting that many people are wise to this myth. We also saw the recent release of Susan Greenfield’s new book Mind Change, all about how technology is changing (damaging?) our brains. This is a worryingly evidence-free but very common claim by Greenfield. Depressingly common, as this blog has pointed out many times. But now even the non-neuroscientist reviewers aren’t buying her claims. © 2014 Guardian News and Media Limited
Link ID: 20011 - Posted: 08.30.2014
|By Roni Jacobson Almost immediately after Albert Hofmann discovered the hallucinogenic properties of LSD in the 1940s, research on psychedelic drugs took off. These consciousness-altering drugs showed promise for treating anxiety, depression, post-traumatic stress disorder (PTSD), obsessive-compulsive disorder (OCD) and addiction, but increasing government conservatism caused a research blackout that lasted decades. Lately, however, there has been a resurgence of interest in psychedelics as possible therapeutic agents. This past spring Swiss researchers published results from the first drug trial involving LSD in more than 40 years. Although the freeze on psychedelic research is thawing, scientists say that restrictive drug policies are continuing to hinder their progress. In the U.S., LSD, psilocybin, MDMA, DMT, peyote, cannabis and ibogaine (a hallucinogen derived from an African shrub) are all classified as Schedule I illegal drugs, which the U.S. Drug Enforcement Administration defines as having a high potential for abuse and no currently accepted medical applications—despite extensive scientific evidence to the contrary. In a joint report released in June, the Drug Policy Alliance and the Multidisciplinary Association for Psychedelic Studies catalogue several ways in which they say that the DEA has unfairly obstructed research on psychedelics, including by overruling an internal recommendation in 1986 that MDMA be placed on a less restrictive schedule. The DEA and the U.S. Food and Drug Administration maintain that there is insufficient research to justify recategorization. This stance creates a catch-22 by basing the decision on the need for more research while limiting the ability of scientists to conduct that research. © 2014 Scientific American
By James Gallagher Health editor, BBC News website Breastfeeding can halve the risk of post-natal depression, according to a large study of 14,000 new mothers. However, there is a large increase in the risk of depression in women planning to breastfeed who are then unable to do so. The study, published in the journal Maternal and Child Health, called for more support for women unable to breastfeed. A parenting charity said mental health was a "huge issue" for many mothers. The health benefits of breastfeeding to the baby are clear-cut and the World Health Organization recommends feeding a child nothing but breast milk for the first six months. However, researchers at the University of Cambridge said the impact on the mother was not as clearly understood. 'Highest risk' One in 10 women will develop depression after the birth of their child. The researchers analysed data from 13,998 births in the south-west of England. It showed that, out of women who were planning to breastfeed, there was a 50% reduction in the risk of post-natal depression if they started breastfeeding. But the risk of depression more than doubled among women who wanted to, but were not able to, breastfeed. Dr Maria Iacovou, one of the researchers, told the BBC: "Breastfeeding does appear to have a protective effect, but there's the other side of the coin as well. "Those who wanted to and didn't end up breastfeeding had the highest risk of all the groups." BBC © 2014
Claudia M. Gold When I hear debate over the association between SSRI’s (selective serotonin re-uptake inhibitors, a class of antidepressant medication) and suicidal behavior in children and adolescents, I am immediately brought back to a night in the early 2000's. As the covering pediatrician I was called to the emergency room to see a young man, a patient of a pediatrician in a neighboring town, who had attempted suicide by taking a nearly lethal overdose. That night, as I watched over him in the intensive care unit, I learned that he was a high achieving student and athlete who, struggling under the pressures of the college application process, had been prescribed an SSRI by his pediatrician. His parents described a transformation in his personality over the months preceding the suicide attempt that was so dramatic that I ordered a CT scan to see if he had a brain tumor. It was normal. When, in the coming years the data emerged about increasing suicidal behavior following use of SSRI's, I recognized in retrospect that his change in behavior was a result of the medication. But at the time I knew nothing of these serious side effects. At that time, coinciding with pharmaceutical industry's aggressive marketing campaign directed at the public as well as a professional audience, these drugs were becoming increasingly popular with pediatricians. As the possible serious side effects of these medications came increasingly in to awareness, the FDA issued the controversial "black box warning" that the drugs carried an increased risk of suicidal behavior. Following the black box warning, pediatricians, myself included, became reluctant to prescribe these medications. We did not have the time or experience to provide the recommended increased monitoring and close follow-up.
By Rachel Feltman At every waking moment, your brain is juggling two very different sets of information. Input from the world around you, like sights and smells, has to be processed. But so does internal information — your memories and thoughts. Right now, for example, I’m looking at a peach: It’s yellow and pink, and has a lot of fuzz. But I also know that it smells nice (a personal assessment) and I’m imagining how good it will taste, based on my previous experience with fragrant pink fruits. The brain’s ability to handle these different signals is key to cognitive function. In some disorders, particularly autism and schizophrenia, this ability is disrupted. The brain has difficulty keeping internal and external input straight. In a new study published Thursday in Cell, researchers observe the switching method in action for the first time. While the research used mice, not humans, principal investigator and NYU Langone Medical Center assistant professor Michael Halassa sees this as a huge step toward understanding and manipulating the same functions in humans. “This is one of the few moments in my life where I’d actually say yes, absolutely this is going to translate to humans,” Halassa said. “This isn’t something based on genes or molecules that are specific to one organism. The underlying principles of how the brain circuitry works are likely to be very similar in humans and mice.” That circuitry has been hypothesized for decades. Neurologists know that the cortex of the brain is responsible for higher cognitive functions, like music and language. And the thalamus, which is an egg-like structure in the center of the brain, works to direct the flow of internal and external information before it gets to the cortex.
By Lenny Bernstein Comedian Robin Williams was grappling with severe depression when he committed suicide Monday, and on Thursday we learned that he also was in the early stages of Parkinson's disease. Sadly, the two conditions are often found together. In a 2012 study conducted by the National Parkinson Foundation, 61 percent of 5,557 Parkinson's patients surveyed reported that they also suffered from depression, with symptoms that ranged from mild to severe. Both conditions are associated with a shortage of dopamine, a neurotransmitter that helps regulate movement and control the brain's pleasure center. "Dopamine is a feel-good chemical. If you are low in dopamine, you are not going to feel so good," said Joyce Oberdorf, president and CEO of the National Parkinson Foundation. "There are [also] other neurotransmitters that can be low." A separate study published Friday found that newly-diagnosed Parkinson's patients have higher rates of depression, anxiety, fatigue, and apathy than a control group of people without Parkinson's. Researchers from the Raymond and Ruth Perelman School of Medicine at the University of Pennsylvania found that 13.9 percent of patients had symptoms of depression when they were diagnosed with Parkinson's, a proportion that rose to 18.7 percent after 24 months. Just 6.6 percent of people without the disease had depression, and that dropped to just 2.4 percent after 24 months. Despite their depressive symptoms, most of the Parkinson's patients who also had that condition were not treated with anti-depressants at any point in the two-year study. The findings were published in the journal Neurology.
By Rebecca Boyle Like a dog wagging its tail in anticipation of treats to come, dolphins and belugas squeal with pleasure at the prospect of a fish snack, according to a new study. It’s the first direct demonstration of an excitement call in these animals, says Peter Madsen, a biologist at Aarhus University in Denmark who was not involved in the study. To hunt and communicate, dolphins and some whale species produce a symphony of clicks, whistles, squeaks, brays, and moans. Sam Ridgway, a longtime marine biologist with the U.S. Navy’s Marine Mammal Program, says he heard distinctive high-pitched squeals for the first time in May 1963 while training newly captured dolphins at the Navy’s facility in Point Mugu, California. “We were throwing fish in, and each time they would catch a fish, they would make this sound,” he says. He describes it as a high-pitched “eeee,” like a child squealing in delight. Ridgway and his collaborators didn’t think much of the sound until later in the 1960s, when dolphins trained to associate a whistle tone with a task or behavior also began making it. Trainers teach animals a task by rewarding them with a treat and coupling it with a special noise, like a click or a whistle. Eventually only the sound is used, letting the animal know it will get a treat later. The whistle was enough to provoke a victory squeal, Ridgway says. Meanwhile, beluga whales would squeal after diving more than 600 meters to switch off an underwater speaker broadcasting tones. “As soon as the tone went off, they would make this same sound,” Ridgway says, “despite the fact that they’re not going to get a reward for five minutes.” He also heard the squeal at marine parks in response to trainers’ whistles. © 2014 American Association for the Advancement of Science.
Greta Kaul It was a rainy day, and earthworms wriggled out of the ground and began to arrange themselves on the pavement as Julian Plumadore walked to his community college zoology class in 1991. They spelled out messages only he could read. "I was very frightened to be a custodian of that kind of cosmic information and be able to do absolutely nothing about it," Plumadore said. Other times, there were voices - demons screaming - telling him he was going to hell. Plumadore was eventually diagnosed as having schizoaffective disorder, a psychosis that combines the hallucinations of schizophrenia with a mood disorder like depression. People with psychotic disorders, of which schizophrenia is the most severe, have hallucinations, like the voices Plumadore was hearing, that are divorced from reality. Now, a Stanford researcher suggests that the voices he experienced might have been different if he had grown up somewhere other than the U.S. If he were from India, he might have heard family members telling him to do household chores. If he were from Ghana, he might have heard the voice of God guiding him. For a study published in June, Tanya Luhrmann, a Stanford anthropologist, and other researchers interviewed 60 people who met the criteria for schizophrenia: 20 from in and around San Mateo, 20 from India and 20 from Ghana. Though the patients heard both positive and negative voices no matter where they were from, those in India and in Ghana tended to have less negative experiences than Americans: They could more often identify who was talking to them and had less violent hallucinations. Though the study isn't conclusive, Luhrmann believes the differences in voice-hearing between cultures may be a clue into how social expectations and environment shape the way people hear those imaginary voices. © 2014 Hearst Communications, Inc.
Hearing voices is an experience that is very distressing for many people. Voices – or “auditory verbal hallucinations” – are one of the most common features of schizophrenia and other psychiatric disorders. But for a small minority of people, voice-hearing is a regular part of their lives, an everyday experience that isn’t associated with being unwell. It is only in the past 10 years that we have begun to understand what might be going on in “non-clinical” voice-hearing. Most of what we know comes from a large study conducted by Iris Sommer and colleagues at UMC Utrecht in the Netherlands. In 2006 they launched a nationwide attempt to find people who had heard voices before but didn’t have any sort of psychiatric diagnosis. From an initial response of over 4,000 people, they eventually identified a sample of 103 who heard voices at least once a month, but didn’t have psychosis. Their voice-hearing was also not caused by misuse of drugs or alcohol. Twenty-one of the participants were also given an MRI scan. When this group was compared with voice-hearers who did have psychosis, many of the same brain regions were active for both groups while they were experiencing auditory hallucinations, including the inferior frontal gyrus (involved in speech production) and the superior temporal gyrus (linked to speech perception). Subsequent studies with the same non-clinical voice-hearers have also highlighted differences in brain structure and functional connectivity (the synchronisation between different brain areas) compared with people who don’t hear voices. These results suggest that, on a neural level, the same sort of thing is going on in clinical and non-clinical voice-hearing. We know from first-person reports that the voices themselves can be quite similar, in terms of how loud they are, where they are coming from, and whether they speak in words or sentences. © 2014 Guardian News and Media Limited
The news of Robin Williams’s suicide has brought mental health into the spotlight this week. According to data from the Massachusetts Violent Death Reporting System at the department of public health, the number of deaths per year as a result of suicide in the state has increased 4 percent per year since 2003. The rate increased from 424 suicides in 2003 to a peak of 600 in 2010, before dropping back down to 588. That’s 8.9 suicides per 100,000, a total of 4,500 deaths for this preventable public health problem. There are many biological, sociological, and psychological risk factors that can increase an individual’s risk for committing suicide. But did you know that poor sleep could be a major factor pushing people over the edge, even if they aren’t depressed? We all know the feeling that when we’re under slept, we aren’t quite ourselves, but according to the Substance Abuse and Mental Health Services Administration, sleep complaints are actually one of the top 10 warning signs for suicide. A study published today in JAMA Psychiatry is the first research of its kind to draw a correlation between poor sleep habits and an increased risk for death by suicide by controlling for signs of depression. Stanford University School of Medicine researchers have found that over a 10 year observation period, people with poor sleep quality and no other depressive symptoms demonstrated a 1.2 times greater risk for death by suicide.
By EDWARD LARKIN and IRENE HURFORD PHILADELPHIA — A FEW months ago, a patient came to our hospital, seeking help. One of us, Edward, was on the team that treated him. He was pleasant, if slightly withdrawn, and cogent. He was a college graduate in his 20s and had recently been fired from his job as a high school math teacher, because of unexpected absences. He had come to believe that government agents were conspiring against him, and he had taken to living out of a truck and sleeping in different parking lots. By the time he came to us, he was exhausted. A diagnosis became clear: he had schizophrenia. We admitted him to the hospital, and after a few days, with his symptoms under control, we released him. Unfortunately, we prescribed a medication for him that could cause significant, permanent harm, instead of an equally effective drug with milder side effects — all because he was uninsured. Schizophrenia, which affects 1 percent of the population and emerges in the late teens to early 20s, is deeply misunderstood. People who suffer from it are often suspected of being dangerous, but this is not usually the case, and antipsychotic drugs are very effective. Our patient was exactly the kind of person who, with the right treatment, could have weakened the stigma surrounding schizophrenia. Antipsychotic drugs fall into two classes: the older ones, like Haldol, and newer ones, like Abilify and Latuda. Both classes are equally effective at treating some of the worst symptoms of schizophrenia, specifically the hallucinations, delusions and paranoia that cause social alienation. (They’re not effective for treating “negative symptoms,” like low motivation.) But the older drugs can cause a multitude of serious side effects, including a potentially devastating one called tardive dyskinesia. This condition involves unsettling, animalistic smacking and wagging of the lips and tongue. At its extreme, it can affect the entire body. It occurs in 20 percent or more of patients who take the drugs long-term, and it tends to start so mildly that patients can’t identify it in time to stop taking the drugs. It is often irreversible. © 2014 The New York Times Company
Link ID: 19950 - Posted: 08.13.2014
By MICHAEL CIEPLY and BROOKS BARNES LOS ANGELES — Peering through his camera at Robin Williams in 2012, the cinematographer John Bailey thought he glimpsed something not previously evident in the comedian’s work. They were shooting the independent film “The Angriest Man in Brooklyn,” and Mr. Williams was playing a New York lawyer who, facing death, goes on a rant against the injustice and banality of life. His performance, Mr. Bailey said Tuesday, was a window into the “Swiftian darkness of Robin’s heart.” The actor, like his character, was raging against the storm. That defiance gave way on Monday to the personal demons that had long tormented Mr. Williams. With his suicide at age 63, Mr. Williams forever shut the window on a complicated soul that was rarely visible through the cracks of an astonishingly intact career. Given his well-publicized troubles with depression, addiction, alcoholism and a significant heart surgery in 2009, Mr. Williams should have had a résumé filled with mysterious gaps. Instead, he worked nonstop. At the very least — if his life had followed the familiar script of troubled actors — there would have been whispers of on-set antics: lateness, forgotten lines, the occasional flared temper. Not so with Mr. Williams. “He was ready to work, he was the first one on the set,” said Mr. Bailey, speaking of Mr. Williams’s contribution to “The Angriest Man in Brooklyn,” of which he was the star. “Robin was always 1,000 percent reliable,” said a senior movie agent, speaking on the condition of anonymity to conform to the wishes of Mr. Williams’s family. “He was almost impossibly high functioning.” As Hollywood struggled on Tuesday to understand how Mr. Williams — effervescent in the extreme — could take his own life, authorities released details of his death. A clothed Mr. Williams hanged himself with a belt from a door frame in his bedroom in Tiburon, Calif., according to Lt. Keith Boyd, assistant deputy chief coroner for Marin County. © 2014 The New York Times Company
Link ID: 19948 - Posted: 08.13.2014