Chapter 11. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
When it comes to fight or flight for brawling crickets, a chemical in the brain is in charge. Being roughed up in a skirmish can trigger nerve cells in Mediterranean field crickets (Gryllus bimaculatus) to release nitric oxide, making the losing cricket run away, scientists report online March 13 in Science Advances. Watch in this video as two crickets face off. When the loser hits its limit, it flees the fight. In a second bout, the loser then tries to avoid the winner. Nitric oxide prompts this continued submissive behavior, which lasts several hours before a cricket’s will to fight returns. “If you block nitric oxide they recover quickly, and if you give them nitric oxide they don’t,” says Paul Stevenson, a coauthor of the new research and behavioral neurobiologist at Leipzig University in Germany. “It’s a very simple algorithm for controlling a very complicated social situation.” P. Stevenson and J. Rillich. Adding up the odds—Nitric oxide signaling underlies the decision to flee and post-conflict depression of aggression. Science Advances. Published online March 13, 2015.doi: 10.1126/sciadv.1500060. © Society for Science & the Public 2000 - 2015.
Link ID: 20686 - Posted: 03.14.2015
By Nicholas Bakalar People sometimes take Valium or Ativan to relieve anxiety before surgery, but a new study suggests that these benzodiazepine drugs have little beneficial effect and may even delay recovery. Researchers studied 1,062 patients admitted to French hospitals for surgery requiring general anesthesia. A third took 2.5 milligrams of lorazepam (brand name Ativan), a third received a placebo, and a third were given no premedication. Patients completed questionnaires assessing anxiety, pain levels and quality of sleep before and a day after their operations, while researchers recorded their time to having ventilation tubes removed and to recovering full wakefulness. The study was published in JAMA. Lorazepam was associated with more postsurgery amnesia and a longer time to recover cognitive abilities. Quality of sleep was impaired in the lorazepam group, but not in the others. And ventilation tubes were kept in significantly longer in the lorazepam group. Pain scores did not differ between the lorazepam and the no-medication groups, but there was more pain in the group given the placebo. The lead author, Dr. Axel Maurice-Szamburski, an anesthesiologist at Timone Hospital in Marseille, cited recent surveys showing that benzodiazepines are widely prescribed before surgery. “But until now,” he added, “sedatives have not been evaluated from the patient’s point of view. It’s the patient who should be happy, not the doctor.” © 2015 The New York Times Company
Link ID: 20676 - Posted: 03.10.2015
By TIMOTHY WILLIAMS In January 1972, Cecil Clayton was cutting wood at his family’s sawmill in southeastern Missouri when a piece of lumber flew off the circular saw blade and struck him in the forehead. The impact caved in part of Mr. Clayton’s skull, driving bone fragments into his brain. Doctors saved his life, but in doing so had to remove 20 percent of his frontal lobe, which psychiatrists say led Mr. Clayton to be tormented for years by violent impulses, schizophrenia and extreme paranoia. In 1996, his lawyers say, those impulses drove Mr. Clayton to kill a law enforcement officer. Today, as Mr. Clayton, 74, sits on death row, his lawyers have returned to that 1972 sawmill accident in a last-ditch effort to save his life, arguing that Missouri’s death penalty law prohibits the execution of severely brain-damaged people. Lawyers for Mr. Clayton, who has an I.Q. of 71, say he should be spared because his injury has made it impossible for him to grasp the significance of his death sentence, scheduled for March 17. “There was a profound change in him that he doesn’t understand, and neither did his family,” said Elizabeth Unger Carlyle, one of Mr. Clayton’s lawyers. While several rulings by the United States Supreme Court in recent years have narrowed the criteria for executing people who have a mental illness, states continue to hold wide sway in establishing who is mentally ill. The debate surrounding Mr. Clayton involves just how profoundly his impairment has affected his ability to understand what is happening to him. Mr. Clayton is missing about 7.7 percent of his brain. © 2015 The New York Times Company
By RICHARD A. FRIEDMAN CHANCES are that everyone on this planet has experienced anxiety, that distinct sense of unease and foreboding. Most of us probably assume that anxiety always has a psychological trigger. Yet clinicians have long known that there are plenty of people who experience anxiety in the absence of any danger or stress and haven’t a clue why they feel distressed. Despite years of psychotherapy, many experience little or no relief. It’s as if they suffer from a mental state that has no psychological origin or meaning, a notion that would seem heretical to many therapists, particularly psychoanalysts. Recent neuroscience research explains why, in part, this may be the case. For the first time, scientists have demonstrated that a genetic variation in the brain makes some people inherently less anxious, and more able to forget fearful and unpleasant experiences. This lucky genetic mutation produces higher levels of anandamide — the so-called bliss molecule and our own natural marijuana — in our brains. In short, some people are prone to be less anxious simply because they won the genetic sweepstakes and randomly got a genetic mutation that has nothing at all to do with strength of character. About 20 percent of adult Americans have this mutation. Those who do may also be less likely to become addicted to marijuana and, possibly, other drugs — presumably because they don’t need the calming effects that marijuana provides. One patient of mine, a man in his late 40s, came to see me because he was depressed and lethargic. He told me at our first meeting that he had been using cannabis almost daily for at least the past 15 years. “It became a way of life,” he explained. “Things are more interesting, and I can tolerate disappointments without getting too upset.” © 2015 The New York Times Company
By Will Boggs MD NEW YORK (Reuters Health) - Adolescents with a history of childhood trauma show different neural responses to subjective anxiety and craving, researchers report. "I think the finding of increased activation of insula, anterior cingulate, and prefrontal cortex in response to stress cues in the high- relative to low-trauma group, while arguably not necessarily unexpected, is important as it suggests that youth exposed to higher levels of trauma may experience different brain responses to similar stressors," Dr. Marc N. Potenza from Yale University, New Haven, Connecticut told Reuters Health by email. Childhood trauma has been associated with anxiety and depression, as well as obesity, risky sexual behavior, and substance use. Previous imaging studies have not investigated neural responses to personalized stimuli, Dr. Potenza and his colleagues write in Neuropsychopharmacology, online January 8. The team used functional MRI to assess regional brain activations to personalized appetitive (favorite food), aversive (stress), and neutral/relaxing cues in 64 adolescents, including 33 in the low-trauma group and 31 in the high-trauma group. Two-thirds of the adolescents had been exposed to cocaine prenatally, with prenatal cocaine exposure being significantly over-represented in the high-trauma group. Compared with the low-trauma group, the high-trauma group showed increased responsivity in several cortical regions in response to stress, as well as decreased activation in the cerebellar vermis and right cerebellum in response to neutral/relaxing cues. But the two groups did not differ significantly in their responses to favorite-food cues, the researchers found. © 2015 Scientific American
By Nicholas Bakalar Gout, a form of arthritis, is extremely painful and associated with an increased risk for cardiovascular problems. But there is a bright side: It may be linked to a reduced risk for Alzheimer’s disease. Researchers compared 59,204 British men and women with gout to 238,805 without the ailment, with an average age of 65. Patients were matched for sex, B.M.I., smoking, alcohol consumption and other characteristics. The study, in The Annals of the Rheumatic Diseases, followed the patients for five years. They found 309 cases of Alzheimer’s among those with gout and 1,942 among those without. Those with gout, whether they were being treated for the condition or not, had a 24 percent lower risk of Alzheimer’s disease. The reason for the connection is unclear. But gout is caused by excessive levels of uric acid in the blood, and previous studies have suggested that uric acid protects against oxidative stress. This may play a role in limiting neuron degeneration. “This is a dilemma, because uric acid is thought to be bad, associated with heart disease and stroke,” said the senior author, Dr. Hyon K. Choi, a professor of medicine at Harvard. “This is the first piece of data suggesting that uric acid isn’t all bad. Maybe there is some benefit. It has to be confirmed in randomized trials, but that’s the interesting twist in this story.” © 2015 The New York Times Company
|By Charles Schmidt The notion that the state of our gut governs our state of mind dates back more than 100 years. Many 19th- and early 20th-century scientists believed that accumulating wastes in the colon triggered a state of “auto-intoxication,” whereby poisons emanating from the gut produced infections that were in turn linked with depression, anxiety and psychosis. Patients were treated with colonic purges and even bowel surgeries until these practices were dismissed as quackery. The ongoing exploration of the human microbiome promises to bring the link between the gut and the brain into clearer focus. Scientists are increasingly convinced that the vast assemblage of microfauna in our intestines may have a major impact on our state of mind. The gut-brain axis seems to be bidirectional—the brain acts on gastrointestinal and immune functions that help to shape the gut's microbial makeup, and gut microbes make neuroactive compounds, including neurotransmitters and metabolites that also act on the brain. These interactions could occur in various ways: microbial compounds communicate via the vagus nerve, which connects the brain and the digestive tract, and microbially derived metabolites interact with the immune system, which maintains its own communication with the brain. Sven Pettersson, a microbiologist at the Karolinska Institute in Stockholm, has recently shown that gut microbes help to control leakage through both the intestinal lining and the blood-brain barrier, which ordinarily protects the brain from potentially harmful agents. Microbes may have their own evolutionary reasons for communicating with the brain. They need us to be social, says John Cryan, a neuroscientist at University College Cork in Ireland, so that they can spread through the human population. © 2015 Scientific American
By JULIE HOLLAND WOMEN are moody. By evolutionary design, we are hard-wired to be sensitive to our environments, empathic to our children’s needs and intuitive of our partners’ intentions. This is basic to our survival and that of our offspring. Some research suggests that women are often better at articulating their feelings than men because as the female brain develops, more capacity is reserved for language, memory, hearing and observing emotions in others. These are observations rooted in biology, not intended to mesh with any kind of pro- or anti-feminist ideology. But they do have social implications. Women’s emotionality is a sign of health, not disease; it is a source of power. But we are under constant pressure to restrain our emotional lives. We have been taught to apologize for our tears, to suppress our anger and to fear being called hysterical. The pharmaceutical industry plays on that fear, targeting women in a barrage of advertising on daytime talk shows and in magazines. More Americans are on psychiatric medications than ever before, and in my experience they are staying on them far longer than was ever intended. Sales of antidepressants and antianxiety meds have been booming in the past two decades, and they’ve recently been outpaced by an antipsychotic, Abilify, that is the No. 1 seller among all drugs in the United States, not just psychiatric ones. As a psychiatrist practicing for 20 years, I must tell you, this is insane. At least one in four women in America now takes a psychiatric medication, compared with one in seven men. Women are nearly twice as likely to receive a diagnosis of depression or anxiety disorder than men are. For many women, these drugs greatly improve their lives. But for others they aren’t necessary. The increase in prescriptions for psychiatric medications, often by doctors in other specialties, is creating a new normal, encouraging more women to seek chemical assistance. Whether a woman needs these drugs should be a medical decision, not a response to peer pressure and consumerism. © 2015 The New York Times Company
Distinct changes in the immune systems of patients with ME or chronic fatigue syndrome have been found, say scientists. Increased levels of immune molecules called cytokines were found in people during the early stages of the disease, a Columbia University study reported. It said the findings could help improve diagnosis and treatments. UK experts said further refined research was now needed to confirm the results. People with ME (myalgic encephalopathy) or CFS (chronic fatigue syndrome) suffer from exhaustion that affects everyday life and does not go away with sleep or rest. They can also have muscle pain and difficulty concentrating. ME can also cause long-term illness and disability, although many people improve over time. It is estimated that around 250,000 people in the UK have the disease. Disease pattern The US research team, who published their findings in the journal Science Advances, tested blood samples from nearly 300 ME patients and around 350 healthy people. They found specific patterns of immune molecules in patients who had the disease for up to three years. These patients had higher levels of of cytokines, particularly one called interferon gamma, which has been linked to the fatigue that follows many viral infections. Healthy patients and those who had the disease for longer than three years did not show the same pattern. Lead author Dr Mady Hornig said this was down to the way viral infections could disrupt the immune system. "It appears that ME/CFS patients are flush with cytokines until around the three-year mark, at which point the immune system shows evidence of exhaustion and cytokine levels drop."
|By Matthew Hutson We like to think of our moral judgments as consistent, but they can be as capricious as moods. Research reveals that such judgments are swayed by incidental emotions and perceptions—for instance, people become more moralistic when they feel dirty or sense contamination, such as in the presence of moldy food. Now a series of studies shows that hippies, the obese and “trailer trash” suffer prejudicial treatment because they tend to elicit disgust. Researchers asked volunteers to read short paragraphs about people committing what many consider to be impure acts, such as watching pornography, swearing or being messy. Some of the paragraphs described the individuals as being a hippie, obese or trailer trash—and the volunteers judged these fictional sinners more harshly, according to the paper in the Journal of Experimental Psychology: General. Questionnaires revealed that feelings of disgust toward these groups were driving the volunteers' assessments. A series of follow-up studies solidified the link, finding that these groups also garnered greater praise for purity-related virtues, such as keeping a neat cubicle. If the transgression in question did not involve purity, such as not tipping a waiter, the difference in judgment disappeared. “The assumption people have is that we draw on values that are universal and important,” says social psychologist E. J. Masicampo of Wake Forest University, who led the study, “but something like mentioning that a person is overweight can really push that judgment around. It's triggering these gut-level emotions.” The researchers also looked for real-world effects. © 2015 Scientific American
By Christian Jarrett Imagine a politician from your party is in trouble for alleged misdemeanors. He’s been assessed by an expert who says he likely has early-stage Alzheimer’s. If this diagnosis is correct, your politician will have to resign, and he’ll be replaced by a candidate from an opposing party. This was the scenario presented to participants in a new study by Geoffrey Munro and Cynthia Munro. A vital twist was that half of the 106 student participants read a version of the story in which the dementia expert based his diagnosis on detailed cognitive tests; the other half read a version in which he used a structural MRI brain scan. All other story details were matched, such as the expert’s years of experience in the field, and the detail provided for the different techniques he used. Overall, the students found the MRI evidence more convincing than the cognitive tests. For example, 69.8 percent of those given the MRI scenario said the evidence the politician had Alzheimer’s was strong and convincing, whereas only 39.6 percent of students given the cognitive tests scenario said the same. MRI data was also seen to be more objective, valid and reliable. Focusing on just those students in both conditions who showed skepticism, over 15 percent who read the cognitive tests scenario mentioned the unreliability of the evidence; none of the students given the MRI scenario cited this reason. In reality, a diagnosis of probable Alzheimer’s will always be made with cognitive tests, with brain scans used to rule out other explanations for any observed test impairments. The researchers said their results are indicative of naive faith in the trustworthiness of brain imaging data. “When one contrasts the very detailed manuals accompanying cognitive tests to the absences of formalized operational criteria to guide the clinical interpretation of structural brain MRI in diagnosing disease, the perception that brain MRI is somehow immune to problems of reliability becomes even more perplexing,” they said. WIRED.com © 2015 Condé Nast.
By Francis Shen and Dena Gromet Neuroscience is appearing everywhere. And the legal system is taking notice. The past few years have seen the emergence of “neurolaw.” A spread in the NYT Magazine, a best-selling NYT book, a primetime PBS documentary, the first Law and Neuroscience casebook, and a multimillion-dollar investment from the MacArthur Foundation to fund a Research Network on Law and Neuroscience have all fueled interest in how neuroscience might revolutionize the law. The potential implications of neurolaw are broad. For example, future developments in brain science might allow: criminal law to better identify recidivists; tort law to better differentiate between those in real pain and those who are faking; insurance law to more accurately and adequately compensate those with mental illness; and end-of-life law to more ethically treat patients who might be able to communicate only through their thoughts. Increasingly courts, including the U.S. Supreme Court, and legislatures are citing brain evidence. But despite the media coverage, and much enthusiasm from science and legal elites, our new research shows that Americans know very little about neurolaw, and that Republicans and independents may diverge from Democrats in their support for neuroscience based legal reforms. In our study, we conducted an experiment within a national survey of Americans (more details about the survey are in our article). Everyone in the survey was told that, “Recently developed neuroscientific techniques allow researchers to see inside the human brain as never before.”
Julie Beck When Paul Ekman was a grad student in the 1950s, psychologists were mostly ignoring emotions. Most psychology research at the time was focused on behaviorism—classical conditioning and the like. Silvan Tomkins was the one other person Ekman knew of who was studying emotions, and he’d done a little work on facial expressions that Ekman saw as extremely promising. “To me it was obvious,” Ekman says. “There’s gold in those hills; I have to find a way to mine it.” For his first cross-cultural studies in the 1960s, he traveled around the U.S., Chile, Argentina, and Brazil. In each location, he showed people photos of different facial expressions and asked them to match the images with six different emotions: happiness, sadness, anger, surprise, fear, and disgust. “There was very high agreement,” Ekman says. People tended to match smiling faces with “happiness,” furrow-browed, tight-lipped faces with “anger,” and so on. But these responses could have been influenced by culture. The best way to test whether emotions were truly universal, he thought, would be to repeat his experiment in a totally remote society that hadn’t been exposed to Western media. So he planned a trip to Papua New Guinea, his confidence bolstered by films he’d seen of the island’s isolated cultures: “I never saw an expression I wasn’t familiar with in our culture,” he says. Once there, he showed locals the same photos he’d shown his other research subjects. He gave them a choice between three photos and asked them to pick images that matched various stories (such as “this man’s child has just died”). Adult participants chose the expected emotion between 28 and 100 percent of the time, depending which photos they were choosing among. (The 28 percent was a bit of an outlier: That was when people had to choose between fear, surprise, and sadness. The next lowest rate was 48 percent.) © 2014 by The Atlantic Monthly Group.
Link ID: 20619 - Posted: 02.26.2015
By Sandhya Sekar It’s stressful being a low-ranking hyena—so stressful that even their chromosomes feel it. Researchers have discovered that the challenges of African savanna hyena society shorten underdogs’ telomeres, stretches of DNA that bookend chromosomes and protect them from wear and tear during cell replication. The stress may come from the top hyenas getting the best meat, whereas lower ranking individuals have to travel long distances—sometimes to the edges of the group territory—to fend for themselves. With increased stress, higher amounts of stress hormones and cellular byproducts like oxygen ions and peroxides are produced, both of which have been shown to shorten telomeres in other species. When telomeres fall below a certain length, cells go into damage-control mode and kick off biochemical pathways that can result in cell death. The study, the team reports online today in Biology Letters, is the first to show that the stress of social hierarchy can shorten telomeres in a wild species. © 2015 American Association for the Advancement of Science.
Link ID: 20611 - Posted: 02.25.2015
By Barron H. Lerner, M.D. I can’t stand it when someone behind me at a movie chews popcorn with his or her mouth open. I mean, I really can’t stand it. I have misophonia, a condition with which certain sounds can drive someone into a burst of rage or disgust. Although only identified and named in the last 20 years, misophonia has been enthusiastically embraced, with websites, Facebook pages and conferences drawing small armies of frustrated visitors. As a primary care physician, I find that misophonia can present some special challenges: At times, my patients can be the source of annoying sounds. At other times, the condition can be a source of special bonding if I realize that a patient is a fellow sufferer. But some experts question whether misophonia really exists. By naming it, are we giving too much credence to a series of symptoms that are no big deal? Coined by the married researchers Margaret and Pawel Jastreboff of Emory University in 2002, misophonia (“hatred of sound”) is sometimes referred to as selective sound sensitivity syndrome. Like me, those with the disorder identify a series of specific sounds that bother them. A2013 study by Arjan Schröder and his colleagues at the University of Amsterdam identified the most common irritants as eating sounds, including lip smacking and swallowing; breathing sounds, such as nostril noises and sneezing; and hand sounds, such as typing and pen clicking. The range of responses to these noises is broad, from irritation to disgust to anger. Some sufferers even respond with verbal or physical aggression to those making the noises. One woman reported wanting to strangle her boyfriend in response to his chewing. © 2015 The New York Times Company
By Nathan Seppa Ask anybody — stress is bad news. The negative view of stress has been expressed so consistently that the concept is now built into our vernacular, which is spiced with advice on avoiding it: Take it easy. Calm down. Chill. Of course, a good case of stress comes in handy during an encounter with a grizzly bear on a hiking trail. In that situation, a stress reaction delivers a burst of hormones that revs up the heart and sharpens attention. This automatic response has served humans well throughout evolution, improving our odds of seeing another day. Problems arise, however, when stress becomes a feature of daily life. Chronic stress is the kind that comes from recurring pain, post-traumatic memories, unemployment, family tension, poverty, childhood abuse, caring for a sick spouse or just living in a sketchy neighborhood. Nonstop, low-grade stress contributes directly to physical deterioration, adding to the risk of heart attack, stroke, infection and asthma. Even recovery from cancer becomes harder. Scientists have now identified many of the biological factors linking stress to these medical problems. The evidence centers on nagging inflammation and genetic twists that steer cells off a healthy course, resulting in immune changes that allow ailments to take hold or worsen. Despite the bad rap stress has acquired throughout history, researchers have only recently been able to convince others that it’s dangerous. “It’s taken much more seriously now,” says Janice Kiecolt-Glaser, a clinical psychologist at Ohio State University in Columbus. “In the 1980s, we were still in the dark ages on this stuff.” © Society for Science & the Public 2000 - 2015
By Christie Aschwanden Paul Offit likes to tell a story about how his wife, pediatrician Bonnie Offit, was about to give a child a vaccination when the kid was struck by a seizure. Had she given the injection a minute sooner, Paul Offit says, it would surely have appeared as though the vaccine had caused the seizure and probably no study in the world would have convinced the parent otherwise. (The Offits have such studies at the ready — Paul is the director of the Vaccine Education Center at the Children’s Hospital of Philadelphia and author of “Deadly Choices: How the Anti-Vaccine Movement Threatens Us All.”) Indeed, famous anti-vaxxer Jenny McCarthy has said her son’s autism and seizures are linked to “so many shots” because vaccinations preceded his symptoms. But, as Offit’s story suggests, the fact that a child became sick after a vaccine is not strong evidence that the immunization was to blame. Psychologists have a name for the cognitive bias that makes us prone to assigning a causal relationship to two events simply because they happened one after the other: the “illusion of causality.” A study recently published in the British Journal of Psychology investigates how this illusion influences the way we process new information. Its finding: Causal illusions don’t just cement erroneous ideas in the mind; they can also prevent new information from correcting them. Helena Matute, a psychologist at Deusto University in Bilbao, Spain, and her colleagues enlisted 147 college students to take part in a computer-based task in which they each played a doctor who specializes in a fictitious rare disease and assessed whether new medications could cure it. ©2015 ESPN Internet Ventures.
Berit Brogaard On popular websites, we read headlines such as “Scientists are finding that love really is a chemical addiction between people.” Love, of course, is not literally a chemical addiction. It’s a drive perhaps, or a feeling or an emotion, but not a chemical addiction or even a chemical state. Nonetheless, romantic love, no doubt, often has a distinct physiological, bodily, and chemical profile. When you fall in love, your body chemicals go haywire. The exciting, scary, almost paranormal and unpredictable elements of love stem, in part, from hyper-stimulation of the limbic brain’s fear center known as the amygdala. It’s a tiny, almond-shaped brain region in the temporal lobe on the side of your head. In terms of evolutionary history, this brain region is old. It developed millions of years before the neocortex, the part of the brain responsible for logical thought and reasoning. While it has numerous biological functions, the prime role of the amagdala is to process negative emotional stimuli. Significant changes to normal amygdala activation are associated with serious psychological disorders. For example, human schizophrenics have significantly less activation in the amygdala and the memory system (the hippocampus), which is due to a substantial reduction in the size of these areas. People with depression, anxiety, and attachment insecurity, on the other hand, have significantly increased blood flow in the amygdala and memory system. Neuroscientist Justin Feinstein and his colleagues (2010) studied a woman whose amygdala was destroyed after a rare brain condition. They exposed her to pictures of spiders and snakes, took her on a tour of the world’s scariest haunted house, and had her take notes about her emotional state when she heard a beep from a random beeper that had been attached to her. After three months of investigation, the researchers concluded that the woman could not experience fear. This is very good evidence for the idea that the amygdala is the main center for fear processing. (The chief competing hypothesis is that fear is processed in a brain region that receives its main information from the amygdala.) © 2015 Salon Media Group, Inc.
By Jane E. Brody Bereavement — how one responds and adjusts to the death of a loved one — is a very individual matter. It is natural to experience a host of negative reactions in the weeks and months following the loss of a loved one: among them, sadness, difficulty sleeping, painful reminders of the person, difficulty enjoying activities once shared, even anger. Grief is a normal human reaction, not a disease, and there is no one right way to get through it. Most often, within six months of a death, survivors adjust and are more or less able to resume usual activities, experience joy, and remember their loved ones without intense pain. But sometimes, even when the loss is neither sudden nor unexpected, as is true in the majority of deaths in the United States, survivors close to the deceased can experience extremely disruptive grief reactions that persist far longer. In a report last month in The New England Journal of Medicine, Dr. M. Katherine Shear presents a composite portrait of what is known as complicated grief, an extreme, unrelenting reaction to loss that persists for more than six months and can result in a serious risk to health. She describes a 68-year-old widow who continued to be seriously impaired by grief four years after her husband died. The woman slept on the couch because she could not bear to sleep in the bed she had shared with him. She found it too painful to engage in activities they used to do together. She no longer ate regular meals because preparing them was a too-distressing reminder of her loss. And she remained alternately angry with the medical staff who cared for him and with herself for not recognizing his illness earlier. Symptoms of complicated grief commonly include intense yearning, longing or emotional pain; frequent preoccupying, intrusive thoughts and memories of the person lost; a feeling of disbelief or inability to accept the loss; and difficulty imagining a meaningful life without that person. © 2015 The New York Times Company
By PAULA SPAN The word “benzodiazepines” and the phrase “widely prescribed for anxiety and insomnia” appear together so frequently that they may remind you of the apparently unbreakable connection between “powerful” and “House Ways and Means Committee.” But now we have a better sense of just how widely prescribed these medications are. A study in this month’s JAMA Psychiatry reports that among 65- to 80-year-old Americans, close to 9 percent use one of these sedative-hypnotics, drugs like Valium, Xanax, Ativan and Klonopin. Among older women, nearly 11 percent take them. “That’s an extraordinarily high rate of use for any class of medications,” said Michael Schoenbaum, a senior adviser at the National Institutes of Mental Health and a co-author of the new report. “It seemed particularly striking given the identified clinical concerns associated with benzodiazepine use in anybody, but especially in older adults.” He was referring to decades of warnings about the potentially unhappy consequences of benzodiazepines for older users. The drugs still are recommended for a handful of specific disorders, including acute alcohol withdrawal and, sometimes, seizures and panic attacks. But concerns about the overuse of benzodiazepines have been aired again and again: in the landmark nursing home reform law of 1987, in the American Geriatrics Society’s Choosing Wisely list of questionable practices in 2013, in last year’s study in the journal BMJ suggesting an association with Alzheimer’s disease. Benzodiazepine users face increased risks of falls and fractures, of auto accidents, of reduced cognition. “Even after one or two doses, you have impaired cognitive performance on memory and other neuropsychological tests, compared to a placebo,” said Dr. D.P. Devanand, director of geriatric psychiatry at Columbia University Medical Center. © 2015 The New York Times Company