Chapter 15. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
On Wednesday morning we woke to the news that a passenger ferry had sunk off the coast of South Korea, with at least four people confirmed dead and 280 unaccounted for. Meanwhile, though the search has continued for the missing Malaysia Airlines plane, relatives' hopes of a safe landing have long since been extinguished. Human tragedies like these are the stuff of daily news, but we rarely hear about the long-term psychological effects on survivors and the bereaved, who may experience the symptoms of post-traumatic stress disorder for years after their experience. Although most people have heard of PTSD, few will have a clear idea of what it entails. The American Psychiatric Association's Diagnostic and Statistical Manual (DSM) defines a traumatic event as one in which a person "experienced, witnessed, or was confronted with an event or events that involved actual or threatened death or serious injury, or a threat to the physical integrity of self or others". PTSD is marked by four types of responses to the trauma. First, patients repeatedly relive the event, either in the form of nightmares or flashbacks. Second, they seek to avoid any reminder of the traumatic event. Third, they feel constantly on edge. Fourth, they are plagued with negative thoughts and low mood. According to one estimate, almost 8% of people will develop PTSD during their lifetime. Clearly trauma (and PTSD) can strike anyone, but the risks of developing the condition are not equally distributed. Rates are higher in socially disadvantaged areas, for instance. Women may be twice as likely to develop PTSD as men. This is partly because women are at greater risk of the kinds of trauma that commonly produce PTSD (rape, for example). Nevertheless – and for unknown reasons – when exposed to the same type of trauma, women are more susceptible to PTSD than men. © 2014 Guardian News and Media Limited
Virginia Hughes Trauma is insidious. It not only increases a person’s risk for psychiatric disorders, but can also spill over into the next generation. People who were traumatized during the Khmer Rouge genocide in Cambodia tended to have children with depression and anxiety, for example, and children of Australian veterans of the Vietnam War have higher rates of suicide than the general population. Trauma’s impact comes partly from social factors, such as its influence on how parents interact with their children. But stress also leaves ‘epigenetic marks’ — chemical changes that affect how DNA is expressed without altering its sequence. A study published this week in Nature Neuroscience finds that stress in early life alters the production of small RNAs, called microRNAs, in the sperm of mice (K. Gapp et al. Nature Neurosci. http://dx.doi.org/10.1038/nn.3695; 2014). The mice show depressive behaviours that persist in their progeny, which also show glitches in metabolism. The study is notable for showing that sperm responds to the environment, says Stephen Krawetz, a geneticist at Wayne State University School of Medicine in Detroit, Michigan, who studies microRNAs in human sperm. (He was not involved in the latest study.) “Dad is having a much larger role in the whole process, rather than just delivering his genome and being done with it,” he says. He adds that this is one of a growing number of studies to show that subtle changes in sperm microRNAs “set the stage for a huge plethora of other effects”. In the new study, Isabelle Mansuy, a neuroscientist at the University of Zurich, Switzerland, and her colleagues periodically separated mother mice from their young pups and exposed the mothers to stressful situations — either by placing them in cold water or physically restraining them. These separations occurred every day but at erratic times, so that the mothers could not comfort their pups (termed the F1 generation) with extra cuddling before separation. © 2014 Nature Publishing Group,
The two marmosets—small, New World monkeys—had been a closely bonded couple for more than 3 years. Then, one fateful day, the female had a terrible accident. She fell out of a tree and hit her head on a ceramic vase that happened to be underneath on the forest floor. Her partner left two of their infants alone in the tree and jumped down to apparently comfort her, until she died an agonizing death a couple of hours later. According to the researchers who recorded the events with a video camera (see video above), this is the first time such compassionate mourning behavior has been observed outside of humans and chimpanzees, and it could indicate that mourning is more widespread among primates than previously thought. Humans mourn their dead, of course, and some recent studies have strongly suggested that chimpanzees do as well. Scientists have recorded cases of adult chimps apparently caring for fellow animals before they die, and chimp mothers have been observed carrying around the bodies of infants for days after their death—although scientists have debated whether the latter behavior represents true grieving or if the mothers didn’t realize their infants were really dead. But there has been little or no evidence that other primates engage in these kinds of behaviors. Indeed, a recent review of the evidence led by anthropologist Peter Fashing of California State University, Fullerton, concluded that there were no convincing observations of “compassionate caretaking” of dying individuals among other nonhuman primates, such as monkeys. © 2014 American Association for the Advancement of Science.
Feeling peeved at your partner? You may want to check your blood sugar. A new study suggests that low levels of glucose in the blood may increase anger and aggression between spouses. The researchers say their findings suggest a connection between glucose and self-control, but other experts disagree about the study’s implications. Glucose is a source of fuel for the body, and its levels in the blood rise and fall throughout the day, as the body metabolizes meals that include carbohydrates. Researchers have suspected since the 1960s that low glucose or swings in glucose may play a role in human aggression. In two 2010 studies, psychologist Brad Bushman of Ohio State University, Columbus, attempted to figure out just what that role is, first by measuring vengefulness among people with symptoms of type 2 diabetes (a disease in which the body can’t regulate glucose levels properly), and then by providing sweetened drinks to strangers competing on a computerized task. Both studies suggested that higher glucose levels can make strangers less likely to treat each other aggressively. Bushman wondered about the relationship between glucose levels and aggression among romantic couples. So he and colleagues at the University of Kentucky and the University of North Carolina recruited 107 married couples and equipped them with blood glucose meters, voodoo dolls, and 51 pins to record their glucose and anger levels over time. For 21 days, the couples used the meters to measure their glucose levels each morning before breakfast and each evening before bed. They also assessed how angry they were at their spouse at the end of each day, by recording how many of the 51 pins they stuck into their voodoo dolls just before bed when their partner wasn’t looking. After 21 days, the couples were invited into the lab. There, they played a computer game that allowed them to blast their spouse with an unpleasant noise—a mixture of fingernails scratching a chalkboard, ambulance sirens, and dentist drills—as loudly and for as long as he or she wanted, as a proxy for their willingness to act aggressively and make their partner suffer. © 2014 American Association for the Advancement of Science.
In an op-ed in the Sunday edition of this newspaper, Barbara Ehrenreich, card-carrying liberal rationalist, writes about her own mystical experiences (the subject of her new book), and argues that the numinous deserves more cutting-edge scientific study: I appreciate the spirit (if you will) of this argument, but I am very doubtful as to its application. The trouble is that in its current state, cognitive science has a great deal of difficulty explaining “what happens” when “those wires connect” for non-numinous experience, which is why mysterian views of consciousness remain so potent even among thinkers whose fundamental commitments are atheistic and materialistic. (I’m going to link to the internet’s sharpest far-left scold for a good recent polemic on this front.) That is to say, even in contexts where it’s very easy to identify the physical correlative to a given mental state, and to get the kind of basic repeatability that the scientific method requires — show someone an apple, ask them to describe it; tell them to bite into it, ask them to describe the taste; etc. — there is no kind of scientific or philosophical agreement on what is actually happening to produce the conscious experience of the color “red,” the conscious experience of the crisp McIntosh taste, etc. So if we can’t say how this ”normal” conscious experience works, even when we can easily identify the physical stimulii that produce it, it seems exponentially harder to scientifically investigate the invisible, maybe-they-exist and maybe-they-don’t stimulii — be they divine, alien, or panpsychic — that Ehrenreich hypothesizes might produce more exotic forms of conscious experience. © 2014 The New York Times Company
Jyoti Madhusoodanan Growing up in a stressful social environment leaves lasting marks on young chromosomes, a study of African American boys has revealed. Telomeres, repetitive DNA sequences that protect the ends of chromosomes from fraying over time, are shorter in children from poor and unstable homes than in children from more nurturing families. When researchers examined the DNA of 40 boys from major US cities at age 9, they found that the telomeres of children from harsh home environments were 19% shorter than those of children from advantaged backgrounds. The length of telomeres is often considered to be a biomarker of chronic stress. The study, published today in the Proceedings of the National Academy of Sciences1, brings researchers closer to understanding how social conditions in childhood can influence long-term health, says Elissa Epel, a health psychologist at the University of California, San Francisco, who was not involved in the research. Participants’ DNA samples and socio-economic data were collected as part of the Fragile Families and Child Wellbeing Study, an effort funded by the US National Institutes of Health to track nearly 5,000 children, the majority of whom were born to unmarried parents in large US cities in 1998–2000. Children's environments were rated on the basis of their mother's level of education; the ratio of a family’s income to needs; harsh parenting; and whether family structure was stable, says lead author Daniel Notterman, a molecular biologist at Pennsylvania State University in Hershey. © 2014 Nature Publishing Group
By By Stephanie Pappas, A little stress may be a good thing for teenagers learning to drive. In a new study, teens whose levels of the stress hormone cortisol increased more during times of stress got into fewer car crashes or near crashes in their first months of driving than their less-stress-responsive peers did. The study suggests that biological differences may affect how teens learn to respond to crises on the road, the researchers reported today (April 7) in the journal JAMA Pediatrics. Efforts to reduce teen car accidents include graduated driver licensing programs, safety messages and increased parental management, but these efforts seem to work better for some teens than others, the researchers said. Alternatives, such as in-vehicle technologies aimed at reducing accidents, may be especially useful for teens with a "neurological basis" for their increased risk of getting into an accident, they said. Automobile accidents are the No. 1 cause of death of teenagers in the United States, according to the Centers for Disease Control and Prevention. Car crashes also kill more 15- to 29-year-olds globally than any other cause, according to the World Health Organization.
By BARBARA EHRENREICH MY atheism is hard-core, rooted in family tradition rather than adolescent rebellion. According to family legend, one of my 19th-century ancestors, a dirt-poor Irish-American woman in Montana, expressed her disgust with the church by vehemently refusing last rites when she lay dying in childbirth. From then on, we were atheists and rationalists, a stance I perpetuated by opting, initially, for a career in science. How else to understand the world except as the interaction of tiny bits of matter and mathematically predictable forces? There were no gods or spirits, just our own minds pressing up against the unknown. But something happened when I was 17 that shook my safely rationalist worldview and left me with a lifelong puzzle. Years later, I learned that this sort of event is usually called a mystical experience, and I can see in retrospect that the circumstances had been propitious: Thanks to a severely underfunded and poorly planned skiing trip, I was sleep-deprived and probably hypoglycemic that morning in 1959 when I stepped out alone, walked into the streets of Lone Pine, Calif., and saw the world — the mountains, the sky, the low scattered buildings — suddenly flame into life. There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with “the All,” as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, too vast and violent to hold on to, too heartbreakingly beautiful to let go of. It seemed to me that whether you start as a twig or a gorgeous tapestry, you will be recruited into the flame and made indistinguishable from the rest of the blaze. I felt ecstatic and somehow completed, but also shattered. © 2014 The New York Times Company
By Deborah Serani Sometimes I work with children and adults who can’t put words to their feelings and thoughts. It’s not that they don’t want to – it’s more that they don’t know how. The clinical term for this experience is alexithymia and is defined as the inability to recognize emotions and their subtleties and textures . Alexithymia throws a monkey wrench into a person’s ability to know their own self-experience or understand the intricacies of what others feel and think. Here are a few examples those with alexithymia experience: Difficulty identifying different types of feelings Limited understanding of what causes feelings Difficulty expressing feelings Difficulty recognizing facial cues in others Limited or rigid imagination Constricted style of thinking Hypersensitive to physical sensations Detached or tentative connection to others Alexithymia was first mentioned as a psychological construct in 1976 and was viewed as a deficit in emotional awareness . Research suggests that approximately 8% of males and 2% of females experience alexithymia, and that it can come in mild, moderate and severe intensities . Studies also show that alexithymia has two dimensions – a cognitive dimension, where a child or adult struggles to identify, interpret and verbalize feelings (the “thinking” part of our emotional experience). And an affective dimension, where difficulties arise in reacting, expressing, feeling and imagining (the “experiencing” part of our emotional experience) . © 2014 Scientific American
David Adam The day the Brazilian racing driver Ayrton Senna died in a crash, I was stuck in the toilet of a Manchester swimming pool. The door was open, but my thoughts blocked the way out. It was May 1994. I was 22 and hungry. After swimming a few lengths of the pool, I had lifted myself from the water and headed for the locker rooms. Going down the steps, I had scraped the back of my heel on the sharp edge of the final step. It left a small graze through which blood bulged into a blob that hung from my broken skin. I transferred the drop to my finger and a second swelled to take its place. I pulled a paper towel from above the sink to press to my wet heel. The blood on my finger ran with the water as it dripped down my arm. My eyes followed the blood. And the anxiety, of course, rushed back, ahead even of the memory. My shoulders sagged. My stomach tightened. Four weeks earlier, I had pricked my finger on a screw that stuck out from a bus shelter's corrugated metal. It was a busy Saturday afternoon and there had been lots of people around. Any one of them, I thought, could easily have injured themselves in the way I had. What if one had been HIV positive? They could have left infected blood on the screw, which then pierced my skin. That would put the virus into my bloodstream. I knew the official line was that transmission was impossible this way – the virus couldn't survive outside the body – but I also knew that, when pressed for long enough, those in the know would weaken the odds to virtually impossible. They couldn't be absolutely sure. In fact, several had admitted to me there was a theoretical risk. © 2014 Guardian News and Media Limited
Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 19447 - Posted: 04.05.2014
By NATALIE ANGIER The “Iliad” may be a giant of Western literature, yet its plot hinges on a human impulse normally thought petty: spite. Achilles holds a festering grudge against Agamemnon (“He cheated me, wronged me ... He can go to hell...”) turning down gifts, homage, even the return of his stolen consort Briseis just to prolong the king’s suffering. Now, after decades of focusing on such staples of bad behavior as aggressiveness, selfishness, narcissism and greed, scientists have turned their attention to the subtler and often unsettling theme of spite — the urge to punish, hurt, humiliate or harass another, even when one gains no obvious benefit and may well pay a cost. Psychologists are exploring spitefulness in its customary role as a negative trait, a lapse that should be embarrassing but is often sublimated as righteousness, as when you take your own sour time pulling out of a parking space because you notice another car is waiting for it and you’ll show that vulture who’s boss here, even though you’re wasting your own time, too. Evolutionary theorists, by contrast, are studying what might be viewed as the brighter side of spite, and the role it may have played in the origin of admirable traits like a cooperative spirit and a sense of fair play. The new research on spite transcends older notions that we are savage, selfish brutes at heart, as well as more recent suggestions that humans are inherently affiliative creatures yearning to love and connect. Instead, it concludes that vice and virtue, like the two sides of a V, may be inextricably linked. “Spitefulness is such an intrinsically interesting subject, and it fits with so many people’s everyday experience, that I was surprised to see how little mention there was of it in the psychology literature,” said David K. Marcus, a psychologist at Washington State University. At the same time, he said, “I was thrilled to find something that people haven’t researched to exhaustion.” © 2014 The New York Times Company
by Meghan Rosen Human faces just got a lot more emotional. People can broadcast more than three times as many different feelings on their faces as scientists once suspected. For years, scientists have thought that people could convey only happiness, surprise, sadness, anger, fear and disgust. “I thought it was very odd to have only one positive emotion,” says cognitive scientist Aleix Martinez of Ohio State University in Columbus. So he and colleagues came up with 16 combined ones, such as “happily disgusted” and “happily surprised.” Then the researchers asked volunteers to imagine situations that would provoke these emotions, such as listening to a gross joke, or getting unexpected good news. When the team compared pictures of the volunteers making different faces and analyzed every eyebrow wrinkle, mouth stretch and tightened chin, “what we found was beyond belief,” Martinez says. For each compound emotion, almost everyone used the same facial muscles, the team reports March 31 in the Proceedings of the National Academy of Sciences. Martinez’s team’s findings could one day help computer engineers improve facial recognition software and help scientists better understand emotion-perception disorders such as schizophrenia. Citations S Du, Y. Tao and A. M. Martinez Compound facial expressions of emotion. Proceedings of the National Academy of Sciences. Published online March 30, 2014. Doi: 10.1073/pnas.1322355111. © Society for Science & the Public 2000 - 2013
Link ID: 19430 - Posted: 04.01.2014
By Helen Briggs BBC News When it comes to detecting lies, you should trust your instinct, research suggests. We are better at identifying liars when we rely on initial responses rather than thinking about it, say psychologists. Generally we are poor at spotting liars - managing only slightly better than flipping a coin. But our success rate rises when we harness the unconscious mind, according to a report in Psychological Science. "What interested us about the unconscious mind is that it just might really be the seat of where accurate lie detection lives," said Dr Leanne ten Brinke of the University of California, Berkeley. "So if our ability to detect lies is not conscious - we simply can't do this when we're thinking hard about it - then maybe it lives somewhere else, and so we thought one possible explanation was the unconscious mind." When trying to find out if someone is lying, most people rely on cues like someone averting their gaze or appearing nervous. However, research suggests this is not accurate - people perform at only about 50% accuracy in traditional lie detection tasks. Psychologists at the University of California were puzzled by this, as some primates, such as chimps, are able to detect deceit - and evolutionary theory supposes that it maximises survival and reproductive success. Dr Ten Brinke and colleagues devised experiments to test the ability of the unconscious mind to spot a liar, to see if they could do better than the conscious mind. BBC © 2014
Link ID: 19420 - Posted: 03.29.2014
By NICHOLAS BAKALAR A large study has linked several common anti-anxiety drugs and sleeping pills to an increased risk of death, although it’s not certain the drugs were the cause. For more than seven years, researchers followed 34,727 people who filled prescriptions for anti-anxiety medications like Valium and Xanax, or sleep aids like Ambien, Sonata and Lunesta, comparing them with 69,418 controls who did not. After adjusting for a wide variety of factors, the researchers found that people who took the drugs had more than double the risk of death. The study appears online in BMJ. The researchers tried to account for the use of other prescribed drugs, age, smoking, alcohol use, socioeconomic status, and other health and behavioral characteristics. Most important, the investigators also controlled for sleep disorders, anxiety disorders and other psychiatric illnesses, all of which are risk factors for mortality. The lead author, Dr. Scott Weich, a professor of psychiatry at the University of Warwick, said that while he and his colleagues were careful to account for as many potential risks as possible, they were not able to control for the severity of the illnesses suffered by the study participants. Still, he said, the research “adds to an accumulating body of evidence that these drugs are dangerous.” He added: “I prescribe these drugs, and they are difficult to come off. The less time you spend on them the better.” © 2014 The New York Times Company
by Erika Engelhaupt What gets us hot can be so revealing. For me, the slightest anxiety or excitement can trigger a warm spread across my face. I can feel the blood rushing up my neck and into the thousands of tiny capillaries across my cheeks. I’ve worn scarves or turtlenecks to job interviews, weather be damned, to keep my burning red neck from betraying my nerves. And the opposite can be true. Have you ever seen someone truly blanch? Given a real fright, the blood can literally drain from a person’s face, leaving a white mask. This all happens thanks to the autonomic nervous system, the fight-or-flight control system. Faced with danger, it tells blood vessels to pinch off the flow to the face and extremities, sending more blood to the muscles and body core so you’ll be pumped up for either the flight or the fight. Heat-sensing cameras can pick all this up, and in way more detail than my scarf could hide. Our nervous systems are constantly chugging away, largely out of our conscious control, tweaking our blood flow for every emotion. Just think of all the tiny wafts of heat flowing across your face as you negotiate with your boss, or talk to your lover. Feeling a bit anxious? Guilty? Stressed? Sexually aroused, perhaps? There’s a researcher out there with a thermal camera that can detect each of those. Even post-traumatic stress disorder may show up in your face’s heat map. In a pilot study of bank tellers who have been robbed, a team of researchers in Italy reports in the April 25 Neuroscience that tellers with mild PTSD have amped-up fear responses that show up in their facial heat signature. © Society for Science & the Public 2000 - 2013.
By RICHARD A. FRIEDMAN FEELING down? Smile. Cheer up. Put on a happy face. No doubt you’ve dismissed these bromides from friends and loved ones because everyone knows that you can’t feel better just by aping a happy look. Or perhaps you can. New research suggests that it is possible to treat depression by paralyzing key facial muscles with Botox, which prevents patients from frowning and having unhappy-looking faces. In a study forthcoming in the Journal of Psychiatric Research, Eric Finzi, a cosmetic dermatologist, and Norman Rosenthal, a professor of psychiatry at Georgetown Medical School, randomly assigned a group of 74 patients with major depression to receive either Botox or saline injections in the forehead muscles whose contraction makes it possible to frown. Six weeks after the injection, 52 percent of the subjects who got Botox showed relief from depression, compared with only 15 percent of those who received the saline placebo. (You might think that patients would easily be able to tell whether they got the placebo or Botox. Actually, it wasn’t so obvious: Only about half of the subjects getting Botox guessed correctly. More important, knowing which treatment was received had no significant effect on treatment response.) Other studies over the past several years have found similar effects of Botox on mood. Michael Lewis at Cardiff University reported that nondepressed patients at a cosmetic dermatology clinic receiving Botox injection above the eyes frowned less and felt better than those who did not receive this injection. And M. Axel Wollmer at the University of Basel found that Botox injection was superior to a placebo in a group of depressed patients. © 2014 The New York Times Company
Link ID: 19410 - Posted: 03.26.2014
by Clare Wilson It's a vicious circle of the cruellest kind. Stress might be causing infertility in women, according to new research. This could explain some cases in which couples are diagnosed as infertile with no apparent cause. Taking longer than usual to conceive can lead to stress, so the problem could become self-perpetuating. A link between everyday life stresses and infertility has long been suspected, but there has been little hard evidence connecting the two. Women receiving fertility treatment are generally advised to avoid stress, but not so the average person trying to conceive. An estimated one in seven couples in the UK have fertility problems and, in about a quarter of those, there is no known medical explanation, and they are given a diagnosis of "unexplained infertility". To explore the role of stress, Courtney Lynch at Ohio State University in Columbus and her colleagues collected saliva samples from 373 women in the US who had just started trying to conceive naturally and measured levels of an enzyme called alpha-amylase, a marker of stress. After one year of regular unprotected sex, about 13 per cent of the couples had failed to get pregnant, the standard definition of infertility. The third of women who had the highest alpha-amylase levels were twice as likely to be in the infertile group as the third with the lowest levels. In a previous study, Lynch's team found that those with higher levels of the stress enzyme were slightly less likely to conceive in their first month of trying. But this is the first time that alpha-amylase has been linked to clinical infertility. © Copyright Reed Business Information Ltd.
Want to live a long, dementia-free life? Stress your cells out. That’s the conclusion of a new study, which finds that heightened cellular stress causes brain cells to produce a protein that staves off Alzheimer’s disease and other forms of dementia. The work could lead to new ways to diagnose or treat such diseases. “This paper is very impressive,” says neuroscientist Li-Huei Tsai of the Massachusetts Institute of Technology in Cambridge, who was not involved in the new work. “It puts a finger on a particular pathway that can provide some explanation as to why some people are more susceptible to Alzheimer’s.” Alzheimer’s disease, characterized by a progressive loss of memory and cognition, affects an estimated 44.4 million people worldwide, mostly over the age of 65. The illness has been linked to the accumulation of certain proteins in the brain, but what causes symptoms has been unclear. That’s because the brains of some elderly people without dementia have the same clumps of so-called amyloid β and τ proteins typically associated with Alzheimer’s. The new study deals with a protein called repressor element 1-silencing transcription factor (REST), which turns genes and off. Scientists knew that REST played a key role in fetal brain development by controlling the activity of certain genes, but they thought it was absent in adult brains. However, when Bruce Yankner, a neurologist at Harvard Medical School in Boston, looked at all the genes and proteins that change in brains as people age, he found that REST levels begin increasing again when a person hits their 30s. Stumped as to why, he and his colleagues isolated human and mouse brain cells and probed what factors altered REST levels and what consequences those levels had. © 2014 American Association for the Advancement of Science
By Michelle Roberts Health editor, BBC News online Statins may be useful in treating advanced multiple sclerosis (MS), say UK researchers. Early trial results in The Lancet show the cholesterol-lowering pills slow brain shrinkage in people with MS. The University College London (UCL) scientists say large trials can now begin. These will check whether statins benefit MS patients by slowing progression of the disease and easing their symptoms. MS is a major cause of disability, affecting nerves in the brain and spinal cord, which causes problems with muscle movement, balance and vision. Currently there is no cure, although there are treatments that can help in the early stages of the disease. Usually, after around 10 years, around half of people with MS will go on to develop more advanced disease - known as secondary progressive MS. It is this later stage disease that Dr Jeremy Chataway and colleagues at UCL hope to treat with low cost statins. To date, no licensed drugs have shown a convincing impact on this later stage of the disease. For their phase two trial, which is published in the Lancet, Dr Chataway's team randomly assigned 140 people with secondary progressive MS to receive either 80mg of a statin called simvastatin or a placebo for two years. The high, daily dose of simvastatin was well tolerated and slowed brain shrinkage by 43% over two years compared with the placebo. Dr Chataway said: "Caution should be taken regarding over-interpretation of our brain imaging findings, because these might not necessarily translate into clinical benefit. However, our promising results warrant further investigation in larger phase three disability-driven trials." BBC © 2014
Keyword: Multiple Sclerosis
Link ID: 19383 - Posted: 03.19.2014
By FLORENCE WILLIAMS So there’s this baby who has swallowed a .22-caliber bullet. The mother rushes into a drugstore, crying, “What shall I do?” “Give him a bottle of castor oil,” replies the druggist, “but don’t point him at anybody.” Whether you find this joke amusing depends on many more variables than you probably ever realized. It depends on a common cultural understanding of the technical properties of castor oil. It depends, as many funny jokes do and as any fourth grader can attest, on our own squeamishness about bodily functions. Getting less obvious, your sense of humor can also depend on your age, your gender, your I.Q., your political inclinations, how extroverted you are and the health of your dopamine reward circuit. If you think all this analysis sounds a bit, well, unfunny, E. B. White would back you up. He once wrote that picking apart jokes is like dissecting frogs: Few people are interested, and the subject always dies in the end. Fortunately, the cognitive neuroscientist Scott Weems isn’t afraid of being unfunny. Humor is worthy of serious academic study, he argues in his book, “Ha! The Science of When We Laugh and Why,” (Read an excerpt.) because it yields insights into how our brains process a complex world and how that, in turn, makes us who we are. Though animals laugh, humans spend more time laughing than exhibiting any other emotion. But what gives some people a better sense of humor than others? Not surprisingly, extroverts tend to laugh more and produce more jokes; yet in tests measuring the ability to write cartoon captions, people who were more neurotic, assertive, manipulative and dogmatic were actually funnier. As the old saw holds, many of the best comics really are miserable. © 2014 The New York Times Company
Link ID: 19373 - Posted: 03.18.2014