Chapter 11. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By CARL ZIMMER Imagine a wolf catching a Frisbee a dozen times in a row, or leading police officers to a stash of cocaine, or just sleeping peacefully next to you on your couch. It’s a stretch, to say the least. Dogs may have evolved from wolves, but the minds of the two canines are profoundly different. Dog brains, as I wrote last month in The New York Times, have become exquisitely tuned to our own. Scientists are now zeroing in on some of the genes that were crucial to the rewiring of dog brains. Their results are fascinating, and not only because they can help us understand how dogs turned into man’s best friend. They may also teach us something about the evolution of our own brains: Some of the genes that evolved in dogs are the same ones that evolved in us. To trace the change in dog brains, scientists have first had to work out how dog breeds are related to one another, and how they’re all related to wolves. Ya-Ping Zhang, a geneticist at the Chinese Academy of Sciences, has led an international network of scientists who have compared pieces of DNA from different canines. They’ve come to the conclusion that wolves started their transformation into dogs in East Asia. Those early dogs then spread to other parts of the world. Many of the breeds we’re most familiar with, like German shepherds and golden retrievers, emerged only in the past few centuries. © 2013 The New York Times Company
by Sara Reardon As suicide rates climb steeply in the US a growing number of psychiatrists are arguing that suicidal behaviour should be considered as a disease in its own right, rather than as a behaviour resulting from a mood disorder. They base their argument on mounting evidence showing that the brains of people who have committed suicide have striking similarities, quite distinct from what is seen in the brains of people who have similar mood disorders but who died of natural causes. Suicide also tends to be more common in some families, suggesting there may be genetic and other biological factors in play. What's more, most people with mood disorders never attempt to kill themselves, and about 10 per cent of suicides have no history of mental disease. The idea of classifying suicidal tendencies as a disease is being taken seriously. The team behind the fifth edition of the Diagnostic Standards Manual (DSM-5) – the newest version of psychiatry's "bible", released at the American Psychiatric Association's meeting in San Francisco this week – considered a proposal to have "suicide behaviour disorder" listed as a distinct diagnosis. It was ultimately put on probation: put into a list of topics deemed to require further research for possible inclusion in future DSM revisions. Another argument for linking suicidal people together under a single diagnosis is that it could spur research into the neurological and genetic factors they have in common. This could allow psychiatrists to better predict someone's suicide risk, and even lead to treatments that stop suicidal feelings. © Copyright Reed Business Information Ltd.
By ANAHAD O'CONNOR The nation’s largest cardiovascular health organization has a new message for Americans: Owning a dog may protect you from heart disease. The unusual message was contained in a scientific statement published on Thursday by the American Heart Association, which convened a panel of experts to review years of data on the cardiovascular benefits of owning a pet. The group concluded that owning a dog, in particular, was “probably associated” with a reduced risk of heart disease. People who own dogs certainly have more reason to get outside and take walks, and studies show that most owners form such close bonds with their pets that being in their presence blunts the owners’ reactions to stress and lowers their heart rate, said Dr. Glenn N. Levine, the head of the committee that wrote the statement. But most of the evidence is observational, which makes it impossible to rule out the prospect that people who are healthier and more active in the first place are simply more likely to bring a dog or cat into their home. “We didn’t want to make this too strong of a statement,” said Dr. Levine, a professor at the Baylor College of Medicine. “But there are plausible psychological, sociological and physiological reasons to believe that pet ownership might actually have a causal role in decreasing cardiovascular risk.” Nationwide, Americans keep roughly 70 million dogs and 74 million cats as pets. Copyright 2013 The New York Times Company
By NICHOLAS BAKALAR Two studies have found that depression and the use of certain antidepressants are both associated with increased risk for Clostridium difficile infection, an increasingly common cause of diarrhea that in the worst cases can be fatal. Researchers studied 16,781 men and women, average age 68, using hospital records and interviews to record cases of the infection, often called C. diff, and diagnoses of depression. The interviews were conducted biennially from 1991 to 2007 to gather self-reports of feelings of sadness and other emotional problems. There were 404 cases of C. difficile infection. After adjusting for other variables, the researchers found that the risk of C. diff infection among people with a history of depression or depressive symptoms was 36 to 47 percent greater than among people without depression. A second study, involving 4,047 hospitalized patients, average age 58, found a similar association of infection with depression. In addition, it found an association of some antidepressants — Remeron, Prozac and trazodone — with C. diff infection. There was no association with other antidepressants. “We have known for a long time that depression is associated with changes in the gastrointestinal system,” said the lead author, Mary A.M. Rogers, a research assistant professor at the University of Michigan, “and this interaction between the brain and the gut deserves more study.” Both reports appeared in the journal BMC Medicine. Copyright 2013 The New York Times Company
By Ben Thomas Horror isn’t the only film genre that specializes in dread. War movies like Apocalypse Now, sci-fi mysteries like Brazil and Blade Runner, and dramas like Melancholia and Requiem for a Dream all masterfully evoke a less violent, more subtle and pervasive sense that something is unwell with the world – that somewhere along the line, something went deeply wrong and now normality itself is unraveling before our eyes. The director David Lynch has arguably built his entire career on directing these kinds of films. In Lynch’s universe, even the most banal moments are still somehow suffused with unnerving suspense. In films like Blue Velvet and Mulholland Drive, disturbing surprises erupt into scene after scene of buried tension, until every ordinary conversation feels like a trap waiting to spring. And then there’s the infamous Eraserhead, where family life itself is transformed into an onslaught of surreal and nauseating images. It’s hard to come away from these movies without feeling that a little of Lynch’s unease has rubbed off on you. So when a team of researchers at the University of British Columbia set out to describe and treat an ancient biological alarm system buried deep within the human brain, they turned to Lynch’s films as an analogy for – and a set of examples of – the feeling of omnipresent yet maddeningly vague “wrongness” that seems to underlie many anxiety disorders. © 2013 Scientific American
Link ID: 18134 - Posted: 05.09.2013
By Nathan Seppa Multiple sclerosis, long considered a disease of white females, has affected more black women in recent years, a new study finds. Hispanic and Asian women, who have previously seemed to be at less risk of MS, remain so, researchers report May 7 in Neurology. The findings bolster a theory that vitamin D deficiency, which is common in people with dark skin in northern latitudes, contributes to MS. MS is a debilitating condition in which the protective coatings on nerves in the central nervous system get damaged, resulting in a loss of motor control, muscle weakness, vision complications and other problems. The National Multiple Sclerosis Society estimates that 2.1 million people worldwide have the condition. The researchers scanned medical information from 3.5 million people who were members of the health maintenance organization Kaiser Permanente Southern California and found that 496 people received diagnoses of MS from 2008 through 2010. Of these patients, women comprised 70 percent, not an unusual fraction for people with MS. Surprisingly, the patients included 84 black women. That means the annual incidence of MS in black women was 10.2 cases per 100,000 people. That’s not a great risk for an individual, but it was higher than the annual rates for white, Hispanic and Asian women, which were 6.9, 2.9 and 1.4 per 100,000 people, respectively. Among blacks, women had three times the incidence as men; in the other racial and ethnic groups, the MS rate in women was roughly double that of men. © Society for Science & the Public 2000 - 2013
By TARA PARKER-POPE Suicide rates among middle-aged Americans have risen sharply in the past decade, prompting concern that a generation of baby boomers who have faced years of economic worry and easy access to prescription painkillers may be particularly vulnerable to self-inflicted harm. More people now die of suicide than in car accidents, according to the Centers for Disease Control and Prevention, which published the findings in Friday’s issue of its Morbidity and Mortality Weekly Report. In 2010 there were 33,687 deaths from motor vehicle crashes and 38,364 suicides. Suicide has typically been viewed as a problem of teenagers and the elderly, and the surge in suicide rates among middle-aged Americans is surprising. From 1999 to 2010, the suicide rate among Americans ages 35 to 64 rose by nearly 30 percent, to 17.6 deaths per 100,000 people, up from 13.7. Although suicide rates are growing among both middle-aged men and women, far more men take their own lives. The suicide rate for middle-aged men was 27.3 deaths per 100,000, while for women it was 8.1 deaths per 100,000. The most pronounced increases were seen among men in their 50s, a group in which suicide rates jumped by nearly 50 percent, to about 30 per 100,000. For women, the largest increase was seen in those ages 60 to 64, among whom rates increased by nearly 60 percent, to 7.0 per 100,000. Suicide rates can be difficult to interpret because of variations in the way local officials report causes of death. But C.D.C. and academic researchers said they were confident that the data documented an actual increase in deaths by suicide and not a statistical anomaly. While reporting of suicides is not always consistent around the country, the current numbers are, if anything, too low. © 2013 The New York Times Company
By ADRIAN RAINE In studying brain scans of criminals, researchers are discovering tell-tale signs of violent tendencies. WSJ's Jason Bellini speaks with Professor Adrian Raine about his latest discoveries. The scientific study of crime got its start on a cold, gray November morning in 1871, on the east coast of Italy. Cesare Lombroso, a psychiatrist and prison doctor at an asylum for the criminally insane, was performing a routine autopsy on an infamous Calabrian brigand named Giuseppe Villella. Lombroso found an unusual indentation at the base of Villella's skull. From this singular observation, he would go on to become the founding father of modern criminology. Lombroso's controversial theory had two key points: that crime originated in large measure from deformities of the brain and that criminals were an evolutionary throwback to more primitive species. Criminals, he believed, could be identified on the basis of physical characteristics, such as a large jaw and a sloping forehead. Based on his measurements of such traits, Lombroso created an evolutionary hierarchy, with Northern Italians and Jews at the top and Southern Italians (like Villella), along with Bolivians and Peruvians, at the bottom. These beliefs, based partly on pseudoscientific phrenological theories about the shape and size of the human head, flourished throughout Europe in the late 19th and early 20th centuries. Lombroso was Jewish and a celebrated intellectual in his day, but the theory he spawned turned out to be socially and scientifically disastrous, not least by encouraging early-20th-century ideas about which human beings were and were not fit to reproduce—or to live at all. ©2013 Dow Jones & Company, Inc.
Link ID: 18111 - Posted: 05.04.2013
Alison Abbott Thinking about a professor just before you take an intelligence test makes you perform better than if you think about football hooligans. Or does it? An influential theory that certain behaviour can be modified by unconscious cues is under serious attack. A paper published in PLoS ONE last week1 reports that nine different experiments failed to replicate this example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks. David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE, is among sceptical scientists calling for Dijksterhuis to design a detailed experimental protocol to be carried out indifferent laboratories to pin down the effect. Dijksterhuis has rejected the request, saying that he “stands by the general effect” and blames the failure to replicate on “poor experiments”. An acrimonious e-mail debate on the subject has been dividing psychologists, who are already jittery about other recent exposures of irreproducible results (see Nature 485, 298–300; 2012). “It’s about more than just replicating results from one paper,” says Shanks, who circulated a draft of his study in October; the failed replications call into question the underpinnings of ‘unconscious-thought theory’. © 2013 Nature Publishing Group
By Breanna Draxler The ruse is common in spy movies—an attractive female saunters in at a critical moment and seduces the otherwise infallible protagonist, duping him into giving up the goods. It works in Hollywood and it works in real life, too. Men tend to say yes to attractive women without really scrutinizing whether or not they are trustworthy. But scientists have shown, for the first time, that a drug may be able to overcome this “honey trap,” and help men make more rational decisions. Nearly 100 men participated in the study; half were given minocycline, an antibiotic normally used to treat acne, and half were given a placebo. After four days of this drug regimen, participants played a computerized one-on-one trust game with eight different women, based only on pictures of the female players. In each round, the male player was given $13 and shown a picture of one of the female players. The male player would choose how much money he wanted to keep and how much he wanted to give to the female player. The amount given away was then tripled, and the female player would decide whether to split the money with the man or keep it all for herself. Unbeknownst to the men, however, the women kept the money every time. The researchers also asked the men to evaluate the photos of the females to determine how trustworthy and attractive they appeared, on a scale of 0 to 10.
By TARA PARKER-POPE Are doctors nicer to patients who aren’t fat? A provocative new study suggests that they are — that thin patients are treated with more warmth and empathy than those who are overweight or obese. For the study, published in the medical journal Obesity, researchers at Johns Hopkins obtained permission to record discussions between 39 primary care doctors and more than 200 patients who had high blood pressure. Although patients were there to talk about blood pressure, not weight, most fell into the overweight or obese category. Only 28 were of normal weight, meaning they had a body mass index below 25. Of the remaining patients, 120 were obese (B.M.I. of 30 or greater) and 60 were classified as overweight (index of 25 to 30). For the most part, all of the patients were treated about the same; there were no meaningful differences in the amount of time doctors spent with them or the topics discussed. But when researchers analyzed transcripts of the visits, there was one striking difference. Doctors seemed just a bit nicer to their normal-weight patients, showing more empathy and warmth in their conversations. Although the study was relatively small, the findings are statistically significant. “It’s not like the physicians were being overtly negative or harsh,” said the lead author, Dr. Kimberly A. Gudzune, an assistant professor of general internal medicine at the Johns Hopkins School of Medicine. “They were just not engaging patients in that rapport-building or making that emotional connection with the patient.” Copyright 2013 The New York Times Company
By Scicurious Say you are out on a camping trip with some friends. You’re in the woods, the tents are up, the beer is out, the sun is down, the campfire is starting up. As you sit there, you hear the campfire crackling loudly. To most people, the crackling of the campfire is just that: a campfire. Nothing threatening at all. But for someone with a severe anxiety disorder such as post-traumatic stress disorder (PTSD), the crackling of the campfire may be associated with terrible memories, a huge conflagration during house to house fighting or a house fire that destroyed all they loved, causing them horrible distress and terrible anxiety. A campfire during a camping trip and the horrible things they endured are entirely dissimilar things, but in severe anxiety disorders, that makes no difference at all. No, this post is not about whether or not anxiety disorders are being over diagnosed. Rather, it’s about how over-generalization within the brain might influence the development of anxiety disorders. What is the difference between a house fire and a campfire? How does your brain know? It’s the idea of pattern separation, an idea that the authors of this review believe could be incredibly important in treating some types of anxiety disorders. Pattern separation is one of the many actions of the hippocampus, the large, curved area in the interior of the brain which is thought to play a role in things like memory and in disorders such as anxiety and depression. Pattern separation was originally observed related to memory, but the authors of this review propose that it may also relate to things like anxiety. © 2013 Scientific American
Link ID: 18089 - Posted: 04.29.2013
By Scicurious Generally, I don’t think of being tickled as a particularly pleasurable or calming activity. Most people who are ticklish go immediately on the defensive and tense up, and I always got the impression that most people prefer NOT to be tickled rather than otherwise. However, that’s just us. And we’re not rats. And it turns out, you can calm a rat with tickling. Life is stressful. Whether it’s running from predators, meeting tight deadlines, or trying to keep fed, there’s a lot that seems to bring us down. What saves us from tearing our hair out? Well, the happy things in life. Tasty food, friends, hugs, puppies. You know, the good stuff. These things elicit positive feeling, and positive feeling have been linked to protecting us from stress. Of course, in humans, it’s easy to say that a positive outlook on life makes someone resistant to stress…but is it really true? They may co-occur, but do positive feelings really decrease stress? If you want to get at causes, one of the best ways is to use an animal model. But how do you come up with an animal model for…happiness? Well, you can tickle rats. As you can see in the video above, rats like to be tickled. They even respond with “laughter”! Of course, it’s not laughter as we know it, or even something we can hear. Instead, these are ultrasonic vocalizations at a specific frequency (50 kilohertz). Scientists figured they must be pleasure-sounds because rats make them when they play with other rats. And it turns out that rats make the same noise at the same frequency when they get tickled! © 2013 Scientific American
By Shaunacy Ferro When David Nichols earned a Ph.D in medicinal chemistry from the University of Iowa in 1973 by studying psychedelics, he thought he would continue studying hallucinogens indefinitely. "I thought I would work on it for the rest of my life," he says. His timing was less than fortuitous. In 1970, the year after Nichols started grad school, Richard Nixon signed into law the Controlled Substances Act, designed to clamp down on the manufacture and distribution of drugs in the U.S. The act classified hallucinogenic substances like LSD, DMT, psilocybin (the psychedelic alkaloid in mushrooms) and mescaline as Schedule I substances--the most restrictive use category, reserved for drugs with high potential for abuse and no accepted medical use. Marijuana was also placed in this category, and 15 years later when ecstasy came onto the scene, MDMA was emergency-classified as a Schedule I substance as well. By contrast, cocaine, opium and morphine are Schedule II substances, meaning they can be prescribed by a doctor. Despite some promising results from trials of psychedelics in treating alcoholism, psychiatric conditions and modeling mental illness, by the early '70s, the government had tightened control of Schedule I substances, even for research. It's only now that we're starting to return to the notion that these drugs could be medicine. © 2012 Popular Science
By ELIZABETH WEIL According to Sonja Lyubomirsky, you have a happiness set point. It’s partly encoded in your genes. If something good happens, your sense of happiness rises; if something bad happens, it falls. But either way, before too long, your mood will creep back to its set point because of a really powerful and perverse phenomenon referred to in science as “hedonic adaptation.” You know, people get used to things. With her 2007 book, “The How of Happiness,” and this year’s follow-up, “The Myths of Happiness,” Dr. Lyubomirsky, a psychology professor at the University of California, Riverside, caused ripples in her field but also drew a wider audience, cementing her place in a long chain of happiness-industry stalwarts, from M. Scott Peck with “The Road Less Traveled” to Martin E. P. Seligman and “Learned Optimism” to Daniel Gilbert and his best-selling “Stumbling on Happiness.” Dr. Lyubomirsky’s findings can be provocative and, at times, counterintuitive. Renters are happier than homeowners, she says. Interrupting positive experiences makes them more enjoyable. Acts of kindness make people feel happier, but not if you are compelled to perform the same act too frequently. (Bring your lover breakfast in bed one day, and it feels great. Bring it every day, and it feels like a chore.) Dr. Lyubomirsky — 46, Russian and expecting to give birth to her fourth child this weekend — is an unlikely mood guru. “I really hate all the smiley faces and rainbows and kittens,” she said in her office. She doesn’t often count her blessings or write gratitude letters, both of which she thinks sound hokey even though her research suggests they make people happier. © 2013 The New York Times Company
Link ID: 18055 - Posted: 04.22.2013
By Emily Chung, CBC News Having a stressed-out mom may give baby squirrels a competitive edge, a new study suggests. Red squirrels who were stressed out during pregnancy had babies that out-competed their peers by growing significantly faster without any extra food, reported the study, published online in Science Express. "What that suggests is that they're first able to predict what sort of environment their offspring will encounter… and they're preparing them for what their offspring are going to face," said Ben Dantzer, lead author of the study he worked on while he was a Ph.D. student at Michigan State University under the supervision of Guelph University biologist Andrew McAdam. Further investigation uncovered a link between faster growth among the baby squirrels and higher levels of stress hormones in their mothers during the pregnancies. That link may explain how environmental conditions cue the animals to respond and adapt. Canadian researchers, including Stan Boutin at the University of Alberta, Murray Humphreys at McGill University in Montreal and McAdam at the University of Guelph, had been studying red squirrels near Kluane Lake, Yukon, for 22 years to find out how they are affected by changes in resources such as food over time. © CBC 2013
By GRETCHEN REYNOLDS If you give a rat a running wheel and it decides not to use it, are genes to blame? And if so, what does that tell us about why many people skip exercise? To examine those questions, scientists at the University of Missouri in Columbia recently interbred rats to create two very distinct groups of animals, one of which loves to run. Those in the other group turn up their collective little noses at exercise, slouching idly in their cages instead. Then the scientists closely scrutinized and compared the animals’ bodies, brains and DNA. For some time, exercise scientists have suspected that the motivation to exercise — or not — must have a genetic component. When researchers have compared physical activity patterns among family members, and particularly among twins, they have found that close relations tend to work out similarly, exercising about as much or as little as their parents or siblings do, even if they grew up in different environments. These findings suggest that the desire to be active or indolent is, to some extent, inherited. But to what extent someone’s motivation to exercise is affected by genes — and what specific genes may be involved — has been hard to determine. There are only so many human twins around for study purposes, after all. And even more daunting, it’s difficult to separate the role of upbringing from that of genetics in determining whether and why some people want to exercise and others don’t. So the University of Missouri researchers decided to create their own innately avid runners or couch potatoes, provide them with similar upbringings, and see what happened next. Copyright 2013 The New York Times Company
By Sandra G. Boodman, For someone who had been such a healthy child, Nancy Kennedy couldn’t figure out how she had become the kind of sickly adult whose life revolved around visits to a seemingly endless series of doctors. Beginning in 2005, shortly after a job transfer took her from Northern Virginia to St. Louis, Kennedy, then 47, developed a string of vexing medical problems. Her white blood cell count was inexplicably elevated. Her sinuses were chronically infected, although her respiratory tract seemed unusually dry. She often felt fatigued, and her joints hurt. “It felt as though an alien had invaded my body,” said Kennedy, formerly a manager at the National Geospatial-Intelligence Agency. “I felt like I was in doctors’ offices all the time.” Tests for possible ailments — including blood disorders, cancer, multiple sclerosis and rheumatoid arthritis — were negative. For seven years. Kennedy and her primary-care physician, who said she felt as though she sent Kennedy to “every specialist that walked,” had no clear idea what might be wrong. But during a physical in January 2012, her doctor, Melissa Johnson, struck by Kennedy’s trouble walking and her accelerating deterioration, decided to check for a condition not previously considered. © 1996-2013 The Washington Post
By Bill Andrews In a paper sure to please lazy stand-up comics and beleaguered husbands everywhere, scientists say that men do indeed have a hard time understanding women. Recent results show that men have a significantly harder time recognizing women’s emotions than they do men’s, and that men seem to use different parts of their brain when ascribing intentions and feelings to women versus men. Previous experiments had suggested that men are naturally wired to be more intuitive toward other men’s mental states and emotions. Eager to figure out why and how this could be, the researchers studied the brains of 22 male participants as they received a version of a well-known empathy test called the “Reading the Mind in the Eyes Test.” (You can take a version of the test online here.) As the name suggests, the test consists of snapshots of pairs of eyes. Pairs of eyes were shown in succession to each participant, who had to determine either the gender or the emotional state of the person pictured. This all took place within an MRI machine, allowing the researchers to see which parts of the brain were active while participants made their determinations. Participants were about equally good at guessing the gender of male and female eyes, but the men did significantly worse at recognizing the emotions of the female eyes. They correctly interpreted about 87 percent of men’s eyes but only about 76 percent of women’s eyes. Participants also took longer to judge women’s emotions—about 40 milliseconds longer on average. Thus, in effect, men can “read” other men’s eyes faster and better, the researchers report in PLOS ONE.
by Elizabeth Norton A loving gaze helps firm up the bond between parent and child, building social skills that last a lifetime. But what happens when mom is blind? A new study shows that the children of sightless mothers develop healthy communication skills and can even outstrip the children of parents with normal vision. Eye contact is one of the most important aspects of communication, according to Atsushi Senju, a developmental cognitive neuroscientist at Birkbeck, University of London. Autistic people don't naturally make eye contact, however, and they can become anxious when urged to do so. Children for whom face-to-face contact is drastically reduced—babies severely neglected in orphanages or children who are born blind—are more likely to have traits of autism, such as the inability to form attachments, hyperactivity, and cognitive impairment. To determine whether eye contact is essential for developing normal communication skills, Senju and colleagues chose a less extreme example: babies whose primary caregivers (their mothers) were blind. These children had other forms of loving interaction, such as touching and talking. But the mothers were unable to follow the babies' gaze or teach the babies to follow theirs, which normally helps children learn the importance of the eyes in communication. Apparently, the children don't need the help. Senju and colleagues studied five babies born to blind mothers, checking the children's proficiency at 6 to 10 months, 12 to 15 months, and 24 to 47 months on several measures of age-appropriate communications skills. At the first two visits, babies watched videos in which a woman shifted her gaze or moved different parts of her face while corresponding changes in the baby's face were recorded. Babies also followed the gaze of a woman sitting at a table and looking at various objects. © 2010 American Association for the Advancement of Science