Chapter 15. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Heidi Ledford Dutch celebrity daredevil Wim Hof has endured lengthy ice-water baths, hiked to the top of Mount Kilimanjaro in shorts and made his mark in Guinness World Records with his ability to withstand cold. Now he has made a mark on science as well. Researchers have used Hof’s methods of mental and physical conditioning to train 12 volunteers to fend off inflammation. The results, published today in the Proceedings of the National Academy of Sciences1, suggest that people can learn to modulate their immune responses — a finding that has raised hopes for patients who have chronic inflammatory disorders such as rheumatoid arthritis and inflammatory bowel disease. The results are only preliminary, warns study first author Matthijs Kox, who investigates immune responses at Radboud University Medical Center in Nijmegen, the Netherlands. Kox says that people with inflammatory disorders sometimes hear about his experiments and call to ask whether the training would enable them to reduce their medication. “We simply do not yet know that,” he says. Still, the work stands out as an illustration of the interactions between the nervous system and the immune system, says Guiseppe Matarese, an immunologist at the University of Salerno in Italy, who was not involved with the study. “This study is a nice way to show that link,” he says. “Orthodox neurobiologists and orthodox immunologists have been sceptical.” They think the study of the interactions between the nervous and immune systems is a “field in the shadows,” he says. © 2014 Nature Publishing Group,
By Christian Jarrett I must have been about seven years old, a junior in my prep school. I was standing in the dining hall surrounded by over a hundred senior boys and schoolmasters, all looking at me, some with pity, others with disdain. It was unheard of for a junior boy to be present in the dining room by the time the seniors had filed in. “What on earth do you think you’re doing Jarrett?” asked the headmaster with mock outrage. I was there because, by refusing to finish my rhubarb crumble, I’d broken a cardinal school rule. All pupils were to eat all they were given. But after vomiting up some of my rhubarb – a flesh-like fruit that still disgusts me to this day – I simply refused to eat on. Keeping me behind in the dining room as the seniors arrived was my punishment. I wanted to explain this to the assembled crowd. Yet speech completely failed me and I began to sob openly and uncontrollably, my humiliation sealed. This was an intense emotional experience for me, and as you can probably tell, the memory remains sore to this day. But is humiliation any more intense than the other negative emotions, such as anger or shame? If it were, how would psychologists and neuroscientists demonstrate that this was the case? You might imagine that the most effective method would be to ask people to rate and describe different emotional experiences – after all, to say that an emotion is intense is really to say something about how it feels, and how it affects you. Yet in a paper published earlier this year, a pair of psychologists – Marte Otten and Kai Jonas – have taken a different approach. Inspired by claims that humiliation is an unusually intense emotion, responsible even for war and strife in the world, the researchers have turned to brain-based evidence. They claim to have provided the “first empirical, neurocognitive evidence for long-standing claims in the humiliation literature that humiliation is a particularly intense emotion.” WIRED.com © 2014 Condé Nast.
Link ID: 19578 - Posted: 05.06.2014
By SAM KEAN UNTIL the past few decades, neuroscientists really had only one way to study the human brain: Wait for strokes or some other disaster to strike people, and if the victims pulled through, determine how their minds worked differently afterward. Depending on what part of the brain suffered, strange things might happen. Parents couldn’t recognize their children. Normal people became pathological liars. Some people lost the ability to speak — but could sing just fine. These incidents have become classic case studies, fodder for innumerable textbooks and bull sessions around the lab. The names of these patients — H. M., Tan, Phineas Gage — are deeply woven into the lore of neuroscience. When recounting these cases today, neuroscientists naturally focus on these patients’ deficits, emphasizing the changes that took place in their thinking and behavior. After all, there’s no better way to learn what some structure in the brain does than to see what happens when it shorts out or otherwise gets destroyed. But these case snippets overlook something crucial about people with brain damage. However glaring their deficits are, their brains still work like ours to a large extent. Most can still read and reason. They can still talk, walk and emote. And they still have the same joys and fears — facts that the psychological caricatures passed down from generation to generation generally omit. The famous amnesiac H. M., for instance, underwent radical brain surgery in 1953 and had most of the hippocampus removed on both sides of his brain; afterward, he seemed to lose the ability to form new long-term memories. Names, dates, directions to the bathroom all escaped him now. He’d eat two breakfasts if no one stopped him. Careful testing, however, revealed that H. M. could form new motor memories — memories of things like how to ride a bicycle — because they rely on different structures in the brain. This work established that memory isn’t a single, monolithic thing, but a collection of different faculties. © 2014 The New York Times Company
by Bethany Brookshire When I was a lab scientist working with mice, I spent hours controlling variables. I stood on precarious chairs to tape tarps over lights to get the light level perfectly right. I made one undergraduate who wore perfume to the lab for animal training wear the same perfume for a whole semester. I was so worried about the mice “recognizing” me over long, overlapping experiments that I did not change the scents of any of my personal care products for nine years. Many of these variables got reported in the methods sections of my papers. “All experiments conducted between 5:00 and 7:00 a.m. Maze dimensions: 4 inches wide, with walls 6 inches tall. Lighting held constant at 10 lux.” All of these variables are reported to allow other people to repeat my experiments, and hopefully get the same result. Now, a new study suggests that maybe I should have included another element in my methods section: “All mice exposed to the scent of a woman.” Jeffrey Mogil’s lab at McGill University in Montreal, Canada, reports April 28 in Nature Methods that mice respond differently to men and women, and that men in fact are a stressful influence. The results show that there’s yet another variable to control when doing sensitive mouse behavioral studies, a variable that could impact fields from pain to depression and beyond. Every department that does animal research has stories about particular experimenters. I recall hearing a story of a lab technician who could get results no one else could, because mice just loved her strawberry-scented hair conditioner. Another colleague told of one experimenter who was so good at handling rats that no one believed her anxiety results. Her rats were just so relaxed. And Mogil’s lab had its own story. In their lab, the presence of human experimenters seemed to stop mice from showing pain. © Society for Science & the Public 2000 - 2013
Jeffrey Mogil’s students suspected there was something fishy going on with their experiments. They were injecting an irritant into the feet of mice to test their pain response, but the rodents didn’t seem to feel anything. “We thought there was something wrong with the injection,” says Mogil, a neuroscientist at McGill University in Montreal, Canada. The real culprit was far more surprising: The mice that didn’t feel pain had been handled by male students. Mogil’s group discovered that this gender distinction alone was enough to throw off their whole experiment—and likely influences the work of other researchers as well. “This is very important work with wide-ranging implications,” says M. Catherine Bushnell, a neuroscientist and the scientific director of the Division of Intramural Research at the National Center for Complementary and Alternative Medicine (NCCAM) in Bethesda, Maryland, who was not involved in the study. “Many people doing research have never thought of this.” Mogil has studied pain for 25 years. He’s long suspected that lab animals respond differently to the sensation when researchers are present. In 2007, his lab observed that mice spend less time licking a painful injection—a sign that they’re hurting—when a person is nearby, even if that “person” is a cardboard cutout of Paris Hilton. Other scientists began to wonder if their own data were biased by the same effect. “There were whisperings at meetings that this was confounding research results,” Mogil says. So he decided to take a closer look. In the new study, Mogil told the researchers in his lab to inject an inflammatory agent into the foot of a rat or mouse and then take a seat nearby and read a book. A video camera trained on the rodent’s face assessed the animal’s pain level, based on a 0- to 2-point “grimace scale” developed by the team. The results were mixed. Sometimes the animals showed pain when an experimenter was present, and sometimes they seemed just fine. So, on a hunch, Mogil and colleagues recrunched the data, this time controlling for whether a male or a female experimenter was present. “We were stunned by the results,” he says. © 2014 American Association for the Advancement of Science.
By JAN HOFFMAN How well can computers interact with humans? Certainly computers play a mean game of chess, which requires strategy and logic, and “Jeopardy!,” in which they must process language to understand the clues read by Alex Trebek (and buzz in with the correct question). But in recent years, scientists have striven for an even more complex goal: programming computers to read human facial expressions. We all know what it’s like to experience pain that makes our faces twist into a grimace. But can you tell if someone else’s face of pain is real or feigned? The practical applications could be profound. Computers could supplement or even replace lie detectors. They could be installed at border crossings and airport security checks. They could serve as diagnostic aids for doctors. Researchers at the University of California, San Diego, have written software that not only detected whether a person’s face revealed genuine or faked pain, but did so far more accurately than human observers. While other scientists have already refined a computer’s ability to identify nuances of smiles and grimaces, this may be the first time a computer has triumphed over humans at reading their own species. “A particular success like this has been elusive,” said Matthew A. Turk, a professor of computer science at the University of California, Santa Barbara. “It’s one of several recent examples of how the field is now producing useful technologies rather than research that only stays in the lab. We’re affecting the real world.” People generally excel at using nonverbal cues, including facial expressions, to deceive others (hence the poker face). They are good at mimicking pain, instinctively knowing how to contort their features to convey physical discomfort. © 2014 The New York Times Company
By LAURENCE STEINBERG I’M not sure whether it’s a badge of honor or a mark of shame, but a paper I published a few years ago is now ranked No. 8 on a list of studies that other psychologists would most like to see replicated. Good news: People find the research interesting. Bad news: They don’t believe it. The paper in question, written with my former student Margo Gardner, appeared in the journal Developmental Psychology in July 2005. It described a study in which we randomly assigned subjects to play a video driving game, either alone or with two same-age friends watching them. The mere presence of peers made teenagers take more risks and crash more often, but no such effect was observed among adults. I find my colleagues’ skepticism surprising. Most people recall that as teenagers, they did far more reckless things when with their friends than when alone. Data from the Federal Bureau of Investigation indicate that many more juvenile crimes than adult crimes are committed in groups. And driving statistics conclusively show that having same-age passengers in the car substantially increases the risk of a teen driver’s crashing but has no similar impact when an adult is behind the wheel. Then again, I’m aware that our study challenged many psychologists’ beliefs about the nature of peer pressure, for it showed that the influence of peers on adolescent risk taking doesn’t rely solely on explicit encouragement to behave recklessly. Our findings also undercut the popular idea that the higher rate of real-world risk taking in adolescent peer groups is a result of reckless teenagers’ being more likely to surround themselves with like-minded others. My colleagues and I have replicated our original study of peer influences on adolescent risk taking several times since 2005. We have also shown that the reason teenagers take more chances when their peers are around is partly because of the impact of peers on the adolescent brain’s sensitivity to rewards. In a study of people playing our driving game, my colleague Jason Chein and I found that when teens were with people their own age, their brains’ reward centers became hyperactivated, which made them more easily aroused by the prospect of a potentially pleasurable experience. This, in turn, inclined teenagers to pay more attention to the possible benefits of a risky choice than to the likely costs, and to make risky decisions rather than play it safe. Peers had no such effect on adults’ reward centers, though. © 2014 The New York Times Company
The negative social, physical and mental health effects of childhood bullying are still evident nearly 40 years later, according to research by British psychiatrists. In the first study of its kind to look at the effects of childhood bullying beyond early adulthood, the researchers said its impact is "persistent and pervasive", with people who were bullied when young more likely to have poorer physical and psychological health and poorer cognitive functioning at age 50. "The effects of bullying are still visible nearly four decades later ... with health, social and economic consequences lasting well into adulthood," said Ryu Takizawa, who led the study at the Institute of Psychiatry at King's College London. The findings, published in the American Journal of Psychiatry on Friday, come from the British National Child Development Study which includes data on all children born in England, Scotland and Wales during one week in 1958. It included 7,771 children whose parents gave information on their child's exposure to bullying when they were aged 7 and 11. The children were then followed up until they reached 50. Bullying is characterized by repeated hurtful actions by children of a similar age, where the victim finds it difficult to defend themselves. More than a quarter of children in the study — 28 per cent — had been bullied occasionally, and 15 per cent were bullied frequently - rates that the researchers said were similar to the situation in Britain today. The study, which adjusted for other factors such as childhood IQ, emotional and behavioural problems and low parental involvement, found people who were frequently bullied in childhood were at an increased risk of mental disorders such as depression, anxiety and experiencing suicidal thoughts. © CBC 2014
By Melissa Healy The nature of psychological resilience has, in recent years, been a subject of enormous interest to researchers, who have wondered how some people endure and even thrive under a certain amount of stress, and others crumble and fall prey to depression. The resulting research has underscored the importance of feeling socially connected and the value of psychotherapy to identify and exercise patterns of thought that protect against hopelessness and defeat. But what does psychological resilience look like inside our brains, at the cellular level? Such knowledge might help bolster peoples' immunity to depression and even treat people under chronic stress. And a new study published Thursday in Science magazine has made some progress in the effort to see the brain struggling with -- and ultimately triumphing over -- stress. A group of neuroscientists at Mount Sinai's Icahn School of Medicine in New York focused on the dopaminergic cells in the brain's ventral tegmentum, a key node in the brain's reward circuitry and therefore an important place to look at how social triumph and defeat play out in the brain. In mice under stress because they were either chronically isolated or rebuffed or attacked by fellow littermates, the group had observed that this group of neurons become overactive. It would logically follow, then, that if you don't want stressed mice (or people) to become depressed, you would want to avoid hyperactivity in that key group of neurons, right? Actually, wrong, the researchers found. In a series of experiments, they saw that the mice who were least prone to behave in socially defeated ways when under stress were actually the ones whose dopaminergic cells in the ventral tegmental area displayed the greatest levels of hyperactivity in response to stress. And that hyperactivity was most pronounced in the neurons that extended from the tegmentum into the nearby nucleus accumbens, also a key node in the brain's reward system.
Scientists have traced vulnerability to depression-like behaviors in mice to out-of-balance electrical activity inside neurons of the brain’s reward circuit and experimentally reversed it – but there’s a twist. Instead of suppressing it, researchers funded by the National Institutes of Health boosted runaway neuronal activity even further, eventually triggering a compensatory self-stabilizing response. Once electrical balance was restored, previously susceptible animals were no longer prone to becoming withdrawn, anxious, and listless following socially stressful experiences. “To our surprise, neurons in this circuit harbor their own self-tuning, homeostatic mechanism of natural resilience,” explained Ming-Hu Han, Ph.D External Web Site Policy., of the Icahn School of Medicine at Mount Sinai, New York City, a grantee of the NIH’s National Institute of Mental Health (NIMH) and leader of the research team. Han and colleagues report on their discovery April 18, 2014 in the journal Science. Prior to the new study, the researchers had turned resilience to social stress on and off by using pulses of light to manipulate reward circuit neuronal firing rates in genetically engineered mice – optogenetics. But they didn’t know how resilience worked at the cellular level. To find out, they focused on electrical events in reward circuit neurons of mice exposed to a social stressor. Some mice that experience repeated encounters with a dominant animal emerge behaviorally unscathed, while others develop depression-like behaviors.
By David Brown, At the very least, the new experiment reported in Science is going to make people think differently about what it means to be a “rat.” Eventually, though, it may tell us interesting things about what it means to be a human being. In a simple experiment, researchers at the University of Chicago sought to find out whether a rat would release a fellow rat from an unpleasantly restrictive cage if it could. The answer was yes. The free rat, occasionally hearing distress calls from its compatriot, learned to open the cage and did so with greater efficiency over time. It would release the other animal even if there wasn’t the payoff of a reunion with it. Astonishingly, if given access to a small hoard of chocolate chips, the free rat would usually save at least one treat for the captive — which is a lot to expect of a rat. The researchers came to the unavoidable conclusion that what they were seeing was empathy — and apparently selfless behavior driven by that mental state. “There is nothing in it for them except for whatever feeling they get from helping another individual,” said Peggy Mason, the neurobiologist who conducted the experiment along with graduate student Inbal Ben-Ami Bartal and fellow researcher Jean Decety. “There is a common misconception that sharing and helping is a cultural occurrence. But this is not a cultural event. It is part of our biological inheritance,” she added. The idea that animals have emotional lives and are capable of detecting emotions in others has been gaining ground for decades. Empathic behavior has been observed in apes and monkeys, and described by many pet owners (especially dog owners). Recently, scientists demonstrated “emotional contagion” in mice, a situation in which one animal’s stress worsens another’s. © 1996-2014 The Washington Post
By Melissa Hogenboom Artists have structurally different brains compared with non-artists, a study has found. Participants' brain scans revealed that artists had increased neural matter in areas relating to fine motor movements and visual imagery. The research, published in NeuroImage, suggests that an artist's talent could be innate. But training and environmental upbringing also play crucial roles in their ability, the authors report. As in many areas of science, the exact interplay of nature and nurture remains unclear. Lead author Rebecca Chamberlain from KU Leuven University, Belgium, said she was interested in finding out how artists saw the world differently. "The people who are better at drawing really seem to have more developed structures in regions of the brain that control for fine motor performance and what we call procedural memory," she explained. In their small study, researchers peered into the brains of 21 art students and compared them to 23 non-artists using a scanning method called voxel-based morphometry. Detail of 'Giant Lobster' from NHM specimen collection One artist who has practised for many years is Alice Shirley - here is a detail of her Giant Lobster These detailed scans revealed that the artist group had significantly more grey matter in an area of the brain called the precuneus in the parietal lobe. "This region is involved in a range of functions but potentially in things that could be linked to creativity, like visual imagery - being able to manipulate visual images in your brain, combine them and deconstruct them," Dr Chamberlain told the BBC's Inside Science programme. BBC © 2014
On Wednesday morning we woke to the news that a passenger ferry had sunk off the coast of South Korea, with at least four people confirmed dead and 280 unaccounted for. Meanwhile, though the search has continued for the missing Malaysia Airlines plane, relatives' hopes of a safe landing have long since been extinguished. Human tragedies like these are the stuff of daily news, but we rarely hear about the long-term psychological effects on survivors and the bereaved, who may experience the symptoms of post-traumatic stress disorder for years after their experience. Although most people have heard of PTSD, few will have a clear idea of what it entails. The American Psychiatric Association's Diagnostic and Statistical Manual (DSM) defines a traumatic event as one in which a person "experienced, witnessed, or was confronted with an event or events that involved actual or threatened death or serious injury, or a threat to the physical integrity of self or others". PTSD is marked by four types of responses to the trauma. First, patients repeatedly relive the event, either in the form of nightmares or flashbacks. Second, they seek to avoid any reminder of the traumatic event. Third, they feel constantly on edge. Fourth, they are plagued with negative thoughts and low mood. According to one estimate, almost 8% of people will develop PTSD during their lifetime. Clearly trauma (and PTSD) can strike anyone, but the risks of developing the condition are not equally distributed. Rates are higher in socially disadvantaged areas, for instance. Women may be twice as likely to develop PTSD as men. This is partly because women are at greater risk of the kinds of trauma that commonly produce PTSD (rape, for example). Nevertheless – and for unknown reasons – when exposed to the same type of trauma, women are more susceptible to PTSD than men. © 2014 Guardian News and Media Limited
Virginia Hughes Trauma is insidious. It not only increases a person’s risk for psychiatric disorders, but can also spill over into the next generation. People who were traumatized during the Khmer Rouge genocide in Cambodia tended to have children with depression and anxiety, for example, and children of Australian veterans of the Vietnam War have higher rates of suicide than the general population. Trauma’s impact comes partly from social factors, such as its influence on how parents interact with their children. But stress also leaves ‘epigenetic marks’ — chemical changes that affect how DNA is expressed without altering its sequence. A study published this week in Nature Neuroscience finds that stress in early life alters the production of small RNAs, called microRNAs, in the sperm of mice (K. Gapp et al. Nature Neurosci. http://dx.doi.org/10.1038/nn.3695; 2014). The mice show depressive behaviours that persist in their progeny, which also show glitches in metabolism. The study is notable for showing that sperm responds to the environment, says Stephen Krawetz, a geneticist at Wayne State University School of Medicine in Detroit, Michigan, who studies microRNAs in human sperm. (He was not involved in the latest study.) “Dad is having a much larger role in the whole process, rather than just delivering his genome and being done with it,” he says. He adds that this is one of a growing number of studies to show that subtle changes in sperm microRNAs “set the stage for a huge plethora of other effects”. In the new study, Isabelle Mansuy, a neuroscientist at the University of Zurich, Switzerland, and her colleagues periodically separated mother mice from their young pups and exposed the mothers to stressful situations — either by placing them in cold water or physically restraining them. These separations occurred every day but at erratic times, so that the mothers could not comfort their pups (termed the F1 generation) with extra cuddling before separation. © 2014 Nature Publishing Group,
The two marmosets—small, New World monkeys—had been a closely bonded couple for more than 3 years. Then, one fateful day, the female had a terrible accident. She fell out of a tree and hit her head on a ceramic vase that happened to be underneath on the forest floor. Her partner left two of their infants alone in the tree and jumped down to apparently comfort her, until she died an agonizing death a couple of hours later. According to the researchers who recorded the events with a video camera (see video above), this is the first time such compassionate mourning behavior has been observed outside of humans and chimpanzees, and it could indicate that mourning is more widespread among primates than previously thought. Humans mourn their dead, of course, and some recent studies have strongly suggested that chimpanzees do as well. Scientists have recorded cases of adult chimps apparently caring for fellow animals before they die, and chimp mothers have been observed carrying around the bodies of infants for days after their death—although scientists have debated whether the latter behavior represents true grieving or if the mothers didn’t realize their infants were really dead. But there has been little or no evidence that other primates engage in these kinds of behaviors. Indeed, a recent review of the evidence led by anthropologist Peter Fashing of California State University, Fullerton, concluded that there were no convincing observations of “compassionate caretaking” of dying individuals among other nonhuman primates, such as monkeys. © 2014 American Association for the Advancement of Science.
Feeling peeved at your partner? You may want to check your blood sugar. A new study suggests that low levels of glucose in the blood may increase anger and aggression between spouses. The researchers say their findings suggest a connection between glucose and self-control, but other experts disagree about the study’s implications. Glucose is a source of fuel for the body, and its levels in the blood rise and fall throughout the day, as the body metabolizes meals that include carbohydrates. Researchers have suspected since the 1960s that low glucose or swings in glucose may play a role in human aggression. In two 2010 studies, psychologist Brad Bushman of Ohio State University, Columbus, attempted to figure out just what that role is, first by measuring vengefulness among people with symptoms of type 2 diabetes (a disease in which the body can’t regulate glucose levels properly), and then by providing sweetened drinks to strangers competing on a computerized task. Both studies suggested that higher glucose levels can make strangers less likely to treat each other aggressively. Bushman wondered about the relationship between glucose levels and aggression among romantic couples. So he and colleagues at the University of Kentucky and the University of North Carolina recruited 107 married couples and equipped them with blood glucose meters, voodoo dolls, and 51 pins to record their glucose and anger levels over time. For 21 days, the couples used the meters to measure their glucose levels each morning before breakfast and each evening before bed. They also assessed how angry they were at their spouse at the end of each day, by recording how many of the 51 pins they stuck into their voodoo dolls just before bed when their partner wasn’t looking. After 21 days, the couples were invited into the lab. There, they played a computer game that allowed them to blast their spouse with an unpleasant noise—a mixture of fingernails scratching a chalkboard, ambulance sirens, and dentist drills—as loudly and for as long as he or she wanted, as a proxy for their willingness to act aggressively and make their partner suffer. © 2014 American Association for the Advancement of Science.
In an op-ed in the Sunday edition of this newspaper, Barbara Ehrenreich, card-carrying liberal rationalist, writes about her own mystical experiences (the subject of her new book), and argues that the numinous deserves more cutting-edge scientific study: I appreciate the spirit (if you will) of this argument, but I am very doubtful as to its application. The trouble is that in its current state, cognitive science has a great deal of difficulty explaining “what happens” when “those wires connect” for non-numinous experience, which is why mysterian views of consciousness remain so potent even among thinkers whose fundamental commitments are atheistic and materialistic. (I’m going to link to the internet’s sharpest far-left scold for a good recent polemic on this front.) That is to say, even in contexts where it’s very easy to identify the physical correlative to a given mental state, and to get the kind of basic repeatability that the scientific method requires — show someone an apple, ask them to describe it; tell them to bite into it, ask them to describe the taste; etc. — there is no kind of scientific or philosophical agreement on what is actually happening to produce the conscious experience of the color “red,” the conscious experience of the crisp McIntosh taste, etc. So if we can’t say how this ”normal” conscious experience works, even when we can easily identify the physical stimulii that produce it, it seems exponentially harder to scientifically investigate the invisible, maybe-they-exist and maybe-they-don’t stimulii — be they divine, alien, or panpsychic — that Ehrenreich hypothesizes might produce more exotic forms of conscious experience. © 2014 The New York Times Company
Jyoti Madhusoodanan Growing up in a stressful social environment leaves lasting marks on young chromosomes, a study of African American boys has revealed. Telomeres, repetitive DNA sequences that protect the ends of chromosomes from fraying over time, are shorter in children from poor and unstable homes than in children from more nurturing families. When researchers examined the DNA of 40 boys from major US cities at age 9, they found that the telomeres of children from harsh home environments were 19% shorter than those of children from advantaged backgrounds. The length of telomeres is often considered to be a biomarker of chronic stress. The study, published today in the Proceedings of the National Academy of Sciences1, brings researchers closer to understanding how social conditions in childhood can influence long-term health, says Elissa Epel, a health psychologist at the University of California, San Francisco, who was not involved in the research. Participants’ DNA samples and socio-economic data were collected as part of the Fragile Families and Child Wellbeing Study, an effort funded by the US National Institutes of Health to track nearly 5,000 children, the majority of whom were born to unmarried parents in large US cities in 1998–2000. Children's environments were rated on the basis of their mother's level of education; the ratio of a family’s income to needs; harsh parenting; and whether family structure was stable, says lead author Daniel Notterman, a molecular biologist at Pennsylvania State University in Hershey. © 2014 Nature Publishing Group
By By Stephanie Pappas, A little stress may be a good thing for teenagers learning to drive. In a new study, teens whose levels of the stress hormone cortisol increased more during times of stress got into fewer car crashes or near crashes in their first months of driving than their less-stress-responsive peers did. The study suggests that biological differences may affect how teens learn to respond to crises on the road, the researchers reported today (April 7) in the journal JAMA Pediatrics. Efforts to reduce teen car accidents include graduated driver licensing programs, safety messages and increased parental management, but these efforts seem to work better for some teens than others, the researchers said. Alternatives, such as in-vehicle technologies aimed at reducing accidents, may be especially useful for teens with a "neurological basis" for their increased risk of getting into an accident, they said. Automobile accidents are the No. 1 cause of death of teenagers in the United States, according to the Centers for Disease Control and Prevention. Car crashes also kill more 15- to 29-year-olds globally than any other cause, according to the World Health Organization.
By BARBARA EHRENREICH MY atheism is hard-core, rooted in family tradition rather than adolescent rebellion. According to family legend, one of my 19th-century ancestors, a dirt-poor Irish-American woman in Montana, expressed her disgust with the church by vehemently refusing last rites when she lay dying in childbirth. From then on, we were atheists and rationalists, a stance I perpetuated by opting, initially, for a career in science. How else to understand the world except as the interaction of tiny bits of matter and mathematically predictable forces? There were no gods or spirits, just our own minds pressing up against the unknown. But something happened when I was 17 that shook my safely rationalist worldview and left me with a lifelong puzzle. Years later, I learned that this sort of event is usually called a mystical experience, and I can see in retrospect that the circumstances had been propitious: Thanks to a severely underfunded and poorly planned skiing trip, I was sleep-deprived and probably hypoglycemic that morning in 1959 when I stepped out alone, walked into the streets of Lone Pine, Calif., and saw the world — the mountains, the sky, the low scattered buildings — suddenly flame into life. There were no visions, no prophetic voices or visits by totemic animals, just this blazing everywhere. Something poured into me and I poured out into it. This was not the passive beatific merger with “the All,” as promised by the Eastern mystics. It was a furious encounter with a living substance that was coming at me through all things at once, too vast and violent to hold on to, too heartbreakingly beautiful to let go of. It seemed to me that whether you start as a twig or a gorgeous tapestry, you will be recruited into the flame and made indistinguishable from the rest of the blaze. I felt ecstatic and somehow completed, but also shattered. © 2014 The New York Times Company