Chapter 15. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Roni Caryn Rabin Sixty-five million Americans suffer from chronic lower back pain, and many feel they have tried it all: physical therapy, painkillers, shots. Now a new study reports many people may find relief with a form of meditation that harnesses the power of the mind to manage pain. The technique, called mindfulness-based stress reduction, involves a combination of meditation, body awareness and yoga, and focuses on increasing awareness and acceptance of one’s experiences, whether they involve physical discomfort or emotional pain. People with lower back pain who learned the meditation technique showed greater improvements in function compared to those who had cognitive behavioral therapy, which has been shown to help ease pain, or standard back care. Participants assigned to meditation or cognitive behavior therapy received eight weekly two-hour sessions of group training in the techniques. After six months, those learning meditation had an easier time doing things like getting up out of a chair, going up the stairs and putting on their socks, and were less irritable and less likely to stay at home or in bed because of pain. They were still doing better a year later. The findings come amid growing concerns about opioid painkillers and a surge of overdose deaths involving the drugs. At the beginning of the trial, 11 percent of the participants said they had used an opioid within the last week to treat their pain, and they were allowed to continue with their usual care throughout the trial. “This new study is exciting, because here’s a technique that doesn’t involve taking any pharmaceutical agents, and doesn’t involve the side effects of pharmaceutical agents,” said Dr. Madhav Goyal of Johns Hopkins University School of Medicine, who co-wrote an editorial accompanying the paper. © 2016 The New York Times Company
Angus Chen You've probably heard that a little booze a day is good for you. I've even said it at parties. "Look at the French," I've said gleefully over my own cup. "Wine all the time and they still live to be not a day younger than 82." I'm sorry to say we're probably wrong. The evidence that alcohol has any benefit on longevity or heart health is thin, says Dr. Timothy Naimi, a physician and epidemiologist at Boston Medical Center. He and his colleagues published an analysis 87 of the best research studies on alcohol's effect on death from any cause in the Journal of Studies on Alcohol and Drugs on Tuesday. "[Our] findings here cast a great deal of skepticism on this long, cherished belief that moderate drinking has a survival advantage," he says. In these studies, the participants get sorted into categories based on how much alcohol they think they drink. Researchers typically size up occasional, moderate and heavy drinkers against non-drinkers. When you do this, the moderates, one to three drinks a day, usually come out on top. They're less likely to die early from health problems like heart disease or cancer and injury. But then it gets very tricky, "because moderate drinkers tend to be very socially advantaged," Naimi says. Moderate drinkers tend to be healthier on average because they're well-educated and more affluent, not because they're drinking a bottle of wine a week on average. "[Their] alcohol consumption ends up looking good from a health perspective because they're already healthy to begin with." © 2016 npr
By Anahad O'Connor What does it take to live a good life? Surveys show that most young adults believe that obtaining wealth and fame are keys to a happy life. But a long-running study out of Harvard suggests that one of the most important predictors of whether you age well and live a long and happy life is not the amount of money you amass or notoriety you receive. A much more important barometer of long term health and well-being is the strength of your relationships with family, friends and spouses. These are some of the findings from the Harvard Study of Adult Development, a research project that since 1938 has closely tracked and examined the lives of more than 700 men and in some cases their spouses. The study has revealed some surprising – and some not so surprising – factors that determine whether people are likely to age happily and healthily, or descend into loneliness, sickness and mental decline. The study’s current director, , outlined some of the more striking findings from the long-running project in a recent TED Talk that has garnered more than seven million views. “We publish our findings in academic journals that most people don’t read,” Dr. Waldinger, a clinical professor of psychiatry at Harvard Medical School, said in a recent interview. “And so we really wanted people to know that this study exists and that it has for 75 years. We’ve been funded by the government for so many years, and it’s important that more people know about this besides academics.” The study began in Boston in the 1930s with two very different groups of young men. © 2016 The New York Times Company
By John Elder Robison What happens to your relationships when your emotional perception changes overnight? Because I’m autistic, I have always been oblivious to unspoken cues from other people. My wife, my son and my friends liked my unflappable demeanor and my predictable behavior. They told me I was great the way I was, but I never really agreed. For 50 years I made the best of how I was, because there was nothing else I could do. Then I was offered a chance to participate in a study at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Investigators at the Berenson-Allen Center there were studying transcranial magnetic stimulation, or T.M.S., a noninvasive procedure that applies magnetic pulses to stimulate the brain. It offers promise for many brain disorders. Several T.M.S. devices have been approved by the Food and Drug Administration for the treatment of severe depression, and others are under study for different conditions. (It’s still in the experimental phase for autism.) The doctors wondered if changing activity in a particular part of the autistic brain could change the way we sense emotions. That sounded exciting. I hoped it would help me read people a little better. They say, be careful what you wish for. The intervention succeeded beyond my wildest dreams — and it turned my life upside down. After one of my first T.M.S. sessions, in 2008, I thought nothing had happened. But when I got home and closed my eyes, I felt as if I were on a ship at sea. And there were dreams — so real they felt like hallucinations. It sounds like a fairy tale, but the next morning when I went to work, everything was different. Emotions came at me from all directions, so fast that I didn’t have a moment to process them. © 2016 The New York Times Company
By Gretchen Reynolds Meditating before running could change the brain in ways that are more beneficial for mental health than practicing either of those activities alone, according to an interesting study of a new treatment program for people with depression. As many people know from experience, depression is characterized in part by an inability to stop dwelling on gloomy thoughts and unhappy memories from the past. Researchers suspect that this thinking pattern, known as rumination, may involve two areas of the brain in particular: the prefrontal cortex, a part of the brain that helps to control attention and focus, and the hippocampus, which is critical for learning and memory. In some studies, people with severe depression have been found to have a smaller hippocampus than people who are not depressed. Interestingly, meditation and exercise affect those same portions of the brain, although in varying ways. In brain-scan studies, people who are long-term meditators, for instance, generally display different patterns of brain-cell communication in their prefrontal cortex during cognitive tests than people who don’t meditate. Those differences are believed to indicate that the meditators possess a more honed ability to focus and concentrate. Meanwhile, according to animal studies, aerobic exercise substantially increases the production of new brain cells in the hippocampus. Both meditation and exercise also have proven beneficial in the treatment of anxiety, depression and other mood disorders. These various findings about exercise and meditation intrigued researchers at Rutgers University in New Brunswick, N.J., who began to wonder whether, since meditation and exercise on their own improve moods, combining the two might intensify the impacts of each. So, for the new study, which was published last month in Translational Psychiatry, the scientists recruited 52 men and women, 22 of whom had been given diagnoses of depression. The researchers confirmed that diagnosis with their own tests and then asked all of the volunteers to complete a computerized test of their ability to focus while sensors measured electrical signals in their brains. © 2016 The New York Times Company
Link ID: 21998 - Posted: 03.17.2016
Nicola Davis Suppressing bad memories from the past can block memory formation in the here and now, research suggests. The study could help to explain why those suffering from post-traumatic stress disorder (PTSD) and other psychological conditions often experience difficulty in remembering recent events, scientists say. Writing in Nature Communications, the authors describe how trying to forget past incidents by suppressing our recollections can create a “virtual lesion” in the brain that casts an “amnesiac shadow” over the formation of new memories. “If you are motivated to try to prevent yourself from reliving a flashback of that initial trauma, anything that you experience around the period of time of suppression tends to get sucked up into this black hole as well,” Dr Justin Hulbert, one of the study’s authors, told the Guardian. “I think it makes perfect sense because we know that people with a wide range of psychological problems have difficulties with their everyday memories for ordinary events,” said Professor Chris Brewin, an expert in PTSD from University College, London, who was not involved in the study. “Potentially this could account for the memory deficits we find in depression and other disorders too.” The phenomenon came to the attention of the scientists during a lecture when a student admitted to having suffered bouts of amnesia after witnessing the 1999 Columbine high school massacre. When the student returned to the school for classes after the incident she found she could not remember anything from the lessons she was in. “Here she was surrounded by all these reminders of these terrible things that she preferred not to think about,” said Hulbert. © 2016 Guardian News and Media Limited
Rich Stanton In 1976, the driving simulation Death Race was removed from an Illinois amusement park. There had, according to a news story at the time, been complaints that it encouraged players to run over pedestrians to score points. Through a series of subsequent newspaper reports, the US National Safety Council labelled the game “gross” and motoring groups demanded its removal from distribution. The first moral panic over video game violence had begun. This January, a group of four scholars published a paper analysing the links between playing violent video games at a young age and aggressive behaviour in later life. The titles mentioned in the report are around 15-years-old – one of several troubling ambiguities to be found in the research. Nevertheless, the quality and quantity of the data make this an uncommonly valuable study. Given that game violence remains a favoured bogeyman for politicians, press and pressure groups, it should be shocking that such a robust study of the phenomenon is rare. But it is, and it’s important to ask why. A history of violence With the arrival of Pong in 1973, video games became a commercial reality, but now, in 2016, they are still on the rocky path to mass acceptance that all new media must traverse. The truth is that the big targets of moral concern – Doom, Grand Theft Auto, Call of Duty – are undeniably about killing and they are undeniably popular among male teenagers. An industry report estimates that 80% of the audience for the Call of Duty series is male, and 21% is aged 10-14. Going by the 18 rating on the last three entries, that means at least a fifth of the game’s vast audience shouldn’t be playing. © 2016 Guardian News and Media Limited
By Diana Kwon All of us have snapped at some point. A stranger cuts in line or a distracted driver nearly hits us and we lose our cool in a sudden fit of rage. As mass shootings continue to make headlines, it is becoming increasingly important to understand the brain circuits that underlie these flashes of emotion. R. Douglas Fields, a neuroscientist at the University of Maryland, College Park, and the National Institutes of Health, explores this very issue in his new book Why We Snap: Understanding the Rage Circuit in Your Brain (Dutton, 2016; 408 pages). After his own experience of sudden rage Fields began studying the topic and uncovered nine specific triggers, which he summarizes using the mnemonic “LIFEMORTS”: Life or limb (defending yourself against attackers); Insult; Family (protecting loved ones); Environment (protecting your territory); Mate; Order in society (responding to social injustice); Resources (gaining and safeguarding possessions); Tribe (defending your group); Stopped (escaping restraint or imprisonment). You mention in the introduction that you were compelled to write Why We Snap after a personal “snapping” incident in Barcelona: When a pickpocket snatched your wallet, you pinned him to the ground and grabbed it back. What was it like to try to understand that moment while writing and researching this book? Being pickpocketed was the inspiration for the book, but I also had the realization that this is an enormous problem that seems to be overlooked. © 2016 Scientific American
Anxious people perceive the world differently. An anxious brain appears to process sounds in an altered way, ramping up the expectation that something bad – or good – might happen. There’s no doubt that some degree of anxiety is vital for survival. When we learn that something is dangerous, we generalise that memory to apply the same warning signal to other, similar situations to avoid getting into trouble. If you’re bitten by a large, aggressive dog, for instance, it makes sense to feel slightly anxious around similar dogs. “It’s better to be safe than sorry,” says Rony Paz at the Weizmann Institute of Science in Rehovot, Israel. The trouble begins when this process becomes exaggerated. In the dog bite example, a person who went on to become anxious around all dogs, even small ones, would be described as overgeneralising. Overgeneralisation is thought to play a role in post-traumatic stress disorder and general anxiety disorder, a condition characterised by anxiety about many situations, leaving people in a state of near-constant restlessness. A study carried out by Paz suggests that overgeneralisation is not limited to anxious thoughts and memories – for such people the same process seems to affect their perception of the world. © Copyright Reed Business Information Ltd.
By Amy Ellis Nutt Surgeons snaked the electrodes under the 65-year-old woman’s scalp. Thirty years of Parkinson’s disease had almost frozen her limbs. The wires, connected to a kind of pacemaker under the skin, were aimed at decreasing the woman’s rigidity and allowing for more fluid movement. But five seconds after the first electrical pulse was fired into her brain, something else happened. Although awake and fully alert, she seemed to plunge into sadness, bowing her head and sobbing. One of the doctors asked what was wrong. “I no longer wish to live, to see anything, to hear anything, feel anything,” she said. Was she in some kind of pain? “No, I’m fed up with life. I’ve had enough,” she replied. “Everything is useless.” The operating team turned off the current. Less than 90 seconds later, the woman was smiling and joking, even acting slightly manic. Another five minutes more, and her normal mood returned. The patient had no history of depression. Yet in those few minutes after the electrical pulse was fired, the despair she expressed met nine of the 11 criteria for severe major depressive disorder in the Diagnostic and Statistical Manual of Mental Disorders. Fascinated by the anomaly, the French physicians wrote up the episode for the New England Journal of Medicine. The year was 1999, and hers was one of the first documented cases of an electrically induced, instantaneous, yet reversible depression. © 1996-2016 The Washington Post
By Christian Jarrett Most of us like to think that we’re independent-minded — we tell ourselves we like Adele’s latest album because it suits our taste, not because millions of other people bought it, or that we vote Democrat because we’re so enlightened, not because all our friends vote that way. The reality, of course, is that humans are swayed in all sorts of different ways — some of them quite subtle — by other people’s beliefs and expectations. Our preferences don’t form in a vacuum, but rather in something of a social pressure-cooker. This has been demonstrated over and over, perhaps most famously in the classic Asch conformity studies from the ‘50s. In those experiments, many participants went along with a blatantly wrong majority judgment about the lengths of different lines — simply, it seems, to fit in. (Although the finding is frequently exaggerated, the basic point about the power of social influence holds true.) But that doesn’t mean all humans are susceptible to peer pressure in the same way. You only have to look at your own friends and family to know that some people always seem to roll with the crowd, while others are much more independent-minded. What accounts for these differences? A new study in Frontiers in Human Neuroscience led by Dr. Juan Dominguez of Monash University in Melbourne, Australia, offers the first hint that part of the answer may come down to certain neural mechanisms. In short, the study suggests that people have a network in their brains that is attuned to disagreement with other people. When this network is activated, it makes us feel uncomfortable (we experience “cognitive dissonance,” to use the psychological jargon) and it’s avoiding this state that motivates us to switch our views as much as possible. It appears the network is more sensitive in some people than in others, and that this might account for varying degrees of pushover-ness. © 2016, New York Media LLC.
Greg Miller The crime was brutal. On November 4, 1989, after a night of heavy drinking, David Scott Detrich and a male coworker picked up a woman walking along the side of the road in Tucson, Arizona. After scoring some cocaine, the trio went back to her place, where, according to court documents, Detrich slit the woman’s throat and stabbed her 40 times. Later, the two men dumped her body in the desert. A jury convicted Detrich of kidnapping and first-degree murder in 1995, and a judge sentenced him to death. Detrich is still on death row today as the appeals process drags on, but in 2010, his lawyers achieved a victory of sorts. They claimed that Detrich had received “ineffective assistance of counsel” at his trial, because his original legal team had failed to present evidence of neuropsychological abnormalities and brain damage that might have swayed the court to give him a lesser sentence. A federal appeals court agreed. The ruling said, in effect, that Detrich had been denied his Constitutional right to a fair trial because his lawyers hadn’t called an expert witness to talk about his brain. That judicial opinion is just one of nearly 1,600 examined in a recent study documenting the expanding use of brain science in the criminal-justice system. The study, by Nita Farahany at Duke University, found that the number of judicial opinions that mention neuroscientific evidence more than doubled between 2005 and 2012. “There are good reasons to believe that the increase in published opinions involving neurobiology are just the tip of the iceberg,” says Owen Jones, a law professor at Vanderbilt who directs the MacArthur Foundation Research Network on Law and Neuroscience.
Link ID: 21946 - Posted: 03.02.2016
By Nala Rogers Treatments that zap the brain with magnets or electricity are rising in popularity, and some evidence suggests they can help lift depression. But scientists are starting to wonder whether they could be hitting the wrong place in left-handed patients. Now, two small studies suggest this could very well be the case. “This is the kind of question that’s been desperately needed for many years,” says Jim Coan, a clinical psychologist at the University of Virginia in Charlottesville who was not involved in the project. “Most researchers in this area, including myself, have selected samples that are strongly right-handed, just in order to avoid mess in the data.” Past studies have suggested that the spots targeted by both kinds of stimulation—located in the left hemisphere—are likely to process “approach” emotions such as happiness, curiosity, and anger, which drive people to reach out and engage with the world. Some studies have also hinted that the brain’s right hemisphere is more involved in so-called “avoidance” emotions such as sorrow and fear. But the studies that support this separation of emotion into the two halves of the brain have relied almost exclusively on right-handed individuals. To figure out whether something else was happening with lefties, University of Chicago in Illinois neuroscientist Daniel Casasanto designed two studies: one to link personality to patterns of brain activity and another to measure the outcome of common brain stimulation treatments in right-handed and left-handed individuals. The brain stimulation treatments were originally designed to treat depression by boosting feelings of happiness and engagement, which motivate “approach” behaviors such as exploring the world and interacting with friends. © 2016 American Association for the Advancement of Science.
Interview by Tim Adams Professor John Cacioppo has been studying the effects and causes of loneliness for 21 years. He is the director of the University of Chicago’s Center for Cognitive and Social Neuroscience. His book Loneliness: Human Nature and the Need for Social Connection examines the pathology and public health implications of the subject. You have been studying social connection and loneliness for more than two decades. How did you come to it as a subject? It was not biographical, I don’t think. Back in the early 90s I had outlined the new field called social neuroscience, the study of the neural mechanisms within a defined social species. Social species are those that create stable bonds, which have societies and cultures. And neuroscience hadn’t really studied those things. Was it something that neuroscientists, with their emphasis on individual brains and cells, resisted? When I proposed it in 1992, I anticipated some kickback from colleagues, so in the original papers I proposed that “social neuroscience isn’t an oxymoron”, and I explained why. That was all well and good, but I quickly realised that theoretical arguments were not going to be enough on their own. I needed to have a convincing demonstration of social neuroscience. And you chose loneliness for that? Well, I was originally interested in social connections. I argued we are defined by social connections, so what happens in the brain when you absent those? I took one other step. I said that the brain is the organ for creating, monitoring, nurturing and retaining these social connections, so it didn’t matter whether you actually had these connections, what was important was whether you felt that you had them. © 2016 Guardian News and Media Limited
Laura Sanders For some adults, Zika virus is a rashy, flulike nuisance. But in a handful of people, the virus may trigger a severe neurological disease. About one in 4,000 people infected by Zika in French Polynesia in 2013 and 2014 got a rare autoimmune disease called Guillain-Barré syndrome, researchers estimate in a study published online February 29 in the Lancet. Of 42 people diagnosed with Guillain-Barré in that outbreak, all had antibodies that signaled a Zika infection. Most also had recent symptoms of the infection. In a control group of hospital patients who did not have Guillain-Barré, researchers saw signs of Zika less frequently: Just 54 out of 98 patients tested showed signs of the virus. The message from this earlier Zika outbreak is that countries in the throes of Zika today “need to be prepared to have adequate intensive care beds capacity to manage patients with Guillain-Barré syndrome,” writes study coauthor Arnaud Fontanet of the Pasteur Institute in Paris and colleagues, some of whom are from French Polynesia. The study, says public health researcher Ernesto Marques of the University of Pittsburgh, “tells us what I think a lot of people already thought: that Zika can cause Guillain-Barré syndrome.” As with Zika and the birth defect microcephaly (SN: 2/20/16, p. 16), though, more work needs to be done to definitively prove the link. Several countries currently hard-hit by Zika have reported upticks in Guillain-Barré syndrome. Colombia, for instance, usually sees about 220 cases of the syndrome a year. But in just five weeks between mid-December 2015 to late January 2016, doctors diagnosed 86 cases, the World Health Organization reports. Other Zika-affected countries, including Brazil, El Salvador and Venezuela, have also reported unusually high numbers of cases. © Society for Science & the Public 2000 - 2016. All rights reserved.
By Roberto A. Ferdman Poverty has a way of rearing its ugly head, slipping into the cracks in people's lives when they're young and then re-emerging later in life. Sometimes it happens in ways that are easily observable—what poor babies are fed, for instance, has been shown to alter what they crave as adults, creating life-long affinities for foods that might be better left uneaten. But sometimes the influences are hidden, and all the more insidious as a result. A team of researchers, led by Sarah Hill, who teaches psychology at Texas Christian University, believe they have uncovered evidence of one such lingering effect. Specifically, Hill and her colleagues found that people who grow up poor seem to have a significantly harder time regulating their food intake, even when they aren't hungry. "We found that they eat comparably high amounts regardless of their need," said Hill. The researchers, interested in exploring why obesity is more prevalent in poorer populations, devised three separate experiments, which tested how people from different socioeconomic backgrounds behaved in front of food. In the first, they invited 31 female participants into their lab, who were asked how long it had been since they had eaten, and how hungry they were. They were then given snacks (cookies and pretzels), which they were free to eat or leave be, as they pleased. When they were finished, Hill and her team measured the number of calories each consumed. The discrepancy between how the participants ate was alarming.
By Ariana Eunjung Cha The Centers for Disease Control and Prevention just published their first national survey of sleep for all 50 states and the District of Columbia. In many respects, it's consistent with our image of ourselves as bleary-eyed insomniacs downing triple espresso shots and melatonin pills as we stare at our iPhones like zombies. The CDC found that more than a third of American adults are not getting the recommended amount of seven-plus hours of sleep on a regular basis. Here's a look at what sleep looks like across the United States, as broken down by marital status, geography, race/ethnicity and employment. The results aren't always what you might expect. 1. First, here's a breakdown of how much sleep Americans are getting overall. This is based on a random telephone survey of 444,306 respondents. Overall, about 65 percent reported a "healthy sleep duration" (seven or more hours of sleep on a regular basis) and about 35 percent reported they were getting less than that. 2. Being unable to work or being unemployed appears to affect sleep in a negative way. That's consistent with previous research on sleep quality and mental health issues like depression that can be related to unemployment. 3. People with college degrees or higher were more likely to get enough sleep. Maybe it's because they are more likely to know how important good sleep is to your health or maybe because they have jobs or income that allow them to get more sleep?
Alison Abbott. More than 50 years after a controversial psychologist shocked the world with studies that revealed people’s willingness to harm others on order, a team of cognitive scientists has carried out an updated version of the iconic ‘Milgram experiments’. Their findings may offer some explanation for Stanley Milgram's uncomfortable revelations: when following commands, they say, people genuinely feel less responsibility for their actions — whether they are told to do something evil or benign. “If others can replicate this, then it is giving us a big message,” says neuroethicist Walter Sinnot-Armstrong of Duke University in Durham, North Carolina, who was not involved in the work. “It may be the beginning of an insight into why people can harm others if coerced: they don’t see it as their own action.” The study may feed into a long-running legal debate about the balance of personal responsibility between someone acting under instruction and their instructor, says Patrick Haggard, a cognitive neuroscientist at University College London, who led the work, published on 18 February in Current Biology1. Milgram’s original experiments were motivated by the trial of Nazi Adolf Eichmann, who famously argued that he was ‘just following orders’ when he sent Jews to their deaths. The new findings don’t legitimize harmful actions, Haggard emphasizes, but they do suggest that the ‘only obeying orders’ excuse betrays a deeper truth about how a person feels when acting under command. © 2016 Nature Publishing Group
Leo Benedictus It seems so obvious when you hear it, yet it could have shaped society for centuries without our knowing. According to research presented by Dr Daniel Casasanto to the American Association for the Advancement of Science annual conference in Washington DC, people just prefer things that are in front of their favourite hand. It could be products on a shelf, or applicants for a job. “Righties would on average choose the person or product on the right; lefties, on average, the person or product on the left,” Dr Casasanto explained. And, from his research conducted at the University of Chicago, it is easy to see how this could have serious political implications. “We found in a large simulated election, that compared to lefties, righties will choose the candidate they see on the right of the ballot paper about 15% more,” Dr Casasanto said. His theory, in simple terms, is that because people go through life with a “fluent side” and a “clumsy side”, they develop a kind of unconscious favouritism, even for things that don’t require them to use their hands. “It seems blindingly obvious that you will have a preference for that bit of space where you operate more frequently,” says Professor Philip Corr, a psychologist at City University, London. “You’ll feel more comfortable operating in that part of the world. Intuitively it makes sense to me.” Many papers have been published on the subject, but we still don’t really know why people don’t all use the same hand - or an even balance of the two, as do most primates.
By Gretchen Reynolds The benefits of mindfulness meditation, increasingly popular in recent years, are supposed to be many: reduced stress and risk for various diseases, improved well-being, a rewired brain. But the experimental bases to support these claims have been few. Supporters of the practice have relied on very small samples of unrepresentative subjects, like isolated Buddhist monks who spend hours meditating every day, or on studies that generally were not randomized and did not include placebo control groups. This month, however, a study published in Biological Psychiatry brings scientific thoroughness to mindfulness meditation and for the first time shows that, unlike a placebo, it can change the brains of ordinary people and potentially improve their health. To meditate mindfully demands ‘‘an open and receptive, nonjudgmental awareness of your present-moment experience,’’ says J. David Creswell, who led the study and is an associate professor of psychology and the director of the Health and Human Performance Laboratory at Carnegie Mellon University. One difficulty of investigating meditation has been the placebo problem. In rigorous studies, some participants receive treatment while others get a placebo: They believe they are getting the same treatment when they are not. But people can usually tell if they are meditating. Dr. Creswell, working with scientists from a number of other universities, managed to fake mindfulness. First they recruited 35 unemployed men and women who were seeking work and experiencing considerable stress. Blood was drawn and brain scans were given. Half the subjects were then taught formal mindfulness meditation at a residential retreat center; the rest completed a kind of sham mindfulness meditation that was focused on relaxation and distracting oneself from worries and stress. © 2016 The New York Times Company
Link ID: 21907 - Posted: 02.18.2016