Chapter 11. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By JAMES GORMAN Any dog owner would testify that dogs are just as prone to jealousy as humans. But can one really compare Othello’s agony to Roscoe’s pique? The answer, according to Christine Harris, a psychologist at the University of California, San Diego, is that if you are petting another dog, Roscoe is going to show something that Dr. Harris thinks is a form of jealousy, even if not as complex and twisted as the adult human form. Other scientists agree there is something going on, but not all are convinced it is jealousy. And Roscoe and the rest of his tribe were, without exception, unavailable for comment. Dr. Harris had been studying human jealousy for years when she took this question on, inspired partly by the antics of her parents’ Border collies. When she petted them, “one would take his head and knock the other’s head away,” she said. It certainly looked like jealousy. But having studied humans, she was aware of different schools of thought about jealousy. Some scientists argue that jealousy requires complex thinking about self and others, which seems beyond dogs’ abilities. Others think that although our descriptions of jealousy are complex, the emotion itself may not be that complex. Dog emotions, as owners perceive them, have been studied before. In one case, Alexandra Horowitz, a cognitive scientist who is an adjunct associate professor at Barnard College and the author of “Inside of a Dog,” found that the so-called guilty look that dogs exhibit seemed to be more related to fear of punishment. Dr. Harris ventured into the tricky turf of dog emotion by devising a test based on work done with infants. © 2014 The New York Times Company
by Bethany Brookshire Even when we love our jobs, we all look forward to some time away. During the week, as stress builds up and deadlines accumulate, Friday looks better and better. Then, with a sigh of relief, the weekend arrives. But come Monday, it seems like the whole weight of responsibility just comes crashing down again. It’s not just you. Rats feel it, too. Rats given a two-day break from a stressful procedure show more signs of strain on “Monday” than rats who never got the weekend, researchers report July 11 in PLOS ONE. The results show that in some cases, an unpredictable getaway can cause more stress than just working through the pressure. Wei Zang, J. Amiel Rosenkranz and colleagues at the Rosalind Franklin University of School of Medicine and Science in Chicago wanted to understand how changes to a stressful situation alter an animal’s response to stress. Normally, when rats are exposed over and over to a stress such as a restraint (in which a rat is placed in a small tube where it can’t turn around or get out), they begin to get used to the stress. Over a few days, rats stop avoiding the tube and stay calmly in the restraint without struggling, until they are set free. Hormones like corticosterone — which spikes in response to stress — go down. This phenomenon is called habituation. Zhang and colleagues wanted to see what happens when this pattern of stress is interrupted. They restrained rats for 20 minutes each for five days. By day five, the animals were hanging out comfortably in the tubes. Then, the scientists introduced an interruption: They gave half of the rats two days off, a science-induced weekend. The scientists continued to restrain the other group of rats daily. © Society for Science & the Public 2000 - 2013
Link ID: 19872 - Posted: 07.23.2014
Sarah C. P. Williams The wheezing, coughing, and gasping for breath that come with a sudden asthma attack aren’t just the fault of an overactive immune system. A particularly sensitive bundle of neurons stretching from the brain to the lungs might be to blame as well, researchers have found. Drugs that alter these neurons could provide a new way to treat some types of asthma. “This is an exciting confirmation of an idea that’s been around for decades,” says Allison Fryer, a pulmonary pharmacology researcher at Oregon Health & Science University in Portland, who was not involved in the new study. An asthma attack can be brought on by a variety of triggers, including exercise, cold temperatures, pollen, and dust. During an attack, a person’s airways become inflamed, mucus clogs their lungs, and the muscles surrounding their airways tighten. Asthma is often considered a disease of the immune system because immune cells go into overdrive when they sense a trigger and cause inflammation. But a bundle of nerves that snakes through the neck and chest, the vagus nerve, has long been suspected to play a role; the cells it contains, after all, control the airway muscles. Studying which cell types and molecular pathways within the thick nerve bundle are involved, though, has been tough—the vagus contains a multitude of different cells that are physically intertwined. Working together at the Howard Hughes Medical Institute’s Janelia Farm Research Campus in Ashburn, Virginia, neurobiologists Dimitri Tränkner, now at the University of Utah in Salt Lake City, and Charles Zuker of Columbia University turned to genetics to work out the players. They selectively shut off different sets of the neurons in mice based on which genes each neuron expressed, rather than their physical location. Then, through a series of injections, they gave the animals an egg white allergy that causes asthmalike symptoms. © 2014 American Association for the Advancement of Science
Link ID: 19866 - Posted: 07.22.2014
By ANNA ALTMAN NPR conducted a study about how stressed out we are as a country, and the results, released last week, show that one in four Americans reported feeling stressed in the last month and one in two has experienced a major stressful event in the last year. Smithsonian Magazine, recommending the study, reports that this likely underestimates the actual stress load on Americans: “The survey only measures stress that people are conscious of, NPR explains, but research shows that people can suffer unaware from other forms of stress.” In short, according to Smithsonian, “stress is becoming the national psyche.” So we are barraged with new studies and ideas about stress and how it may be harming us — but many of them are contradictory. Stress can hurt your health, but stressing too much about stress is even worse for your health. Stress can make you sleep badly or it can make you fall asleep. People are most stressed out on Wednesday at 3:30 p.m. And BuzzFeed made a cute video asking whether stress can actually kill you. (“Those under significant stress can have more clogged arteries” and that “can ultimately lead to heart attack.”) Nevertheless, longstanding medical studies do show that chronic stress can lead to anxiety, depression, digestive problems, trouble sleeping, heart disease, weight gain and memory or concentration impairment. Alexandra Drane, a health care consultant, told NPR that those experiencing “toxic stress” were “2.6 times as likely to have diabetes, 2.9 times as likely to have back pain. They were 5 times as likely to be having mental health issues.” Our economy is contributing to the strain, as elevated stress levels often correlate with downturns. In the United States, the Great Recession brought a spike in stress and anxiety. The Gallup-Healthways Well-Being Index polls more than a thousand people each day and in 2008, the study’s first year, showed the definitive effects of economic hardship on stress and mental well-being. © 2014 The New York Times Company
Link ID: 19847 - Posted: 07.17.2014
Thomas B. Edsall It’s been a key question of American politics since at least 1968: Why do so many poor, working-class and lower-middle-class whites — many of them dependent for survival on government programs — vote for Republicans? The debate over the motives of conservative low-income white voters remains unresolved, but two recent research papers suggest that the hurdles facing Democrats in carrying this segment of the electorate may prove difficult to overcome. In “Obedience to Traditional Authority: A heritable factor underlying authoritarianism, conservatism and religiousness,” published by the journal Personality and Individual Differences in 2013, three psychologists write that “authoritarianism, religiousness and conservatism,” which they call the “traditional moral values triad,” are “substantially influenced by genetic factors.” According to the authors — Steven Ludeke of Colgate, Thomas J. Bouchard of the University of Minnesota, and Wendy Johnson of the University of Edinburgh — all three traits are reflections of “a single, underlying tendency,” previously described in one word by Bouchard in a 2006 paper as “traditionalism.” Traditionalists in this sense are defined as “having strict moral standards and child-rearing practices, valuing conventional propriety and reputation, opposing rebelliousness and selfish disregard of others, and valuing religious institutions and practices.” Working along a parallel path, Amanda Friesen, a political scientist at Indiana University, and Aleksander Ksiazkiewicz, a graduate student in political science at Rice University, concluded from their study comparing identical and fraternal twins that “the correlation between religious importance and conservatism” is “driven primarily, but usually not exclusively, by genetic factors.” The substantial “genetic component in these relationships suggests that there may be a common underlying predisposition that leads individuals to adopt conservative bedrock social principles and political ideologies while simultaneously feeling the need for religious experiences.” © 2014 The New York Times Company
The modern idea of stress began on a rooftop in Canada, with a handful of rats freezing in the winter wind. This was 1936 and by that point the owner of the rats, an endocrinologist named Hans Selye, had become expert at making rats suffer for science. "He would subject them to extreme temperatures, make them go hungry for long periods, or make them exercise a lot," the medical historian says. "Then what he would do is kill the rats and look at their organs." What was interesting to Selye was that no matter how different the tortures he devised for the rats were — from icy winds to painful injections — when he cut them open to examine their guts it appeared that the physical effects of his different tortures were always the same. "Almost universally these rats showed a particular set of signs," Jackson says. "There would be changes particularly in the adrenal gland. So Selye began to suggest that subjecting an animal to prolonged stress led to tissue changes and physiological changes with the release of certain hormones, that would then cause disease and ultimately the death of the animal." And so the idea of stress — and its potential costs to the body — was born. But here's the thing: The idea of stress wasn't born to just any parent. It was born to Selye, a scientist absolutely determined to make the concept of stress an international sensation. © 2014 NPR
Link ID: 19809 - Posted: 07.09.2014
By EMILY ANTHES It was love at first pet when Laurel Braitman and her husband adopted a 4-year-old Bernese mountain dog, a 120-pound bundle of fur named Oliver. The first few months were blissful. But over time, Oliver’s troubled mind slowly began to reveal itself. He snapped at invisible flies. He licked his tail until it was wounded and raw. He fell to pieces when he spied a suitcase. And once, while home alone, he ripped a hole in a screen and jumped out of a fourth-floor window. To everyone’s astonishment, he survived. Oliver’s anguish devastated Dr. Braitman, a historian of science, but it also awakened her curiosity and sent her on an investigation deep into the minds of animals. The result is the lovely, big-hearted book “Animal Madness,” in which Dr. Braitman makes a compelling case that nonhuman creatures can also be afflicted with mental illness and that their suffering is not so different from our own. In the 17th century, Descartes described animals as automatons, a view that held sway for centuries. Today, however, a large and growing body of research makes it clear that animals have never been unthinking machines. We now know that species from magpies to elephants can recognize themselves in the mirror, which some scientists consider a sign of self-awareness. Rats emit a form of laughter when they’re tickled. And dolphins, parrots and dogs show clear signs of distress when their companions die. Together, these and many other findings demonstrate what any devoted pet owner has probably already concluded: that animals have complex minds and rich emotional lives. Unfortunately, as Dr. Braitman notes, “every animal with a mind has the capacity to lose hold of it from time to time.” © 2014 The New York Times Company
—By Chris Mooney The United States has a voting problem. In the 2012 presidential election, only about 57 percent of eligible American voters turned out, a far lower participation rate than in comparable democracies. That means about 93 million people who were eligible to vote didn't bother. Clearly, figuring out why people vote (and why they don't) is of premium importance to those who care about the health of democracy, as well as to campaigns that are becoming ever more sophisticated in targeting individual voters. To that end, much research has shown that demographic factors such as age and poverty affect one's likelihood of voting. But are there individual-level biological factors that also influence whether a person votes? The idea has long been heretical in political science, and yet the logic behind it is unavoidable. People vary in all sorts of ways—ranging from personalities to genetics—that affect their behavior. Political participation can be an emotional, and even a stressful activity, and in an era of GOP-led efforts to make voting more difficult, voting in certain locales can be a major hassle. To vote, you need both to be motivated and also not so intimidated you stay away from the polls. So are there biological factors that can shape these perceptions? "Our study is unique in that it is the first to examine whether differences in physiology may be causally related to differences in political activity," says lead study author Jeffrey French. ©2014 Mother Jones
Link ID: 19790 - Posted: 07.04.2014
From David Beckham’s infamous kick at France '98 to Luis Suárez chomping Giorgio Chiellini's shoulder in Brazil last week, the history of the World Cup is littered with moments of impulsive aggression that appear to defy all rational explanation. The story of human impulsivity stretches back deep into our evolutionary past. By nature, we are all prone to making quick, rash decisions that may lead to regret, and in some cases a four-month ban from international football. Impulsivity is actually a survival mechanism and was essential in the African savanna where our species evolved around a million and a half years ago. For our ancestors, the ability to make split-second decisions could make the difference between life and death. All of us have deep primal instincts but over the several hundred million years of evolution separating our reptilian ancestors from the first mammals, and eventually primates, the cognitive ability to exercise self-restraint has increased. While most living things make this decision purely as a trade-off between risk and reward, only humans can decide to exercise self-restraint on the basis of how they think they will be perceived by others – an ability that emerged some time in the past 100,000 years or so. “We evolved to be very social animals, living in large groups, and so we have developed inhibitory mechanisms in the more recently evolved parts of the prefrontal cortex,” explains Michael Price of the School of Social Sciences at the University of Brunel. “This is the social centre of the brain. Our big reason not to be impulsive is because of your reputation and how other people are going to judge you and perhaps ostracise you as we saw with Beckham in the aftermath of France ’98.” © 2014 Guardian News and Media Limited
By Tanya Lewis and Live Science They say laughter is the best medicine. But what if laughter is the disease? For a 6-year-old girl in Bolivia who suffered from uncontrollable and inappropriate bouts of giggles, laughter was a symptom of a serious brain problem. But doctors initially diagnosed the child with “misbehavior.” “She was considered spoiled, crazy — even devil-possessed,” José Liders Burgos Zuleta of the Advanced Medical Image Centre in La Paz said in a statement. [ But Burgos Zuleta discovered that the true cause of the girl’s laughing seizures, medically called gelastic seizures, was a brain tumor. After the girl underwent a brain scan, the doctors discovered a hamartoma, a small, benign tumor that was pressing against her brain’s temporal lobe. Surgeons removed the tumor, the doctors said. She stopped having the uncontrollable attacks of laughter and now laughs only normally, they said. Gelastic seizures are a relatively rare form of epilepsy, said Solomon Moshé, a pediatric neurologist at Albert Einstein College of Medicine in New York. “It’s not necessarily ‘ha-ha-ha’ laughing,” Moshé said. “There’s no happiness in this. Some of the kids may be very scared,” he added. The seizures are most often caused by tumors in the hypothalamus, although they can also come from tumors in other parts of brain, Moshé said. Although laughter is the main symptom, patients may also have outbursts of crying.
Sarah C. P. Williams There’s a reason people say “Calm down or you’re going to have a heart attack.” Chronic stress—such as that brought on by job, money, or relationship troubles—is suspected to increase the risk of a heart attack. Now, researchers studying harried medical residents and harassed rodents have offered an explanation for how, at a physiological level, long-term stress can endanger the cardiovascular system. It revolves around immune cells that circulate in the blood, they propose. The new finding is “surprising,” says physician and atherosclerosis researcher Alan Tall of Columbia University, who was not involved in the new study. “The idea has been out there that chronic psychosocial stress is associated with increased cardiovascular disease in humans, but what’s been lacking is a mechanism,” he notes. Epidemiological studies have shown that people who face many stressors—from those who survive natural disasters to those who work long hours—are more likely to develop atherosclerosis, the accumulation of fatty plaques inside blood vessels. In addition to fats and cholesterols, the plaques contain monocytes and neutrophils, immune cells that cause inflammation in the walls of blood vessels. And when the plaques break loose from the walls where they’re lodged, they can cause more extreme blockages elsewhere—leading to a stroke or heart attack. Studying the effect of stressful intensive care unit (ICU) shifts on medical residents, biologist Matthias Nahrendorf of Harvard Medical School in Boston recently found that blood samples taken when the doctors were most stressed out had the highest levels of neutrophils and monocytes. To probe whether these white blood cells, or leukocytes, are the missing link between stress and atherosclerosis, he and his colleagues turned to experiments on mice. © 2014 American Association for the Advancement of Science
Link ID: 19761 - Posted: 06.23.2014
by Laura Sanders Some brain cells need a jolt of stress to snap to attention. Cells called astroglia help regulate blood flow, provide energy to nearby cells and even influence messages’ movement between nerve cells. Now, scientists report June 18 in Neuron that astroglia can be roused by the stress molecule norepinephrine, an awakening that may help the entire brain jump into action. As mice were forced to walk on a treadmill, an activity that makes them alert, astroglia in several parts of their brains underwent changes in calcium levels, a sign of activity, neuroscientist Dwight Bergles of Johns Hopkins University School of Medicine and colleagues found. Norepinephrine, which acts as a fight-or-flight hormone in the body and a neural messenger in the brain, seemed to cause the cell activity boost. When researchers depleted norepinephrine, treadmill walking no longer activated astroglia. It’s not clear whether astroglia in all parts of the brain heed this wake-up call, nor is it clear whether this activation influences behavior. Norepinephrine might help shift brain cells, both neurons and astroglia, into a state of heightened vigilance, the authors write. © Society for Science & the Public 2000 - 2013.
By MARIA KONNIKOVA THE absurdity of having had to ask for an extension to write this article isn’t lost on me: It is, after all, a piece on time and poverty, or, rather, time poverty — about what happens when we find ourselves working against the clock to finish something. In the case of someone who isn’t otherwise poor, poverty of time is an unpleasant inconvenience. But for someone whose lack of time is just one of many pressing concerns, the effects compound quickly. We make a mistake when we look at poverty as simply a question of financial constraint. Take what happened with my request for an extension. It was granted, and the immediate time pressure was relieved. But even though I met the new deadline (barely), I’m still struggling to dig myself out from the rest of the work that accumulated in the meantime. New deadlines that are about to whoosh by, a growing list of ignored errands, a rent check and insurance payment that I just realized I haven’t mailed. And no sign of that promised light at the end of the tunnel. My experience is the time equivalent of a high-interest loan cycle, except instead of money, I borrow time. But this kind of borrowing comes with an interest rate of its own: By focusing on one immediate deadline, I neglect not only future deadlines but the mundane tasks of daily life that would normally take up next to no time or mental energy. It’s the same type of problem poor people encounter every day, multiple times: The demands of the moment override the demands of the future, making that future harder to reach. When we think of poverty, we tend to think about money in isolation: How much does she earn? Is that above or below the poverty line? But the financial part of the equation may not be the single most important factor. “The biggest mistake we make about scarcity,” Sendhil Mullainathan, an economist at Harvard who is a co-author of the book “Scarcity: Why Having Too Little Means So Much,” tells me, “is we view it as a physical phenomenon. It’s not.” © 2014 The New York Times Company
By JAMES GORMAN Crazed commuters, fretful parents and overwrought executives are not the only ones to suffer from anxiety — or to benefit from medication for it. The simple crayfish has officially entered the age of anxiety, too. This presumably was already clear to crayfish, which have been around for more than 200 million years and, what with predatory fish — and more recently, étouffée — have long had reasons to worry. But now scientists from France have documented behavior in crayfish that fits the pattern of a certain type of anxiety in human beings and other animals. Although the internal life of crayfish is still unknown, the findings, reported on Thursday in the journal Science, suggest that the external hallmarks of anxiety have been around for a very long time — and far down the food chain. Beyond that, a precursor of Valium changed the behavior back to normal. That does not mean that the crayfish are ready for the therapist’s couch, but it does reinforce the sometimes surprising connections humans have with other living things. Humans share genes with yeast as well as apes, the brains of flies can yield insights into the brains of humans, and even a tiny roundworm has mating behaviors that depend on a molecule very similar to a human hormone. The response to a threat or danger that the scientists found in crayfish had been documented before in other animals, like mice, but not in invertebrates like insects and crustaceans. Researchers including Pascal Fossat and Daniel Cattaert at the University of Bordeaux reported that after crayfish were exposed to electric shocks, they would not venture out of comfortable dark areas in an elaborate aquarium into scarier (for a crayfish) brightly lit areas. © 2014 The New York Times Company
The financial crisis has been linked to a 4.5 per cent increase in Canada’s suicide rate, according to a study that estimates at least 10,000 extra suicides could be connected to economic hardship in EU countries and North America. Researchers compared suicide data from the World Health Organization before and after the onset of the recession in 2007. "A crucial question for policy and psychiatric practice is whether these suicide rises are inevitable," Aaron Reeves of Oxford University’s sociology department and his co-authors said in Wednesday’s issue of the British Journal of Psychiatry. Given that the rise in suicides exceeded what would be expected and the large variations in suicide rates across countries, the researchers suspect some of the suicides were "potentially avoidable." In Canada, the suicides rose by 4.5 per cent or about 240 suicides more than expected between 2007 and 2010. In the U.S.A, the rate increased by 4.8 per cent over the same period. Before 2007 in Europe, suicide rates had been falling, but the trend reversed, rising by 6.5 per cent by 2009 and staying elevated through 2011. Two countries, Sweden and Finland, bucked the trend in the early 1990s. Job loss, home repossession and debt are the main risk factors leading to suicide during economic downturns, previous studies suggest. © CBC 2014
Jane J. Lee Could've, should've, would've. Everyone has made the wrong choice at some point in life and suffered regret because of it. Now a new study shows we're not alone in our reaction to incorrect decisions. Rats too can feel regret. Regret is thinking about what you should have done, says David Redish, a neuroscientist at the University of Minnesota in Minneapolis. It differs from disappointment, which you feel when you don't get what you expected. And it affects how you make decisions in the future. (See "Hand Washing Wipes Away Regrets?") If you really want to study emotions or feelings like regret, says Redish, you can't just ask people how they feel. So when psychologists and economists study regret, they look for behavioral and neural manifestations of it. Using rats is one way to get down into the feeling's neural mechanics. Redish and colleague Adam Steiner, also at the University of Minneapolis, found that rats expressed regret through both their behavior and their neural activity. Those signals, researchers report today in the journal Nature Neuroscience, were specific to situations the researchers set up to induce regret, which led to specific neural patterns in the brain and in behavior. When Redish and Steiner looked for neural activity, they focused on two areas known in people—and in some animals—to be involved in decision-making and the evaluation of expected outcomes: the orbitofrontal cortex and the ventral striatum. Brain scans have revealed that people with a damaged orbitofrontal cortex, for instance, don't express regret. To record nerve-cell activity, the researchers implanted electrodes in the brains of four rats—a typical sample size in this kind of experiment—then trained them to run a "choice" maze. © 1996-2014 National Geographic Society
By JOHN COATES SIX years after the financial meltdown there is once again talk about market bubbles. Are stocks succumbing to exuberance? Is real estate? We thought we had exorcised these demons. It is therefore with something close to despair that we ask: What is it about risk taking that so eludes our understanding, and our control? Part of the problem is that we tend to view financial risk taking as a purely intellectual activity. But this view is incomplete. Risk is more than an intellectual puzzle — it is a profoundly physical experience, and it involves your body. Risk by its very nature threatens to hurt you, so when confronted by it your body and brain, under the influence of the stress response, unite as a single functioning unit. This occurs in athletes and soldiers, and it occurs as well in traders and people investing from home. The state of your body predicts your appetite for financial risk just as it predicts an athlete’s performance. If we understand how a person’s body influences risk taking, we can learn how to better manage risk takers. We can also recognize that mistakes governments have made have contributed to excessive risk taking. Consider the most important risk manager of them all — the Federal Reserve. Over the past 20 years, the Fed has pioneered a new technique of influencing Wall Street. Where before the Fed shrouded its activities in secrecy, it now informs the street in as clear terms as possible of what it intends to do with short-term interest rates, and when. Janet L. Yellen, the chairwoman of the Fed, declared this new transparency, called forward guidance, a revolution; Ben S. Bernanke, her predecessor, claimed it reduced uncertainty and calmed the markets. But does it really calm the markets? Or has eliminating uncertainty in policy spread complacency among the financial community and actually helped inflate market bubbles? We get a fascinating answer to these questions if we turn from economics and look into the biology of risk taking. © 2014 The New York Times Company
By Jonathan Webb Science reporter, BBC News A new theory suggests that our male ancestors evolved beefy facial features as a defence against fist fights. The bones most commonly broken in human punch-ups also gained the most strength in early "hominin" evolution. They are also the bones that show most divergence between males and females. The paper, in the journal Biological Reviews, argues that the reinforcements evolved amid fighting over females and resources, suggesting that violence drove key evolutionary changes. For many years, this extra strength was seen as an adaptation to a tough diet including nuts, seeds and grasses. But more recent findings, examining the wear pattern and carbon isotopes in australopith teeth, have cast some doubt on this "feeding hypothesis". "In fact, [the australopith] boisei, the 'nutcracker man', was probably eating fruit," said Prof David Carrier, the new theory's lead author and an evolutionary biologist at the University of Utah. Masculine armour Instead of diet, Prof Carrier and his co-author, physician Dr Michael Morgan, propose that violent competition demanded the development of these facial fortifications: what they call the "protective buttressing hypothesis". In support of their proposal, Carrier and Morgan offer data from modern humans fighting. Several studies from hospital emergency wards, including one from the Bristol Royal Infirmary, show that faces are particularly vulnerable to violent injuries. BBC © 2014
Neil Levy Can human beings still be held responsible in the age of neuroscience? Some people say no: they say once we understand how the brain processes information and thereby causes behaviour, there’s nothing left over for the person to do. This argument has not impressed philosophers, who say there doesn’t need to be anything left for the person to do in order to be responsible. People are not anything over and above the causal systems involved in information processing, we are our brains (plus some other, equally physical stuff). We are responsible if our information processing systems are suitably attuned to reasons, most philosophers think. There are big philosophical debates concerning what it takes to be suitably attuned to reasons, and whether this is really enough for responsibility. But I want to set those debates aside here. It’s more interesting to ask what we can learn from neuroscience about the nature of responsibility and about when we’re responsible. Even if neuroscience doesn’t tell us that no one is ever responsible, it might be able to tell us if particular people are responsible for particular actions. A worthy case study Consider a case like this: early one morning in 1987, a Canadian man named Ken Parks got up from the sofa where he had fallen asleep and drove to his parents’-in-law house. There he stabbed them both before driving to the police station, where he told police he thought he had killed someone. He had: his mother-in-law died from her injuries. © 2010–2014, The Conversation Trust (UK)
By Denali Tietjen Meditation has long been known for its mental health benefits, but new research shows that just a few minutes of mindfulness can improve physical health and personal life as well. A recent study conducted by researchers at INSEAD and The Wharton School found that 15 minutes of mindful meditation can help you make better decisions. The research, published in the Association for Psychological Science’s journal Psychological Science, comes from four studies (varying in sample size from 69 to 178 adults) in which participants responded to sunk-cost scenarios at different degrees of mindful awareness. The results consistently showed that increased mindfulness decreases the sunk-cost bias. WOAH, hold the phone. What’s a sunk cost and what’s a sunk-cost bias?? Sunk cost is an economics term that psychologists have adopted. In economics, sunk costs are defined as non-recoverable investment costs like the cost of employee training or a lease on office space. In psychology, sunk costs are basically the same thing: The time and energy we put into our personal lives. Though we might not sit down with a calculator at the kitchen table when deciding who to take as our plus one to our second cousin’s wedding next weekend, we do a cost-benefit analysis every time we make a decision. And we take these sunk costs into account. The sunk-cost bias, then, is the tendency to allow sunk costs to overly influence current decisions. Mindfulness meditation can provide improved clarity, which helps you stay present and make better decisions, the study says. This protects you from that manipulative sunk-cost bias.
Link ID: 19693 - Posted: 06.05.2014