Chapter 15. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Nicola Davis People with a larger circle of friends are better able to tolerate pain, according to research into the pain thresholds and social networks of volunteers. The link is thought to be down a system in the brain that involves endorphins: potent pain-killing chemicals produced by the body that also trigger a sense of wellbeing. “At an equivalent dose, endorphins have been shown to be stronger than morphine,” said Katerina Johnson, a doctoral student at the University of Oxford, who co-authored the research. Writing in the journal Scientific Reports, Johnson and Robin Dunbar, professor of evolutionary psychology at the University of Oxford, sought to probe the theory that the brain’s endorphin system might have evolved to not only handle our response to physical discomfort, but influence our experience of pleasure from social interactions too. “Social behaviour and being attached to other individuals is really important for our survival - whether that is staying close to our parents, or our offspring or cooperating with others to find food or to help defend ourselves,” said Johnson. To test the link, the authors examined both the social networks and pain thresholds of 101 adults aged between 18 and 34. Each participant was asked to complete a questionnaire, designed to quiz them on friends they contacted once a week and those they got in touch with once a month. The personality of each participant was probed, looking at traits such as “agreeableness”; they were also asked to rate their fitness and stress levels. © 2016 Guardian News and Media Limited
Keyword: Pain & Touch
Link ID: 22156 - Posted: 04.28.2016
People who've recovered from depression stave off relapses with mindfulness therapy as well as with antidepressants, a new review finds. Mindfulness-based cognitive therapy (MBCT) is an eight-week group program that helps people become better observers of their own thoughts and emotions and to learn to distance themselves before ruminations spiral downwards. An international team of psychiatry researchers combined data from nine randomized trials of 1,258 patients total with recurrent depression to compare the mindfulness therapy to placebo, treatment as usual and other active treatments including antidepressants. People suffering from depression who received the mindfulness therapy were 31 per cent less likely to suffer a relapse during the next 60 weeks compared with those who did not receive it, Willem Kuyken of the University of Oxford, in England and his co-authors reported in a meta-analysis review in Wednesday's issue of the journal JAMA Psychiatry. "If you compare MBCT against antidepressant medication it basically holds its own, which means it provides protection on par with what people would get from continuing to take to take medications for one, two or three years after they've recovered from depression," said co-author Dr. Zindel Segal, a professor of psychology at the University of Toronto Scarborough. No one reported side-effects associated with participating in the therapy. ©2016 CBC/Radio-Canada.
by Bethany Brookshire Interviewing for a new job is filled with uncertainty, and that uncertainty fuels stress. There’s the uncertainty associated with preparing for the interview — what questions will they ask me? What should I put in my portfolio? And then there’s the ambiguity when you’re left to stew. Did I get the job? Or did someone else? Scientists have recently shown that these two types of uncertainty — the kind we can prepare for, and the kind we’re just stuck with — are not created equal. The uncertainty we can’t do anything about is more stressful than the one we can. The results help show exactly what in our lives freaks us out — and why. But the findings also show a positive side to the stress we feel when not knowing what’s ahead — the closer our stress levels reflect the real ambiguity in the world, the better we perform in it. “There is a bias in the public perception” against stress, says Claus Lamm, a cognitive neuroscientist at the University of Vienna in Austria. But stress “prepares us to deal with environmental challenges,” he notes, preparing us to fight or flee, and it keeps us paying attention to our surroundings. For decades, scientists have been trying to figure out just what makes us stressed and why. It turns out that unpredictability is a great stressor. Studies in the 1960s and 1970s showed that rats and humans who can’t predict a negative effect (such as a small shock) end up more frazzled than those who can predict when a zap is coming. In a 2006 study, people zapped with unpredictable electric shocks to the hand rated the pain as more unpleasant than when they knew what to expect. © Society for Science & the Public 2000 - 2016.
Link ID: 22151 - Posted: 04.27.2016
By Leonard Sax, M.D., Ph.D Why is it that girls tend to be more anxious than boys? It may start with how they feel about how they look. Some research has shown that in adolescence, girls tend to become more dissatisfied with their bodies, whereas boys tend to become more satisfied with their bodies. Another factor has to do with differences in how girls and boys use social media. A girl is much more likely than a boy to post a photo of herself wearing a swimsuit, while the boy is more likely to post a photo where the emphasis is on something he has done rather than on how he looks. If you don’t like Jake’s selfie showing off his big trophy, he may not care. But if you don’t like Sonya’s photo of herself wearing her bikini, she’s more likely to take it personally. Imagine another girl sitting in her bedroom, alone. She’s scrolling through other girls’ Instagram and Snapchat feeds. She sees Sonya showing off her new bikini; Sonya looks awesome. She sees Madison at a party, having a blast. She sees Vanessa with her adorable new puppy. And she thinks: I’m just sitting here in my bedroom, not doing anything. My life sucks. Boys are at lower risk for the toxic effects of social media than girls are, for at least three reasons. First, boys are less likely to be heavily invested in what you think of their selfies. “Does this swimsuit make me look fat?” is a question asked by girls more often than by boys. Second, boys tend to overestimate how interesting their own life is. Third, the average boy is likely to spend more time playing video games than Photoshopping his selfie for Instagram. And in video games, unlike social media, everybody truly can be a winner, eventually. If you play Grand Theft Auto or Call of Duty long enough, you will, sooner or later, complete all the missions, if you just keep at it. © 2016 The New York Times Company
By SABRINA TAVERNISE WASHINGTON — Suicide in the United States has surged to the highest levels in nearly 30 years, a federal data analysis has found, with increases in every age group except older adults. The rise was particularly steep for women. It was also substantial among middle-aged Americans, sending a signal of deep anguish from a group whose suicide rates had been stable or falling since the 1950s. The suicide rate for middle-aged women, ages 45 to 64, jumped by 63 percent over the period of the study, while it rose by 43 percent for men in that age range, the sharpest increase for males of any age. The overall suicide rate rose by 24 percent from 1999 to 2014, according to the National Center for Health Statistics, which released the study on Friday. The increases were so widespread that they lifted the nation’s suicide rate to 13 per 100,000 people, the highest since 1986. The rate rose by 2 percent a year starting in 2006, double the annual rise in the earlier period of the study. In all, 42,773 people died from suicide in 2014, compared with 29,199 in 1999. From 1999 to 2014, suicide rates in the United States rose among most age groups. Men and women from 45 to 64 had a sharp increase. Rates fell among those age 75 and older. “It’s really stunning to see such a large increase in suicide rates affecting virtually every age group,” said Katherine Hempstead, senior adviser for health care at the Robert Wood Johnson Foundation, who has identified a link between suicides in middle age and rising rates of distress about jobs and personal finances. Researchers also found an alarming increase among girls 10 to 14, whose suicide rate, while still very low, had tripled. The number of girls who killed themselves rose to 150 in 2014 from 50 in 1999. “This one certainly jumped out,” said Sally Curtin, a statistician at the center and an author of the report. © 2016 The New York Times Company
Anna Nowogrodzki There’s a little too much wishful thinking about mindfulness, and it is skewing how researchers report their studies of the technique. Researchers at McGill University in Montreal, Canada, analysed 124 published trials of mindfulness as a mental-health treatment, and found that scientists reported positive findings 60% more often than is statistically likely. The team also examined another 21 trials that were registered with databases such as ClinicalTrials.gov; of these, 62% were unpublished 30 months after they finished. The findings — reported in PLoS ONE on 8 April1 — hint that negative results are going unpublished. Mindfulness is the practice of being aware of thoughts and feelings without judging them good or bad. Mental-health treatments that focus on this method include mindfulness-based stress reduction — an 8-week group-based programme that includes yoga and daily meditation — and mindfulness-based cognitive therapy. A bias toward publishing studies that find the technique to be effective withholds important information from mental-health clinicians and patients, says Christopher Ferguson, a psychologist at Stetson University in Florida, who was not involved in the study. “I think this is a very important finding,” he adds. “We’ll invest a lot of social and financial capital in these issues, and a lot of that can be misplaced unless we have good data.” © 2016 Nature Publishing Group
Link ID: 22129 - Posted: 04.23.2016
By Esther Landhuis Peer inside the brain of someone with Alzheimer’s disease, and you’ll see some striking features: shriveled nerve cells and strange protein clumps. According to a leading theory, proteins called amyloid beta and tau build up in the brain and choke nerve cell communication, setting the disease in motion years before people suspect anything is wrong with their recall. Yet the Alzheimer’s brain has another curious aspect. Some of the clusters of toxic amyloid proteins are entangled with octopus-like immune cells called microglia, cells that live in the brain to clear unwanted clutter. By munching on amyloid plaques, microglia are thought to help keep the disease at bay. But these housekeeping cells have an additional role—they switch on inflammatory pathways. Inflammation is critically important when the immune system encounters infection or needs to repair tissue. If left unchecked, however, the inflammatory process churns out toxic substances that can kill surrounding cells, whose death triggers more inflammation and creates a vicious cycle. For years scientists have probed how neuroinflammation contributes to Alzheimer’s disease and other neurodegenerative ailments. Researchers face a number of immediate questions: Is neuroinflammation a driving force? Does it kick in when the disease is already underway and worsen the process? Could it be harnessed for good in the early stages? Those questions are far from settled, but research is starting to reveal a clearer picture. “It may not be the amyloid plaques themselves that directly damage neurons and the connections between them. Rather, it may be the immune reaction to the plaques that does the damage,” says Cynthia Lemere, a neuroscientist at Brigham and Women’s Hospital. Still, it is hard to say if microglia are good guys or bad, making it challenging to create therapeutics that target these cells. © 2016 Scientific American
By Mitch Leslie The worst part of being sick isn’t always the muscle aches and coughing. It’s the foggy head, the crankiness, the apathy, and the fatigue—in short, what researchers call sickness behavior. A new study uncovers a molecular mechanism that explains why we feel so crummy when we’re under the weather. “It’s a nice study that’s covered a lot of ground,” says neuroimmunologist Colm Cunningham of Trinity College in Dublin who wasn’t connected to the research. “What they’ve found is very plausible.” Although sickness behavior is unpleasant, researchers think the symptoms we suffer during a viral or bacterial infection are beneficial, enabling us to divert our energy to fighting the pathogens that have invaded our bodies. For cancer patients and people with autoimmune diseases, however, sickness behavior can be an unwanted side effect of treatment with immune molecules known as interferons, which our cells naturally release when we have an infection. The condition has posed a puzzle for researchers because they assumed the blood-brain barrier, a protective system that excludes most pathogens and immune molecules from the brain, would block signals from the immune system. Although scientists have identified several mechanisms that allow such messages to cross the barrier and influence behavior, the question of how the immune system and brain communicate “has been only partially answered,” says immunophysiologist Keith Kelley of the University of Illinois, Urbana-Champaign, who wasn’t connected to the new study. © 2016 American Association for the Advancement of Science.
Link ID: 22121 - Posted: 04.20.2016
By FRANS de WAAL TICKLING a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself. The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter. Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal. The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners. Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. Everything boiled down to genes and reinforcement. To think otherwise opened you up to ridicule, which is what happened to Wolfgang Köhler, the German psychologist who, a century ago, was the first to demonstrate flashes of insight in chimpanzees. © 2016 The New York Times Company
By JOANNA KLEIN Misconception: Migraines are psychological manifestations of women’s inability to manage stress and emotions Actually: Neurologists are very clear that migraines are a real, debilitating medical condition related to temporary abnormal brain activity. The fact that they may be more common for some women during “that time of the month” has nothing to do with emotions. For centuries, doctors explained migraines as a woman’s problem caused by emotional disturbances like hysteria, depression or stress. “Bizarrely, the recommended cure was marriage!” said Dr. Anne MacGregor, the lead author of the British Association for the Study of Headache’s guidelines for diagnosing and managing migraines. While that prescription may be far behind us, the misconception that migraines are fueled by a woman’s inability to cope persists. “It was considered psychological, or that I was a nervous overachiever, so I would never tell people that I have them,” said Lorie Novak, an artist in her sixties who has suffered from migraines since she was 8. After reading Joan Didion’s 1968 essay “In Bed,” about the writer’s struggle with migraines, Ms. Novak decided to tackle the representation of these debilitating headaches. Starting in 2009, Ms. Novak photographed herself every time she got a migraine. Under the hashtag #notjustaheadache, hundreds of others on Twitter and Instagram have demonstrated their own frustration with a widespread lack of understanding of the reality of migraines. © 2016 The New York Times Company
Philip Ball James Frazer’s classic anthropological study The Golden Bough1 contains a harrowing chapter on human sacrifice in rituals of crop fertility and harvest among historical cultures around the world. Frazer describes sacrificial victims being crushed under huge toppling stones, slow-roasted over fires and dismembered alive. Frazer’s methods of analysis wouldn't all pass muster among anthropologists today (his work was first published in 1890), but it is hard not to conclude from his descriptions that what industrialized societies today would regard as the most extreme psychopathy has in the past been seen as normal — and indeed sacred — behaviour. In almost all societies, killing within a tribe or clan has been strongly taboo; exemption is granted only to those with great authority. Anthropologists have suspected that ritual human sacrifice serves to cement power structures — that is, it signifies who sits at the top of the social hierarchy. The idea makes intuitive sense, but until now there has been no clear evidence to support it. In a study published in Nature2, Joseph Watts, a specialist in cultural evolution at the University of Auckland in New Zealand, and his colleagues have analysed 93 traditional cultures in Austronesia (the region that loosely embraces the many small and island states in the Pacific and Indonesia) as they were before they were influenced by colonization and major world religions (generally in the late 19th and early 20th centuries). © 2016 Nature Publishing Group
By Emily Underwood More than 99% of clinical trials for Alzheimer’s drugs have failed, leading many to wonder whether pharmaceutical companies have gone after the wrong targets. Now, research in mice points to a potential new target: a developmental process gone awry, which causes some immune cells to feast on the connections between neurons. “It is beautiful new work,” which “brings into light what’s happening in the early stage of the disease,” says Jonathan Kipnis, a neuroscientist at the University of Virginia School of Medicine in Charlottesville. Most new Alzheimer’s drugs aim to eliminate β amyloid, a protein that forms telltale sticky plaques around neurons in people with the disease. Those with Alzheimer’s tend to have more of these deposits in their brains than do healthy people, yet more plaques don’t always mean more severe symptoms such as memory loss or poor attention, says Beth Stevens of Boston Children’s Hospital, who led the new work. What does track well with the cognitive decline seen in Alzheimer’s disease—at least in mice that carry genes that confer high risk for the condition in people—is a marked loss of synapses, particularly in brain regions key to memory, Stevens says. These junctions between nerve cells are where neurotransmitters are released to spark the brain’s electrical activity. Stevens has spent much of her career studying a normal immune mechanism that prunes weak or unnecessary synapses as the brain matures from the womb through adolescence, allowing more important connections to become stronger. In this process, a protein called C1q sets off a series of chemical reactions that ultimately mark a synapse for destruction. After a synapse has been “tagged,” immune cells called microglia—the brain’s trash disposal service—know to “eat” it, Stevens says. © 2016 American Association for the Advancement of Science
Noah Smith, ( How do human beings behave in response to risk? That is one of the most fundamental unanswered questions of our time. A general theory of decision-making amid uncertainty would be the kind of scientific advance that comes only a few times a century. Risk is central to financial and insurance markets. It affects the consumption, saving and business investment that moves the global economy. Understanding human behavior in the face of risk would let us reduce accidents, retire more comfortably, get cheaper health insurance and maybe even avoid recessions. A number of our smartest scientists have tried to develop a general theory of risk behavior. John von Neumann, the pioneering mathematician and physicist, took a crack at it back in 1944, when he developed the theory of expected utility along with Oskar Morgenstern. According to this simple theory, people value a possible outcome by multiplying the probability that something happens by the amount they would like it to happen. This beautiful idea underlies much of modern economic theory, but unfortunately it doesn't work well in most situations. Alternative theories have been developed for specific applications. The psychologist Daniel Kahneman won a Nobel Prize for the creation of prospect theory, which says -- among other things -- that people measure outcomes relative to a reference point. That theory does a great job of explaining the behavior of subjects in certain lab experiments, and can help account for the actions of certain inexperienced consumers. But it is very difficult to apply generally, because the reference points are hard to predict in advance and may shift in unpredictable ways.
By Nicholas Bakalar Stress in childhood may be linked to hardening of the arteries in adulthood, new research suggests. Finnish researchers studied 311 children 12 to 18 years old, scoring their levels of stress according to a variety of components, including the family’s economic circumstances, the emotional environment in the home, whether parents engaged in healthy behaviors, stressful events (such as divorce, moves or death of a family member) and parental concerns about the child’s social adjustment. Using these criteria, they calculated a stress score. When the members of the group were 40 to 46 years old, they used computed tomography to measure coronary artery calcification, a marker of atherosclerosis and a risk factor for cardiovascular disease. The study, in JAMA Pediatrics, controlled for sex, cholesterol, body mass index and other factors, but still found that the higher the childhood stress score, the greater the risk for coronary artery calcification. The study is observational, and the data is based largely on parental reports, which can be biased. Still, its long follow-up time and careful control of other variables gives it considerable strength. There are plausible mechanisms for the connection, including stress-induced increases in inflammation, which in animal models have been linked to a variety of ailments. “I think that economic conditions are important here,” said the lead author, Dr. Markus Juonala, a professor of internal medicine at the University of Turku in Finland. “Public health interventions should focus on how to intervene in better ways with people with higher stress and lower socioeconomic status.” © 2016 The New York Times Company
By Ariana Eunjung Cha LAS VEGAS — Jamie Tyler was stressed. He had just endured a half-hour slog through airport security and needed some relief. Many travelers in this situation might have headed for the nearest bar or popped an aspirin. But Tyler grabbed a triangular piece of gadgetry from his bag and held it to his forehead. As he closed his eyes, the device zapped him with low-voltage electrical currents. Within minutes, Tyler said, he was feeling serene enough to face the crowds once again. This is no science fiction. The Harvard-trained neurobiologist was taking advantage of one of his own inventions, a device called Thync, which promises to help users activate their body's “natural state of energy or calm” — for a retail price of a mere $199. Americans’ obsession with wellness is fueling a new category of consumer electronics, one that goes far beyond the ubiquitous Fitbits and UP activity wristbands that only passively monitor users' physical activity. The latest wearable tech, to put it in the simplest terms, is about hacking your brain. These gadgets claim to be able to make you have more willpower, think more creatively and even jump higher. One day, their makers say, the technology may even succeed in delivering on the holy grail of emotions: happiness. There’s real, peer-reviewed science behind the theory driving these devices. It involves stimulating key regions of the brain — with currents or magnetic fields — to affect emotions and physical well-being.
Link ID: 22053 - Posted: 03.31.2016
Brendan Maher It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. “Pick one, kidd,” one nudged. Then, “Choose FA GO TT.” It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. © 2016 Nature Publishing Group
By DAVID FRANK and JAMES GORMAN Social life is good for you, even when your friends have lice — if you’re a Japanese macaque. Whether the same is true for humans hasn’t been tested directly, at least not the way researchers in Japan conducted their experiments with networks of female macaques. Julie Duboscq, a researcher at Kyoto University’s Primate Research Institute in Japan, tracked louse infestation and grooming interactions in about 20 adult female macaques. As she, Andrew J.J. MacIntosh and their colleagues noted in describing their research in Scientific Reports, grooming is known to reduce lice, but such close physical contact can also make it easy for lice to pass from one animal to another. Dr. Duboscq is interested in the costs and benefits of social behavior. For animals that live in social groups, as macaques and people do, the benefits of social life are many, from defense against predators (for wild monkeys, and no doubt for humans at some point in their history) to emotional health and well-being (for humans, and probably monkeys, too). But there are negatives associated with sociality, like the transmission of parasites and diseases. “We don’t fully understand the costs and benefits,” Dr. Duboscq said. In this study, she and her colleagues estimated the degree of louse infestation by the number of nits picked. The more nits, they calculated, the more lice-producing nits. © 2016 The New York Times Company
Link ID: 22038 - Posted: 03.28.2016
Laura Sanders The 22 men took the same pill for four weeks. When interviewed, they said they felt less daily stress and their memories were sharper. The brain benefits were subtle, but the results, reported at last year’s annual meeting of the Society for Neuroscience, got attention. That’s because the pills were not a precise chemical formula synthesized by the pharmaceutical industry. The capsules were brimming with bacteria. In the ultimate PR turnaround, once-dreaded bacteria are being welcomed as health heroes. People gobble them up in probiotic yogurts, swallow pills packed with billions of bugs and recoil from hand sanitizers. Helping us nurture the microbial gardens in and on our bodies has become big business, judging by grocery store shelves. These bacteria are possibly working at more than just keeping our bodies healthy: They may be changing our minds. Recent studies have begun turning up tantalizing hints about how the bacteria living in the gut can alter the way the brain works. These findings raise a question with profound implications for mental health: Can we soothe our brains by cultivating our bacteria? By tinkering with the gut’s bacterial residents, scientists have changed the behavior of lab animals and small numbers of people. Microbial meddling has turned anxious mice bold and shy mice social. Rats inoculated with bacteria from depressed people develop signs of depression themselves. And small studies of people suggest that eating specific kinds of bacteria may change brain activity and ease anxiety. Because gut bacteria can make the very chemicals that brain cells use to communicate, the idea makes a certain amount of sense. © Society for Science & the Public 2000 - 2016
Nicola Davis If you get hot under the collar behind the wheel, it could be down to a brain parasite. According to new research, adults who have intermittent explosive disorder (IED) - a psychiatric condition in which violent outbursts of anger and cursing erupt in response to apparently trivial irritations - are more likely to have been infected with toxoplasma gondii. “The kind of triggers are usually social provocations,” said Dr Royce Lee, an author of the study from the University of Chicago. “In the workplace it could be some kind of interpersonal frustration, on the road it could be getting cut up.” A common parasite, toxoplasma gondii reproduces within cats and is spread in their faeces. It can enter humans through the food chain in raw or undercooked meat, contaminated water or unwashed vegetables that have come into contact with the parasite. It is thought that up to a third of the British population have been infected with toxoplasma gondii - a parasite that lurks in the tissues of the brain. While generally considered to be harmless, toxoplasmosis in pregnant women has been linked miscarriages, stillbirths and congenital defects in babies, and can cause serious problems in those with weakened immune systems. While infection with the parasite in humans is often symptomless, its effects have attracted much attention - studies in humans have suggested that infection could be linked to schizophrenia and even increase the likelihood of road traffic accidents, while research in rats has found that infection with the parasite can remove their fear of cats. © 2016 Guardian News and Media Limited
Link ID: 22026 - Posted: 03.24.2016
By PAM BELLUCK When people make risky decisions, like doubling down in blackjack or investing in volatile stocks, what happens in the brain? Scientists have long tried to understand what makes some people risk-averse and others risk-taking. Answers could have implications for how to treat, curb or prevent destructively risky behavior, like pathological gambling or drug addiction. Now, a study by Dr. Karl Deisseroth, a prominent Stanford neuroscientist and psychiatrist, and his colleagues gives some clues. The study, published Wednesday in the journal Nature, reports that a specific type of neuron or nerve cell, in a certain brain region helps galvanize whether or not a risky choice is made. The study was conducted in rats, but experts said it built on research suggesting the findings could be similar in humans. If so, they said, it could inform approaches to addiction, which involves some of the same neurons and brain areas, as well as treatments for Parkinson’s disease because one class of Parkinson’s medications turns some patients into problem gamblers. In a series of experiments led by Kelly Zalocusky, a doctoral student, researchers found that a risk-averse rat made decisions based on whether its previous choice involved a loss (in this case, of food). Rats whose previous decision netted them less food were prompted to behave conservatively next time by signals from certain receptors in a brain region called the nucleus accumbens, the scientists discovered. These receptors, which are proteins attached to neurons, are part of the dopamine system, a neurochemical important to emotion, movement and thinking. In risk-taking rats, however, those receptors sent a much fainter signal, so the rats kept making high-stakes choices even if they lost out. But by employing optogenetics, a technique that uses light to manipulate neurons, the scientists stimulated brain cells with those receptors, heightening the “loss” signal and turning risky rats into safer rats. © 2016 The New York Times Company