Chapter 11. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By FRANS de WAAL TICKLING a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself. The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter. Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal. The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners. Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. Everything boiled down to genes and reinforcement. To think otherwise opened you up to ridicule, which is what happened to Wolfgang Köhler, the German psychologist who, a century ago, was the first to demonstrate flashes of insight in chimpanzees. © 2016 The New York Times Company
By JOANNA KLEIN Misconception: Migraines are psychological manifestations of women’s inability to manage stress and emotions Actually: Neurologists are very clear that migraines are a real, debilitating medical condition related to temporary abnormal brain activity. The fact that they may be more common for some women during “that time of the month” has nothing to do with emotions. For centuries, doctors explained migraines as a woman’s problem caused by emotional disturbances like hysteria, depression or stress. “Bizarrely, the recommended cure was marriage!” said Dr. Anne MacGregor, the lead author of the British Association for the Study of Headache’s guidelines for diagnosing and managing migraines. While that prescription may be far behind us, the misconception that migraines are fueled by a woman’s inability to cope persists. “It was considered psychological, or that I was a nervous overachiever, so I would never tell people that I have them,” said Lorie Novak, an artist in her sixties who has suffered from migraines since she was 8. After reading Joan Didion’s 1968 essay “In Bed,” about the writer’s struggle with migraines, Ms. Novak decided to tackle the representation of these debilitating headaches. Starting in 2009, Ms. Novak photographed herself every time she got a migraine. Under the hashtag #notjustaheadache, hundreds of others on Twitter and Instagram have demonstrated their own frustration with a widespread lack of understanding of the reality of migraines. © 2016 The New York Times Company
Philip Ball James Frazer’s classic anthropological study The Golden Bough1 contains a harrowing chapter on human sacrifice in rituals of crop fertility and harvest among historical cultures around the world. Frazer describes sacrificial victims being crushed under huge toppling stones, slow-roasted over fires and dismembered alive. Frazer’s methods of analysis wouldn't all pass muster among anthropologists today (his work was first published in 1890), but it is hard not to conclude from his descriptions that what industrialized societies today would regard as the most extreme psychopathy has in the past been seen as normal — and indeed sacred — behaviour. In almost all societies, killing within a tribe or clan has been strongly taboo; exemption is granted only to those with great authority. Anthropologists have suspected that ritual human sacrifice serves to cement power structures — that is, it signifies who sits at the top of the social hierarchy. The idea makes intuitive sense, but until now there has been no clear evidence to support it. In a study published in Nature2, Joseph Watts, a specialist in cultural evolution at the University of Auckland in New Zealand, and his colleagues have analysed 93 traditional cultures in Austronesia (the region that loosely embraces the many small and island states in the Pacific and Indonesia) as they were before they were influenced by colonization and major world religions (generally in the late 19th and early 20th centuries). © 2016 Nature Publishing Group
By Emily Underwood More than 99% of clinical trials for Alzheimer’s drugs have failed, leading many to wonder whether pharmaceutical companies have gone after the wrong targets. Now, research in mice points to a potential new target: a developmental process gone awry, which causes some immune cells to feast on the connections between neurons. “It is beautiful new work,” which “brings into light what’s happening in the early stage of the disease,” says Jonathan Kipnis, a neuroscientist at the University of Virginia School of Medicine in Charlottesville. Most new Alzheimer’s drugs aim to eliminate β amyloid, a protein that forms telltale sticky plaques around neurons in people with the disease. Those with Alzheimer’s tend to have more of these deposits in their brains than do healthy people, yet more plaques don’t always mean more severe symptoms such as memory loss or poor attention, says Beth Stevens of Boston Children’s Hospital, who led the new work. What does track well with the cognitive decline seen in Alzheimer’s disease—at least in mice that carry genes that confer high risk for the condition in people—is a marked loss of synapses, particularly in brain regions key to memory, Stevens says. These junctions between nerve cells are where neurotransmitters are released to spark the brain’s electrical activity. Stevens has spent much of her career studying a normal immune mechanism that prunes weak or unnecessary synapses as the brain matures from the womb through adolescence, allowing more important connections to become stronger. In this process, a protein called C1q sets off a series of chemical reactions that ultimately mark a synapse for destruction. After a synapse has been “tagged,” immune cells called microglia—the brain’s trash disposal service—know to “eat” it, Stevens says. © 2016 American Association for the Advancement of Science
Noah Smith, ( How do human beings behave in response to risk? That is one of the most fundamental unanswered questions of our time. A general theory of decision-making amid uncertainty would be the kind of scientific advance that comes only a few times a century. Risk is central to financial and insurance markets. It affects the consumption, saving and business investment that moves the global economy. Understanding human behavior in the face of risk would let us reduce accidents, retire more comfortably, get cheaper health insurance and maybe even avoid recessions. A number of our smartest scientists have tried to develop a general theory of risk behavior. John von Neumann, the pioneering mathematician and physicist, took a crack at it back in 1944, when he developed the theory of expected utility along with Oskar Morgenstern. According to this simple theory, people value a possible outcome by multiplying the probability that something happens by the amount they would like it to happen. This beautiful idea underlies much of modern economic theory, but unfortunately it doesn't work well in most situations. Alternative theories have been developed for specific applications. The psychologist Daniel Kahneman won a Nobel Prize for the creation of prospect theory, which says -- among other things -- that people measure outcomes relative to a reference point. That theory does a great job of explaining the behavior of subjects in certain lab experiments, and can help account for the actions of certain inexperienced consumers. But it is very difficult to apply generally, because the reference points are hard to predict in advance and may shift in unpredictable ways.
By Nicholas Bakalar Stress in childhood may be linked to hardening of the arteries in adulthood, new research suggests. Finnish researchers studied 311 children 12 to 18 years old, scoring their levels of stress according to a variety of components, including the family’s economic circumstances, the emotional environment in the home, whether parents engaged in healthy behaviors, stressful events (such as divorce, moves or death of a family member) and parental concerns about the child’s social adjustment. Using these criteria, they calculated a stress score. When the members of the group were 40 to 46 years old, they used computed tomography to measure coronary artery calcification, a marker of atherosclerosis and a risk factor for cardiovascular disease. The study, in JAMA Pediatrics, controlled for sex, cholesterol, body mass index and other factors, but still found that the higher the childhood stress score, the greater the risk for coronary artery calcification. The study is observational, and the data is based largely on parental reports, which can be biased. Still, its long follow-up time and careful control of other variables gives it considerable strength. There are plausible mechanisms for the connection, including stress-induced increases in inflammation, which in animal models have been linked to a variety of ailments. “I think that economic conditions are important here,” said the lead author, Dr. Markus Juonala, a professor of internal medicine at the University of Turku in Finland. “Public health interventions should focus on how to intervene in better ways with people with higher stress and lower socioeconomic status.” © 2016 The New York Times Company
By Ariana Eunjung Cha LAS VEGAS — Jamie Tyler was stressed. He had just endured a half-hour slog through airport security and needed some relief. Many travelers in this situation might have headed for the nearest bar or popped an aspirin. But Tyler grabbed a triangular piece of gadgetry from his bag and held it to his forehead. As he closed his eyes, the device zapped him with low-voltage electrical currents. Within minutes, Tyler said, he was feeling serene enough to face the crowds once again. This is no science fiction. The Harvard-trained neurobiologist was taking advantage of one of his own inventions, a device called Thync, which promises to help users activate their body's “natural state of energy or calm” — for a retail price of a mere $199. Americans’ obsession with wellness is fueling a new category of consumer electronics, one that goes far beyond the ubiquitous Fitbits and UP activity wristbands that only passively monitor users' physical activity. The latest wearable tech, to put it in the simplest terms, is about hacking your brain. These gadgets claim to be able to make you have more willpower, think more creatively and even jump higher. One day, their makers say, the technology may even succeed in delivering on the holy grail of emotions: happiness. There’s real, peer-reviewed science behind the theory driving these devices. It involves stimulating key regions of the brain — with currents or magnetic fields — to affect emotions and physical well-being.
Link ID: 22053 - Posted: 03.31.2016
Brendan Maher It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. “Pick one, kidd,” one nudged. Then, “Choose FA GO TT.” It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. © 2016 Nature Publishing Group
By DAVID FRANK and JAMES GORMAN Social life is good for you, even when your friends have lice — if you’re a Japanese macaque. Whether the same is true for humans hasn’t been tested directly, at least not the way researchers in Japan conducted their experiments with networks of female macaques. Julie Duboscq, a researcher at Kyoto University’s Primate Research Institute in Japan, tracked louse infestation and grooming interactions in about 20 adult female macaques. As she, Andrew J.J. MacIntosh and their colleagues noted in describing their research in Scientific Reports, grooming is known to reduce lice, but such close physical contact can also make it easy for lice to pass from one animal to another. Dr. Duboscq is interested in the costs and benefits of social behavior. For animals that live in social groups, as macaques and people do, the benefits of social life are many, from defense against predators (for wild monkeys, and no doubt for humans at some point in their history) to emotional health and well-being (for humans, and probably monkeys, too). But there are negatives associated with sociality, like the transmission of parasites and diseases. “We don’t fully understand the costs and benefits,” Dr. Duboscq said. In this study, she and her colleagues estimated the degree of louse infestation by the number of nits picked. The more nits, they calculated, the more lice-producing nits. © 2016 The New York Times Company
Link ID: 22038 - Posted: 03.28.2016
Laura Sanders The 22 men took the same pill for four weeks. When interviewed, they said they felt less daily stress and their memories were sharper. The brain benefits were subtle, but the results, reported at last year’s annual meeting of the Society for Neuroscience, got attention. That’s because the pills were not a precise chemical formula synthesized by the pharmaceutical industry. The capsules were brimming with bacteria. In the ultimate PR turnaround, once-dreaded bacteria are being welcomed as health heroes. People gobble them up in probiotic yogurts, swallow pills packed with billions of bugs and recoil from hand sanitizers. Helping us nurture the microbial gardens in and on our bodies has become big business, judging by grocery store shelves. These bacteria are possibly working at more than just keeping our bodies healthy: They may be changing our minds. Recent studies have begun turning up tantalizing hints about how the bacteria living in the gut can alter the way the brain works. These findings raise a question with profound implications for mental health: Can we soothe our brains by cultivating our bacteria? By tinkering with the gut’s bacterial residents, scientists have changed the behavior of lab animals and small numbers of people. Microbial meddling has turned anxious mice bold and shy mice social. Rats inoculated with bacteria from depressed people develop signs of depression themselves. And small studies of people suggest that eating specific kinds of bacteria may change brain activity and ease anxiety. Because gut bacteria can make the very chemicals that brain cells use to communicate, the idea makes a certain amount of sense. © Society for Science & the Public 2000 - 2016
Nicola Davis If you get hot under the collar behind the wheel, it could be down to a brain parasite. According to new research, adults who have intermittent explosive disorder (IED) - a psychiatric condition in which violent outbursts of anger and cursing erupt in response to apparently trivial irritations - are more likely to have been infected with toxoplasma gondii. “The kind of triggers are usually social provocations,” said Dr Royce Lee, an author of the study from the University of Chicago. “In the workplace it could be some kind of interpersonal frustration, on the road it could be getting cut up.” A common parasite, toxoplasma gondii reproduces within cats and is spread in their faeces. It can enter humans through the food chain in raw or undercooked meat, contaminated water or unwashed vegetables that have come into contact with the parasite. It is thought that up to a third of the British population have been infected with toxoplasma gondii - a parasite that lurks in the tissues of the brain. While generally considered to be harmless, toxoplasmosis in pregnant women has been linked miscarriages, stillbirths and congenital defects in babies, and can cause serious problems in those with weakened immune systems. While infection with the parasite in humans is often symptomless, its effects have attracted much attention - studies in humans have suggested that infection could be linked to schizophrenia and even increase the likelihood of road traffic accidents, while research in rats has found that infection with the parasite can remove their fear of cats. © 2016 Guardian News and Media Limited
Link ID: 22026 - Posted: 03.24.2016
By PAM BELLUCK When people make risky decisions, like doubling down in blackjack or investing in volatile stocks, what happens in the brain? Scientists have long tried to understand what makes some people risk-averse and others risk-taking. Answers could have implications for how to treat, curb or prevent destructively risky behavior, like pathological gambling or drug addiction. Now, a study by Dr. Karl Deisseroth, a prominent Stanford neuroscientist and psychiatrist, and his colleagues gives some clues. The study, published Wednesday in the journal Nature, reports that a specific type of neuron or nerve cell, in a certain brain region helps galvanize whether or not a risky choice is made. The study was conducted in rats, but experts said it built on research suggesting the findings could be similar in humans. If so, they said, it could inform approaches to addiction, which involves some of the same neurons and brain areas, as well as treatments for Parkinson’s disease because one class of Parkinson’s medications turns some patients into problem gamblers. In a series of experiments led by Kelly Zalocusky, a doctoral student, researchers found that a risk-averse rat made decisions based on whether its previous choice involved a loss (in this case, of food). Rats whose previous decision netted them less food were prompted to behave conservatively next time by signals from certain receptors in a brain region called the nucleus accumbens, the scientists discovered. These receptors, which are proteins attached to neurons, are part of the dopamine system, a neurochemical important to emotion, movement and thinking. In risk-taking rats, however, those receptors sent a much fainter signal, so the rats kept making high-stakes choices even if they lost out. But by employing optogenetics, a technique that uses light to manipulate neurons, the scientists stimulated brain cells with those receptors, heightening the “loss” signal and turning risky rats into safer rats. © 2016 The New York Times Company
Laura Sanders In a pair of twin sisters, a rare disease had damaged the brain’s structures believed necessary to feel fear. But an injection of a drug could nevertheless make them anxious. The results of that experiment, described in the March 23 Journal of Neuroscience, add to evidence that the amygdalae, small, almond-shaped brain structures tucked deep in the brain, aren’t the only bits of the brain that make a person feel afraid. “Overall, this suggests multiple different routes in the brain to a common endpoint of the experience of fear,” says cognitive neuroscientist Stephan Hamann of Emory University in Atlanta. The twins, called B.G. and A.M., have Urbach-Wiethe disease, a genetic disorder that destroyed most of their amygdalae in late childhood. Despite this, the twins showed fear after inhaling air laden with extra carbon dioxide (an experience that can create the sensation of suffocating), an earlier study showed (SN: 3/23/13, p. 12). Because carbon dioxide affects a wide swath of the body and brain, scientists turned to a more specific cause of fear that stems from inside the body: a drug called isoproterenol, which can set the heart racing and make breathing hard. Sensing these bodily changes provoked by the drug can cause anxiety. “If you know what adrenaline feels like, you know what isoproterenol feels like,” says study coauthor Sahib Khalsa, a psychiatrist and neuroscientist at the Laureate Institute for Brain Research in Tulsa, Okla. © Society for Science & the Public 2000 - 2016.
Link ID: 22021 - Posted: 03.23.2016
By Roni Caryn Rabin Sixty-five million Americans suffer from chronic lower back pain, and many feel they have tried it all: physical therapy, painkillers, shots. Now a new study reports many people may find relief with a form of meditation that harnesses the power of the mind to manage pain. The technique, called mindfulness-based stress reduction, involves a combination of meditation, body awareness and yoga, and focuses on increasing awareness and acceptance of one’s experiences, whether they involve physical discomfort or emotional pain. People with lower back pain who learned the meditation technique showed greater improvements in function compared to those who had cognitive behavioral therapy, which has been shown to help ease pain, or standard back care. Participants assigned to meditation or cognitive behavior therapy received eight weekly two-hour sessions of group training in the techniques. After six months, those learning meditation had an easier time doing things like getting up out of a chair, going up the stairs and putting on their socks, and were less irritable and less likely to stay at home or in bed because of pain. They were still doing better a year later. The findings come amid growing concerns about opioid painkillers and a surge of overdose deaths involving the drugs. At the beginning of the trial, 11 percent of the participants said they had used an opioid within the last week to treat their pain, and they were allowed to continue with their usual care throughout the trial. “This new study is exciting, because here’s a technique that doesn’t involve taking any pharmaceutical agents, and doesn’t involve the side effects of pharmaceutical agents,” said Dr. Madhav Goyal of Johns Hopkins University School of Medicine, who co-wrote an editorial accompanying the paper. © 2016 The New York Times Company
Angus Chen You've probably heard that a little booze a day is good for you. I've even said it at parties. "Look at the French," I've said gleefully over my own cup. "Wine all the time and they still live to be not a day younger than 82." I'm sorry to say we're probably wrong. The evidence that alcohol has any benefit on longevity or heart health is thin, says Dr. Timothy Naimi, a physician and epidemiologist at Boston Medical Center. He and his colleagues published an analysis 87 of the best research studies on alcohol's effect on death from any cause in the Journal of Studies on Alcohol and Drugs on Tuesday. "[Our] findings here cast a great deal of skepticism on this long, cherished belief that moderate drinking has a survival advantage," he says. In these studies, the participants get sorted into categories based on how much alcohol they think they drink. Researchers typically size up occasional, moderate and heavy drinkers against non-drinkers. When you do this, the moderates, one to three drinks a day, usually come out on top. They're less likely to die early from health problems like heart disease or cancer and injury. But then it gets very tricky, "because moderate drinkers tend to be very socially advantaged," Naimi says. Moderate drinkers tend to be healthier on average because they're well-educated and more affluent, not because they're drinking a bottle of wine a week on average. "[Their] alcohol consumption ends up looking good from a health perspective because they're already healthy to begin with." © 2016 npr
By Anahad O'Connor What does it take to live a good life? Surveys show that most young adults believe that obtaining wealth and fame are keys to a happy life. But a long-running study out of Harvard suggests that one of the most important predictors of whether you age well and live a long and happy life is not the amount of money you amass or notoriety you receive. A much more important barometer of long term health and well-being is the strength of your relationships with family, friends and spouses. These are some of the findings from the Harvard Study of Adult Development, a research project that since 1938 has closely tracked and examined the lives of more than 700 men and in some cases their spouses. The study has revealed some surprising – and some not so surprising – factors that determine whether people are likely to age happily and healthily, or descend into loneliness, sickness and mental decline. The study’s current director, , outlined some of the more striking findings from the long-running project in a recent TED Talk that has garnered more than seven million views. “We publish our findings in academic journals that most people don’t read,” Dr. Waldinger, a clinical professor of psychiatry at Harvard Medical School, said in a recent interview. “And so we really wanted people to know that this study exists and that it has for 75 years. We’ve been funded by the government for so many years, and it’s important that more people know about this besides academics.” The study began in Boston in the 1930s with two very different groups of young men. © 2016 The New York Times Company
By John Elder Robison What happens to your relationships when your emotional perception changes overnight? Because I’m autistic, I have always been oblivious to unspoken cues from other people. My wife, my son and my friends liked my unflappable demeanor and my predictable behavior. They told me I was great the way I was, but I never really agreed. For 50 years I made the best of how I was, because there was nothing else I could do. Then I was offered a chance to participate in a study at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Investigators at the Berenson-Allen Center there were studying transcranial magnetic stimulation, or T.M.S., a noninvasive procedure that applies magnetic pulses to stimulate the brain. It offers promise for many brain disorders. Several T.M.S. devices have been approved by the Food and Drug Administration for the treatment of severe depression, and others are under study for different conditions. (It’s still in the experimental phase for autism.) The doctors wondered if changing activity in a particular part of the autistic brain could change the way we sense emotions. That sounded exciting. I hoped it would help me read people a little better. They say, be careful what you wish for. The intervention succeeded beyond my wildest dreams — and it turned my life upside down. After one of my first T.M.S. sessions, in 2008, I thought nothing had happened. But when I got home and closed my eyes, I felt as if I were on a ship at sea. And there were dreams — so real they felt like hallucinations. It sounds like a fairy tale, but the next morning when I went to work, everything was different. Emotions came at me from all directions, so fast that I didn’t have a moment to process them. © 2016 The New York Times Company
By Gretchen Reynolds Meditating before running could change the brain in ways that are more beneficial for mental health than practicing either of those activities alone, according to an interesting study of a new treatment program for people with depression. As many people know from experience, depression is characterized in part by an inability to stop dwelling on gloomy thoughts and unhappy memories from the past. Researchers suspect that this thinking pattern, known as rumination, may involve two areas of the brain in particular: the prefrontal cortex, a part of the brain that helps to control attention and focus, and the hippocampus, which is critical for learning and memory. In some studies, people with severe depression have been found to have a smaller hippocampus than people who are not depressed. Interestingly, meditation and exercise affect those same portions of the brain, although in varying ways. In brain-scan studies, people who are long-term meditators, for instance, generally display different patterns of brain-cell communication in their prefrontal cortex during cognitive tests than people who don’t meditate. Those differences are believed to indicate that the meditators possess a more honed ability to focus and concentrate. Meanwhile, according to animal studies, aerobic exercise substantially increases the production of new brain cells in the hippocampus. Both meditation and exercise also have proven beneficial in the treatment of anxiety, depression and other mood disorders. These various findings about exercise and meditation intrigued researchers at Rutgers University in New Brunswick, N.J., who began to wonder whether, since meditation and exercise on their own improve moods, combining the two might intensify the impacts of each. So, for the new study, which was published last month in Translational Psychiatry, the scientists recruited 52 men and women, 22 of whom had been given diagnoses of depression. The researchers confirmed that diagnosis with their own tests and then asked all of the volunteers to complete a computerized test of their ability to focus while sensors measured electrical signals in their brains. © 2016 The New York Times Company
Link ID: 21998 - Posted: 03.17.2016
Nicola Davis Suppressing bad memories from the past can block memory formation in the here and now, research suggests. The study could help to explain why those suffering from post-traumatic stress disorder (PTSD) and other psychological conditions often experience difficulty in remembering recent events, scientists say. Writing in Nature Communications, the authors describe how trying to forget past incidents by suppressing our recollections can create a “virtual lesion” in the brain that casts an “amnesiac shadow” over the formation of new memories. “If you are motivated to try to prevent yourself from reliving a flashback of that initial trauma, anything that you experience around the period of time of suppression tends to get sucked up into this black hole as well,” Dr Justin Hulbert, one of the study’s authors, told the Guardian. “I think it makes perfect sense because we know that people with a wide range of psychological problems have difficulties with their everyday memories for ordinary events,” said Professor Chris Brewin, an expert in PTSD from University College, London, who was not involved in the study. “Potentially this could account for the memory deficits we find in depression and other disorders too.” The phenomenon came to the attention of the scientists during a lecture when a student admitted to having suffered bouts of amnesia after witnessing the 1999 Columbine high school massacre. When the student returned to the school for classes after the incident she found she could not remember anything from the lessons she was in. “Here she was surrounded by all these reminders of these terrible things that she preferred not to think about,” said Hulbert. © 2016 Guardian News and Media Limited
Rich Stanton In 1976, the driving simulation Death Race was removed from an Illinois amusement park. There had, according to a news story at the time, been complaints that it encouraged players to run over pedestrians to score points. Through a series of subsequent newspaper reports, the US National Safety Council labelled the game “gross” and motoring groups demanded its removal from distribution. The first moral panic over video game violence had begun. This January, a group of four scholars published a paper analysing the links between playing violent video games at a young age and aggressive behaviour in later life. The titles mentioned in the report are around 15-years-old – one of several troubling ambiguities to be found in the research. Nevertheless, the quality and quantity of the data make this an uncommonly valuable study. Given that game violence remains a favoured bogeyman for politicians, press and pressure groups, it should be shocking that such a robust study of the phenomenon is rare. But it is, and it’s important to ask why. A history of violence With the arrival of Pong in 1973, video games became a commercial reality, but now, in 2016, they are still on the rocky path to mass acceptance that all new media must traverse. The truth is that the big targets of moral concern – Doom, Grand Theft Auto, Call of Duty – are undeniably about killing and they are undeniably popular among male teenagers. An industry report estimates that 80% of the audience for the Call of Duty series is male, and 21% is aged 10-14. Going by the 18 rating on the last three entries, that means at least a fifth of the game’s vast audience shouldn’t be playing. © 2016 Guardian News and Media Limited