Chapter 15. Emotions, Aggression, and Stress

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 2440

By Melissa Hogenboom Science reporter, BBC News A genetic analysis of almost 900 offenders in Finland has revealed two genes associated with violent crime. Those with the genes were 13 times more likely to have a history of repeated violent behaviour. The authors of the study, published in the journal Molecular Psychiatry, said at least 4-10% of all violent crime in Finland could be attributed to individuals with these genotypes. But they stressed the genes could not be used to screen criminals. Many more genes may be involved in violent behaviour and environmental factors are also known to have a fundamental role. Even if an individual has a "high-risk combination" of these genes the majority will never commit a crime, the lead author of the work Jari Tiihonen of the Karolinska Institutet in Sweden said. "Committing a severe, violent crime is extremely rare in the general population. So even though the relative risk would be increased, the absolute risk is very low," he told the BBC. The study, which involved analysis of almost 900 criminals, is the first to have looked at the genetic make-up of so many violent criminals in this way. Warrior gene Each criminal was given a profile based on their offences, categorising them into violent or non-violent. The association between genes and previous behaviour was strongest for the 78 who fitted the "extremely violent offender" profile. This group had committed a total of 1,154 murders, manslaughters, attempted homicides or batteries. A replication group of 114 criminals had all committed at least one murder. BBC © 2014

Keyword: Aggression; Aggression
Link ID: 20252 - Posted: 10.28.2014

By GABRIELE OETTINGEN MANY people think that the key to success is to cultivate and doggedly maintain an optimistic outlook. This belief in the power of positive thinking, expressed with varying degrees of sophistication, informs everything from affirmative pop anthems like Katy Perry’s “Roar” to the Mayo Clinic’s suggestion that you may be able to improve your health by eliminating “negative self-talk.” But the truth is that positive thinking often hinders us. More than two decades ago, I conducted a study in which I presented women enrolled in a weight-reduction program with several short, open-ended scenarios about future events — and asked them to imagine how they would fare in each one. Some of these scenarios asked the women to imagine that they had successfully completed the program; others asked them to imagine situations in which they were tempted to cheat on their diets. I then asked the women to rate how positive or negative their resulting thoughts and images were. A year later, I checked in on these women. The results were striking: The more positively women had imagined themselves in these scenarios, the fewer pounds they had lost. My colleagues and I have since performed many follow-up studies, observing a range of people, including children and adults; residents of different countries (the United States and Germany); and people with various kinds of wishes — college students wanting a date, hip-replacement patients hoping to get back on their feet, graduate students looking for a job, schoolchildren wishing to get good grades. In each of these studies, the results have been clear: Fantasizing about happy outcomes — about smoothly attaining your wishes — didn’t help. Indeed, it hindered people from realizing their dreams. © 2014 The New York Times Company

Keyword: Attention; Aggression
Link ID: 20244 - Posted: 10.27.2014

By J. PEDER ZANE Striking it rich is the American dream, a magnetic myth that has drawn millions to this nation. And yet, a countervailing message has always percolated through the culture: Money can’t buy happiness. From Jay Gatsby and Charles Foster Kane to Tony Soprano and Walter White, the woefully wealthy are among the seminal figures of literature, film and television. A thriving industry of gossipy, star-studded magazines and websites combines these two ideas, extolling the lifestyles of the rich and famous while exposing the sadness of celebrity. All of which raises the question: Is the golden road paved with misery? Yes, in a lot of cases, according to a growing body of research exploring the connection between wealth and happiness. Studies in behavioral economics, cognitive psychology and neuroscience are providing new insights into how a changing American economy and the wiring of the human brain can make life on easy street feel like a slog. Make no mistake, it is better to be rich than poor — psychologically as well as materially. Levels of depression, anxiety and stress diminish as incomes rise. What has puzzled researchers is that the psychological benefits of wealth seem to stop accruing once people reach an income of about $75,000 a year. “The question is, What are the factors that dampen the rewards of income?” said Scott Schieman, a professor of sociology at the University of Toronto. “Why doesn’t earning even more money — beyond a certain level — make us feel even happier and more satisfied?” The main culprit, he said, is the growing demands of work. For millenniums, leisure was wealth’s bedfellow. The rich were different because they worked less. The tables began to turn in America during the 1960s, when inherited privilege gave way to educational credentials and advancement became more closely tied to merit. © 2014 The New York Times Company

Keyword: Emotions
Link ID: 20236 - Posted: 10.23.2014

James Hamblin People whose faces are perceived to look more "competent" are more likely to be CEOs of large, successful companies. Having a face that people deem "dominant" is a predictor of rank advancement in the military. People are more likely to invest money with people who look "trustworthy." These sorts of findings go on and on in recent studies that claim people can accurately guess a variety of personality traits and behavioral tendencies from portraits alone. The findings seem to elucidate either canny human intuition or absurd, misguided bias. There has been a recent boom in research on how people attribute social characteristics to others based on the appearance of faces—independent of cues about age, gender, race, or ethnicity. (At least, as independent as possible.) The results seem to offer some intriguing insight, claiming that people are generally pretty good at predicting who is, for example, trustworthy, competent, introverted or extroverted, based entirely on facial structure. There is strong agreement across studies as to what facial attributes mean what to people, as illustrated in renderings throughout this article. But it's, predictably, not at all so simple. Christopher Olivola, an assistant professor at Carnegie Mellon University, makes the case against face-ism today, in the journal Trends in Cognitive Sciences. In light of many recent articles touting people's judgmental abilities, Olivola and Princeton University's Friederike Funk and Alexander Todorov say that a careful look at the data really doesn't support these claims. And "instead of applauding our ability to make inferences about social characteristics from facial appearances," Olivola said, "the focus should be on the dangers."

Keyword: Emotions; Aggression
Link ID: 20234 - Posted: 10.23.2014

BY Bethany Brookshire Stress is our coping response. Whether emotional or physical, stress is how organisms react to upheaval in their lives. And in many cases, that response requires tradeoffs. An animal will make it through now, but may come out with fewer fat stores or a shorter life span. But a new study shows that under certain conditions, developmental stress in male zebra finches might have a positive effect, in the form of more offspring to carry on his genes. Ondi Crino, a biologist now at Macquarie University in Sydney, examined how stress during development might affect reproductive success in male zebra finches. She purchased 10 male and 10 female zebra finches from pet shops near the University of Montana. The birds were allowed to pair off and nest. When the first batch of chicks was 12 days old, Crino fed half of the male offspring peanut oil, and half peanut oil with the hormone corticosterone mixed in. Both humans and finches produce stress-related hormones. Humans produce cortisol, while finches produce corticosterone. These two hormones increase during times of stress and cause many of the negative effects we associate with worry and pressure. So administering corticosterone is one method of “stressing” an animal without changing anything else in its environment. The dose was in the range of what a young bird might experience in the midst of a natural upheaval such as a cold snap or famine. After 16 days of the peanut oil supplement, the young male birds receiving corticosterone were smaller than their relaxed counterparts. They also had a larger spike in their own corticosterone levels when they were stressed. But over time, the chicks that received corticosterone appeared to grow out of their stressful upbringing. By adulthood they were the same size as controls, and they did not show frazzled feathers or pale colors that might indicate a rough chickhood. © Society for Science & the Public 2000 - 2014

Keyword: Stress; Aggression
Link ID: 20209 - Posted: 10.16.2014

By Jane E. Brody In the 1997 film “As Good As It Gets,” Jack Nicholson portrays Melvin Udall, a middle-aged man with obsessive-compulsive disorder who avoids stepping on cracks, locks doors and flips light switches exactly five times, and washes his hands repeatedly, each time tossing out the new bar of soap he used. He brings wrapped plastic utensils to the diner where he eats breakfast at the same table every day. Though the film is billed as a romantic comedy, Melvin’s disorder is nothing to laugh about. O.C.D. is often socially, emotionally and vocationally crippling. It can even be fatal. Four years ago, John C. Kelly, 24, killed himself in Irvington, N.Y., after a long battle with a severe form of obsessive-compulsive disorder. Mr. Kelly was a devoted baseball player, and now friends hold an annual softball tournament to raise money for the foundation established in his honor to increase awareness of the disorder. Obsessive thoughts and compulsive behaviors occur in almost every life from time to time. I have a fair share of compulsive patterns: seasonings arranged in strict alphabetical order; kitchen equipment always put back the same way in the same place; two large freezers packed with foods just in case I need them. I hold onto a huge collection of plastic containers, neatly stacked with their covers, and my closets bulge with clothes and shoes I haven’t worn in years, and probably never will again — yet cannot bring myself to give away. But these common habits fall far short of the distressing obsessions and compulsions that are the hallmarks of O.C.D.: intrusive, disturbing thoughts or fears that cannot be ignored and compel the sufferer to engage in ritualistic, irrational behaviors to relieve the resulting anxiety.

Keyword: OCD - Obsessive Compulsive Disorder
Link ID: 20208 - Posted: 10.16.2014

Daniel Cressey Mirrors are often used to elicit aggression in animal behavioural studies, with the assumption being that creatures unable to recognize themselves will react as if encountering a rival. But research suggests that such work may simply reflect what scientists expect to see, and not actual aggression. For most people, looking in a mirror does not trigger a bout of snarling hostility at the face staring back. But many animals do seem to react aggressively to their mirror image, and for years mirrors have been used to trigger such responses for behavioural research on species ranging from birds to fish. “There’s been a very long history of using a mirror as it’s just so handy,” says Robert Elwood, an animal-behaviour researcher at Queen’s University in Belfast, UK. Using a mirror radically simplifies aggression experiments, cutting down the number of animals required and providing the animal being observed with an ‘opponent’ perfectly matched in terms of size and weight. But in a study just published in Animal Behaviour1, Elwood and his team add to evidence that many mirror studies are flawed. The researchers looked at how convict cichlid fish (Amatitlania nigrofasciata) reacted both to mirrors and to real fish of their own species. This species prefers to display their right side in aggression displays, which means that they end up alongside each other in a head-to-tail configuration. It is impossible for a fish to achieve this with their own reflection, but Elwood reasoned that fish faced with a mirror would attempt it, and flip from side to side as they tried to present an aggressive display. On the other hand, if the reflection did not trigger an aggressive reaction, the fish would not display such behaviour as much or as frequently. © 2014 Nature Publishing Group,

Keyword: Consciousness; Aggression
Link ID: 20202 - Posted: 10.13.2014

By MOISES VELASQUEZ-MANOFF WHEN Andre H. Lagrange, a neurologist at Vanderbilt University in Nashville, saw the ominous white spots on the patient’s brain scan, he considered infection or lymphoma, a type of cancer. But tests ruled out both. Meanwhile, anti-epilepsy drugs failed to halt the man’s seizures. Stumped, Dr. Lagrange turned to something the mother of the 30-year-old man kept repeating. The fits coincided, she insisted, with spells of constipation and diarrhea. That, along with an odd rash, prompted Dr. Lagrange to think beyond the brain. Antibody tests, followed by an intestinal biopsy, indicated celiac disease, an autoimmune disorder of the gut triggered by the gluten proteins in wheat and other grains. Once on a gluten-free diet, the man’s seizures stopped; those brain lesions gradually disappeared. He made a “nearly complete recovery,” Dr. Lagrange told me. I began encountering case descriptions like this some years ago as I researched autoimmune disease. The first few seemed like random noise in an already nebulous field. But as I amassed more — describing seizures, hallucinations, psychotic breaks and even, in one published case, what looked like regressive autism, all ultimately associated with celiac disease — they began to seem less like anomalies, and more like a frontier in celiac research. They tended to follow a similar plot. What looked like neurological or psychiatric symptoms appeared suddenly. The physician ran through a diagnostic checklist without success. Drugs directed at the brain failed. Some clue suggestive of celiac disease was observed. The diagnosis was made. And the patient recovered on a gluten-free diet. The cases highlighted, in an unusually concrete fashion, the so-called gut-brain axis. The supposed link between the intestinal tract and the central nervous system is much discussed in science journals, often in the context of the microbial community inhabiting the gut. But it’s unclear how, really, we can leverage the link to improve health. © 2014 The New York Times Company

Keyword: Neuroimmunology
Link ID: 20200 - Posted: 10.13.2014

by Mallory Locklear Do you have an annoying friend who loves bungee jumping or hang-gliding, and is always blathering on about how it never scares them? Rather than being a macho front, their bravado may have a biological basis. Research from Stony Brook University in New York shows that not all risk-takers are cut from the same cloth. Some actually seem to feel no fear – or at least their bodies and brains don't respond to danger in the usual way. The study is the first to attempt to tease apart the differences in the risk-taking population. In order to ensure every participant was a card-carrying risk-taker, the team led by Lilianne Mujica-Parodi, recruited 30 first-time skydivers. "Most studies on sensation-seeking compare people who take risks and people who don't. We were interested in something more subtle – those who take risks adaptively and those who do so maladaptively." In other words, do all risk-takers process potential danger in the same way or do some ignore the risks more than others? To find out, the researchers got their participants to complete several personality questionnaires, including one that asked them to rank how well statements such as, "The greater the risk the more fun the activity," described them. Next, the team used fMRI imaging to observe whether the participants' corticolimbic brain circuit – which is involved in risk assessment - was well-regulated. A well-regulated circuit is one that reacts to a threat and then returns to a normal state afterwards. © Copyright Reed Business Information Ltd

Keyword: Emotions; Aggression
Link ID: 20194 - Posted: 10.11.2014

By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company

Keyword: Stress; Aggression
Link ID: 20161 - Posted: 10.04.2014

|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American

Keyword: Learning & Memory; Aggression
Link ID: 20159 - Posted: 10.04.2014

Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.

Keyword: Emotions; Aggression
Link ID: 20156 - Posted: 10.04.2014

By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.

Keyword: Alzheimers; Aggression
Link ID: 20148 - Posted: 10.02.2014

By Linda Carroll The debate over whether violent movies contribute to real-world mayhem may have just developed a wrinkle: New research suggests they might enhance aggression only in those already prone to it. Using PET scanners to peer into the brains of volunteers watching especially bloody movie scenes, researchers determined that the way a viewer’s brain circuitry responds to violent video depends upon whether the individual is aggressive by nature. The study was published Wednesday in PLOS One. “Just as beauty is in the eye of the beholder, environmental stimuli are in the brain of the beholder,” said Nelly Alia-Klein, the study’s lead author and an associate professor at the Friedman Brain Institute and the Icahn School of Medicine at Mount Sinai Hospital in New York City. To test the importance of personality, Alia-Klein and her colleagues rounded up 54 healthy men, some of whom had a history of getting into physical fights, while the others had no history of aggression. The researchers scanned the volunteers three times: doing nothing, watching emotionally charged video and viewing a violent movie. “It wasn’t the whole [violent] movie,” Alia-Klein said, “just the violent scenes, one after another after another.” Along with the brain scans, the researchers monitored blood pressure and asked about the viewers’ moods every 15 minutes.

Keyword: Aggression
Link ID: 20132 - Posted: 09.30.2014

By Erik Parens Will advances in neuroscience move reasonable people to abandon the idea that criminals deserve to be punished? Some researchers working at the intersection of psychology, neuroscience and philosophy think the answer is yes. Their reasoning is straightforward: if the idea of deserving punishment depends upon the idea that criminals freely choose their actions, and if neuroscience reveals that free choice is an illusion, then we can see that the idea of deserving punishment is nonsense. As Joshua Greene and Jonathan Cohen speculated in a 2004 essay: “new neuroscience will undermine people’s common sense, libertarian conception of free will and the retributivist thinking that depends on it, both of which have heretofore been shielded by the inaccessibility of sophisticated thinking about the mind and its neural basis.” Just as we need two eyes that integrate slightly different information about one scene to achieve visual depth perception, we need to view ourselves through two lenses to gain a greater depth of understanding of ourselves. This past summer, Greene and several other colleagues did empirical work that appears to confirm that 2004 speculation. The new work finds that when university students learn about “the neural basis of behavior” — quite simply, the brain activity underlying human actions —they become less supportive of the idea that criminals deserve to be punished. According to the study’s authors, once students are led to question the concept of free will — understood as the idea that humans “can generate spontaneous choices and actions not determined by prior events” — they begin to find the idea of “just deserts” untenable. “When genuine choice is deemed impossible, condemnation is less justified,” the authors write. © 2014 The New York Times Company

Keyword: Consciousness; Aggression
Link ID: 20131 - Posted: 09.29.2014

By Alyssa Abkowitz If you’re wary of investing in a certain stock or exchange-traded fund, it could be because of the your brain’s physical composition. In a recent study, 61 participants from the Northeastern U.S. were asked to choose between monetary options that differed in the level of risk. Questions included: “Would you prefer a 50 percent chance of receiving $5 or would you rather take a 13 percent chance of winning $50?” and “Would you prefer $10 for sure or a 50 percent chance of receiving $50?” Researchers found that individuals with more gray matter in a specific part of their brains tend to tolerate more financial risks, says Agnieszka Tymula, an economist at the University of Sydney and co-author of the findings. Most of the participants answered questions while their brains were being scanned, while others received MRIs afterward (the timing doesn’t make a difference because the researchers were looking at brain structure, not brain function). The study involved measuring the volume of gray matter, or the outer layer of the brain, in the right posterior parietal region of the cortex. Thicker gray matter corresponded to riskier responses. Tymula worked with researchers from Yale University, University College London, New York University, and the University of Pennsylvania. Their findings, published in the Journal of Neuroscience this month, dovetail with previous work in which Tymula found that adults become more risk-averse as they age. Other neuroscience research shows that people’s cortexes become thinner as they get older, meaning there could be a link between a thinning cortex and risk aversion. ©2014 Bloomberg L.P

Keyword: Emotions; Aggression
Link ID: 20117 - Posted: 09.25.2014

by Laura Sanders Earlier this month, a star running back for the Minnesota Vikings was indicted for whipping his young son bloody with a switch. Leaked photographs allegedly showed Adrian Peterson’s 4-year-old son with cuts and bruises on his legs, back, buttocks and scrotum. As details about the incident emerged, Peterson took to Twitter to say that he’s not a perfect parent but what he did was not abuse. It was discipline. “My goal is always to teach my son right from wrong and that’s what I tried to do that day,” he wrote. Many people, and I’m one of them, that think Peterson’s actions were disgusting. There’s no way that hitting 4-year-old with a switch until his body is cut and bruised is a good way to impart values and morals. Peterson’s extreme actions, done in the name of corporal punishment, ignited a ferocious, emotionally fraught debate over whether it’s OK to hit your kid. The debate reflects deep divides in our society, chasms that track along political, religious, regional and racial lines. Half of all U.S. parents say they’ve spanked their kid. Spanking doesn’t just happen in the privacy of homes, either. Nineteen states allow teachers or principals to hit children. Opponents often point to scientific studies as proof that spanking is bad. And I confess, I originally thought this post was going to describe those results that we’ve all heard: how children who have been spanked are more aggressive and have more behavioral problems. But despite the headlines, the science behind spanking is actually quite limited, says clinical psychologist Christopher Ferguson of Stetson University in DeLand, Fla. “Because it’s a culture war issue, I think a lot of what we hear has misrepresented what is very nuanced science,” he says. © Society for Science & the Public 2000 - 2014.

Keyword: Sexual Behavior; Aggression
Link ID: 20116 - Posted: 09.25.2014

|By Corinne Iozzio Albert “Skip” Rizzo of the University of Southern California began studying virtual reality (VR) as psychological treatment in 1993. Since then, dozens of studies, his included, have shown the immersion technique to be effective for everything from post-traumatic stress disorder (PTSD) and anxiety to phobias and addiction. But a lack of practical hardware has kept VR out of reach for clinicians. The requirements for a VR headset seem simple—a high-resolution, fast-reacting screen, a field of vision that is wide enough to convince patients they are in another world and a reasonable price tag— yet such a product has proved elusive. Says Rizzo, “It’s been 20 frustrating years.” In 2013 VR stepped into the consumer spotlight in the form of a prototype head- mounted display called the Oculus Rift. Inventor Palmer Luckey’s goal was to create a platform for immersive video games, but developers from many fields—medicine, aviation, tourism—are running wild with possibilities. The Rift’s reach is so broad that Oculus, now owned by Facebook, hosted a conference for developers in September. The Rift, slated for public release in 2015, is built largely from off- the-shelf parts, such as the screens used in smartphones. A multi- axis motion sensor lets the headset refresh imagery in real time as the wearer’s head moves. The kicker is the price: $350. (Laboratory systems start at $20,000.) Rizzo has been among the first in line. His work focuses on combat PTSD. In a 2010 study, he placed patients into controlled traumatic scenarios, including a simulated battlefield, so they could confront and process emotions triggered in those situations. © 2014 Scientific American

Keyword: Stress; Aggression
Link ID: 20106 - Posted: 09.23.2014

By Maria Konnikova At the turn of the twentieth century, Ivan Pavlov conducted the experiments that turned his last name into an adjective. By playing a sound just before he presented dogs with a snack, he taught them to salivate upon hearing the tone alone, even when no food was offered. That type of learning is now called classical—or Pavlovian—conditioning. Less well known is an experiment that Pavlov was conducting at around the same time: when some unfortunate canines heard the same sound, they were given acid. Just as their luckier counterparts had learned to salivate at the noise, these animals would respond by doing everything in their power to get the imagined acid out of their mouths, each “shaking its head violently, opening its mouth and making movements with its tongue.” For many years, Pavlov’s classical conditioning findings overshadowed the darker version of the same discovery, but, in the nineteen-eighties, the New York University neuroscientist Joseph LeDoux revived the technique to study the fear reflex in rats. LeDoux first taught the rats to associate a certain tone with an electric shock so that they froze upon hearing the tone alone. In essence, the rat had formed a new memory—that the tone signifies pain. He then blunted that memory by playing the tone repeatedly without following it with a shock. After multiple shock-less tones, the animals ceased to be afraid. Now a new generation of researchers is trying to figure out the next logical step: re-creating the same effects within the brain, without deploying a single tone or shock. Is memory formation now understood well enough that memories can be implanted and then removed absent the environmental stimulus?

Keyword: Learning & Memory; Aggression
Link ID: 20097 - Posted: 09.19.2014

by Bob Holmes THERE'S something primal in a mother's response to a crying infant. So primal, in fact, that mother deer will rush protectively to the distress calls of other infant mammals, such as fur seals, marmots and even humans. This suggests such calls might share common elements – and perhaps that these animals experience similar emotions. Researchers – and, indeed, all pet owners – know that humans respond emotionally to the distress cries of their domestic animals, and there is some evidence that dogs also respond to human cries. However, most people have assumed this is a by-product of domestication. However, Susan Lingle, a biologist at the University of Winnipeg, Canada, noticed that the infants of many mammal species have similar distress calls: simple sounds with few changes in pitch. She decided to test whether cross-species responses occur more widely across the evolutionary tree. So, Lingle and her colleague Tobias Riede, now at Midwestern University in Glendale, Arizona, recorded the calls made by infants from a variety of mammal species when separated from their mother or otherwise threatened. They then played the recordings through hidden speakers to wild mule deer (Odocoileus hemionus) out on the Canadian prairies. They found that deer mothers quickly moved towards the recordings of infant deer, but also towards those of infant fur seals, dogs, cats and humans, all of which call at roughly the same pitch. Even the ultrasonic calls of infant bats attracted the deer mothers if Lingle used software to lower their pitch to match that of deer calls. In contrast, they found the deer did not respond to non-infant calls such as birdsong or the bark of a coyote (American Naturalist, DOI: 10.1086/677677). © Copyright Reed Business Information Ltd.

Keyword: Emotions; Aggression
Link ID: 20095 - Posted: 09.19.2014