Chapter 11. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Emily Underwood Since swine flu swept the globe in 2009, scientists have scrambled to determine why a small percentage of children in Europe who received the flu vaccine Pandemrix developed narcolepsy, an incurable brain disorder that causes irresistible sleepiness. This week, a promising explanation was dealt a setback when prominent sleep scientist Emmanuel Mignot of Stanford University in Palo Alto, California, and colleagues retracted their influential study reporting a potential link between the H1N1 virus used to make the vaccine and narcolepsy. Some researchers were taken aback. “This was one of the most important pieces of work on narcolepsy that has come out,” says neuroimmunologist Lawrence Steinman, a close friend and colleague of Mignot’s, who is also at Stanford. The retraction, announced in Science Translational Medicine (STM), “really caught me by surprise,” he says. Others say that journal editors should have detected problems with the study’s methodology. The work provided the first substantiation of an autoimmune mechanism for narcolepsy, which could explain the Pandemrix side effect, researchers say. The vaccine, used only in Europe, seems to have triggered the disease in roughly one out of 15,000 children who received it. The affected children carried a gene variant for a particular human leukocyte antigen (HLA) type—a molecule that presents foreign proteins to immune cells—considered necessary for developing narcolepsy. In the 18 December 2013 issue of STM, Mignot and colleagues reported that T cells from people with narcolepsy, but not from healthy controls, are primed to attack by hypocretin, a hormone that regulates wakefulness. They also showed molecular similarities between fragments of the H1N1 virus and the hypocretin molecule and suggested that these fragments might fool the immune system into attacking hypocretin-producing cells. © 2014 American Association for the Advancement of Science
By Caelainn Hogan A simple blood test could determine a person’s risk of suicide and provide a future tool of prevention to stem suicide rates. In a study published online Wednesday in the American Journal of Psychiatry, researchers say they have discovered a genetic indicator of a person’s vulnerability to the effects of stress and anxiety and, therefore, the risk of suicidal thoughts or attempts. The Johns Hopkins researchers looked at how a group of chemicals known as methyls affect the gene SKA2, which modifies how the brain reacts to stress hormones. If the gene’s function is impaired by a chemical change, someone who is stressed won’t be able to shut down the effect of the stress hormone, which would be like having a faulty brake pad in a car for the fear center of the brain, worsening the impact of even everyday stresses. Researchers studied about 150 postmortem brain samples of healthy people and those with mental illness, including some who had committed suicide. They found that those who died by suicide had significantly higher levels of the chemical that altered the SKA2 gene. As a result of the gene’s modification, it was not able to “switch off” the effect of the stress hormone. The researchers then tested sets of blood samples from more than 325 participants in the Johns Hopkins Center for Prevention Research study to see whether they could determine those who were at greater risk of suicide by the same biomarker. They were able to guess with 80 to 90 percent accuracy whether a person had thoughts of suicide or made an attempt by looking at the single gene, while accounting for age, gender and levels of stress or anxiety.
By ANNA NORTH What does it mean to be lonely? It’s tempting to equate the feeling with a dearth of social interaction, but some people are now saying that it’s more complicated than that — and that true loneliness might be dangerous. In a story at Medium, Robin Marantz Henig busts some common loneliness myths. Lonely people aren’t necessarily weird or uncool: Ms. Henig cites a study of Ohio State undergrads showing that “those who called themselves lonely had just as much ‘social capital’ — defined by physical attractiveness, height, weight, socioeconomic status, and academic achievement — as their non-lonely peers.” And they may not be actually alone: “The students at Ohio State who were lonely belonged to as many clubs and had as many roommates as those who were ‘socially embedded.’ And while some studies indicate that living alone puts people at greater risk for loneliness, living with a spouse is not necessarily any protection.” Rather, loneliness may be psychological. The lonely, writes Ms. Henig, are more likely than others “to feel put upon and misunderstood” in social situations, to see “social danger even where none might exist.” She writes: “People grow lonely because of the gloomy stories they tell themselves. And, in a cruel twist, the loneliness itself can further distort their thinking, making them misread other people’s good intentions, which in turn causes them to withdraw to protect themselves from further rejection — and causes other people to keep them at arm’s length.” This distancing can have a physical impact; Ms. Henig argues that loneliness deserves further study, in part because it may increase the risk of high blood pressure, sleep problems and Alzheimer’s disease. © 2014 The New York Times Company
|By Fikri Birey What’s the difference between you and a rat? The list is unsurprisingly long but now, we can cross a universal human experience — feelings of regret — off of it. A new study shows for the first time that rats regret bad decisions and learn from them. In addition to existentialist suggestions of a rat’s regret — and what that takes away from, or adds to, being “human” — the study is highly relevant to basic brain research. Researchers demonstrated that we can tap into complex internal states of rodents if we hone in on the right behavior and the right neurons. There is a significant literature on what brain regions are representative of certain states, like reward predictions and value calculations, but the study, powered by a novel behavioral test, is able to put together such discrete behavioral correlates into a “rat” definition of regret. Finding better animal models of human behavior constitute a long-standing challenge in neuroscience: It has been difficult to authentically recapitulate mental states in animal models of neuropsychiatric disorders: For example, an attempt to model depression in rodents can often go no further than relatively coarse approximations of the core symptoms like guilt or sadness, which often translates to behaviors like social avoidance or anhedonia in rodents. The inability to efficiently approach the questions of mental abnormalities is a major problem. Depression is currently ranked as the leading cause of disability globally, and it’s estimated that by 2020, depression will lead 1.5 million people to end their lives by suicide. Now, thanks to a simple yet well-conceived series of experiments by Steiner and Redish, a compound behavior like regret is fully open to investigation. The investigators use a spatial decision-making set-up called “Restaurant Row”: an arena with four zones where four different flavors of food (banana, cherry, chocolate or unflavored) are introduced in sequence. © 2014 Scientific American
By STEPHANIE FAIRYINGTON A few months ago, I was on a Manhattan-bound D train heading to work when a man with a chunky, noisy newspaper got on and sat next to me. As I watched him softly turn the pages of his paper, a chill spread like carbonated bubbles through the back of my head, instantly relaxing me and bringing me to the verge of sweet slumber. It wasn’t the first time I’d felt this sensation at the sound of rustling paper — I’ve experienced it as far back as I can remember. But it suddenly occurred to me that, as a lifelong insomniac, I might be able to put it to use by reproducing the experience digitally whenever sleep refused to come. Under the sheets of my bed that night, I plugged in some earphones, opened the YouTube app on my phone and searched for “Sound of pages.” What I discovered stunned me. There were nearly 2.6 million videos depicting a phenomenon called autonomous sensory meridian response, or A.S.M.R., designed to evoke a tingling sensation that travels over the scalp or other parts of the body in response to auditory, olfactory or visual forms of stimulation. The sound of rustling pages, it turns out, is just one of many A.S.M.R. triggers. The most popular stimuli include whispering; tapping or scratching; performing repetitive, mundane tasks like folding towels or sorting baseball cards; and role-playing, where the videographer, usually a breathy woman, softly talks into the camera and pretends to give a haircut, for example, or an eye examination. The videos span 30 minutes on average, but some last more than an hour. For those not wired for A.S.M.R. — and even for those who, like me, apparently are — the videos and the cast of characters who produce them — sometimes called “ASMRtists” or “tingle-smiths” — can seem weird, creepy or just plain boring. (Try pitching the pleasures of watching a nerdy German guy slowly and silently assemble a computer for 30 minutes.) © 2014 The New York Times Company
By CATHERINE SAINT LOUIS “This has happened before,” she tells herself. “It’s nowhere near as bad as before, and it will pass.” Robbie Pinter’s 21-year-old son, Nicholas, is upset again. He yells. He obsesses about something that can’t be changed. Even good news may throw him off. So Dr. Pinter breathes deeply, as she was taught, focusing on each intake and release. She talks herself through the crisis, reminding herself that this is how Nicholas copes with his autism and bipolar disorder. With these simple techniques, Dr. Pinter, who teaches English at Belmont University in Nashville, blunts the stress of parenting a child with severe developmental disabilities. Dr. Pinter, who said she descends from “a long line of the most nervous women,” credits her mindfulness practice with giving her the tools to cope with whatever might come her way. “It is very powerful,” she said. All parents endure stress, but studies show that parents of children with developmental disabilities, like autism, experience depression and anxiety far more often. Struggling to obtain crucial support services, the financial strain of paying for various therapies, the relentless worry over everything from wandering to the future — all of it can be overwhelming. “The toll stress-wise is just enormous, and we know that we don’t do a really great job of helping parents cope with it,” said Dr. Fred R. Volkmar, the director of Child Study Center at Yale University School of Medicine. “Having a child that has a disability, it’s all-encompassing,” he added. “You could see how people would lose themselves.” But a study published last week in the journal Pediatrics offers hope. It found that just six weeks of training in simple techniques led to significant reductions in stress, depression and anxiety among these parents. © 2014 The New York Times Company
By KATE MURPHY ONE of the biggest complaints in modern society is being overscheduled, overcommitted and overextended. Ask people at a social gathering how they are and the stock answer is “super busy,” “crazy busy” or “insanely busy.” Nobody is just “fine” anymore. When people aren’t super busy at work, they are crazy busy exercising, entertaining or taking their kids to Chinese lessons. Or maybe they are insanely busy playing fantasy football, tracing their genealogy or churning their own butter. And if there is ever a still moment for reflective thought — say, while waiting in line at the grocery store or sitting in traffic — out comes the mobile device. So it’s worth noting a study published last month in the journal Science, which shows how far people will go to avoid introspection. “We had noted how wedded to our devices we all seem to be and that people seem to find any excuse they can to keep busy,” said Timothy Wilson, a psychology professor at the University of Virginia and lead author of the study. “No one had done a simple study letting people go off on their own and think.” The results surprised him and have created a stir in the psychology and neuroscience communities. In 11 experiments involving more than 700 people, the majority of participants reported that they found it unpleasant to be alone in a room with their thoughts for just 6 to 15 minutes. Moreover, in one experiment, 64 percent of men and 15 percent of women began self-administering electric shocks when left alone to think. These same people, by the way, had previously said they would pay money to avoid receiving the painful jolt. It didn’t matter if the subjects engaged in the contemplative exercise at home or in the laboratory, or if they were given suggestions of what to think about, like a coming vacation; they just didn’t like being in their own heads. © 2014 The New York Times Company
By MICHAEL INZLICHT and SUKHVINDER OBHI I FEEL your pain. These words are famously associated with Bill Clinton, who as a politician seemed to ooze empathy. A skeptic might wonder, though, whether he truly was personally distressed by the suffering of average Americans. Can people in high positions of power — presidents, bosses, celebrities, even dominant spouses — easily empathize with those beneath them? Psychological research suggests the answer is no. Studies have repeatedly shown that participants who are in high positions of power (or who are temporarily induced to feel powerful) are less able to adopt the visual, cognitive or emotional perspective of other people, compared to participants who are powerless (or are made to feel so). For example, Michael Kraus, a psychologist now at the University of Illinois at Urbana-Champaign, and two colleagues found that among full-time employees of a public university, those who were higher in social class (as determined by level of education) were less able to accurately identify emotions in photographs of human faces than were co-workers who were lower in social class. (While social class and social power are admittedly not the same, they are strongly related.) Why does power leave people seemingly coldhearted? Some, like the Princeton psychologist Susan Fiske, have suggested that powerful people don’t attend well to others around them because they don’t need them in order to access important resources; as powerful people, they already have plentiful access to those. We suggest a different, albeit complementary, reason from cognitive neuroscience. On the basis of a study we recently published with the researcher Jeremy Hogeveen, in the Journal of Experimental Psychology: General, we contend that when people experience power, their brains fundamentally change how sensitive they are to the actions of others. © 2014 The New York Times Company
By JAMES GORMAN Any dog owner would testify that dogs are just as prone to jealousy as humans. But can one really compare Othello’s agony to Roscoe’s pique? The answer, according to Christine Harris, a psychologist at the University of California, San Diego, is that if you are petting another dog, Roscoe is going to show something that Dr. Harris thinks is a form of jealousy, even if not as complex and twisted as the adult human form. Other scientists agree there is something going on, but not all are convinced it is jealousy. And Roscoe and the rest of his tribe were, without exception, unavailable for comment. Dr. Harris had been studying human jealousy for years when she took this question on, inspired partly by the antics of her parents’ Border collies. When she petted them, “one would take his head and knock the other’s head away,” she said. It certainly looked like jealousy. But having studied humans, she was aware of different schools of thought about jealousy. Some scientists argue that jealousy requires complex thinking about self and others, which seems beyond dogs’ abilities. Others think that although our descriptions of jealousy are complex, the emotion itself may not be that complex. Dog emotions, as owners perceive them, have been studied before. In one case, Alexandra Horowitz, a cognitive scientist who is an adjunct associate professor at Barnard College and the author of “Inside of a Dog,” found that the so-called guilty look that dogs exhibit seemed to be more related to fear of punishment. Dr. Harris ventured into the tricky turf of dog emotion by devising a test based on work done with infants. © 2014 The New York Times Company
by Bethany Brookshire Even when we love our jobs, we all look forward to some time away. During the week, as stress builds up and deadlines accumulate, Friday looks better and better. Then, with a sigh of relief, the weekend arrives. But come Monday, it seems like the whole weight of responsibility just comes crashing down again. It’s not just you. Rats feel it, too. Rats given a two-day break from a stressful procedure show more signs of strain on “Monday” than rats who never got the weekend, researchers report July 11 in PLOS ONE. The results show that in some cases, an unpredictable getaway can cause more stress than just working through the pressure. Wei Zang, J. Amiel Rosenkranz and colleagues at the Rosalind Franklin University of School of Medicine and Science in Chicago wanted to understand how changes to a stressful situation alter an animal’s response to stress. Normally, when rats are exposed over and over to a stress such as a restraint (in which a rat is placed in a small tube where it can’t turn around or get out), they begin to get used to the stress. Over a few days, rats stop avoiding the tube and stay calmly in the restraint without struggling, until they are set free. Hormones like corticosterone — which spikes in response to stress — go down. This phenomenon is called habituation. Zhang and colleagues wanted to see what happens when this pattern of stress is interrupted. They restrained rats for 20 minutes each for five days. By day five, the animals were hanging out comfortably in the tubes. Then, the scientists introduced an interruption: They gave half of the rats two days off, a science-induced weekend. The scientists continued to restrain the other group of rats daily. © Society for Science & the Public 2000 - 2013
Link ID: 19872 - Posted: 07.23.2014
Sarah C. P. Williams The wheezing, coughing, and gasping for breath that come with a sudden asthma attack aren’t just the fault of an overactive immune system. A particularly sensitive bundle of neurons stretching from the brain to the lungs might be to blame as well, researchers have found. Drugs that alter these neurons could provide a new way to treat some types of asthma. “This is an exciting confirmation of an idea that’s been around for decades,” says Allison Fryer, a pulmonary pharmacology researcher at Oregon Health & Science University in Portland, who was not involved in the new study. An asthma attack can be brought on by a variety of triggers, including exercise, cold temperatures, pollen, and dust. During an attack, a person’s airways become inflamed, mucus clogs their lungs, and the muscles surrounding their airways tighten. Asthma is often considered a disease of the immune system because immune cells go into overdrive when they sense a trigger and cause inflammation. But a bundle of nerves that snakes through the neck and chest, the vagus nerve, has long been suspected to play a role; the cells it contains, after all, control the airway muscles. Studying which cell types and molecular pathways within the thick nerve bundle are involved, though, has been tough—the vagus contains a multitude of different cells that are physically intertwined. Working together at the Howard Hughes Medical Institute’s Janelia Farm Research Campus in Ashburn, Virginia, neurobiologists Dimitri Tränkner, now at the University of Utah in Salt Lake City, and Charles Zuker of Columbia University turned to genetics to work out the players. They selectively shut off different sets of the neurons in mice based on which genes each neuron expressed, rather than their physical location. Then, through a series of injections, they gave the animals an egg white allergy that causes asthmalike symptoms. © 2014 American Association for the Advancement of Science
Link ID: 19866 - Posted: 07.22.2014
By ANNA ALTMAN NPR conducted a study about how stressed out we are as a country, and the results, released last week, show that one in four Americans reported feeling stressed in the last month and one in two has experienced a major stressful event in the last year. Smithsonian Magazine, recommending the study, reports that this likely underestimates the actual stress load on Americans: “The survey only measures stress that people are conscious of, NPR explains, but research shows that people can suffer unaware from other forms of stress.” In short, according to Smithsonian, “stress is becoming the national psyche.” So we are barraged with new studies and ideas about stress and how it may be harming us — but many of them are contradictory. Stress can hurt your health, but stressing too much about stress is even worse for your health. Stress can make you sleep badly or it can make you fall asleep. People are most stressed out on Wednesday at 3:30 p.m. And BuzzFeed made a cute video asking whether stress can actually kill you. (“Those under significant stress can have more clogged arteries” and that “can ultimately lead to heart attack.”) Nevertheless, longstanding medical studies do show that chronic stress can lead to anxiety, depression, digestive problems, trouble sleeping, heart disease, weight gain and memory or concentration impairment. Alexandra Drane, a health care consultant, told NPR that those experiencing “toxic stress” were “2.6 times as likely to have diabetes, 2.9 times as likely to have back pain. They were 5 times as likely to be having mental health issues.” Our economy is contributing to the strain, as elevated stress levels often correlate with downturns. In the United States, the Great Recession brought a spike in stress and anxiety. The Gallup-Healthways Well-Being Index polls more than a thousand people each day and in 2008, the study’s first year, showed the definitive effects of economic hardship on stress and mental well-being. © 2014 The New York Times Company
Link ID: 19847 - Posted: 07.17.2014
Thomas B. Edsall It’s been a key question of American politics since at least 1968: Why do so many poor, working-class and lower-middle-class whites — many of them dependent for survival on government programs — vote for Republicans? The debate over the motives of conservative low-income white voters remains unresolved, but two recent research papers suggest that the hurdles facing Democrats in carrying this segment of the electorate may prove difficult to overcome. In “Obedience to Traditional Authority: A heritable factor underlying authoritarianism, conservatism and religiousness,” published by the journal Personality and Individual Differences in 2013, three psychologists write that “authoritarianism, religiousness and conservatism,” which they call the “traditional moral values triad,” are “substantially influenced by genetic factors.” According to the authors — Steven Ludeke of Colgate, Thomas J. Bouchard of the University of Minnesota, and Wendy Johnson of the University of Edinburgh — all three traits are reflections of “a single, underlying tendency,” previously described in one word by Bouchard in a 2006 paper as “traditionalism.” Traditionalists in this sense are defined as “having strict moral standards and child-rearing practices, valuing conventional propriety and reputation, opposing rebelliousness and selfish disregard of others, and valuing religious institutions and practices.” Working along a parallel path, Amanda Friesen, a political scientist at Indiana University, and Aleksander Ksiazkiewicz, a graduate student in political science at Rice University, concluded from their study comparing identical and fraternal twins that “the correlation between religious importance and conservatism” is “driven primarily, but usually not exclusively, by genetic factors.” The substantial “genetic component in these relationships suggests that there may be a common underlying predisposition that leads individuals to adopt conservative bedrock social principles and political ideologies while simultaneously feeling the need for religious experiences.” © 2014 The New York Times Company
The modern idea of stress began on a rooftop in Canada, with a handful of rats freezing in the winter wind. This was 1936 and by that point the owner of the rats, an endocrinologist named Hans Selye, had become expert at making rats suffer for science. "He would subject them to extreme temperatures, make them go hungry for long periods, or make them exercise a lot," the medical historian says. "Then what he would do is kill the rats and look at their organs." What was interesting to Selye was that no matter how different the tortures he devised for the rats were — from icy winds to painful injections — when he cut them open to examine their guts it appeared that the physical effects of his different tortures were always the same. "Almost universally these rats showed a particular set of signs," Jackson says. "There would be changes particularly in the adrenal gland. So Selye began to suggest that subjecting an animal to prolonged stress led to tissue changes and physiological changes with the release of certain hormones, that would then cause disease and ultimately the death of the animal." And so the idea of stress — and its potential costs to the body — was born. But here's the thing: The idea of stress wasn't born to just any parent. It was born to Selye, a scientist absolutely determined to make the concept of stress an international sensation. © 2014 NPR
Link ID: 19809 - Posted: 07.09.2014
By EMILY ANTHES It was love at first pet when Laurel Braitman and her husband adopted a 4-year-old Bernese mountain dog, a 120-pound bundle of fur named Oliver. The first few months were blissful. But over time, Oliver’s troubled mind slowly began to reveal itself. He snapped at invisible flies. He licked his tail until it was wounded and raw. He fell to pieces when he spied a suitcase. And once, while home alone, he ripped a hole in a screen and jumped out of a fourth-floor window. To everyone’s astonishment, he survived. Oliver’s anguish devastated Dr. Braitman, a historian of science, but it also awakened her curiosity and sent her on an investigation deep into the minds of animals. The result is the lovely, big-hearted book “Animal Madness,” in which Dr. Braitman makes a compelling case that nonhuman creatures can also be afflicted with mental illness and that their suffering is not so different from our own. In the 17th century, Descartes described animals as automatons, a view that held sway for centuries. Today, however, a large and growing body of research makes it clear that animals have never been unthinking machines. We now know that species from magpies to elephants can recognize themselves in the mirror, which some scientists consider a sign of self-awareness. Rats emit a form of laughter when they’re tickled. And dolphins, parrots and dogs show clear signs of distress when their companions die. Together, these and many other findings demonstrate what any devoted pet owner has probably already concluded: that animals have complex minds and rich emotional lives. Unfortunately, as Dr. Braitman notes, “every animal with a mind has the capacity to lose hold of it from time to time.” © 2014 The New York Times Company
—By Chris Mooney The United States has a voting problem. In the 2012 presidential election, only about 57 percent of eligible American voters turned out, a far lower participation rate than in comparable democracies. That means about 93 million people who were eligible to vote didn't bother. Clearly, figuring out why people vote (and why they don't) is of premium importance to those who care about the health of democracy, as well as to campaigns that are becoming ever more sophisticated in targeting individual voters. To that end, much research has shown that demographic factors such as age and poverty affect one's likelihood of voting. But are there individual-level biological factors that also influence whether a person votes? The idea has long been heretical in political science, and yet the logic behind it is unavoidable. People vary in all sorts of ways—ranging from personalities to genetics—that affect their behavior. Political participation can be an emotional, and even a stressful activity, and in an era of GOP-led efforts to make voting more difficult, voting in certain locales can be a major hassle. To vote, you need both to be motivated and also not so intimidated you stay away from the polls. So are there biological factors that can shape these perceptions? "Our study is unique in that it is the first to examine whether differences in physiology may be causally related to differences in political activity," says lead study author Jeffrey French. ©2014 Mother Jones
Link ID: 19790 - Posted: 07.04.2014
From David Beckham’s infamous kick at France '98 to Luis Suárez chomping Giorgio Chiellini's shoulder in Brazil last week, the history of the World Cup is littered with moments of impulsive aggression that appear to defy all rational explanation. The story of human impulsivity stretches back deep into our evolutionary past. By nature, we are all prone to making quick, rash decisions that may lead to regret, and in some cases a four-month ban from international football. Impulsivity is actually a survival mechanism and was essential in the African savanna where our species evolved around a million and a half years ago. For our ancestors, the ability to make split-second decisions could make the difference between life and death. All of us have deep primal instincts but over the several hundred million years of evolution separating our reptilian ancestors from the first mammals, and eventually primates, the cognitive ability to exercise self-restraint has increased. While most living things make this decision purely as a trade-off between risk and reward, only humans can decide to exercise self-restraint on the basis of how they think they will be perceived by others – an ability that emerged some time in the past 100,000 years or so. “We evolved to be very social animals, living in large groups, and so we have developed inhibitory mechanisms in the more recently evolved parts of the prefrontal cortex,” explains Michael Price of the School of Social Sciences at the University of Brunel. “This is the social centre of the brain. Our big reason not to be impulsive is because of your reputation and how other people are going to judge you and perhaps ostracise you as we saw with Beckham in the aftermath of France ’98.” © 2014 Guardian News and Media Limited
By Tanya Lewis and Live Science They say laughter is the best medicine. But what if laughter is the disease? For a 6-year-old girl in Bolivia who suffered from uncontrollable and inappropriate bouts of giggles, laughter was a symptom of a serious brain problem. But doctors initially diagnosed the child with “misbehavior.” “She was considered spoiled, crazy — even devil-possessed,” José Liders Burgos Zuleta of the Advanced Medical Image Centre in La Paz said in a statement. [ But Burgos Zuleta discovered that the true cause of the girl’s laughing seizures, medically called gelastic seizures, was a brain tumor. After the girl underwent a brain scan, the doctors discovered a hamartoma, a small, benign tumor that was pressing against her brain’s temporal lobe. Surgeons removed the tumor, the doctors said. She stopped having the uncontrollable attacks of laughter and now laughs only normally, they said. Gelastic seizures are a relatively rare form of epilepsy, said Solomon Moshé, a pediatric neurologist at Albert Einstein College of Medicine in New York. “It’s not necessarily ‘ha-ha-ha’ laughing,” Moshé said. “There’s no happiness in this. Some of the kids may be very scared,” he added. The seizures are most often caused by tumors in the hypothalamus, although they can also come from tumors in other parts of brain, Moshé said. Although laughter is the main symptom, patients may also have outbursts of crying.
Sarah C. P. Williams There’s a reason people say “Calm down or you’re going to have a heart attack.” Chronic stress—such as that brought on by job, money, or relationship troubles—is suspected to increase the risk of a heart attack. Now, researchers studying harried medical residents and harassed rodents have offered an explanation for how, at a physiological level, long-term stress can endanger the cardiovascular system. It revolves around immune cells that circulate in the blood, they propose. The new finding is “surprising,” says physician and atherosclerosis researcher Alan Tall of Columbia University, who was not involved in the new study. “The idea has been out there that chronic psychosocial stress is associated with increased cardiovascular disease in humans, but what’s been lacking is a mechanism,” he notes. Epidemiological studies have shown that people who face many stressors—from those who survive natural disasters to those who work long hours—are more likely to develop atherosclerosis, the accumulation of fatty plaques inside blood vessels. In addition to fats and cholesterols, the plaques contain monocytes and neutrophils, immune cells that cause inflammation in the walls of blood vessels. And when the plaques break loose from the walls where they’re lodged, they can cause more extreme blockages elsewhere—leading to a stroke or heart attack. Studying the effect of stressful intensive care unit (ICU) shifts on medical residents, biologist Matthias Nahrendorf of Harvard Medical School in Boston recently found that blood samples taken when the doctors were most stressed out had the highest levels of neutrophils and monocytes. To probe whether these white blood cells, or leukocytes, are the missing link between stress and atherosclerosis, he and his colleagues turned to experiments on mice. © 2014 American Association for the Advancement of Science
Link ID: 19761 - Posted: 06.23.2014
by Laura Sanders Some brain cells need a jolt of stress to snap to attention. Cells called astroglia help regulate blood flow, provide energy to nearby cells and even influence messages’ movement between nerve cells. Now, scientists report June 18 in Neuron that astroglia can be roused by the stress molecule norepinephrine, an awakening that may help the entire brain jump into action. As mice were forced to walk on a treadmill, an activity that makes them alert, astroglia in several parts of their brains underwent changes in calcium levels, a sign of activity, neuroscientist Dwight Bergles of Johns Hopkins University School of Medicine and colleagues found. Norepinephrine, which acts as a fight-or-flight hormone in the body and a neural messenger in the brain, seemed to cause the cell activity boost. When researchers depleted norepinephrine, treadmill walking no longer activated astroglia. It’s not clear whether astroglia in all parts of the brain heed this wake-up call, nor is it clear whether this activation influences behavior. Norepinephrine might help shift brain cells, both neurons and astroglia, into a state of heightened vigilance, the authors write. © Society for Science & the Public 2000 - 2013.