Chapter 11. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Rae Ellen Bichell For Tim Goliver and Luther Glenn, the worst illness of their lives started in the same way — probably after having a stomach bug. Tim was 21 and a college student at the University of Michigan. He was majoring in English and biology and active in the Lutheran church. "I was a literature geek," says Tim. "I was really looking forward to my senior year and wherever life would take me." Luther was in his 50s. He'd spent most of his career as a U.S. military policeman and was working in security in Washington, D.C. He'd recently separated from his wife and had just moved into a new house with his two daughters, who were in their 20s. Both men recovered from their stomach bugs, but a few days later they started to feel sluggish. "Here we are trying to unpack, prepare ourselves for new life together and I'm flat out, dead tired," says Luther. He fell asleep in the car one morning and never made it out of the garage. Then he fell in the bathroom. For Tim, it started to feel like running a marathon just to lift a spoonful of soup. One morning, he tried to comb his hair and realized he couldn't lift his arm above his shoulder. "At that moment I started to freak out," he says. Both men got so weak that their families had to wheel them into the emergency room in wheelchairs. They got the same diagnosis: Guillain-Barre syndrome, a neurological disorder which can leave people paralyzed for weeks. © 2016 npr
By Geraldine Dawson There’s a popular saying in the autism community: “If you’ve met one person with autism, you’ve met one person with autism.” Although this phrase is meant to convey the remarkable variation in abilities and disabilities among people with autism spectrum disorder (ASD), we’re learning that it also applies to the extraordinary variability in how ASD develops. When I first began doing research on autism decades ago, we thought of it as one condition and aimed to discover its “cause.” Now we know ASD is actually a group of lifelong conditions that can arise from a complex combination of multiple genetic and environmental factors. In the same way that each person with ASD has a unique personality and profile of talents and disabilities, each also has a distinct developmental history shaped by a specific combination of genetic and environmental factors. More evidence of this extraordinary variety will be presented this week in Baltimore, where nearly 2,000 of the world’s leading autism researchers will gather for the International Meeting for Autism Research (IMFAR). As president of the International Society for Autism Research, which sponsors the conference, I am more impressed than ever with the progress we are making. New findings being presented at the conference will highlight the importance of the prenatal period in understanding how various environmental factors such as exposure to alcohol, smoking and certain chemical compounds can increase risk for ASD. The impact of many environmental factors depends, however, on an individual’s genetic background and the timing of the exposure. Other research links inflammation—detected in blood spot tests taken at birth—with a higher likelihood of an ASD diagnosis later on. Researchers suggest that certain factors such as maternal infection and other factors during pregnancy may influence an infant’s immune system and contribute to risk. As our knowledge of these risk factors grows, so do the opportunities for promoting healthy pregnancies and better outcomes. © 2016 Scientific American
By Marta Zaraska Scientists and laypeople alike have historically attributed political beliefs to upbringing and surroundings, yet recent research shows that our political inclinations have a large genetic component. The largest recent study of political beliefs, published in 2014 in Behavior Genetics, looked at a sample of more than 12,000 twin pairs from five countries, including the U.S. Some were identical and some fraternal; all were raised together. The study reveals that the development of political attitudes depends, on average, about 60 percent on the environment in which we grow up and live and 40 percent on our genes. “We inherit some part of how we process information, how we see the world and how we perceive threats—and these are expressed in a modern society as political attitudes,” explains Peter Hatemi, who is a genetic epidemiologist at the University of Sydney and lead author of the study. The genes involved in such complex traits are difficult to pinpoint because they tend to be involved in a huge number of bodily and cognitive processes that each play a minuscule role in shaping our political attitudes. Yet a study published in 2015 in the Proceedings of the Royal Society B managed to do just that, showing that genes encoding certain receptors for the neurotransmitter dopamine are associated with where we fall on the liberal-conservative axis. Among women who were highly liberal, 62 percent were carriers of certain receptor genotypes that have previously been associated with such traits as extroversion and novelty seeking. Meanwhile, among highly conservative women, the proportion was only 37.5 percent. © 2016 Scientific American
By Jessica Lahey Before she became a neuroscientist, Mary Helen Immordino-Yang was a seventh-grade science teacher at a school outside Boston. One year, during a period of significant racial and ethnic tension at the school, she struggled to engage her students in a unit on human evolution. After days of apathy and outright resistance to Ms. Immordino-Yang’s teaching, a student finally asked the question that altered her teaching — and her career path — forever: “Why are early hominids always shown with dark skin?” With that question, one that connected the abstract concepts of human evolution and the very concrete, personal experiences of racial tension in the school, her students’ resistance gave way to interest. As she explained the connection between the effects of equatorial sunlight, melanin and skin color and went on to explain how evolutionary change and geography result in various human characteristics, interest blossomed into engagement, and something magical happened: Her students began to learn. Dr. Immordino-Yang’s eyes light up as she recounts this story in her office at the Brain and Creativity Institute at the University of Southern California. Now an associate professor of education, psychology and neuroscience, she understands the reason behind her students’ shift from apathy to engagement and, finally, to deep, meaningful learning. Her students learned because they became emotionally engaged in material that had personal relevance to them. Emotion is essential to learning, Dr. Immordino-Yang said, and should not be underestimated or misunderstood as a trend, or as merely the “E” in “SEL,” or social-emotional learning. Emotion is where learning begins, or, as is often the case, where it ends. Put simply, “It is literally neurobiologically impossible to think deeply about things that you don’t care about,” she said. © 2016 The New York Times Company
By Sarah Kaplan Scientists have known for a while that stereotypes warp our perceptions of things. Implicit biases — those unconscious assumptions that worm their way into our brains, without our full awareness and sometimes against our better judgment — can influence grading choices from teachers, split-second decisions by police officers and outcomes in online dating. We can't even see the world without filtering it through the lens of our assumptions, scientists say. In a study published Monday in the journal Nature Neuroscience, psychologists report that the neurons that respond to things such as sex, race and emotion are linked by stereotypes, distorting the way we perceive people's faces before that visual information even reaches our conscious brains. "The moment we actually glimpse another person ... [stereotypes] are biasing that processing in a way that conforms to our already existing expectations," said Jonathan Freeman, a psychology professor at New York University and one of the authors of the report. Responsibility lies in two far-flung regions of the brain: the orbital frontal cortex, which rests just above the eyes and is responsible for rapid visual predictions and categorizations, and the fusiform cortex, which sits in the back of the brain and is involved in recognizing faces. When Freeman and his co-author, Ryan Stolier, had 43 participants look at images of faces in a brain scanner, they noticed that neurons seemed to be firing in similar patterns in both parts of the brain, suggesting that information from each part was influencing the other.
Nicola Davis People with a larger circle of friends are better able to tolerate pain, according to research into the pain thresholds and social networks of volunteers. The link is thought to be down a system in the brain that involves endorphins: potent pain-killing chemicals produced by the body that also trigger a sense of wellbeing. “At an equivalent dose, endorphins have been shown to be stronger than morphine,” said Katerina Johnson, a doctoral student at the University of Oxford, who co-authored the research. Writing in the journal Scientific Reports, Johnson and Robin Dunbar, professor of evolutionary psychology at the University of Oxford, sought to probe the theory that the brain’s endorphin system might have evolved to not only handle our response to physical discomfort, but influence our experience of pleasure from social interactions too. “Social behaviour and being attached to other individuals is really important for our survival - whether that is staying close to our parents, or our offspring or cooperating with others to find food or to help defend ourselves,” said Johnson. To test the link, the authors examined both the social networks and pain thresholds of 101 adults aged between 18 and 34. Each participant was asked to complete a questionnaire, designed to quiz them on friends they contacted once a week and those they got in touch with once a month. The personality of each participant was probed, looking at traits such as “agreeableness”; they were also asked to rate their fitness and stress levels. © 2016 Guardian News and Media Limited
Keyword: Pain & Touch
Link ID: 22156 - Posted: 04.28.2016
People who've recovered from depression stave off relapses with mindfulness therapy as well as with antidepressants, a new review finds. Mindfulness-based cognitive therapy (MBCT) is an eight-week group program that helps people become better observers of their own thoughts and emotions and to learn to distance themselves before ruminations spiral downwards. An international team of psychiatry researchers combined data from nine randomized trials of 1,258 patients total with recurrent depression to compare the mindfulness therapy to placebo, treatment as usual and other active treatments including antidepressants. People suffering from depression who received the mindfulness therapy were 31 per cent less likely to suffer a relapse during the next 60 weeks compared with those who did not receive it, Willem Kuyken of the University of Oxford, in England and his co-authors reported in a meta-analysis review in Wednesday's issue of the journal JAMA Psychiatry. "If you compare MBCT against antidepressant medication it basically holds its own, which means it provides protection on par with what people would get from continuing to take to take medications for one, two or three years after they've recovered from depression," said co-author Dr. Zindel Segal, a professor of psychology at the University of Toronto Scarborough. No one reported side-effects associated with participating in the therapy. ©2016 CBC/Radio-Canada.
by Bethany Brookshire Interviewing for a new job is filled with uncertainty, and that uncertainty fuels stress. There’s the uncertainty associated with preparing for the interview — what questions will they ask me? What should I put in my portfolio? And then there’s the ambiguity when you’re left to stew. Did I get the job? Or did someone else? Scientists have recently shown that these two types of uncertainty — the kind we can prepare for, and the kind we’re just stuck with — are not created equal. The uncertainty we can’t do anything about is more stressful than the one we can. The results help show exactly what in our lives freaks us out — and why. But the findings also show a positive side to the stress we feel when not knowing what’s ahead — the closer our stress levels reflect the real ambiguity in the world, the better we perform in it. “There is a bias in the public perception” against stress, says Claus Lamm, a cognitive neuroscientist at the University of Vienna in Austria. But stress “prepares us to deal with environmental challenges,” he notes, preparing us to fight or flee, and it keeps us paying attention to our surroundings. For decades, scientists have been trying to figure out just what makes us stressed and why. It turns out that unpredictability is a great stressor. Studies in the 1960s and 1970s showed that rats and humans who can’t predict a negative effect (such as a small shock) end up more frazzled than those who can predict when a zap is coming. In a 2006 study, people zapped with unpredictable electric shocks to the hand rated the pain as more unpleasant than when they knew what to expect. © Society for Science & the Public 2000 - 2016.
Link ID: 22151 - Posted: 04.27.2016
By Leonard Sax, M.D., Ph.D Why is it that girls tend to be more anxious than boys? It may start with how they feel about how they look. Some research has shown that in adolescence, girls tend to become more dissatisfied with their bodies, whereas boys tend to become more satisfied with their bodies. Another factor has to do with differences in how girls and boys use social media. A girl is much more likely than a boy to post a photo of herself wearing a swimsuit, while the boy is more likely to post a photo where the emphasis is on something he has done rather than on how he looks. If you don’t like Jake’s selfie showing off his big trophy, he may not care. But if you don’t like Sonya’s photo of herself wearing her bikini, she’s more likely to take it personally. Imagine another girl sitting in her bedroom, alone. She’s scrolling through other girls’ Instagram and Snapchat feeds. She sees Sonya showing off her new bikini; Sonya looks awesome. She sees Madison at a party, having a blast. She sees Vanessa with her adorable new puppy. And she thinks: I’m just sitting here in my bedroom, not doing anything. My life sucks. Boys are at lower risk for the toxic effects of social media than girls are, for at least three reasons. First, boys are less likely to be heavily invested in what you think of their selfies. “Does this swimsuit make me look fat?” is a question asked by girls more often than by boys. Second, boys tend to overestimate how interesting their own life is. Third, the average boy is likely to spend more time playing video games than Photoshopping his selfie for Instagram. And in video games, unlike social media, everybody truly can be a winner, eventually. If you play Grand Theft Auto or Call of Duty long enough, you will, sooner or later, complete all the missions, if you just keep at it. © 2016 The New York Times Company
By SABRINA TAVERNISE WASHINGTON — Suicide in the United States has surged to the highest levels in nearly 30 years, a federal data analysis has found, with increases in every age group except older adults. The rise was particularly steep for women. It was also substantial among middle-aged Americans, sending a signal of deep anguish from a group whose suicide rates had been stable or falling since the 1950s. The suicide rate for middle-aged women, ages 45 to 64, jumped by 63 percent over the period of the study, while it rose by 43 percent for men in that age range, the sharpest increase for males of any age. The overall suicide rate rose by 24 percent from 1999 to 2014, according to the National Center for Health Statistics, which released the study on Friday. The increases were so widespread that they lifted the nation’s suicide rate to 13 per 100,000 people, the highest since 1986. The rate rose by 2 percent a year starting in 2006, double the annual rise in the earlier period of the study. In all, 42,773 people died from suicide in 2014, compared with 29,199 in 1999. From 1999 to 2014, suicide rates in the United States rose among most age groups. Men and women from 45 to 64 had a sharp increase. Rates fell among those age 75 and older. “It’s really stunning to see such a large increase in suicide rates affecting virtually every age group,” said Katherine Hempstead, senior adviser for health care at the Robert Wood Johnson Foundation, who has identified a link between suicides in middle age and rising rates of distress about jobs and personal finances. Researchers also found an alarming increase among girls 10 to 14, whose suicide rate, while still very low, had tripled. The number of girls who killed themselves rose to 150 in 2014 from 50 in 1999. “This one certainly jumped out,” said Sally Curtin, a statistician at the center and an author of the report. © 2016 The New York Times Company
Anna Nowogrodzki There’s a little too much wishful thinking about mindfulness, and it is skewing how researchers report their studies of the technique. Researchers at McGill University in Montreal, Canada, analysed 124 published trials of mindfulness as a mental-health treatment, and found that scientists reported positive findings 60% more often than is statistically likely. The team also examined another 21 trials that were registered with databases such as ClinicalTrials.gov; of these, 62% were unpublished 30 months after they finished. The findings — reported in PLoS ONE on 8 April1 — hint that negative results are going unpublished. Mindfulness is the practice of being aware of thoughts and feelings without judging them good or bad. Mental-health treatments that focus on this method include mindfulness-based stress reduction — an 8-week group-based programme that includes yoga and daily meditation — and mindfulness-based cognitive therapy. A bias toward publishing studies that find the technique to be effective withholds important information from mental-health clinicians and patients, says Christopher Ferguson, a psychologist at Stetson University in Florida, who was not involved in the study. “I think this is a very important finding,” he adds. “We’ll invest a lot of social and financial capital in these issues, and a lot of that can be misplaced unless we have good data.” © 2016 Nature Publishing Group
Link ID: 22129 - Posted: 04.23.2016
By Esther Landhuis Peer inside the brain of someone with Alzheimer’s disease, and you’ll see some striking features: shriveled nerve cells and strange protein clumps. According to a leading theory, proteins called amyloid beta and tau build up in the brain and choke nerve cell communication, setting the disease in motion years before people suspect anything is wrong with their recall. Yet the Alzheimer’s brain has another curious aspect. Some of the clusters of toxic amyloid proteins are entangled with octopus-like immune cells called microglia, cells that live in the brain to clear unwanted clutter. By munching on amyloid plaques, microglia are thought to help keep the disease at bay. But these housekeeping cells have an additional role—they switch on inflammatory pathways. Inflammation is critically important when the immune system encounters infection or needs to repair tissue. If left unchecked, however, the inflammatory process churns out toxic substances that can kill surrounding cells, whose death triggers more inflammation and creates a vicious cycle. For years scientists have probed how neuroinflammation contributes to Alzheimer’s disease and other neurodegenerative ailments. Researchers face a number of immediate questions: Is neuroinflammation a driving force? Does it kick in when the disease is already underway and worsen the process? Could it be harnessed for good in the early stages? Those questions are far from settled, but research is starting to reveal a clearer picture. “It may not be the amyloid plaques themselves that directly damage neurons and the connections between them. Rather, it may be the immune reaction to the plaques that does the damage,” says Cynthia Lemere, a neuroscientist at Brigham and Women’s Hospital. Still, it is hard to say if microglia are good guys or bad, making it challenging to create therapeutics that target these cells. © 2016 Scientific American
By Mitch Leslie The worst part of being sick isn’t always the muscle aches and coughing. It’s the foggy head, the crankiness, the apathy, and the fatigue—in short, what researchers call sickness behavior. A new study uncovers a molecular mechanism that explains why we feel so crummy when we’re under the weather. “It’s a nice study that’s covered a lot of ground,” says neuroimmunologist Colm Cunningham of Trinity College in Dublin who wasn’t connected to the research. “What they’ve found is very plausible.” Although sickness behavior is unpleasant, researchers think the symptoms we suffer during a viral or bacterial infection are beneficial, enabling us to divert our energy to fighting the pathogens that have invaded our bodies. For cancer patients and people with autoimmune diseases, however, sickness behavior can be an unwanted side effect of treatment with immune molecules known as interferons, which our cells naturally release when we have an infection. The condition has posed a puzzle for researchers because they assumed the blood-brain barrier, a protective system that excludes most pathogens and immune molecules from the brain, would block signals from the immune system. Although scientists have identified several mechanisms that allow such messages to cross the barrier and influence behavior, the question of how the immune system and brain communicate “has been only partially answered,” says immunophysiologist Keith Kelley of the University of Illinois, Urbana-Champaign, who wasn’t connected to the new study. © 2016 American Association for the Advancement of Science.
Link ID: 22121 - Posted: 04.20.2016
By FRANS de WAAL TICKLING a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself. The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter. Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal. The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners. Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. Everything boiled down to genes and reinforcement. To think otherwise opened you up to ridicule, which is what happened to Wolfgang Köhler, the German psychologist who, a century ago, was the first to demonstrate flashes of insight in chimpanzees. © 2016 The New York Times Company
By JOANNA KLEIN Misconception: Migraines are psychological manifestations of women’s inability to manage stress and emotions Actually: Neurologists are very clear that migraines are a real, debilitating medical condition related to temporary abnormal brain activity. The fact that they may be more common for some women during “that time of the month” has nothing to do with emotions. For centuries, doctors explained migraines as a woman’s problem caused by emotional disturbances like hysteria, depression or stress. “Bizarrely, the recommended cure was marriage!” said Dr. Anne MacGregor, the lead author of the British Association for the Study of Headache’s guidelines for diagnosing and managing migraines. While that prescription may be far behind us, the misconception that migraines are fueled by a woman’s inability to cope persists. “It was considered psychological, or that I was a nervous overachiever, so I would never tell people that I have them,” said Lorie Novak, an artist in her sixties who has suffered from migraines since she was 8. After reading Joan Didion’s 1968 essay “In Bed,” about the writer’s struggle with migraines, Ms. Novak decided to tackle the representation of these debilitating headaches. Starting in 2009, Ms. Novak photographed herself every time she got a migraine. Under the hashtag #notjustaheadache, hundreds of others on Twitter and Instagram have demonstrated their own frustration with a widespread lack of understanding of the reality of migraines. © 2016 The New York Times Company
Philip Ball James Frazer’s classic anthropological study The Golden Bough1 contains a harrowing chapter on human sacrifice in rituals of crop fertility and harvest among historical cultures around the world. Frazer describes sacrificial victims being crushed under huge toppling stones, slow-roasted over fires and dismembered alive. Frazer’s methods of analysis wouldn't all pass muster among anthropologists today (his work was first published in 1890), but it is hard not to conclude from his descriptions that what industrialized societies today would regard as the most extreme psychopathy has in the past been seen as normal — and indeed sacred — behaviour. In almost all societies, killing within a tribe or clan has been strongly taboo; exemption is granted only to those with great authority. Anthropologists have suspected that ritual human sacrifice serves to cement power structures — that is, it signifies who sits at the top of the social hierarchy. The idea makes intuitive sense, but until now there has been no clear evidence to support it. In a study published in Nature2, Joseph Watts, a specialist in cultural evolution at the University of Auckland in New Zealand, and his colleagues have analysed 93 traditional cultures in Austronesia (the region that loosely embraces the many small and island states in the Pacific and Indonesia) as they were before they were influenced by colonization and major world religions (generally in the late 19th and early 20th centuries). © 2016 Nature Publishing Group
By Emily Underwood More than 99% of clinical trials for Alzheimer’s drugs have failed, leading many to wonder whether pharmaceutical companies have gone after the wrong targets. Now, research in mice points to a potential new target: a developmental process gone awry, which causes some immune cells to feast on the connections between neurons. “It is beautiful new work,” which “brings into light what’s happening in the early stage of the disease,” says Jonathan Kipnis, a neuroscientist at the University of Virginia School of Medicine in Charlottesville. Most new Alzheimer’s drugs aim to eliminate β amyloid, a protein that forms telltale sticky plaques around neurons in people with the disease. Those with Alzheimer’s tend to have more of these deposits in their brains than do healthy people, yet more plaques don’t always mean more severe symptoms such as memory loss or poor attention, says Beth Stevens of Boston Children’s Hospital, who led the new work. What does track well with the cognitive decline seen in Alzheimer’s disease—at least in mice that carry genes that confer high risk for the condition in people—is a marked loss of synapses, particularly in brain regions key to memory, Stevens says. These junctions between nerve cells are where neurotransmitters are released to spark the brain’s electrical activity. Stevens has spent much of her career studying a normal immune mechanism that prunes weak or unnecessary synapses as the brain matures from the womb through adolescence, allowing more important connections to become stronger. In this process, a protein called C1q sets off a series of chemical reactions that ultimately mark a synapse for destruction. After a synapse has been “tagged,” immune cells called microglia—the brain’s trash disposal service—know to “eat” it, Stevens says. © 2016 American Association for the Advancement of Science
Noah Smith, ( How do human beings behave in response to risk? That is one of the most fundamental unanswered questions of our time. A general theory of decision-making amid uncertainty would be the kind of scientific advance that comes only a few times a century. Risk is central to financial and insurance markets. It affects the consumption, saving and business investment that moves the global economy. Understanding human behavior in the face of risk would let us reduce accidents, retire more comfortably, get cheaper health insurance and maybe even avoid recessions. A number of our smartest scientists have tried to develop a general theory of risk behavior. John von Neumann, the pioneering mathematician and physicist, took a crack at it back in 1944, when he developed the theory of expected utility along with Oskar Morgenstern. According to this simple theory, people value a possible outcome by multiplying the probability that something happens by the amount they would like it to happen. This beautiful idea underlies much of modern economic theory, but unfortunately it doesn't work well in most situations. Alternative theories have been developed for specific applications. The psychologist Daniel Kahneman won a Nobel Prize for the creation of prospect theory, which says -- among other things -- that people measure outcomes relative to a reference point. That theory does a great job of explaining the behavior of subjects in certain lab experiments, and can help account for the actions of certain inexperienced consumers. But it is very difficult to apply generally, because the reference points are hard to predict in advance and may shift in unpredictable ways.
By Nicholas Bakalar Stress in childhood may be linked to hardening of the arteries in adulthood, new research suggests. Finnish researchers studied 311 children 12 to 18 years old, scoring their levels of stress according to a variety of components, including the family’s economic circumstances, the emotional environment in the home, whether parents engaged in healthy behaviors, stressful events (such as divorce, moves or death of a family member) and parental concerns about the child’s social adjustment. Using these criteria, they calculated a stress score. When the members of the group were 40 to 46 years old, they used computed tomography to measure coronary artery calcification, a marker of atherosclerosis and a risk factor for cardiovascular disease. The study, in JAMA Pediatrics, controlled for sex, cholesterol, body mass index and other factors, but still found that the higher the childhood stress score, the greater the risk for coronary artery calcification. The study is observational, and the data is based largely on parental reports, which can be biased. Still, its long follow-up time and careful control of other variables gives it considerable strength. There are plausible mechanisms for the connection, including stress-induced increases in inflammation, which in animal models have been linked to a variety of ailments. “I think that economic conditions are important here,” said the lead author, Dr. Markus Juonala, a professor of internal medicine at the University of Turku in Finland. “Public health interventions should focus on how to intervene in better ways with people with higher stress and lower socioeconomic status.” © 2016 The New York Times Company
By Ariana Eunjung Cha LAS VEGAS — Jamie Tyler was stressed. He had just endured a half-hour slog through airport security and needed some relief. Many travelers in this situation might have headed for the nearest bar or popped an aspirin. But Tyler grabbed a triangular piece of gadgetry from his bag and held it to his forehead. As he closed his eyes, the device zapped him with low-voltage electrical currents. Within minutes, Tyler said, he was feeling serene enough to face the crowds once again. This is no science fiction. The Harvard-trained neurobiologist was taking advantage of one of his own inventions, a device called Thync, which promises to help users activate their body's “natural state of energy or calm” — for a retail price of a mere $199. Americans’ obsession with wellness is fueling a new category of consumer electronics, one that goes far beyond the ubiquitous Fitbits and UP activity wristbands that only passively monitor users' physical activity. The latest wearable tech, to put it in the simplest terms, is about hacking your brain. These gadgets claim to be able to make you have more willpower, think more creatively and even jump higher. One day, their makers say, the technology may even succeed in delivering on the holy grail of emotions: happiness. There’s real, peer-reviewed science behind the theory driving these devices. It involves stimulating key regions of the brain — with currents or magnetic fields — to affect emotions and physical well-being.
Link ID: 22053 - Posted: 03.31.2016