Chapter 15. Emotions, Aggression, and Stress
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Arielle Duhaime-Ross Rats don't usually come out into daylight, especially not on a busy morning in New York City. But there it was, head awkwardly jutting out in front of its body, swinging from side to side. What injured the creature, I have no idea, but its hind legs could no longer support its weight. The rat dragged them like a kid drags a garbage bag that parents have asked be taken out–reluctantly. The muscles in the front legs rippled as they propelled the body forward along the sidewalk. The rodent was surprisingly quick considering the injury. But its aimlessness suggested distress. Two girls, no more than 15 years old, spotted the wounded rat from about 10 feet away. They held each other close, squealing and giggling, inching toward the animal theatrically. Staring them down, I scowled. How could they not appreciate this creature’s suffering or be touched by its desperation? I looked on, saying nothing. In The Last Child in the Woods, journalist Richard Louv talks about "nature deficit disorder," something we urbanites have picked up over the last hundred years or so. He says that city-dwellers have become so disconnected from nature that they cannot process the harsh realities of the natural world, like the sight of an injured animal. But if those young women were suffering from urban disconnection, then why didn’t I—a city slicker through and through—react that way as well? What made me respond with empathy instead of disgust? Evolutionary theorists believe that many of our behaviors are adaptive in some way. "Empathy probably started out as a mechanism to improve maternal care," says Frans de Waal, a primatologist at Emory University and author of The Age of Empathy. "Mammalian mothers who were attentive to their young’s needs were more likely to rear successful offspring." © 2013 Scientific American
By Scicurious When I am stressed (and I’m stressed a lot of the time, as I bet a lot of you are as well), I turn to coffee. Not just to keep me going through the time when I need to get things done, but also for relaxation. For me, the smell and taste of coffee brings me thoughts of relaxing conversations with friends and other fun times. But what if the memories weren’t all the relaxing the caffeine was doing for me? What if the chronic caffeine consumption was keeping my stressful life at bay? It’s time to look at adenosine 2A receptors in the hippocampus. Don’t worry, the coffee will be back. First let’s talk about stress. Specifically, childhood stress. In small doses, stress exposure can actually be good for you, but in large, or prolonged, doses, it’s definitely not. There are effects immediately after stress, as well as long term ones. when you suffer strong stressors in development, you can end up with changes all the way into adulthood, from cognitive deficits to predisposition to psychiatric disorders. Why is stress in development so important? During development, our brains are developing too, particularly our hippocampus. While the hippocampus is best known for its role in memory and spatial navigation, it’s also extremely important in emotional responses. Neuronal growth in the hippocampus can come from enriched environments or chronic antidepressants, and death of those neurons can come from chronic stress. Chronic stress also disrupts the hypothalamic-pituitary-adrenal axis (the HPA axis) And that’s just in adults! During development, animals are very susceptible to stress, and the hippocampus is still developing its connections. And we’re still figuring out what changes occur during early life stress and how they relate to behaviors in adulthood. © 2013 Scientific American
By James Gallagher Health and science reporter, BBC News An experimental treatment to stop the body attacking its own nervous system in patients with multiple sclerosis (MS) appears safe in trials. The sheath around nerves cells, made of myelin, is destroyed in MS, leaving the nerves struggling to pass on messages. A study on nine patients, reported in Science Translational Medicine, tried to train the immune system to cease its assault on myelin. The MS Society said the idea had "exciting potential". As nerves lose their ability to talk to each other, the disease results in problems moving and balancing and can affect vision. There are drugs that can reduce number and severity of attacks, but there is no cure. The disease is caused by the body's immune system thinking that myelin is a foreign body like a flu virus. Researchers at the Northwestern University Feinberg School of Medicine developed a technique to retrain the immune system. They took blood samples and coupled white blood cells, a part of the immune system, to fragments of myelin. This was injected back into the patients to make them tolerate myelin. BBC © 2013
By Piercarlo Valdesolo The posed stare-down is a staple of the pre-fight ritual. Two fighters, one day removed from attempting to beat the memories from each other, stand impossibly close, raise their clenched fists and fix their gaze on the other’s eyes as cameras click away. This has always seemed little more than a vehicle for media hype, but new research from psychologists at the University of Illinois suggests that there may be clues in this bit of theatre that predict the results of the fight to come. Specifically, the researchers hypothesized that there’s something about the fighters’ facial expressions in this standoff that reveal the competitive dynamics between them. A subtle, and perhaps unintentional, communication of submission from one fighter to the other. A recognition of the opponent’s power. The smile. Facial expressions have long been thought to be reliable indicators of a person’s true feelings. Indeed, in his book “The Expression of the Emotions in Man and Animals” Darwin suggested that such expressions have evolved precisely because they serve this important function. The smile has attracted much empirical attention and has generally been interpreted as a signal of an individuals’ immediate, as well as long-term, well-being. In a particularly interesting study, the frequency and “authenticity” of smiles in high school yearbook photos tended to predict higher levels of subjective well-being years later. But smiles can mean different things in different contexts. The researchers here were particularly interested in what a smile might mean when displayed between competitors. Instead of merely communicating a fighter’s good spirits, the researchers hypothesized that it would be a submissive signal that reveals a fighter’s reduced hostility and lower willingness to aggress towards the opponent. © 2013 Scientific American,
By Felicity Muth If you grew up with brothers or sisters you will know that competition is a key part of childhood. Personally, I experienced competition for food resources (the last bar of chocolate), parental investment (attention) and other more unusual resources (the best colour of lego pieces). As we age, we continue competing, although what we compete for changes. We compete in sports, for partners and for jobs. Like humans, pretty much all other animals will compete in one way or another. Even if they live a solitary life they may be still competing indirectly with others. But, like humans, animals need to choose which battles are worth fighting, and how much effort to put into it. One obvious way of deciding when to bother competing with another is the absolute worth of the thing you’re fighting over. If you and a stranger stumbled across some money in the street, you might fight vigorously for a $100 note, but more half-heartedly for $5 (this is of course an example using some very money-driven and aggressive individuals). However, how much value an individual puts on an item’s worth is going to be somewhat subjective. If you’re poor and starving, you might invest more into fighting for $5 than someone who is not. Thus, most competitions will contain both objective and subjective aspects: the intrinsic worth of an object (large food items are better than small), and the individual’s state when they’re assessing that item. © 2013 Scientific American
Link ID: 18225 - Posted: 06.04.2013
By JENNA WORTHAM ON a recent family outing, my mother and sister got into a shouting match. But they weren’t mad at each other — they were yelling at the iPhone’s turn-by-turn navigation system. I interrupted to say that the phone didn’t understand — or care — that they were upset. “Honey, we know,” my mom replied. “But it should!” She had a point. After all, computers and technology are becoming only smarter, faster and more intuitive. Artificial intelligence is creeping into our lives at a steady pace. Devices and apps can anticipate what we need, sometimes even before we realize it ourselves. So why shouldn’t they understand our feelings? If emotional reactions were measured, they could be valuable data points for better design and development. Emotional artificial intelligence, also called affective computing, may be on its way. But should it be? After all, we’re already struggling to cope with the always-on nature of the devices in our lives. Yes, those gadgets would be more efficient if they could respond when we are frustrated, bored or too busy to be interrupted, yet they would also be intrusive in ways we can’t even fathom today. It sounds like a science-fiction movie, and in some ways it is. Much of this technology is still in its early stages, but it’s inching closer to reality. Companies like Affectiva, a start-up spun out of the M.I.T. Media Lab, are working on software that trains computers to recognize human emotions based on their facial expressions and physiological responses. A company called Beyond Verbal, which has just raised close to $3 million in venture financing, is working on a software tool that can analyze speech and, based on the tone of a person’s voice, determine whether it indicates qualities like arrogance or annoyance, or both. © 2013 The New York Times Company
by Paul Gabrielsen Take a whiff, men. A chemical component of other guys' sweat makes men more cooperative and generous, new research says. The study is the first to show that this pheromone, called androstadienone, influences other men's behavior and reinforces the developing finding that humans are susceptible and responsive to these chemical signals. Pheromones are everywhere in the animal world. Bugs in particular give off these chemicals to sound an alarm, identify a food source, or attract a mate. And smitten animals may indeed have "chemistry" together—pheromone signals are a subconscious part of their communication. Scientists didn't know if humans played that game as well. But in the last 30 years, they've identified both male and female putative pheromones that are linked to mood and reproductive cycles. Some fragrancemakers have even incorporated them into their products, hoping to add an extra emotional punch to colognes and perfumes. Real-life pheromones don't smell so nice, however: The specialized glands that produce these chemical compounds are located near the armpit, where they mix with sweat. Previous investigations focused on the chemicals as sexual attractants—studying a male pheromone's effect on female mood and behavior, for example. Turns out that women aren't the only ones susceptible to the power of male pheromones. Evolutionary biologist Markus Rantala of the University of Turku in Finland crafted an experiment in which 40 men in their mid-20s played a computer game in which two players decided how to share €10. One player offers a possible split, and the other decides whether to accept or reject it. Each participant took a turn making or deciding on offers. © 2010 American Association for the Advancement of Science
By DANIEL BERGNER Linneah sat at a desk at the Center for Sexual Medicine at Sheppard Pratt in the suburbs of Baltimore and filled out a questionnaire. She read briskly, making swift checks beside her selected answers, and when she was finished, she handed the pages across the desk to Martina Miller, who gave her a round of pills. The pills were either a placebo or a new drug called Lybrido, created to stoke sexual desire in women. Checking her computer, Miller pointed out gently that Linneah hadn’t been doing her duty as a study participant. Over the past eight weeks, she took the tablets before she planned to have sex, and for every time she put a pill on her tongue, she was supposed to make an entry in her online diary about her level of lust. “I know, I know,” Linneah said. She is a 44-year-old part-time elementary-school teacher, and that day she wore red pants and a canary yellow scarf. (She asked that only a nickname be used to protect her privacy.) “It’s a mess. I keep forgetting.” Miller, a study coordinator, began a short interview, typing Linneah’s replies into a database that the medication’s Dutch inventor, Adriaan Tuiten, will present to the Food and Drug Administration this summer or fall as part of his campaign to win the agency’s approval and begin marketing what might become the first female-desire drug in America. “Thinking about your desire now,” Miller said, “would you say it is absent, very low, low, reasonable or present?” “Low.” This was no different from Linneah’s reply at the trial’s outset two months before. © 2013 The New York Times Company
By CARL ZIMMER Imagine a wolf catching a Frisbee a dozen times in a row, or leading police officers to a stash of cocaine, or just sleeping peacefully next to you on your couch. It’s a stretch, to say the least. Dogs may have evolved from wolves, but the minds of the two canines are profoundly different. Dog brains, as I wrote last month in The New York Times, have become exquisitely tuned to our own. Scientists are now zeroing in on some of the genes that were crucial to the rewiring of dog brains. Their results are fascinating, and not only because they can help us understand how dogs turned into man’s best friend. They may also teach us something about the evolution of our own brains: Some of the genes that evolved in dogs are the same ones that evolved in us. To trace the change in dog brains, scientists have first had to work out how dog breeds are related to one another, and how they’re all related to wolves. Ya-Ping Zhang, a geneticist at the Chinese Academy of Sciences, has led an international network of scientists who have compared pieces of DNA from different canines. They’ve come to the conclusion that wolves started their transformation into dogs in East Asia. Those early dogs then spread to other parts of the world. Many of the breeds we’re most familiar with, like German shepherds and golden retrievers, emerged only in the past few centuries. © 2013 The New York Times Company
by Sara Reardon As suicide rates climb steeply in the US a growing number of psychiatrists are arguing that suicidal behaviour should be considered as a disease in its own right, rather than as a behaviour resulting from a mood disorder. They base their argument on mounting evidence showing that the brains of people who have committed suicide have striking similarities, quite distinct from what is seen in the brains of people who have similar mood disorders but who died of natural causes. Suicide also tends to be more common in some families, suggesting there may be genetic and other biological factors in play. What's more, most people with mood disorders never attempt to kill themselves, and about 10 per cent of suicides have no history of mental disease. The idea of classifying suicidal tendencies as a disease is being taken seriously. The team behind the fifth edition of the Diagnostic Standards Manual (DSM-5) – the newest version of psychiatry's "bible", released at the American Psychiatric Association's meeting in San Francisco this week – considered a proposal to have "suicide behaviour disorder" listed as a distinct diagnosis. It was ultimately put on probation: put into a list of topics deemed to require further research for possible inclusion in future DSM revisions. Another argument for linking suicidal people together under a single diagnosis is that it could spur research into the neurological and genetic factors they have in common. This could allow psychiatrists to better predict someone's suicide risk, and even lead to treatments that stop suicidal feelings. © Copyright Reed Business Information Ltd.
By ANAHAD O'CONNOR The nation’s largest cardiovascular health organization has a new message for Americans: Owning a dog may protect you from heart disease. The unusual message was contained in a scientific statement published on Thursday by the American Heart Association, which convened a panel of experts to review years of data on the cardiovascular benefits of owning a pet. The group concluded that owning a dog, in particular, was “probably associated” with a reduced risk of heart disease. People who own dogs certainly have more reason to get outside and take walks, and studies show that most owners form such close bonds with their pets that being in their presence blunts the owners’ reactions to stress and lowers their heart rate, said Dr. Glenn N. Levine, the head of the committee that wrote the statement. But most of the evidence is observational, which makes it impossible to rule out the prospect that people who are healthier and more active in the first place are simply more likely to bring a dog or cat into their home. “We didn’t want to make this too strong of a statement,” said Dr. Levine, a professor at the Baylor College of Medicine. “But there are plausible psychological, sociological and physiological reasons to believe that pet ownership might actually have a causal role in decreasing cardiovascular risk.” Nationwide, Americans keep roughly 70 million dogs and 74 million cats as pets. Copyright 2013 The New York Times Company
By NICHOLAS BAKALAR Two studies have found that depression and the use of certain antidepressants are both associated with increased risk for Clostridium difficile infection, an increasingly common cause of diarrhea that in the worst cases can be fatal. Researchers studied 16,781 men and women, average age 68, using hospital records and interviews to record cases of the infection, often called C. diff, and diagnoses of depression. The interviews were conducted biennially from 1991 to 2007 to gather self-reports of feelings of sadness and other emotional problems. There were 404 cases of C. difficile infection. After adjusting for other variables, the researchers found that the risk of C. diff infection among people with a history of depression or depressive symptoms was 36 to 47 percent greater than among people without depression. A second study, involving 4,047 hospitalized patients, average age 58, found a similar association of infection with depression. In addition, it found an association of some antidepressants — Remeron, Prozac and trazodone — with C. diff infection. There was no association with other antidepressants. “We have known for a long time that depression is associated with changes in the gastrointestinal system,” said the lead author, Mary A.M. Rogers, a research assistant professor at the University of Michigan, “and this interaction between the brain and the gut deserves more study.” Both reports appeared in the journal BMC Medicine. Copyright 2013 The New York Times Company
By Ben Thomas Horror isn’t the only film genre that specializes in dread. War movies like Apocalypse Now, sci-fi mysteries like Brazil and Blade Runner, and dramas like Melancholia and Requiem for a Dream all masterfully evoke a less violent, more subtle and pervasive sense that something is unwell with the world – that somewhere along the line, something went deeply wrong and now normality itself is unraveling before our eyes. The director David Lynch has arguably built his entire career on directing these kinds of films. In Lynch’s universe, even the most banal moments are still somehow suffused with unnerving suspense. In films like Blue Velvet and Mulholland Drive, disturbing surprises erupt into scene after scene of buried tension, until every ordinary conversation feels like a trap waiting to spring. And then there’s the infamous Eraserhead, where family life itself is transformed into an onslaught of surreal and nauseating images. It’s hard to come away from these movies without feeling that a little of Lynch’s unease has rubbed off on you. So when a team of researchers at the University of British Columbia set out to describe and treat an ancient biological alarm system buried deep within the human brain, they turned to Lynch’s films as an analogy for – and a set of examples of – the feeling of omnipresent yet maddeningly vague “wrongness” that seems to underlie many anxiety disorders. © 2013 Scientific American
Link ID: 18134 - Posted: 05.09.2013
By Nathan Seppa Multiple sclerosis, long considered a disease of white females, has affected more black women in recent years, a new study finds. Hispanic and Asian women, who have previously seemed to be at less risk of MS, remain so, researchers report May 7 in Neurology. The findings bolster a theory that vitamin D deficiency, which is common in people with dark skin in northern latitudes, contributes to MS. MS is a debilitating condition in which the protective coatings on nerves in the central nervous system get damaged, resulting in a loss of motor control, muscle weakness, vision complications and other problems. The National Multiple Sclerosis Society estimates that 2.1 million people worldwide have the condition. The researchers scanned medical information from 3.5 million people who were members of the health maintenance organization Kaiser Permanente Southern California and found that 496 people received diagnoses of MS from 2008 through 2010. Of these patients, women comprised 70 percent, not an unusual fraction for people with MS. Surprisingly, the patients included 84 black women. That means the annual incidence of MS in black women was 10.2 cases per 100,000 people. That’s not a great risk for an individual, but it was higher than the annual rates for white, Hispanic and Asian women, which were 6.9, 2.9 and 1.4 per 100,000 people, respectively. Among blacks, women had three times the incidence as men; in the other racial and ethnic groups, the MS rate in women was roughly double that of men. © Society for Science & the Public 2000 - 2013
By TARA PARKER-POPE Suicide rates among middle-aged Americans have risen sharply in the past decade, prompting concern that a generation of baby boomers who have faced years of economic worry and easy access to prescription painkillers may be particularly vulnerable to self-inflicted harm. More people now die of suicide than in car accidents, according to the Centers for Disease Control and Prevention, which published the findings in Friday’s issue of its Morbidity and Mortality Weekly Report. In 2010 there were 33,687 deaths from motor vehicle crashes and 38,364 suicides. Suicide has typically been viewed as a problem of teenagers and the elderly, and the surge in suicide rates among middle-aged Americans is surprising. From 1999 to 2010, the suicide rate among Americans ages 35 to 64 rose by nearly 30 percent, to 17.6 deaths per 100,000 people, up from 13.7. Although suicide rates are growing among both middle-aged men and women, far more men take their own lives. The suicide rate for middle-aged men was 27.3 deaths per 100,000, while for women it was 8.1 deaths per 100,000. The most pronounced increases were seen among men in their 50s, a group in which suicide rates jumped by nearly 50 percent, to about 30 per 100,000. For women, the largest increase was seen in those ages 60 to 64, among whom rates increased by nearly 60 percent, to 7.0 per 100,000. Suicide rates can be difficult to interpret because of variations in the way local officials report causes of death. But C.D.C. and academic researchers said they were confident that the data documented an actual increase in deaths by suicide and not a statistical anomaly. While reporting of suicides is not always consistent around the country, the current numbers are, if anything, too low. © 2013 The New York Times Company
By ADRIAN RAINE In studying brain scans of criminals, researchers are discovering tell-tale signs of violent tendencies. WSJ's Jason Bellini speaks with Professor Adrian Raine about his latest discoveries. The scientific study of crime got its start on a cold, gray November morning in 1871, on the east coast of Italy. Cesare Lombroso, a psychiatrist and prison doctor at an asylum for the criminally insane, was performing a routine autopsy on an infamous Calabrian brigand named Giuseppe Villella. Lombroso found an unusual indentation at the base of Villella's skull. From this singular observation, he would go on to become the founding father of modern criminology. Lombroso's controversial theory had two key points: that crime originated in large measure from deformities of the brain and that criminals were an evolutionary throwback to more primitive species. Criminals, he believed, could be identified on the basis of physical characteristics, such as a large jaw and a sloping forehead. Based on his measurements of such traits, Lombroso created an evolutionary hierarchy, with Northern Italians and Jews at the top and Southern Italians (like Villella), along with Bolivians and Peruvians, at the bottom. These beliefs, based partly on pseudoscientific phrenological theories about the shape and size of the human head, flourished throughout Europe in the late 19th and early 20th centuries. Lombroso was Jewish and a celebrated intellectual in his day, but the theory he spawned turned out to be socially and scientifically disastrous, not least by encouraging early-20th-century ideas about which human beings were and were not fit to reproduce—or to live at all. ©2013 Dow Jones & Company, Inc.
Link ID: 18111 - Posted: 05.04.2013
Alison Abbott Thinking about a professor just before you take an intelligence test makes you perform better than if you think about football hooligans. Or does it? An influential theory that certain behaviour can be modified by unconscious cues is under serious attack. A paper published in PLoS ONE last week1 reports that nine different experiments failed to replicate this example of ‘intelligence priming’, first described in 1998 (ref. 2) by Ap Dijksterhuis, a social psychologist at Radboud University Nijmegen in the Netherlands, and now included in textbooks. David Shanks, a cognitive psychologist at University College London, UK, and first author of the paper in PLoS ONE, is among sceptical scientists calling for Dijksterhuis to design a detailed experimental protocol to be carried out indifferent laboratories to pin down the effect. Dijksterhuis has rejected the request, saying that he “stands by the general effect” and blames the failure to replicate on “poor experiments”. An acrimonious e-mail debate on the subject has been dividing psychologists, who are already jittery about other recent exposures of irreproducible results (see Nature 485, 298–300; 2012). “It’s about more than just replicating results from one paper,” says Shanks, who circulated a draft of his study in October; the failed replications call into question the underpinnings of ‘unconscious-thought theory’. © 2013 Nature Publishing Group
By Breanna Draxler The ruse is common in spy movies—an attractive female saunters in at a critical moment and seduces the otherwise infallible protagonist, duping him into giving up the goods. It works in Hollywood and it works in real life, too. Men tend to say yes to attractive women without really scrutinizing whether or not they are trustworthy. But scientists have shown, for the first time, that a drug may be able to overcome this “honey trap,” and help men make more rational decisions. Nearly 100 men participated in the study; half were given minocycline, an antibiotic normally used to treat acne, and half were given a placebo. After four days of this drug regimen, participants played a computerized one-on-one trust game with eight different women, based only on pictures of the female players. In each round, the male player was given $13 and shown a picture of one of the female players. The male player would choose how much money he wanted to keep and how much he wanted to give to the female player. The amount given away was then tripled, and the female player would decide whether to split the money with the man or keep it all for herself. Unbeknownst to the men, however, the women kept the money every time. The researchers also asked the men to evaluate the photos of the females to determine how trustworthy and attractive they appeared, on a scale of 0 to 10.
By TARA PARKER-POPE Are doctors nicer to patients who aren’t fat? A provocative new study suggests that they are — that thin patients are treated with more warmth and empathy than those who are overweight or obese. For the study, published in the medical journal Obesity, researchers at Johns Hopkins obtained permission to record discussions between 39 primary care doctors and more than 200 patients who had high blood pressure. Although patients were there to talk about blood pressure, not weight, most fell into the overweight or obese category. Only 28 were of normal weight, meaning they had a body mass index below 25. Of the remaining patients, 120 were obese (B.M.I. of 30 or greater) and 60 were classified as overweight (index of 25 to 30). For the most part, all of the patients were treated about the same; there were no meaningful differences in the amount of time doctors spent with them or the topics discussed. But when researchers analyzed transcripts of the visits, there was one striking difference. Doctors seemed just a bit nicer to their normal-weight patients, showing more empathy and warmth in their conversations. Although the study was relatively small, the findings are statistically significant. “It’s not like the physicians were being overtly negative or harsh,” said the lead author, Dr. Kimberly A. Gudzune, an assistant professor of general internal medicine at the Johns Hopkins School of Medicine. “They were just not engaging patients in that rapport-building or making that emotional connection with the patient.” Copyright 2013 The New York Times Company
By Scicurious Say you are out on a camping trip with some friends. You’re in the woods, the tents are up, the beer is out, the sun is down, the campfire is starting up. As you sit there, you hear the campfire crackling loudly. To most people, the crackling of the campfire is just that: a campfire. Nothing threatening at all. But for someone with a severe anxiety disorder such as post-traumatic stress disorder (PTSD), the crackling of the campfire may be associated with terrible memories, a huge conflagration during house to house fighting or a house fire that destroyed all they loved, causing them horrible distress and terrible anxiety. A campfire during a camping trip and the horrible things they endured are entirely dissimilar things, but in severe anxiety disorders, that makes no difference at all. No, this post is not about whether or not anxiety disorders are being over diagnosed. Rather, it’s about how over-generalization within the brain might influence the development of anxiety disorders. What is the difference between a house fire and a campfire? How does your brain know? It’s the idea of pattern separation, an idea that the authors of this review believe could be incredibly important in treating some types of anxiety disorders. Pattern separation is one of the many actions of the hippocampus, the large, curved area in the interior of the brain which is thought to play a role in things like memory and in disorders such as anxiety and depression. Pattern separation was originally observed related to memory, but the authors of this review propose that it may also relate to things like anxiety. © 2013 Scientific American
Link ID: 18089 - Posted: 04.29.2013