Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 1480

by Laura Beil The obesity crisis has given prehistoric dining a stardom not known since Fred Flintstone introduced the Bronto Burger. Last year, “Paleo diet” topped the list of most-Googled weight loss searches, as modern Stone Age dieters sought the advice of bestsellers like The Paleo Solution or The Primal Blueprint, which encourages followers to “honor your primal genes.” The assumption is that America has a weight problem because human metabolism runs on ancient genes that are ill equipped for contemporary eating habits. In this line of thinking, a diet true to the hunter-gatherers we once were — heavy on protein, light on carbs — will make us skinny again. While the fad has attracted skepticism from those who don’t buy the idea whole hog, there’s still plenty of acceptance for one common premise about the evolution of obesity: Our bodies want to stockpile fat. For most of human history, the theory goes, hunter-gatherers ate heartily when they managed to slay a fleeing mastodon. Otherwise, prehistoric life meant prolonged stretches of near starvation, surviving only on inner reserves of adipose. Today, modern humans mostly hunt and gather at the drive-thru, but our Pleistocene genes haven’t stopped fretting over the coming famine. The idea that evolution favored calorie-hoarding genes has long shaped popular and scientific thinking. Called the “thrifty gene” hypothesis, it has arguably been the dominant theory for evolutionary origins of obesity, and by extension diabetes. (Insulin resistance and diabetes so commonly accompany obesity that doctors have coined the term “diabesity.”) However, it’s not that difficult to find scientists who call the rise of the thrifty gene theory a feat of enthusiasm over evidence. Greg Gibson, director of the Center for Integrative Genomics at Georgia Tech in Atlanta, calls the data “somewhere between scant and nonexistent — a great example of crowd mentality in science.” © Society for Science & the Public 2000 - 2014

Keyword: Obesity
Link ID: 20042 - Posted: 09.06.2014

By Jeffrey Mervis Embattled U.K. biomedical researchers are drawing some comfort from a new survey showing that a sizable majority of the public continues to support the use of animals in research. But there’s another twist that should interest social scientists as well: The government’s decision this year to field two almost identical surveys on the topic offers fresh evidence that the way you ask a question affects how people answer it. Since 1999, the U.K. Department for Business, Innovation & Skills (BIS) has been funding a survey of 1000 adults about their attitudes toward animal experimentation. But this year the government asked the London-based pollsters, Ipsos MORI, to carry out a new survey, changing the wording of several questions. (The company also collected additional information, including public attitudes toward different animal species and current rules regarding their use.) For example, the phrase “animal experimentation” was replaced by “animal research” because the latter is “less inflammatory,” notes Ipsos MORI Research Manager Jerry Latter. In addition, says Emma Brown, a BIS spokeswoman, the word research “more accurately reflects the range of procedures that animals may be involved in, including the breeding of genetically modified animals.” But government officials also value the information about long-term trends in public attitudes that can be gleaned from the current survey. So they told the company to conduct one last round—the 10th in the series—at the same time they deployed the new survey. Each survey went to a representative, but different, sample of U.K. adults. © 2014 American Association for the Advancement of Scienc

Keyword: Animal Rights
Link ID: 20041 - Posted: 09.06.2014

Ewen Callaway Caffeine's buzz is so nice it evolved twice. The coffee genome has now been published, and it reveals that the coffee plant makes caffeine using a different set of genes from those found in tea, cacao and other perk-you-up plants. Coffee plants are grown across some 11 million hectares of land, with more than two billion cups of the beverage drunk every day. It is brewed from the fermented, roasted and ground berries of Coffea canephora and Coffea arabica, known as robusta and arabica, respectively. An international team of scientists has now identified more than 25,000 protein-making genes in the robusta coffee genome. The species accounts for about one-third of the coffee produced, much of it for instant-coffee brands such as Nescafe. Arabica contains less caffeine, but its lower acidity and bitterness make it more flavourful to many coffee drinkers. However, the robusta species was selected for sequencing because its genome is simpler than arabica’s. Caffeine evolved long before sleep-deprived humans became addicted to it, probably to defend the coffee plant against predators and for other benefits. For example, coffee leaves contain the highest levels of caffeine of any part of the plant, and when they fall on the soil they stop other plants from growing nearby. “Caffeine also habituates pollinators and makes them want to come back for more, which is what it does to us, too,” says Victor Albert, a genome scientist at the University of Buffalo in New York, who co-led the sequencing effort. The results were published on 4 September in Science1. © 2014 Nature Publishing Group

Keyword: Drug Abuse; Aggression
Link ID: 20040 - Posted: 09.06.2014

By LISA SANDERS, M.D. On Thursday, we challenged Well readers to take on the case of a 19-year-old man who suddenly collapsed at work after months of weakness and fatigue dotted with episodes of nausea and vomiting. More than 500 of you wrote in with suggested diagnoses. And more than 60 of you nailed it. The cause of this man’s collapse, weakness, nausea and vomiting was… Addisonian crisis because of Addison’s disease Addison’s disease, named after Dr. Thomas Addison, the 19th-century physician who first described the disorder, occurs when the adrenal glands stop producing the fight-or-flight hormones, particularly cortisol and adrenaline and a less well known but equally important hormone called aldosterone that helps the body manage salt. In Addison’s, the immune system mistakenly attacks the adrenal glands as if they were foreign invaders. Why this happens is not well understood, but without these glands and the essential hormones they make, the body cannot respond to biological stress. The symptoms of Addison’s are vague. That’s one reason it’s so hard to diagnosis. Patients complain of weakness and fatigue. They often crave salt. And when confronted with any stress — an infection or an injury — patients with Addison’s may go into adrenal crisis, characterized by nausea and vomiting, low blood pressure and, sometimes, physical collapse. Their blood pressure may drop so low that oxygen-carrying blood cannot reach the extremities, causing skin to turn blue; if blood fails to reach even more essential organs, it can lead to death. © 2014 The New York Times Company

Keyword: Hormones & Behavior
Link ID: 20037 - Posted: 09.06.2014

by Sandrine Ceurstemont Screening an instructional monkey movie in a forest reveals that marmosets do not only learn from family members: they also copy on-screen strangers. It is the first time such a video has been used for investigations in the wild. Tina Gunhold at the University of Vienna, Austria, and her colleagues filmed a common marmoset retrieving a treat from a plastic device. They then took the device to the Atlantic Forest near Aldeia in Pernambuco, Brazil, and showed the movie to wild marmosets there. Although monkeys are known to learn from others in their social group, especially when they are youngMovie Camera, little is known about their ability to learn from monkeys that do not belong to the same group. Marmosets are territorial, so the presence of an outsider – even a virtual one on a screen – could provoke an attack. "We didn't know if wild marmosets would be frightened of the video box but actually they were all attracted to it," says Gunhold. Compared to monkeys shown a static image of the stranger, video-watching marmosets were more likely to manipulate the device, typically copying the technique shown (see video). Young monkeys spent more time near the video box than older family members, suggesting that they found the movie more engaging – although as soon as one monkey mastered the task, it was impossible to tell whether the others were learning from the video or from their relative. "We think it's a combination of both," says Gunhold. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Aggression
Link ID: 20035 - Posted: 09.04.2014

Yves Frégnac & Gilles Laurent Launched in October 2013, the Human Brain Project (HBP) was sold by charismatic neurobiologist Henry Markram as a bold new path towards understanding the brain, treating neurological diseases and building information technology. It is one of two 'flagship' proposals funded by the European Commission's Future and Emerging Technologies programme (see go.nature.com/icotmi). Selected after a multiyear competition, the project seemed like an exciting opportunity to bring together neuroscience and IT to generate practical applications for health and medicine (see go.nature.com/2eocv8). Contrary to public assumptions that the HBP would generate knowledge about how the brain works, the project is turning into an expensive database-management project with a hunt for new computing architectures. In recent months, the HBP executive board revealed plans to drastically reduce its experimental and cognitive neuroscience arm, provoking wrath in the European neuroscience community. The crisis culminated with an open letter from neuroscientists (including one of us, G.L.) to the European Commission on 7 July 2014 (see www.neurofuture.eu), which has now gathered more than 750 signatures. Many signatories are scientists in experimental and theoretical fields, and the list includes former HBP participants. The letter incorporates a pledge of non-participation in a planned call for 'partnering projects' that must raise about half of the HBP's total funding. This pledge could seriously lower the quality of the project's final output and leave the planned databases empty. © 2014 Nature Publishing Group

Keyword: Brain imaging
Link ID: 20033 - Posted: 09.04.2014

By GRETCHEN REYNOLDS Amyotrophic lateral sclerosis has been all over the news lately because of the ubiquitous A.L.S. ice bucket challenge. That attention has also reinvigorated a long-simmering scientific debate about whether participating in contact sports or even vigorous exercise might somehow contribute to the development of the fatal neurodegenerative disease, an issue that two important new studies attempt to answer. Ever since the great Yankees first baseman Lou Gehrig died of A.L.S. in 1941 at age 37, many Americans have vaguely connected A.L.S. with athletes and sports. In Europe, the possible linkage has been more overtly discussed. In the past decade, several widely publicized studies indicated that professional Italian soccer players were disproportionately prone to A.L.S., with about a sixfold higher incidence than would have been expected numerically. Players were often diagnosed while in their 30s; the normal onset is after 60. These findings prompted some small, follow-up epidemiological studies of A.L.S. patients in Europe. To the surprise and likely consternation of the researchers, they found weak but measurable associations between playing contact sports and a heightened risk for A.L.S. The data even showed links between being physically active — meaning exercising regularly — and contracting the disease, raising concerns among scientists that exercise might somehow be inducing A.L.S. in susceptible people, perhaps by affecting brain neurons or increasing bodily stress. But these studies were extremely small and had methodological problems. So to better determine what role sports and exercise might play in the risk for A.L.S., researchers from across Europe recently combined their efforts into two major new studies. The results should reassure those of us who exercise. The numbers showed that physical activity — whether at work, in sports or during exercise — did not increase people’s risk of developing A.L.S. © 2014 The New York Times Company

Keyword: ALS-Lou Gehrig's Disease
Link ID: 20031 - Posted: 09.03.2014

By Kate Wong In 1871 Charles Darwin surmised that humans were evolutionarily closer to the African apes than to any other species alive. The recent sequencing of the gorilla, chimpanzee and bonobo genomes confirms that supposition and provides a clearer view of how we are connected: chimps and bonobos in particular take pride of place as our nearest living relatives, sharing approximately 99 percent of our DNA, with gorillas trailing at 98 percent. Yet that tiny portion of unshared DNA makes a world of difference: it gives us, for instance, our bipedal stance and the ability to plan missions to Mars. Scientists do not yet know how most of the DNA that is uniquely ours affects gene function. But they can conduct whole-genome analyses—with intriguing results. For example, comparing the 33 percent of our genome that codes for proteins with our relatives' genomes reveals that although the sum total of our genetic differences is small, the individual differences pervade the genome, affecting each of our chromosomes in numerous ways. © 2014 Scientific American

Keyword: Evolution; Aggression
Link ID: 20030 - Posted: 09.03.2014

By Jonathan Webb Science reporter, BBC News Monkeys at the top and bottom of the social pecking order have physically different brains, research has found. A particular network of brain areas was bigger in dominant animals, while other regions were bigger in subordinates. The study suggests that primate brains, including ours, can be specialised for life at either end of the hierarchy. The differences might reflect inherited tendencies toward leading or following, or the brain adapting to an animal's role in life - or a little of both. Neuroscientists made the discovery, which appears in the journal Plos Biology, by comparing brain scans from 25 macaque monkeys that were already "on file" as part of ongoing research at the University of Oxford. "We were also looking at learning and memory and decision-making, and the changes that are going on in your brain when you're doing those things," explained Dr MaryAnn Noonan, the study's first author. The decision to look at the animals' social status produced an unexpectedly clear result, Dr Noonan said. "It was surprising. All our monkeys were of different ages and different genders - but with fMRI (functional magnetic resonance imaging) you can control for all of that. And we were consistently seeing these same networks coming out." BBC © 2014

Keyword: Emotions; Aggression
Link ID: 20029 - Posted: 09.03.2014

|By Madhuvanthi Kannan We humans assume we are the smartest of all creations. In a world with over 8.7 million species, only we have the ability to understand the inner workings of our body while also unraveling the mysteries of the universe. We are the geniuses, the philosophers, the artists, the poets and savants. We amuse at a dog playing ball, a dolphin jumping rings, or a monkey imitating man because we think of these as remarkable acts for animals that, we presume, aren’t smart as us. But what is smart? Is it just about having ideas, or being good at language and math? Scientists have shown, time and again, that many animals have an extraordinary intellect. Unlike an average human brain that can barely recall a vivid scene from the last hour, chimps have a photographic memory and can memorize patterns they see in the blink of an eye. Sea lions and elephants can remember faces from decades ago. Animals also have a unique sense perception. Sniffer dogs can detect the first signs of colon cancer by the scents of patients, while doctors flounder in early diagnosis. So the point is animals are smart too. But that’s not the upsetting realization. What happens when, for just once, a chimp or a dog challenges man to one of their feats? Well, for one, a precarious face-off – like the one Matt Reeves conceived in the Planet of the Apes – would seem a tad less unlikely than we thought. In a recent study by psychologists Colin Camerer and Tetsuro Matsuzawa, chimps and humans played a strategy game – and unexpectedly, the chimps outplayed the humans. Chimps are a scientist’s favorite model to understand human brain and behavior. Chimp and human DNAs overlap by a whopping 99 percent, which makes us closer to chimps than horses to zebras. Yet at some point, we evolved differently. Our behavior and personalities, molded to some extent by our distinct societies, are strikingly different from that of our fellow primates. Chimps are aggressive and status-hungry within their hierarchical societies, knit around a dominant alpha male. We are, perhaps, a little less so. So the question arises whether competitive behavior is hard-wired in them. © 2014 Scientific American

Keyword: Intelligence; Aggression
Link ID: 20028 - Posted: 09.03.2014

By Virginia Morell Figaro, a Goffin’s cockatoo (Cacatua goffini) housed at a research lab in Austria, stunned scientists a few years ago when he began spontaneously making stick tools from the wooden beams of his aviary. The Indonesian parrots are not known to use tools in the wild, yet Figaro confidently employed his sticks to rake in nuts outside his wire enclosure. Wondering if Figaro’s fellow cockatoos could learn by watching his methods, scientists set up experiments for a dozen of them. One group watched as Figaro used a stick to reach a nut placed inside an acrylic box with a wire-mesh front panel; others saw “ghost demonstrators”—magnets that were hidden beneath a table and that the researchers controlled—displace the treats. Each bird was then placed in front of the box, with a stick just like Figaro’s lying nearby. The group of three males and three females that had watched Figaro also picked up the sticks, and made some efforts reminiscent of his actions. But only those three males, such as the one in the photo above, became proficient with the tool and successfully retrieved the nuts, the scientists report online today in the Proceedings of the Royal Society B. None of the females did so; nor did any of the birds, male or female, in the ghost demonstrator group. Because the latter group failed entirely, the study shows that the birds need living teachers, the scientists say. Intriguingly, the clever observers developed a better technique than Figaro’s for getting the treat. Thus, the cockatoos weren’t copying his exact actions, but emulating them—a distinction that implies some degree of creativity. Two of the successful cockatoos were later given a chance to make a tool of their own. One did so immediately (as in the video above), and the other succeeded after watching Figaro. It may be that by learning to use a tool, the birds are stimulated to make tools of their own, the scientists say. © 2014 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 20027 - Posted: 09.03.2014

Moheb Costandi Autism can be baffling, appearing in various forms and guises and thwarting our best attempts to understand the minds of people affected by it. Anything we know for sure about the disorder can probably be traced back to the pioneering research of the developmental psychologist Uta Frith. Frith was the first to propose that people with autism lack theory of mind, the ability to attribute beliefs, intentions and desires to others. She also recognized the superior perceptual abilities of many with the disorder — and their tendency to be unable to see the forest for the trees. Frith, now affiliated with the Institute of Cognitive Neuroscience at University College London (UCL), has shaped autism research for an entire generation of investigators. Meanwhile, her husband Chris Frith formulated a new view of schizophrenia, a mental illness marked by hallucinations, disordered thinking and apathy. His work explored how the disorder affects the experience of agency, the sense that we are in control of our bodies and responsible for our actions. And his innovations in brain imaging helped researchers examine the relationship between brain and mind. Independently, husband and wife explored the social and cognitive aspects of these psychiatric disorders. Together, they helped lay the foundations of cognitive neuroscience, the discipline that seeks to understand the biological basis of thought processes. Trevor Robbins, a cognitive neuroscientist at the University of Cambridge in the U.K., calls them “tremendously influential pioneers,” in particular because both brought a social perspective to cognitive neuroscience. © Copyright 2014 Simons Foundation

Keyword: Autism
Link ID: 20019 - Posted: 09.02.2014

By ANAHAD O’CONNOR People who avoid carbohydrates and eat more fat, even saturated fat, lose more body fat and have fewer cardiovascular risks than people who follow the low-fat diet that health authorities have favored for decades, a major new study shows. The findings are unlikely to be the final salvo in what has been a long and often contentious debate about what foods are best to eat for weight loss and overall health. The notion that dietary fat is harmful, particularly saturated fat, arose decades ago from comparisons of disease rates among large national populations. But more recent clinical studies in which individuals and their diets were assessed over time have produced a more complex picture. Some have provided strong evidence that people can sharply reduce their heart disease risk by eating fewer carbohydrates and more dietary fat, with the exception of trans fats. The new findings suggest that this strategy more effectively reduces body fat and also lowers overall weight. The new study was financed by the National Institutes of Health and published in the Annals of Internal Medicine. It included a racially diverse group of 150 men and women — a rarity in clinical nutrition studies — who were assigned to follow diets for one year that limited either the amount of carbs or fat that they could eat, but not overall calories. “To my knowledge, this is one of the first long-term trials that’s given these diets without calorie restrictions,” said Dariush Mozaffarian, the dean of the Friedman School of Nutrition Science and Policy at Tufts University, who was not involved in the new study. “It shows that in a free-living setting, cutting your carbs helps you lose weight without focusing on calories. And that’s really important because someone can change what they eat more easily than trying to cut down on their calories.” © 2014 The New York Times Company

Keyword: Obesity
Link ID: 20018 - Posted: 09.02.2014

Carl Zimmer An unassuming single-celled organism called Toxoplasma gondii is one of the most successful parasites on Earth, infecting an estimated 11 percent of Americans and perhaps half of all people worldwide. It’s just as prevalent in many other species of mammals and birds. In a recent study in Ohio, scientists found the parasite in three-quarters of the white-tailed deer they studied. One reason for Toxoplasma’s success is its ability to manipulate its hosts. The parasite can influence their behavior, so much so that hosts can put themselves at risk of death. Scientists first discovered this strange mind control in the 1990s, but it’s been hard to figure out how they manage it. Now a new study suggests that Toxoplasma can turn its host’s genes on and off — and it’s possible other parasites use this strategy, too. Toxoplasma manipulates its hosts to complete its life cycle. Although it can infect any mammal or bird, it can reproduce only inside of a cat. The parasites produce cysts that get passed out of the cat with its feces; once in the soil, the cysts infect new hosts. Toxoplasma returns to cats via their prey. But a host like a rat has evolved to avoid cats as much as possible, taking evasive action from the very moment it smells feline odor. Experiments on rats and mice have shown that Toxoplasma alters their response to cat smells. Many infected rodents lose their natural fear of the scent. Some even seem to be attracted to it. Manipulating the behavior of a host is a fairly common strategy among parasites, but it’s hard to fathom how they manage it. A rat’s response to cat odor, for example, emerges from complex networks of neurons that detect an odor, figure out its source and decide on the right response in a given moment. © 2014 The New York Times Company

Keyword: Emotions; Aggression
Link ID: 20017 - Posted: 08.30.2014

Memory can be boosted by using a magnetic field to stimulate part of the brain, a study has shown. The effect lasts at least 24 hours after the stimulation is given, improving the ability of volunteers to remember words linked to photos of faces. Scientists believe the discovery could lead to new treatments for loss of memory function caused by ageing, strokes, head injuries and early Alzheimer's disease. Dr Joel Voss, from Northwestern University in Chicago, said: "We show for the first time that you can specifically change memory functions of the brain in adults without surgery or drugs, which have not proven effective. "This non-invasive stimulation improves the ability to learn new things. It has tremendous potential for treating memory disorders." The scientists focused on associative memory, the ability to learn and remember relationships between unrelated items. An example of associative memory would be linking someone to a particular restaurant where you both once dined. It involves a network of different brain regions working in concert with a key memory structure called the hippocampus, which has been compared to an "orchestra conductor" directing brain activity. Stimulating the hippocampus caused the "musicians" – the brain regions – to "play" more in time, thereby tightening up their performance. A total of 16 volunteers aged 21-40 took part in the study, agreeing to undergo 20 minutes of transcranial magnetic stimulation (TMS) every day for five days. © 2014 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 20015 - Posted: 08.30.2014

by Michael Slezak It's odourless, colourless, tasteless and mostly non-reactive – but it may help you forget. Xenon gas has been shown to erase fearful memories in mice, raising the possibility that it could be used to treat post-traumatic stress disorder (PTSD) if the results are replicated in a human trial next year. The method exploits a neurological process known as "reconsolidation". When memories are recalled, they seem to get re-encoded, almost like a new memory. When this process is taking place, the memories become malleable and can be subtly altered. This new research suggests that at least in mice, the reconsolidation process might be partially blocked by xenon, essentially erasing fearful memories. Among other things, xenon is used as an anaesthetic. Frozen in fear Edward Meloni and his colleagues at Harvard Medical School in Boston trained mice to be afraid of a sound by placing them in a cage and giving them an electric shock after the sound was played. Thereafter, if the mice heard the noise, they would become frightened and freeze. Later, the team played the sound and then gave the mice either a low dose of xenon gas for an hour or just exposed them to normal air. Mice that were exposed to xenon froze for less time in response to the sound than the other mice. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory
Link ID: 20014 - Posted: 08.30.2014

By GARY GREENBERG Joel Gold first observed the Truman Show delusion — in which people believe they are the involuntary subjects of a reality television show whose producers are scripting the vicissitudes of their lives — on Halloween night 2003 at Bellevue Hospital, where he was the chief attending psychiatrist. “Suspicious Minds,” which he wrote with his brother, Ian, an associate professor of philosophy and psychology at McGill University, is an attempt to use this delusion, which has been observed by many clinicians, to pose questions that have gone out of fashion in psychiatry over the last half-century: Why does a mentally ill person have the delusions he or she has? And, following the lead of the medical historian Roy Porter, who once wrote that “every age gets the lunatics it deserves,” what can we learn about ourselves and our times from examining the content of madness? The Golds’ answer is a dual broadside: against a psychiatric profession that has become infatuated with neuroscience as part of its longstanding attempt to establish itself as “real medicine,” and against a culture that has become too networked for its own good. Current psychiatric practice is to treat delusions as the random noise generated by a malfunctioning (and mindless) brain — a strategy that would be more convincing if doctors had a better idea of how the brain produced madness and how to cure it. According to the Golds, ignoring the content of delusions like T.S.D. can only make mentally ill people feel more misunderstood, even as it distracts the rest of us from the true significance of the delusion: that we live in a society that has put us all under surveillance. T.S.D. sufferers may be paranoid, but that does not mean they are wrong to think the whole world is watching. This is not to say they aren’t crazy. Mental illness may be “just a frayed, weakened version of mental health,” but what is in tatters for T.S.D. patients is something crucial to negotiating social life, and that, according to the Golds, is the primary purpose toward which our big brains have evolved: the ability to read other people’s intentions or, as cognitive scientists put it, to have a theory of mind. This capacity is double-edged. “The better you are at ToM,” they write, “the greater your capacity for friendship.” © 2014 The New York Times Company

Keyword: Schizophrenia
Link ID: 20013 - Posted: 08.30.2014

By Virginia Morell A dog’s bark may sound like nothing but noise, but it encodes important information. In 2005, scientists showed that people can tell whether a dog is lonely, happy, or aggressive just by listening to his bark. Now, the same group has shown that dogs themselves distinguish between the barks of pooches they’re familiar with and the barks of strangers and respond differently to each. The team tested pet dogs’ reactions to barks by playing back recorded barks of a familiar and unfamiliar dog. The recordings were made in two different settings: when the pooch was alone, and when he was barking at a stranger at his home’s fence. When the test dogs heard a strange dog barking, they stayed closer to and for a longer period of time at their home’s gate than when they heard the bark of a familiar dog. But when they heard an unknown and lonely dog barking, they stayed close to their house and away from the gate, the team reports this month in Applied Animal Behaviour Science. They also moved closer toward their house when they heard a familiar dog’s barks, and they barked more often in response to a strange dog barking. Dogs, the scientists conclude from this first study of pet dogs barking in their natural environment (their owners’ homes), do indeed pay attention to and glean detailed information from their fellows’ barks. © 2014 American Association for the Advancement of Science

Keyword: Animal Communication; Aggression
Link ID: 20012 - Posted: 08.30.2014

By ANNA NORTH “You can learn a lot from what you see on a screen,” said Yalda T. Uhls. However, she told Op-Talk, “It’s not going to give you context. It’s not going to give you the big picture.” Ms. Uhls, a researcher at the Children’s Digital Media Center in Los Angeles, was part of a team that looked at what happened when kids were separated from their screens — phones, iPads, laptops and the like — for several days. Their findings may have implications for adults’ relationship to technology, too. For a paper published in the journal Computers in Human Behavior, the researchers studied 51 sixth-graders who attended a five-day camp where no electronic devices were allowed. Before and after the camp, they tested the kids’ emotion-recognition skills using photos of facial expressions and sound-free video clips designed to measure their reading of nonverbal cues. The kids did significantly better on both tests after five screen-free days; a group of sixth-graders from the same school who didn’t go to camp showed less or no improvement. Ms. Uhls, who also works for the nonprofit Common Sense Media, told Op-Talk that a number of factors might have been at play in the campers’ improvement. For instance, their time in nature might have played a role. But to her, the most likely explanation was the sheer increase in face-to-face interaction: “The issue really is not that staring at screens is going to make you bad at recognizing emotions,” she said. “It’s more that if you’re looking at screens you’re not looking at the world, and you’re not looking at people.” Many adults have sought out the same Internet-free experience the kids had, though they usually don’t go to camp to get it. The novelist Neil Gaiman took a “sabbatical from social media” in 2013, “so I can concentrate on my day job: making things up.” © 2014 The New York Times Company

Keyword: Emotions
Link ID: 20006 - Posted: 08.28.2014

By Michael Balter Humans are generally highly cooperative and often impressively altruistic, quicker than any other animal species to help out strangers in need. A new study suggests that our lineage got that way by adopting so-called cooperative breeding: the caring for infants not just by the mother, but also by other members of the family and sometimes even unrelated adults. In addition to helping us get along with others, the advance led to the development of language and complex civilizations, the authors say. Cooperative breeding is not unique to humans. Up to 10% of birds are cooperative breeders, as are meerkats and New World monkeys such as tamarins and marmosets. But our closest primate relatives, great apes such as chimpanzees, are not cooperative breeders. Because the human and chimpanzee lineages split between 5 million and 7 million years ago, and humans are the only apes that engage in cooperative breeding, researchers have puzzled over how this helping behavior might have evolved all over again on the human line. In the late 1990s, Sarah Blaffer Hrdy, now an anthropologist emeritus at the University of California, Davis, proposed the cooperative breeding hypothesis. According to her model, early in their evolution humans added cooperative breeding behaviors to their already existing advanced ape cognition, leading to a powerful combination of smarts and sociality that fueled even bigger brains, the evolution of language, and unprecedented levels of cooperation. Soon after Hrdy’s proposal, anthropologists Carel van Schaik and Judith Burkart of the University of Zurich in Switzerland began to test some of these ideas, demonstrating that cooperatively breeding primates like marmosets engaged in seemingly altruistic behavior by helping other marmosets get food with no immediate reward to themselves. © 2014 American Association for the Advancement of Science.

Keyword: Evolution; Aggression
Link ID: 20001 - Posted: 08.27.2014