Chapter 16. None
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Obese people who get surgery to lose weight have half the risk of developing heart failure as do patients who make lifestyle changes to shed excess pounds, a recent study suggests. “We were surprised by the large difference in heart failure incidence between the two groups,” said lead study author Johan Sundstrom of Uppsala University in Sweden. It’s possible that gastric bypass patients had a lower risk of heart failure because they lost more weight than the group trying to do so without surgery. Researchers also found that losing 22 pounds by any means was tied to a 23 percent drop in heart failure risk. The study team examined data on 25,805 obese people who had gastric bypass surgery, which reduces the stomach to a small pouch, and 13,701 patients who were put on low-calorie diets. After following half of the patients for at least four years, people who had gastric bypass were found to be 46 percent less likely to have developed heart failure. After one year, surgery patients had an average weight loss 41.4 pounds greater than that of those who relied on diet and exercise, the study found. After two years, surgery was associated with an average weight loss that was 49.8 pounds more than those who undertook lifestyle changes. Some previous research has linked obesity to heart failure, and a growing body of evidence suggests that obesity might directly cause the heart condition, Sundstrom said. While the new study wasn’t designed to prove a causal relationship, it adds more evidence in support of this possibility. © 1996-2017 The Washington Post
Link ID: 23372 - Posted: 03.19.2017
By Anna Azvolinsky Delivering a CRISPR/Cas9–based therapy directly to the eye via a viral vector can prevent retinal degeneration in a mouse model of retinitis pigmentosa, a team led by researchers at the National Eye Institute reported in Nature Communications today (March 14). Retinitis pigmentosa, which affects around one in 4,000 people, causes retinal degeneration that eventually leads to blindness. The inherited disorder has been mapped to more than 60 genes (and more than 3,000 mutations), presenting a challenge for researchers working toward a gene therapy. The results of this latest study suggest that a broader, gene-editing–based therapeutic approach could be used to target many of the genetic defects underlying retinitis pigmentosa. “Given the lack of effective therapies for retinal degeneration, particularly the lack of therapies applicable to a broad range of different genetic varieties of this disease, this study represents a very exciting and important advance in our field,” Joseph Corbo, a neuropathologist at the Washington University School of Medicine in St. Louis who was not involved in the work, wrote in an email to The Scientist. This combination of “CRISPR technology with an adeno-associated virus vector, a system tried and true for delivering genetic information to the retina, may represent the first step in a global treatment approach for rod-mediated degenerative disease,” Shannon Boye, whose University of Florida lab develops gene replacement strategies for eye disorders, wrote in an email to The Scientist. © 1986-2017 The Scientist
Link ID: 23364 - Posted: 03.16.2017
By Mitch Leslie It sounds like a crazy way to improve your health—spend some time on a platform that vibrates at about the same frequency as the lowest string on a double bass. But recent research indicates that the procedure, known as whole-body vibration, may be helpful in illnesses from cerebral palsy to chronic obstructive pulmonary disease. Now, a new study of obese mice reveals that whole-body vibration provides similar metabolic benefits as walking on a treadmill, suggesting it may be useful for treating obesity and type II diabetes. “I think it’s very promising,” says exercise physiologist Lee Brown of the California State University in Fullerton, who wasn’t connected to the study. Although the effects are small, he says, researchers should follow-up to determine whether they can duplicate them in humans. Plenty of gyms feature whole-body vibration machines, and many athletes swear the activity improves their performance. The jiggling does seem to spur muscles to work harder, possibly triggering some of the same effects as exercise. But researchers still don’t know how the two compare, especially when it comes to people who are ill. So biomedical engineer Meghan McGee-Lawrence of the Medical College of Georgia in Augusta and colleagues decided to perform a head-to-head comparison of exercise and whole-body vibration. The researchers tested mutant mice resistant to the appetite-controlling hormone leptin, resulting in obesity and diabetes. © 2017 American Association for the Advancement of Science.
Link ID: 23362 - Posted: 03.16.2017
By Christof Koch We moderns take it for granted that consciousness is intimately tied up with the brain. But this assumption did not always hold. For much of recorded history, the heart was considered the seat of reason, emotion, valor and mind. Indeed, the first step in mummification in ancient Egypt was to scoop out the brain through the nostrils and discard it, whereas the heart, the liver and other internal organs were carefully extracted and preserved. The pharaoh would then have access to everything he needed in his afterlife. Everything except for his brain! Several millennia later Aristotle, one of the greatest of all biologists, taxonomists, embryologists and the first evolutionist, had this to say: “And of course, the brain is not responsible for any of the sensations at all. The correct view [is] that the seat and source of sensation is the region of the heart.” He argued consistently that the primary function of the wet and cold brain is to cool the warm blood coming from the heart. Another set of historical texts is no more insightful on this question. The Old and the New Testaments are filled with references to the heart but entirely devoid of any mentions of the brain. Debate about what the brain does grew ever more intense over ensuing millennia. The modern embodiment of these arguments seeks to identify the precise areas within the three-pound cranial mass where consciousness arises. What follows is an attempt to size up the past and present of this transmillennial journey. The field has scored successes in delineating a brain region that keeps the neural engine humming. Switched on, you are awake and conscious. In another setting, your body is asleep, yet you still have experiences—you dream. In a third position, you are deeply asleep, effectively off-line. © 2017 Scientific American
Link ID: 23361 - Posted: 03.16.2017
Ian Sample Science editor Researchers have overcome one of the major stumbling blocks in artificial intelligence with a program that can learn one task after another using skills it acquires on the way. Developed by Google’s AI company, DeepMind, the program has taken on a range of different tasks and performed almost as well as a human. Crucially, and uniquely, the AI does not forget how it solved past problems, and uses the knowledge to tackle new ones. The AI is not capable of the general intelligence that humans draw on when they are faced with new challenges; its use of past lessons is more limited. But the work shows a way around a problem that had to be solved if researchers are ever to build so-called artificial general intelligence (AGI) machines that match human intelligence. “If we’re going to have computer programs that are more intelligent and more useful, then they will have to have this ability to learn sequentially,” said James Kirkpatrick at DeepMind. The ability to remember old skills and apply them to new tasks comes naturally to humans. A regular rollerblader might find ice skating a breeze because one skill helps the other. But recreating this ability in computers has proved a huge challenge for AI researchers. AI programs are typically one trick ponies that excel at one task, and one task only.
By MATT RICHTEL Amid an opioid epidemic, the rise of deadly synthetic drugs and the widening legalization of marijuana, a curious bright spot has emerged in the youth drug culture: American teenagers are growing less likely to try or regularly use drugs, including alcohol. With minor fits and starts, the trend has been building for a decade, with no clear understanding as to why. Some experts theorize that falling cigarette-smoking rates are cutting into a key gateway to drugs, or that antidrug education campaigns, long a largely failed enterprise, have finally taken hold. But researchers are starting to ponder an intriguing question: Are teenagers using drugs less in part because they are constantly stimulated and entertained by their computers and phones? The possibility is worth exploring, they say, because use of smartphones and tablets has exploded over the same period that drug use has declined. This correlation does not mean that one phenomenon is causing the other, but scientists say interactive media appears to play to similar impulses as drug experimentation, including sensation-seeking and the desire for independence. Or it might be that gadgets simply absorb a lot of time that could be used for other pursuits, including partying. Nora Volkow, director of the National Institute on Drug Abuse, says she plans to begin research on the topic in the next few months, and will convene a group of scholars in April to discuss it. The possibility that smartphones were contributing to a decline in drug use by teenagers, Dr. Volkow said, was the first question she asked when she saw the agency’s most recent survey results. The survey, “Monitoring the Future,” an annual government-funded report measuring drug use by teenagers, found that past-year use of illicit drugs other than marijuana was at the lowest level in the 40-year history of the project for eighth, 10th and 12th graders. © 2017 The New York Times Company
Keyword: Drug Abuse
Link ID: 23357 - Posted: 03.15.2017
Heidi Ledford Like a zombie that keeps on kicking, legal battles over mutant mice used for Alzheimer’s research are haunting the field once again — four years after the last round of lawsuits. In the latest case, the University of South Florida (USF) in Tampa has sued the US National Institutes of Health (NIH) for authorizing the distribution of a particular type of mouse used in the field. The first pre-trial hearing in the case is set to begin in a federal court on 21 March. The university holds a patent on the mouse, but the NIH has contracted the Jackson Laboratory, a non-profit organization in Bar Harbor, Maine, to supply the animals to researchers. The USF is now claiming that it deserves some of the money that went to the contractor. If the suit, filed in December 2015, is successful, it could set a precedent for other universities, cautions Robert Cook-Deegan, an intellectual-property scholar at the Washington DC centre of Arizona State University in Tempe. And that would threaten the affordability of and access to lab animals used to investigate. “It feels greedy to me,” Cook-Deegan says. “If other universities start doing this, all it does is push up the cost of research tools.” The mice, on which the USF filed a patent in 1997, express mutated forms of two genes1. These modifications help researchers to study how amyloid plaques develop in the brain, and enable them to investigate behavioural changes that manifest before those plaques appear. © 2017 Macmillan Publishers Limited,
Link ID: 23356 - Posted: 03.15.2017
Jon Hamilton An orangutan named Rocky is helping scientists figure out when early humans might have uttered the first word. Rocky, who is 12 and lives at the Indianapolis Zoo, has shown that he can control his vocal cords much the way people do. He can learn new vocal sounds and even match the pitch of sounds made by a person. "Rocky, and probably other great apes, can do things with their vocal apparatus that, for decades, people have asserted was impossible," says Rob Shumaker, the zoo's director, who has studied orangutans for more than 30 years. Rocky's abilities suggest that our human ancestors could have begun speaking 10 million years ago, about the time humans and great apes diverged, Shumaker says. Until now, many scientists thought that speech required changes in the brain and vocal apparatus that evolved more recently, during the past 2 million years. The vocal abilities of orangutans might have gone undetected had it not been for Rocky, an ape with an unusual past and a rare relationship with people. Rocky was separated from his mother soon after he was born, and spent his early years raised largely by people, and working in show business. "He was certainly the most visible orangutan in entertainment at the time," says Shumaker. "TV commercials, things like that."
There is widespread interest among teachers in the use of neuroscientific research findings in educational practice. However, there are also misconceptions and myths that are supposedly based on sound neuroscience that are prevalent in our schools. We wish to draw attention to this problem by focusing on an educational practice supposedly based on neuroscience that lacks sufficient evidence and so we believe should not be promoted or supported. Generally known as “learning styles”, it is the belief that individuals can benefit from receiving information in their preferred format, based on a self-report questionnaire. This belief has much intuitive appeal because individuals are better at some things than others and ultimately there may be a brain basis for these differences. Learning styles promises to optimise education by tailoring materials to match the individual’s preferred mode of sensory information processing. There are, however, a number of problems with the learning styles approach. First, there is no coherent framework of preferred learning styles. Usually, individuals are categorised into one of three preferred styles of auditory, visual or kinesthetic learners based on self-reports. One study found that there were more than 70 different models of learning styles including among others, “left v right brain,” “holistic v serialists,” “verbalisers v visualisers” and so on. The second problem is that categorising individuals can lead to the assumption of fixed or rigid learning style, which can impair motivation to apply oneself or adapt. Finally, and most damning, is that there have been systematic studies of the effectiveness of learning styles that have consistently found either no evidence or very weak evidence to support the hypothesis that matching or “meshing” material in the appropriate format to an individual’s learning style is selectively more effective for educational attainment. Students will improve if they think about how they learn but not because material is matched to their supposed learning style.
Keyword: Learning & Memory
Link ID: 23352 - Posted: 03.14.2017
An international team of researchers has conducted the first study of its kind to look at the genomic underpinnings of obesity in continental Africans and African-Americans. They discovered that approximately 1 percent of West Africans, African-Americans and others of African ancestry carry a genomic variant that increases their risk of obesity, a finding that provides insight into why obesity clusters in families. Researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, and their African collaborators published their findings March 13, 2017, in the journal Obesity. People with genomic differences in the semaphorin-4D (SEMA4D) gene were about six pounds heavier than those without the genomic variant, according to the study. Most of the genomic studies conducted on obesity to date have been in people of European ancestry, despite an increased risk of obesity in people of African ancestry. Obesity is a global health problem, contributing to premature death and morbidity by increasing a person’s risk of developing diabetes, hypertension, heart disease and some cancers. While obesity mostly results from lifestyle and cultural factors, including excess calorie intake and inadequate levels of physical activity, it has a strong genomic component. The burden of obesity is, however, not the same across U.S. ethnic groups, with African-Americans having the highest age-adjusted rates of obesity, said Charles N. Rotimi, Ph.D., chief of NHGRI’s Metabolic, Cardiovascular and Inflammatory Disease Genomics Branch and director of the Center for Research on Genomics and Global Health (CRGGH) at NIH. CRGGH examines the socio-cultural and genomic factors at work in health disparities — the negative health outcomes that impact certain groups of people — so they can be translated into policies that reduce or eliminate healthcare inequalities in the United States and globally.
Is there life after death for our brains? It depends. Loretta Norton, a doctoral student at Western University in Canada, was curious, so she and her collaborators asked critically ill patients and their families if they could record brain activity in the half hours before and after life support was removed. They ended up recording four patients with electroencephalography, better known as EEG, which uses small electrodes attached to a person’s head to measure electrical activity in the brain. In three patients, the EEG showed brain activity stopping up to 10 minutes before the person’s heart stopped beating. But in a fourth, the EEG picked up so-called delta wave bursts up to 10 minutes after the person’s heart stopped. Delta waves are associated with deep sleep, also known as slow-wave sleep. In living people, neuroscientists consider slow-wave sleep to be a key process in consolidating memories. The study also raises questions about the exact moment when death occurs. Here’s Neuroskeptic: Another interesting finding was that the actual moment at which the heart stopped was not associated with any abrupt change in the EEG. The authors found no evidence of the large “delta blip” (the so-called “death wave“), an electrical phenomena which has been observed in rats following decapitation. With only four patients, it’s difficult to draw any sort of broad conclusion from this study. But it does suggest that death may be a gradual process as opposed to a distinct moment in time. © 1996-2017 WGBH Educational Foundation
Link ID: 23348 - Posted: 03.13.2017
By STEPH YIN Despite being just the size of a rice grain, robber flies, which live all over the world, are champion predators. In field experiments, they can detect targets the size of sand grains from nearly two feet away — 100 times the fly’s body length — and intercept them in under half a second. What’s more, they never miss their mark. A team led by scientists at the University of Cambridge has started to unveil the secrets to the robber fly’s prowess. In a study published Thursday in Current Biology, the team outlined the mechanics of the fly’s pursuit, from its impressive eye anatomy to how it makes a successful catch every time. Notably, the researchers observed a behavior never before described in a flying animal: About 30 centimeters from its prey, the insect slows, turns slightly and brings itself in for a close catch. “This ‘lock-on’ phase and change in behavior during a flight is quite remarkable,” said Sam Fabian, a graduate student at Cambridge and an author of the study. “We would actually expect them to do something very simple — just accelerate and hit the target.” The scientists surveyed robber flies in the field using a “fly teaser,” which consisted of beads on a rapidly moving fishing line controlled by a motor. As the flies charged at the bait, the researchers captured their movements using high-speed cameras. At the start of the robber fly’s conquest, it sits on a perch and scans the sky for passing prey. When it glimpses a potential meal, it takes flight, maintaining a steady angle between itself and its target. This proactive strategy, using a “constant bearing angle,” is also employed by fish, bats and sailors, Mr. Fabian said. © 2017 The New York Times Company
Link ID: 23346 - Posted: 03.11.2017
If I was the late Andy Rooney, I’d say “You know what really bothers me? When science shows some facts about nature, and then someone rejects those facts because they’re inconvenient or uncomfortable for their ideology.” Indeed, when people ignore such inconvenient truths, it not only makes their cause look bad, but can produce palpable harm. Case in point: the damage that the Russian charlatan-agronomist Lysenko did to Soviet agriculture under Stalin. Rejecting both natural selection and modern genetics, Lysenko made all sorts of wild promises about improving Soviet agriculture based on bogus treatment of plants that would supposedly change their genetics. It not only didn’t work, failing to relieve Russia of its chronic famines, but Lyesnko’s Stalin-supported resistance to modern (“Western”) genetics led to the imprisonment and even the execution of really good geneticists and agronomists like Niklolia Vavilov. The ideological embrace of an unevidenced but politically amenable view of science set back Russian genetics for decades. Other cases in point: the denial of evolution by creationists, and of anthropogenic global warming by conservatives. I needn’t belabor these. But the opposition to research on group and sex differences continues. One of its big exponents is the author Cordelia Fine, who has written two books with the explicit aim of showing that there are no reliably accepted evolved and biological differences in behavior between men and women. I read her first book, Delusions of Gender, and found it a mixed bag: some of her targets did indeed do bad science, and she properly called them out; but the book was also tendentious, and wasn’t objective about other studies. I’m now about to read her second book, Testosterone Rex: Myths of Sex, Science, and Society. Judging from the reviews, which have been positive, it’s just as much a polemic as the first book, and has an ideological aim.
By Meredith Wadman The U.S. Fish and Wildlife Service (FWS) is considering repealing a rule that exempts captive members of 11 threatened primate species from protection under the federal Endangered Species Act (ESA). If the agency approves a repeal, the captive animals would be designated as threatened, like their wild counterparts, and researchers would need to apply for permits for experiments. To be approved, studies would have to be aimed at species survival and recovery. A rule change would affect biomedical researchers who work with several hundred captive Japanese macaques housed in Oregon. People for the Ethical Treatment of Animals (PETA), a Norfolk, Virginia–based animal rights organization, petitioned FWS this past January, asking it to extend ESA protections to captive members of the 11 species housed in research labs, zoos, and held as pets. For obscure reasons, a “special rule” exempted these captive populations from ESA protection in 1976. Among the 11 species, the Japanese macaque (Macaca fuscata) appears to be the only one regularly used in U.S. research. A troop of roughly 300 resides at the Oregon National Primate Research Center in Hillsboro. That is where the main impact of a successful PETA petition would be felt by scientists. “The importance of protecting endangered animals can’t be minimized,” says Jared Goodman, the director of animal law at the PETA Foundation in Los Angeles, California. “These animals are not listed lightly [under the Endangered Species Act],” he adds. “And the agencies until now have unlawfully provided differential treatment to animals in captivity who are similarly threatened.” © 2017 American Association for the Advancement of Science.
Keyword: Animal Rights
Link ID: 23335 - Posted: 03.10.2017
There has been much gnashing of teeth in the science-journalism community this week, with the release of an infographic that claims to rate the best and worst sites for scientific news. According to the American Council on Science and Health, which helped to prepare the ranking, the field is in a shoddy state. “If journalism as a whole is bad (and it is),” says the council, “science journalism is even worse. Not only is it susceptible to the same sorts of biases that afflict regular journalism, but it is uniquely vulnerable to outrageous sensationalism”. News aggregator RealClearScience, which also worked on the analysis, goes further: “Much of science reporting is a morass of ideologically driven junk science, hyped research, or thick, technical jargon that almost no one can understand”. How — without bias or outrageous sensationalism, of course — do they judge the newspapers and magazines that emerge from this sludge? Simple: they rank each by how evidence-based and compelling they subjectively judge its content to be. Modesty (almost) prevents us from naming the publication graded highest on both (okay, it’s Nature), but some names are lower than they would like. Big hitters including The New York Times, The Washington Post and The Guardian score relatively poorly. It’s a curious exercise, and one that fails to satisfy on any level. It is, of course, flattering to be judged as producing compelling content. But one audience’s compelling is another’s snoozefest, so it seems strikingly unfair to directly compare publications that serve readers with such different interests as, say, The Economist and Chemistry World. It is equally unfair to damn all who work on a publication because of some stories that do not meet the grade. (This is especially pertinent now that online offerings spread the brand and the content so much thinner.) © 2017 Macmillan Publishers Limited
Link ID: 23334 - Posted: 03.09.2017
Mo Costandi To many of us, having to memorize a long list of items feels like a chore. But for others, it is more like a sport. Every year, hundreds of these ‘memory athletes’ compete with one another in the World Memory Championships, memorising hundreds of words, numbers, or other pieces of information within minutes. The current world champion is Alex Mullen, who beat his competitors by memorizing a string of more than 550 digits in under 5 minutes. You may think that such prodigious mental feats are linked to having an unusual brain, or to being extraordinarily clever. But they are not. New research published in the journal Neuron shows that you, too, can be a super memorizer with just six weeks of intensive mnemonic training, and also reveals the long-lasting changes to brain structure and function that occur as a result of such training. The Homer Simpson effect: forgetting to remember Read more Martin Dresler of Radboud University in the Netherlands and his colleagues recruited 23 memory athletes, all of whom are currently in the top 50 of the memory sports world rankings, and a group of control participants, who had no previous experience of memory training, and who were carefully selected to match the group of champions in age, sex, and IQ. © 2017 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 23333 - Posted: 03.09.2017
By Catherine Offord Getting to Santa María, Bolivia, is no easy feat. Home to a farming and foraging society, the village is located deep in the Amazon rainforest and is accessible only by river. The area lacks electricity and running water, and the Tsimane’ people who live there make contact with the outside world only occasionally, during trips to neighboring towns. But for auditory researcher Josh McDermott, this remoteness was central to the community’s scientific appeal. In 2015, the MIT scientist loaded a laptop, headphones, and a gasoline generator into a canoe and pushed off from the Amazonian town of San Borja, some 50 kilometers downriver from Santa María. Together with collaborator Ricardo Godoy, an anthropologist at Brandeis University, McDermott planned to carry out experiments to test whether the Tsimane’ could discern certain combinations of musical tones, and whether they preferred some over others. The pair wanted to address a long-standing question in music research: Are the features of musical perception seen across cultures innate, or do similarities in preferences observed around the world mirror the spread of Western culture and its (much-better-studied) music? “Particular musical intervals are used in Western music and in other cultures,” McDermott says. “They don’t appear to be random—some are used more commonly than others. The question is: What’s the explanation for that?” © 1986-2017 The Scientist
Link ID: 23332 - Posted: 03.09.2017
By Colin Barras What a difference 1000 kilometres make. Neanderthals living in prehistoric Belgium enjoyed their meat – but the Neanderthals who lived in what is now northern Spain seem to have survived on an almost exclusively vegetarian diet. This is according to new DNA analysis that also suggests sick Neanderthals could self-medicate with naturally occurring painkillers and antibiotics, and that they shared mouth microbiomes with humans – perhaps exchanged by kissing. Neanderthals didn’t clean their teeth particularly well – which is lucky for scientific investigators. Over time, plaque built up into a hard substance called dental calculus, which still clings to the ancient teeth even after tens of thousands of years. Researchers have already identified tiny food fragments in ancient dental calculus to get an insight into the diets of prehistoric hominins. Now Laura Weyrich at the University of Adelaide, Australia, and her colleagues have shown that dental calculus also carries ancient DNA that can reveal both what Neanderthals ate and which bacteria lived in their mouths. The team focused on three Neanderthals – two 48,000-year-old specimens from a site called El Sidrón in Spain and a 39,000-year-old specimen from a site called Spy in Belgium. The results suggested that the Spy Neanderthal often dined on woolly rhinoceros, sheep and mushrooms – but no plants. The El Sidrón Neanderthals ate more meagre fare: moss, bark and mushrooms – and, apparently, no meat. © Copyright Reed Business Information Ltd.
By Joshua A. Krisch Alcian blue-stained skateUCSF/JULIUS LABSharks, rays, and skates can detect minute fluctuations in electric fields—signals as subtle as a small fish breathing within the vicinity—and rely on specialized electrosensory cells to navigate, and hunt for prey hidden in the sand. But how these elasmobranch fish separate signal from noise has long baffled scientists. In an environment full of tiny electrical impulses, how does the skate home in on prey? See “Sensory Biology Around the Animal Kingdom” In a study published this week (March 6) in Nature, researchers at the University of California, San Francisco (UCSF), have analyzed the electrosensory cells of the little skate (Leucoraja erinacea). They found that voltage-gated calcium channels within these cells appear to work in concert with calcium-activated potassium channels, both specifically tuned in the little skate to pick up on weak electrical signals. “We have elucidated a molecular basis for electrosensation, at least in the little skate, which accounts for this unusual and highly sensitive mechanism for detecting electrical fields,” said coauthor Nicholas Bellono, a postdoc at USCF. “How general it is, we don’t know. But this is really the first instance in which we’ve been able to drill down and ask what molecules could be involved in this kind of system.” © 1986-2017 The Scientist
Keyword: Pain & Touch
Link ID: 23330 - Posted: 03.09.2017
By Andy Coghlan In primates such as humans, living in cooperative societies usually means having bigger brains — with brainpower needed to navigate complex social situations. But surprisingly, in birds the opposite may be true. Group-living woodpecker species have been found to have smaller brains than solitary ones. Cooperative societies might in fact enable birds to jettison all that brainpower otherwise needed on their own to constantly out-think, outfox and outcompete wily rivals, say researchers. Socialism in birds may therefore mean the individuals can afford to get dumber. The results are based on a comparison of brain sizes in 61 woodpecker species. The eight group-living species identified typically had brains that were roughly 30 per cent smaller than solitary and pair-living ones. “It’s a pretty big effect,” says lead researcher Richard Byrne at the University of St Andrews in the UK. Byrne’s explanation is that a solitary life is more taxing on the woodpecker brain than for those in cooperative groups, in which a kind of group-wide “social brain” takes the strain off individuals when a challenge arises. Group-living acorn woodpeckers in North America, for example, are well known for creating collective “granaries” of acorns by jamming them into crevices accessible to the whole group during hard times. © Copyright Reed Business Information Ltd.
Link ID: 23328 - Posted: 03.08.2017