Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 2617

By DAN BILEFSKY LONDON — The model in the Gucci ad is young and waiflike, her frail body draped in a geometric-pattern dress as she leans back in front of a wall painted with a tree branch that appears to mimic the angle of her silhouette. On Wednesday, the Advertising Standards Authority of Britain ruled that the ad was “irresponsible” and that the model looked “unhealthily thin,” fanning a perennial debate in the fashion industry over when thin is too thin. The regulator said that the way the woman in the image had posed elongated her torso and accentuated her waist, so that it appeared to be very small. It said her “somber facial expression and dark makeup, particularly around her eyes, made her face look gaunt.” It said the offending image — a still photograph of the model that appeared in an online video posted on the website of The Times of London in December — should not appear again in its current form. The specific image was removed from the video on Gucci’s YouTube channel, though the model still appears in the ad directed by Glen Luchford. The image deemed "irresponsible" by the Advertising Standards Authority of Britain appeared at the end of this online video, but has been taken out. Video by Gucci The Italian fashion brand, for its part, had defended the ad, saying it was part of a video that portrayed a dance party and that was aimed at an older and sophisticated audience. Nowhere in the ads were any models’ bones visible, it said, and they were all “toned and slim.” It noted that “it was, to some extent, a subjective issue as to whether a model looked unhealthily thin,” according to the authority. The decision by the advertising authority, an independent industry regulatory group, barred Gucci from using the image in advertisements in Britain. The ruling comes amid a longstanding debate on both sides of the Atlantic about the perils of overly thin models projecting an unhealthy body image for women. As when critics lashed out against idealized images of “heroin chic” in the early 1990s, some have voiced concern that fashion houses are encouraging potentially hazardous behaviors by glamorizing models who are rail-thin. © 2016 The New York Times Company

Keyword: Anorexia & Bulimia
Link ID: 22080 - Posted: 04.07.2016

by Sarah Zielinski Spring has finally arrived, and birds’ nests all over the country will soon be filling up with eggs and then nestlings. Watch a nest long enough (the Science News staff is partial to the DC Eagle Cam) and you’ll see itty bitty baby birds begging for a meal. But mama birds don’t always reward that begging with food. In some species, like the tree swallow, birds that beg more will get more food. But in others, like the hoopoe, mom ignores who is begging and gives more food to the biggest chicks, researchers have found. This lack of an overall pattern has confounded ornithologists, but it seems that they may have been missing a key piece of the puzzle. A new study finds that the quality of the birds’ environment determines whether a mama bird can afford to feed all of her kids or if she has to ignore some to make sure the others survive. The study appears March 29 in Nature Communications. Stuart West of the University of Oxford and colleagues compiled data from 306 studies that looked at 143 bird species. When the birds were living in a good environment — one that had plenty of resources or a high amount of predictability — then mom would feed the chicks that beg the most, which were often the ones that needed the most help. But when the environment was poor in quality or unpredictable, then mama bird responded less to begging. |© Society for Science & the Public 2000 - 2016.

Keyword: Sexual Behavior
Link ID: 22079 - Posted: 04.07.2016

Laura Sanders NEW YORK — Lip-readers’ minds seem to “hear” the words their eyes see being formed. And the better a person is at lipreading, the more neural activity there is in the brain’s auditory cortex, scientists reported April 4 at the annual meeting of the Cognitive Neuroscience Society. Earlier studies have found that auditory brain areas are active during lipreading. But most of those studies focused on small bits of language — simple sentences or even single words, said study coauthor Satu Saalasti of Aalto University in Finland. In contrast, Saalasti and colleagues studied lipreading in more natural situations. Twenty-nine people read the silent lips of a person who spoke Finnish for eight minutes in a video. “We can all lip-read to some extent,” Saalasti said, and the participants, who had no lipreading experience, varied widely in their comprehension of the eight-minute story. In the best lip-readers, activity in the auditory cortex was quite similar to that evoked when the story was read aloud, brain scans revealed. The results suggest that lipreading success depends on a person’s ability to “hear” the words formed by moving lips, Saalasti said. Citations J. Alho et al. Similar brain responses to lip-read, read and listened narratives. Cognitive Neuroscience Society annual meeting, New York City, April 4, 2016. Further Reading © Society for Science & the Public 2000 - 2016.

Keyword: Language
Link ID: 22077 - Posted: 04.07.2016

Emily Anthes Type 'depression' into the Apple App Store and a list of at least a hundred programs will pop up on the screen. There are apps that diagnose depression (Depression Test), track moods (Optimism) and help people to “think more positive” (Affirmations!). There's Depression Cure Hypnosis (“The #1 Depression Cure Hypnosis App in the App Store”), Gratitude Journal (“the easiest and most effective way to rewire your brain in just five minutes a day”), and dozens more. And that's just for depression. There are apps pitched at people struggling with anxiety, schizophrenia, post-traumatic stress disorder (PTSD), eating disorders and addiction. This burgeoning industry may meet an important need. Estimates suggest that about 29% of people will experience a mental disorder in their lifetime1. Data from the World Health Organization (WHO) show that many of those people — up to 55% in developed countries and 85% in developing ones — are not getting the treatment they need. Mobile health apps could help to fill the gap (see 'Mobilizing mental health'). Given the ubiquity of smartphones, apps might serve as a digital lifeline — particularly in rural and low-income regions — putting a portable therapist in every pocket. “We can now reach people that up until recently were completely unreachable to us,” says Dror Ben-Zeev, who directs the mHealth for Mental Health Program at the Dartmouth Psychiatric Research Center in Lebanon, New Hampshire. Public-health organizations have been buying into the concept. In its Mental Health Action Plan 2013–2020, the WHO recommended “the promotion of self-care, for instance, through the use of electronic and mobile health technologies.” And the UK National Health Service (NHS) website NHS Choices carries a short list of online mental-health resources, including a few apps, that it has formally endorsed. © 2016 Nature Publishing Grou

Keyword: Depression; Schizophrenia
Link ID: 22075 - Posted: 04.06.2016

By Kj Dell’Antonia If you tell your child’s pediatrician that your child is having trouble sleeping, she might respond by asking you how well you sleep yourself. A team of Finnish researchers found that parents with poor sleep quality tended to report more sleep-related difficulties in their children than parents who slept well. But when the researchers looked at an objective monitor of the children’s sleep, using a bracelet similar to a commercial fitness tracker that monitored movement acceleration, a measure of sleep quality, they found that the parents were often reporting sleep problems in their children that didn’t seem to be there. “The only thing that was associated with sleeping problems, as reported by the parents, was their own reported sleeping problems,” said Marko Elovainio, a professor of psychology at the University of Helsinki and one of the authors of the study, which was published this month in the journal Pediatrics. The study was relatively small, involving 100 families with children aged 2 to 6. But the findings suggest that parents’ report of sleep problems in their children are influenced by their own attitudes and behaviors surrounding sleep. The researchers were inspired to do their study, in part, by research showing that mothers with depression over-report behavioral problems in their children, seeing issues that teachers do not see. In pediatrics, the researchers noted, doctors rely heavily on parental reports for information — and if that information is biased by a parent’s own experience, diagnosis becomes more difficult. “Sleep is a good measure of stress,” said Dr. Elovaino, and it is one tool doctors use to evaluate how much stress a child is experiencing. But when making a diagnosis involving a child’s sleeping patterns, “we can’t rely on reports of parents. We need to use more objective measures.” One reason to look at sleep in this context, he said, is that unlike other possible markers of stress, it can be measured objectively. © 2016 The New York Times Company

Keyword: Sleep
Link ID: 22073 - Posted: 04.06.2016

by Daniel Galef Footage from a revolutionary behavioural experiment showed non-primates making and using tools just like humans. In the video, a crow is trying to get food out of a narrow vessel, but its beak is too short for it to reach through the container. Nearby, the researchers placed a straight wire, which the crow bent against a nearby surface into a hook. Then, holding the hook in its beak, it fished the food from the bottle. Corvids—the family of birds that includes crows, ravens, rooks, jackdaws, and jays—are pretty smart overall. Although not to the level of parrots and cockatoos, ravens can also mimic human speech. They also have a highly developed system of communication and are believed to be among the most intelligent non-primate animals in existence. McGill Professor Andrew Reisner recalls meeting a graduate student studying corvid intelligence at Oxford University when these results were first published in 2015. “I had read early in the year that some crows had been observed making tools, and I mentioned this to him,” Reisner explained. “He said that he knew about that, as it had been he who had first observed it happening. Evidently the graduate students took turns watching the ‘bird box,’ […] and the tool making first occurred there on his shift.”

Keyword: Evolution; Intelligence
Link ID: 22072 - Posted: 04.06.2016

By Roni Caryn Rabin Alzheimer’s disease is a progressive brain disorder that causes dementia, destroying memory, cognitive skills, the ability to care for oneself, speak and walk, said Ruth Drew, director of family and information services at the Alzheimer’s Association. “And since the brain affects everything, Alzheimer’s ultimately affects everything,” she said, “including the ability to swallow, cough and breathe.” Once patients reach the advanced stages of Alzheimer’s, they may stop eating and become weak and susceptible to infections, said Dr. Jason Karlawish, a professor of medicine at the University of Pennsylvania. Unable to swallow or cough, they are at high risk of choking, aspirating food particles or water into the lungs and developing pneumonia, which is often the immediate cause of death, he said. “You see a general decline in the contribution the brain makes, not just in thinking, but in maintaining the body’s homeostasis,” Dr. Karlawish said. Using a feeding tube to nourish patients and hospitalizing them for infections does not significantly extend life at the advanced stages of the disease and is discouraged because it can prolong suffering with no hope of recovery, he said. Alzheimer's is the sixth leading cause of death in the United States, according to the Centers for Disease Control and Prevention, but that figure may underestimate the actual number of cases, Dr. Karlawish said, since some deaths may be attributed to other causes like pneumonia. © 2016 The New York Times Company

Keyword: Alzheimers
Link ID: 22071 - Posted: 04.06.2016

Philip Ball James Frazer’s classic anthropological study The Golden Bough1 contains a harrowing chapter on human sacrifice in rituals of crop fertility and harvest among historical cultures around the world. Frazer describes sacrificial victims being crushed under huge toppling stones, slow-roasted over fires and dismembered alive. Frazer’s methods of analysis wouldn't all pass muster among anthropologists today (his work was first published in 1890), but it is hard not to conclude from his descriptions that what industrialized societies today would regard as the most extreme psychopathy has in the past been seen as normal — and indeed sacred — behaviour. In almost all societies, killing within a tribe or clan has been strongly taboo; exemption is granted only to those with great authority. Anthropologists have suspected that ritual human sacrifice serves to cement power structures — that is, it signifies who sits at the top of the social hierarchy. The idea makes intuitive sense, but until now there has been no clear evidence to support it. In a study published in Nature2, Joseph Watts, a specialist in cultural evolution at the University of Auckland in New Zealand, and his colleagues have analysed 93 traditional cultures in Austronesia (the region that loosely embraces the many small and island states in the Pacific and Indonesia) as they were before they were influenced by colonization and major world religions (generally in the late 19th and early 20th centuries). © 2016 Nature Publishing Group

Keyword: Aggression; Evolution
Link ID: 22070 - Posted: 04.05.2016

Feel like you haven’t slept in ages? If you’re one of the 5 per cent of the population who has severe insomnia – trouble sleeping for more than a month – then your brain’s white matter might be to blame. The cell bodies and synapses of our brain cells make up our brain’s grey matter, while bundles of their tails that connect one brain region to another make up the white matter. These nerve cell tails – axons – are cloaked in a fatty myelin sheath that helps transmit signals. Radiologist Shumei Li from Guangdong No. 2 Provincial People’s Hospital in Guangzhou, China, and her team, scanned the brains of 30 healthy sleepers and 23 people with severe insomnia using diffusion tensor imaging MRI. This imaging technique lights up the white matter circuitry. Axons unsheathed They found that in the brains of the people with severe insomnia, the regions in the right hemisphere that support learning, memory, smell and emotion were less well connected compared with healthy sleepers. They attribute this break down in circuitry to the loss of the myelin sheath in the white matter. A study in November suggested that smoking could be one cause for myelin loss. The team also found that the insomniacs had poorer connections in the white matter of the thalamus, a brain region that regulates consciousness, alertness and sleep. The study proposes a potential mechanism for insomnia but there could be other factors, says Max Wintermark, a radiologist at Stanford. He says it’s not possible to say whether the poor connections are the cause of result of insomnia. © Copyright Reed Business Information Ltd.

Keyword: Sleep
Link ID: 22069 - Posted: 04.05.2016

Laura Sanders NEW YORK — Sometimes forgetting can be harder than remembering. When people forced themselves to forget a recently seen image, select brain activity was higher than when they tried to remember that image. Forgetting is often a passive process, one in which the memory slips out of the brain, Tracy Wang of the University of Texas at Austin said April 2 at the annual meeting of the Cognitive Neuroscience Society. But in some cases, forgetting can be deliberate. Twenty adults saw images of faces, scenes and objects while an fMRI scanner recorded their brains’ reactions to the images. If instructed to forget the preceding image, people were less likely to remember that image later. Researchers used the scan data to build a computer model that could infer how strongly the brain responds to each particular kind of image. In the ventral temporal cortex, a part of the brain above the ear, brain patterns elicited by a particular image were stronger when a participant was told to forget the sight than when instructed to remember it. Of course, everyone knows that it’s easy to forget something without even trying. But these results show that intentional forgetting isn’t a passive process — the brain has to actively work to wipe out a memory on purpose. Citations T.H. Wang et al. Forgetting is more work than remembering. Annual meeting of the Cognitive Neuroscience Society, New York City, April 2, 2016. © Society for Science & the Public 2000 - 2016

Keyword: Learning & Memory
Link ID: 22068 - Posted: 04.05.2016

By DONALD G. McNEIL Jr The World Health Organization said on Thursday that there is “strong scientific consensus” that Zika virus is a cause of microcephaly, unusually small heads with brain damage in infants, as well as other neurological disorders. Yet a surge in microcephaly has been reported only in Brazil; a small increase was reported in French Polynesia, and a cluster of 32 cases is now under investigation in Colombia. For proof of the connection between infection with the virus and birth defects, scientists are waiting for the results of a large study of 5,000 pregnant women, most of them in Colombia. Women with past Zika infections will be compared with similar women without infections to see if they have more microcephalic children. The epidemic peaked in Colombia in early February, according to the W.H.O. Most of the women in the study are due to give birth in May and June. Virtually all public health agencies already believe the virus is to blame for these birth defects and are giving medical advice based on that assumption. Here are the lines of evidence they cite. As early as last August, hospitals in northeast Brazil realized that something unheard of was happening: Neonatal wards that normally saw one or two microcephalic babies a year were seeing five or more at the same time. Doctors learned from the mothers that many of them had had Zika symptoms months earlier. © 2016 The New York Times Company

Keyword: Development of the Brain
Link ID: 22065 - Posted: 04.04.2016

Quirin Schiermeier & Alison Abbott The ability to study brain processes in real time is one of the goals of the Human Brain Project's newly-released computing tools. Europe’s major brain-research project has unveiled a set of prototype computing tools and called on the global neuroscience community to start using them. The move marks the end of the 30-month ramp-up phase of the Human Brain Project (HBP), and the start of its operational phase. The release of the computing platforms — which include brain-simulation tools, visualization software and a pair of remotely accessible supercomputers to study brain processes in real time — could help to allay concerns about the €1-billion (US$1.1-billion) project’s benefits to the wider scientific community. “The new platforms open countless new possibilities to analyse the human brain,” said Katrin Amunts, a neuroscientist at the Jülich Research Centre in Germany and a member of the project’s board of directors, at a press conference on 30 March. “We are proud to offer the global brain community a chance to participate.” But it is not clear how the platforms — some freely accessible, others available only on the success of a peer-reviewed application — will resonate with brain researchers outside the project. “At this point, no one can say whether or not the research platforms will be a success,” says Andreas Herz, chair of computational neuroscience at the Ludwig Maximilian University of Munich in Germany. © 2016 Nature Publishing Group

Keyword: Brain imaging
Link ID: 22061 - Posted: 04.01.2016

By BENEDICT CAREY Some scientists studying the relationship between contact sports and memory or mood problems later in life argue that cumulative exposure to hits that cause a snap of the head — not an athlete’s number of concussions — is the most important risk factor. That possibility is particularly worrisome in football, in which frequent “subconcussive” blows are unavoidable. On Thursday, researchers based at Boston University reported the most rigorous evidence to date that overall exposure to contact in former high school and college football players could predict their likelihood of experiencing problems like depression, apathy or memory loss years later. The finding, appearing in The Journal of Neurotrauma, is not conclusive, the authors wrote. Such mental problems can stem from a variety of factors in any long life. Yet the paper represents researchers’ first attempt to precisely calculate cumulative lifetime exposure to contact in living players, experts said. Previous estimates had relied in part on former players’ memories of concussions, or number of years played. The new paper uses more objective measures, including data from helmet accelerometer studies, and provides a glimpse of where the debate over the risk of contact sports may next play out, the experts said. “They used a much more refined and quantitative approach to estimate exposure than I’ve seen in this area,” said John Meeker, a professor of environmental health sciences at the University of Michigan School of Public Health, who was not a part of the research team. But he added, “Their methods will have to be validated in much larger studies; this is very much a preliminary finding.” The study did not address the risk of chronic traumatic encephalopathy, or C.T.E., a degenerative scarring in the brain tied to head blows, which can be diagnosed only after death. © 2016 The New York Times Company

Keyword: Brain Injury/Concussion
Link ID: 22060 - Posted: 04.01.2016

Meghan Rosen Despite massive public health campaigns, the rise in worldwide obesity rates continues to hurtle along like a freight train on greased tracks. In 2014, more than 640 million men and women were obese (measured as a body mass index of 30 or higher). That’s up from 105 million in 1975, researchers estimate in the April 2 Lancet. The researchers analyzed four decades of height and weight data for more than 19 million adults, and then calculated global rates based on population data. On average, people worldwide are gaining about 1.5 kilograms per decade — roughly the weight of a half-gallon of ice cream. But the road isn’t entirely rocky. During the same time period, average life expectancy also jumped: from less than 59 years to more than 71 years, George Davey Smith points out in a comment accompanying the new study. Smith, an epidemiologist at the University of Bristol in England, boils the data down to a single, seemingly paradoxical sentence: “The world is at once fatter and healthier.” © Society for Science & the Public 2000 - 2016

Keyword: Obesity
Link ID: 22059 - Posted: 04.01.2016

Ewen Callaway Homo floresiensis, the mysterious and diminutive species found in Indonesia in 2003, is tens of thousands of years older than originally thought — and may have been driven to extinction by modern humans. After researchers discovered H. floresiensis, which they nicknamed the hobbit, in Liang Bua cave on the island of Flores, they concluded that its skeletal remains were as young as 11,000 years old. But later excavations that have dated more rock and sediment around the remains now suggest that hobbits were gone from the cave by 50,000 years ago, according to a study published in Nature on 30 March1. That is around the time that modern humans moved through southeast Asia and Australia. “I can’t believe that it is purely coincidence, based on what else we know happens when modern humans enter a new area,” says Richard Roberts, a geochronologist at the University of Wollongong, Australia. He notes that Neanderthals vanished soon after early modern humans arrived in Europe from Africa. Roberts co-led the study with archaeologist colleague Thomas Sutikna (who also helped coordinate the 2003 dig), and Matthew Tocheri, a paleoanthropologist at Lakehead University in Thunder Bay, Canada. The first hobbit fossil, known as LB1, was found in 20032 beneath about 6 metres of dirt and rock. Its fragile bones were too precious for radiocarbon dating, so the team collected nearby charcoal, on the assumption that it had accrued at the same time as the bones. That charcoal was as young as 11,000 years old, researchers reported at the time3, 4. “Somehow these tiny people had survived on this island 30,000 years after modern humans arrived,” says Roberts. “We were scratching our heads. It couldn’t add up.” © 2016 Nature Publishing Group,

Keyword: Evolution
Link ID: 22055 - Posted: 03.31.2016

Chris French The fallibility of human memory is one of the most well established findings in psychology. There have been thousands of demonstrations of the unreliability of eyewitness testimony under well-controlled conditions dating back to the very earliest years of the discipline. Relatively recently, it was discovered that some apparent memories are not just distorted memories of witnessed events: they are false memories for events that simply never took place at all. Psychologists have developed several reliable methods for implanting false memories in a sizeable proportion of experimental participants. It is only in the last few years, however, that scientists have begun to systematically investigate the phenomenon of non-believed memories. These are subjectively vivid memories of personal experiences that an individual once believed were accurate but now accepts are not based upon real events. Prior to this, there were occasional anecdotal reports of non-believed memories. One of the most famous was provided by the influential developmental psychologist Jean Piaget. He had a clear memory of almost being kidnapped at about the age of two and of his brave nurse beating off the attacker. His grateful family were so impressed with the nurse that they gave her a watch as a reward. Years later, the nurse confessed that she had made the whole story up. Even after he no longer believed that the event had taken place, Piaget still retained his vivid and detailed memory of it. © 2016 Guardian News and Media Limited

Keyword: Learning & Memory
Link ID: 22050 - Posted: 03.30.2016

By Ariana Eunjung Cha In the movie "Concussion," which is based on the life of Bennet Omalu, a doctor who studied traumatic brain injury, Omalu explains that the reason the prognosis is so poor for so many of them is because their symptoms went undiagnosed. When head injuries aren't treated or are under-treated, it puts patients at risk of more serious injury. This is why children with concussions are often asked not to return to class or sports until their symptoms have resolved and adults often have to take days off work. One of the challenges has been that concussions are tricky to diagnose, and it isn't uncommon for a patient to rush to the ER only to be met with a vague response from the doctor about whether there's anything worrisome. Symptoms often aren't apparent for hours or even days after the initial injury, and the imaging technology we have can't pick up anything other than larger bleeds and lesions. How different could things have been if there was a simple blood test to detect a concussion? In a paper published in JAMA Neurology on Monday, researchers reported that they may be closer than ever to such a test. The study involved 600 patients admitted to a trauma center from March 2010 to March 2014. All had suffered some kind of head injury resulting in loss of consciousness, amnesia or disorientation.

Keyword: Brain Injury/Concussion; Glia
Link ID: 22047 - Posted: 03.30.2016

Opioids are becoming the latest serious addiction problem in this country. Among these drugs manufactured from opium, heroin is the most serious, dangerous, cheap and available everywhere. In April's edition of Harper's Magazine, Dan Baum has examined a new response to this latest addiction problem: the legalization of drugs. NPR's Linda Wertheimer asks Baum about how he began to delve into the topic of America's war on drugs and why he calls attempts at legalization a big risk based on our approach to solving the widespread problem. Interview Highlights You go back, covering the war on drugs, I wonder if you could tell us the story which kicks off your article. I was starting a book on the politics of drug enforcement. And in 1994 I got word that John Erlichman was doing minority recruitment at an engineering firm in Atlanta. Well, I'm 60. Erlichman was one of the great villains of American History, a Watergate villain. And he was Richard Nixon's drug policy advisor. And Richard Nixon was the one who coined the phrase, "war on drugs." And he told me an amazing thing. I started asking him some earnest, wonky policy questions and he waved them away. He said, "Can we cut the B.S.? Can I just tell you what this was all about?" The Nixon campaign in '68 and the Nixon White House had two enemies: black people and the anti-war left. He said, and we knew that if we could associate heroin with black people and marijuana with the hippies, we could project the police into those communities, arrest their leaders, break up their meetings and most of all, demonize them night after night on the evening news. And he looked me in the eyes and said, "Did we know we were lying about the drugs? Of course we did." © 2016 npr

Keyword: Drug Abuse
Link ID: 22046 - Posted: 03.29.2016

by Sarah Zielinski There must be something wrong with the guy who never leaves home, right? Maybe not — at least if that guy is a male spotted hyena. Males that stay with their birth clan, instead of taking off to join a new group, may simply be making a good choice, a new study suggests. Spotted hyenas are a matriarchal society. Females are in charge. They rank higher than every male in the clan. And the females generally stay with the clan for their entire lives. But males face a choice when they reach two and a half years in age. They can stay with the clan, or they can leave and join a new clan. Each choice has its pros and cons. Staying with the clan means that a male hyena keeps a place at the top of the male pecking order. He’ll probably have his mother around to help. But he’ll be limited in the number of females he can mate with, because many of the female hyenas won’t mate with him because they might be related. If he joins a new clan, the male hyena might have access to more females — and they might even be better than the ones in his home clan — but he’ll start with the lowest social rank and have to spend years fighting his way to the top. Among most group-living mammal species, the guys that stay at home turn out to be losers, siring fewer offspring. But spotted hyenas, it appears, are an exception. Eve Davidian of the Leibniz Institute for Zoo and Wildlife Research in Berlin and colleagues tracked 254 male spotted hyenas that lived in eight clans in Ngorongoro Crater in Tanzania throughout their lives, a study lasting 20 years. When these males reached the age of maturity, they left their clans to take a look at the other options available to them. Forty-one hyenas returned to their home clans, and 213 settled with new ones. © Society for Science & the Public 2000 - 2016

Keyword: Sexual Behavior; Evolution
Link ID: 22045 - Posted: 03.29.2016

By Roni Caryn Rabin Here’s another reason to eat your fruits and veggies: You may reduce your risk of vision loss from cataracts. Cataracts that cloud the lenses of the eye develop naturally with age, but a new study is one of the first to suggest that diet may play a greater role than genetics in their progression. Researchers had about 1,000 pairs of female twins in Britain fill out detailed food questionnaires that tracked their nutrient intake. Their mean age was just over 60. The study participants underwent digital imaging of the eye to measure the progression of cataracts. The researchers found that women who consumed diets rich in vitamin C and who ate about two servings of fruit and two servings of vegetables a day had a 20 percent lower risk of cataracts than those who ate a less nutrient-rich diet. Ten years later, the scientists followed up with 324 of the twin pairs, and found that those who had reported consuming more vitamin C in their diet — at least twice the recommended dietary allowance of 75 milligrams a day for women (the R.D.A. for adult men is 90 milligrams) — had a 33 percent lower risk of their cataracts progressing than those who get less vitamin C. The researchers concluded that genetic factors account for about 35 percent of the difference in cataract progression, while environmental factors like diet account for 65 percent. “We found no beneficial effect from supplements, only from the vitamin C in the diet,” said Dr. Christopher Hammond, a professor of ophthalmology at King’s College London and an author of the study,published in Ophthalmology. Foods high in vitamin C include oranges, cantaloupe, kiwi, broccoli and dark leafy greens. © 2016 The New York Times Company

Keyword: Vision
Link ID: 22044 - Posted: 03.29.2016