Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 6281 - 6300 of 29538

Katherine Hobson The theory behind artificial sweeteners is simple: If you use them instead of sugar, you get the joy of sweet-tasting beverages and foods without the downer of extra calories, potential weight gain and related health issues. In practice, it's not so simple, as a review of the scientific evidence on non-nutritive sweeteners published Monday shows. After looking at two types of scientific research, the authors conclude that there is no solid evidence that sweeteners like aspartame and sucralose help people manage their weight. And observational data suggest that the people who regularly consume these sweeteners are also more likely to develop future health problems, though those studies can't say those problems are caused by the sweeteners. The health effects of artificial sweeteners are important to study, because so many people use them. Another study published earlier this year found that a quarter of U.S. children and 41 percent of adults reported consuming them, most of them once per day. Even more people may be consuming them unwittingly in products such as granola bars or yogurt. "We were really interested in the everyday person who is consuming these products not to lose weight, but because they think it's the healthier choice, for many years on end," says Meghan Azad, lead author of the review and a research scientist at the University of Manitoba. While more research needs to be done, from what we know now, "there is no clear benefit for weight loss, and there's a potential association with increased weight gain, diabetes and other negative cardiovascular outcomes," says Azad. © 2017 npr

Keyword: Obesity
Link ID: 23845 - Posted: 07.18.2017

By PERRI KLASS, M.D. We want to believe we’re raising our kids to think for themselves, and not to do dumb or unhealthy things just because the cool kids are doing them. But research shows that when it comes to smoking, children are heavily influenced by some of the folks they consider the coolest of the cool: actors in movies. “There’s a dose-response relationship: The more smoking kids see onscreen, the more likely they are to smoke,” said Dr. Stanton Glantz, a professor and director of the University of California, San Francisco, Center for Tobacco Control Research and Education. He is one of the authors of a new study that found that popular movies are showing more tobacco use onscreen. “The evidence shows it’s the largest single stimulus,” for smoking, he said; “it overpowers good parental role modeling, it’s more powerful than peer influence or even cigarette advertising.” He said that epidemiological studies have shown that if you control for all the other risk factors of smoking (whether parents smoke, attitudes toward risk taking, socioeconomic status, and so on), younger adolescents who are more heavily exposed to smoking on film are two to three times as likely to start smoking, compared with the kids who are more lightly exposed. Those whose parents smoke are more likely to smoke, he said, but exposure to smoking in movies can overcome the benefit of having nonsmoking parents. In one study, the children of nonsmoking parents with heavy exposure to movie smoking were as likely to smoke as the children of smoking parents with heavy movie exposure. To Dr. Glantz, and the other people who study this topic, that makes smoking in movies an “environmental toxin,” a factor endangering children. © 2017 The New York Times Company

Keyword: Drug Abuse; Attention
Link ID: 23844 - Posted: 07.18.2017

By TALYA MINSBERG When Marti Noxon set out to make “To the Bone,” a film about a 20-year-old battling an eating disorder, she initially faced the question: Was the topic too niche? The answer came in the form of a rousing premiere in January at the Sundance Film Festival, Netflix’s reported $8 million purchase of the film, a trailer that went viral with 54 million views in the first week and arguments over whether it glamorized excessive thinness. The film debuted on Netflix on Friday. The film is loosely based on Ms. Noxon’s experience with an eating disorder. She and its star, Lily Collins, are among the 30 million Americans — a third of them men — who have struggled with one. Ms. Collins plays Ellen, an anorexia patient who enters her fifth eating disorder treatment center, an unorthodox group home run by a doctor played by Keanu Reeves. Many of those reacting to the film’s trailer worried that watching it could trigger unhealthy thoughts in viewers who may be prone to eating disorders or already struggling with them. Indeed, some experts said that people who have had eating disorders should consider the state of their health before watching the film. “If you don’t feel solid in your recovery, don’t watch it right now. It could be triggering at any part of your life if you aren’t feeling strong and solid in your recovery,” Dr. Dena Cabera, executive clinical director at Rosewood Centers for Eating Disorders said. “It will always be there; you can look it up later.” Others say the film may help spur action. Eating disorders have the highest mortality rate of any psychiatric disorder, and can affect individuals across every demographic. “If the film helps raise awareness and more people seek treatment, that would be a success that we can be pleased with,” Dr. S. Bryn Austin, a professor at Boston Children’s Hospital and Harvard T.H. Chan School of Public Health, said. “Eating disorders can be successfully treated, they just need to take the first step in reaching out for care.” © 2017 The New York Times Company

Keyword: Anorexia & Bulimia
Link ID: 23843 - Posted: 07.17.2017

Tim Adams Henry Marsh made the decision to become a neurosurgeon after he had witnessed his three-month-old son survive the complex removal of a brain tumour. For two decades he was the senior consultant in the Atkinson Morley wing at St George’s hospital in London, one of the country’s largest specialist brain surgery units. He pioneered techniques in operating on the brain under local anaesthetic and was the subject of the BBC documentary Your Life in Their Hands. His first book, Do No Harm: Stories of Life, Death, and Brain Surgery, was published in 2014 to great acclaim, and became a bestseller across the world. Marsh retired from full-time work at St George’s in 2015, though he continues with long-standing surgical roles at hospitals in the Ukraine and Nepal. He is also an avid carpenter. Earlier this year he published a second volume of memoir, Admissions: a Life in Brain Surgery, in which he looks back on his career as he takes up a “retirement project” of renovating a decrepit lock-keeper’s cottage near where he grew up in Oxfordshire. He lives with his second wife, the social anthropologist and author Kate Fox. They have homes in Oxford, and in south London, which is where the following conversation took place. Have you officially retired now? Well, I still do one day a week for the NHS, though apparently they want a “business case” for it, so I’m not getting paid at present. Yes, well, people talk about the mind-matter problem – it’s not a problem for me: mind is matter. That’s not being reductionist. It is actually elevating matter. We don’t even begin to understand how electrochemistry and nerve cells generate thought and feeling. We have not the first idea. The relation of neurosurgery to neuroscience is a bit like the relationship between plumbing and quantum mechanics.

Keyword: Consciousness
Link ID: 23842 - Posted: 07.17.2017

Jon Hamilton Harsh life experiences appear to leave African-Americans vulnerable to Alzheimer's and other forms of dementia, researchers reported Sunday at the Alzheimer's Association International Conference in London. Several teams presented evidence that poverty, disadvantage and stressful life events are strongly associated with cognitive problems in middle age and dementia later in life among African-Americans. The findings could help explain why African-Americans are twice as likely as white Americans to develop dementia. And the research suggests genetic factors are not a major contributor. "The increased risk seems to be a matter of experience rather than ancestry," says Megan Zuelsdorff, a postdoctoral fellow in the Health Disparities Research Scholars Program at the University of Wisconsin-Madison. Scientists have struggled to understand why African-Americans are so likely to develop dementia. They are more likely to have conditions like high blood pressure and diabetes, which can affect the brain. And previous research has found some evidence that African-Americans are more likely to carry genes that raise the risk. But more recent studies suggest those explanations are incomplete, says Rachel Whitmer, an epidemiologist with Kaiser Permanente's Division of Research in Northern California. Whitmer has been involved in several studies that accounted for genetic and disease risks when comparing dementia in white and black Americans. "And we still saw these [racial] differences," she says. "So there is still something there that we are trying to get at." © 2017 npr

Keyword: Alzheimers; Stress
Link ID: 23841 - Posted: 07.17.2017

Nicola Davis People who experience hearing loss could be at greater risk of memory and thinking problems later in life than those without auditory issues, research suggests. The study focused on people who were at risk of Alzheimer’s disease, revealing that those who were diagnosed with hearing loss had a higher risk of “mild cognitive impairment” four years later. “It’s really not mild,” said Clive Ballard, professor of age-related disease at the University of Exeter. “They are in the lowest 5% of cognitive performance and about 50% of those individuals will go on to develop dementia.” Guardian Morning Briefing - sign up and start the day one step ahead Read more Presented at the Alzheimer’s Association International Conference in London, researchers from the US looked at the memory and thinking skills of 783 cognitively healthy participants in late middle age, more than two-thirds of whom had at least one parent who had been diagnosed with Alzheimer’s disease. The team carried out a range of cognitive tests on the participants over a four-year period, aimed at probing memory and mental processing, revealing that those who had hearing loss at the start of the study were more than twice as likely to be found to have mild cognitive impairment four years later than those with no auditory problems, once a variety of other risk factors were taken into account. Taylor Fields, a PhD student at the University of Wisconsin who led the research, said that the findings suggest hearing loss could be an early warning sign that an individual might be at greater risk of future cognitive impairment - but added more research was necessary to unpick the link. “There is something here and it should be looked into,” she said. © 2017 Guardian News and Media Limited

Keyword: Hearing; Alzheimers
Link ID: 23840 - Posted: 07.17.2017

Pagan Kennedy In 2011, Ben Trumble emerged from the Bolivian jungle with a backpack containing hundreds of vials of saliva. He had spent six weeks following indigenous men as they tramped through the wilderness, shooting arrows at wild pigs. The men belonged to the Tsimane people, who live as our ancestors did thousands of years ago — hunting, foraging and farming small plots of land. Dr. Trumble had asked the men to spit into vials a few times a day so that he could map their testosterone levels. In return, he carried their kills and helped them field-dress their meat — a sort of roadie to the hunters. Dr. Trumble wanted to find out whether the hunters who successfully shot an animal would be rewarded with a spike in testosterone. (They were.) As a researcher with the Tsimane Health and Life History Project, he had joined a long-running investigation into human well-being and aging in the absence of industrialization. That day when he left the jungle, he stumbled across a new and more urgent question about human health. He dropped his backpack, called his mom and heard some terrible news: His 64-year-old uncle had learned he had dementia, probably Alzheimer’s. In just a few short years, his uncle, a vibrant former lawyer, would stop speaking, stop eating and die. “I couldn’t help my uncle,” Dr. Trumble said, but he was driven to understand the disease that killed him. He wondered: Do the Tsimane suffer from Alzheimer’s disease like we do? And if not, what can we learn from them about treating or preventing dementia? “There is really no cure yet for Alzheimer’s,” Dr. Trumble told me. “We have nothing that can undo the damage already done.” Why, he wondered, had billions of dollars and decades of research yielded so little? Perhaps major clues were being missed. Dr. Trumble was trained as an anthropologist, and his field — evolutionary medicine — taught him to see our surroundings as a blip in the timeline of human history. He thinks it’s a problem that medical research focuses almost exclusively on “people who live in cities like New York or L.A.” Scientists often refer to these places as “Weird” — Western, educated, industrialized, rich and democratic — and point out that our bodies are still designed for the not-Weird environment in which our species evolved. Yet we know almost nothing about how dementia affected humans during the 50,000 years before developments like antibiotics and mechanized farming. Studying the Tsimane, Dr. Trumble believes, could shed light on this modern plague. © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23839 - Posted: 07.15.2017

Brandie Jefferson It wasn't long ago that there were no treatments for multiple sclerosis. In the 1970s, some doctors used chemotherapy to treat the degenerative neurological disease. Since then, more than a dozen drugs have been developed or approved, including infusions, oral medications and self-administered shots. None of these are a magic bullet for a disease that can be disabling and deadly. But now there is a new drug, Ocrevus, that looks like a game-changer. It uses a novel approach to blocking the inflammation that drives the disease and looks as if it's spectacularly effective. It also costs $65,000 a year. I have MS. Should I take Ocrevus? That, I discovered, is not a simple question to answer. But because I'm an MS patient and a science journalist, I was determined to try to figure it out. In March, the FDA approved Ocrevus (ocrelizumab) for the treatment of relapsing-remitting multiple sclerosis, the most common form of the disease. People with RRMS tend to have flare-ups when their symptoms worsen, followed by periods of remission and, in some cases, a full or partial recovery. In two clinical trials sponsored by the drug's eventual manufacturer, F. Hoffmann-La Roche, RRMS patients who were given ocrelizumab had about 50 percent fewer relapses and up to 95 percent fewer new lesions on the brain and spinal cord than those who were given Rebif, a common therapy. MS is an autoimmune disease, meaning the body attacks itself. The body's nerve endings and the fatty tissue that coats them, called myelin, bear the brunt of the immune system's attacks. As a result, the central nervous system has difficulty communicating with the nerves, leading to a disease that manifests itself in different ways, such as pain, fatigue, disability and slurred speech. © 2017 npr

Keyword: Multiple Sclerosis
Link ID: 23838 - Posted: 07.15.2017

By Aaron Reuben, Jonathan Schaefer Most of us know at least one person who has struggled with a bout of debilitating mental illness. Despite their familiarity, however, these kinds of episodes are typically considered unusual, and even shameful. New research, from our lab and from others around the world, however, suggests mental illnesses are so common that almost everyone will develop at least one diagnosable mental disorder at some point in their lives. Most of these people will never receive treatment, and their relationships, job performance and life satisfaction will likely suffer. Meanwhile the few individuals who never seem to develop a disorder may offer psychology a new avenue of study, allowing researchers to ask what it takes to be abnormally, enduringly, mentally well. Epidemiologists have long known that, at any given point in time, roughly 20 to 25 percent of the population suffers from a mental illness, which means they experience psychological distress severe enough to impair functioning at work, school or in their relationships. Extensive national surveys, conducted from the mid-1990s through the early 2000s, suggested that a much higher percentage, close to half the population, would experience a mental illness at some point in their lives. These surveys were large, involving thousands of participants representative of the U.S. in age, sex, social class and ethnicity. They were also, however, retrospective, which means they relied on survey respondents’ accurate recollection of feelings and behaviors months, years and even decades in the past. Human memory is fallible, and modern science has demonstrated that people are notoriously inconsistent reporters about their own mental health history, leaving the final accuracy of these studies up for debate. Of further concern, up to a third of the people contacted by the national surveys failed to enroll in the studies. Follow-up tests suggested that these “nonresponders” tended to have worse mental health. © 2017 Scientific American

Keyword: Schizophrenia; Depression
Link ID: 23837 - Posted: 07.15.2017

Hannah Devlin Science correspondent Brash, brawny and keen to impose their will on anyone who enters their sphere of existence: the alpha male in action is unmistakable. Now scientists claim to have pinpointed the biological root of domineering behaviour. New research has located a brain circuit that, when activated in mice, transformed timid individuals into bold alpha mice that almost always prevailed in aggressive social encounters. In some cases, the social ranking of the subordinate mice soared after the scientists’ intervention, hinting that it might be possible to acquire “alphaness” simply by adopting the appropriate mental attitude. Or as Donald Trump might put it: “My whole life is about winning. I almost never lose.” Prof Hailan Hu, a neuroscientist at Zhejiang University in Hangzhou, China, who led the work said: “We stimulate this brain region and we can make lower ranked mice move up the social ladder.” The brain region, called the dorsal medial prefrontal cortex (dmPFC), was already known to light up during social interactions involving decisions about whether to be assertive or submissive with others. But brain imaging alone could not determine whether the circuit was ultimately controlling how people behave. The latest findings answer the question, showing that when the circuit was artificially switched on, low-ranking mice were immediately emboldened. “It’s not aggressiveness per se,” Hu said. “It increases their perseverance, motivational drive, grit.” © 2017 Guardian News and Media Limited

Keyword: Aggression; Sexual Behavior
Link ID: 23836 - Posted: 07.14.2017

Susan Milius Ravens have passed what may be their toughest tests yet of powers that, at least on a good day, let people and other apes plan ahead. Lab-dwelling common ravens (Corvus corax) in Sweden at least matched the performance of nonhuman apes and young children in peculiar tests of advanced planning ability. The birds faced such challenges as selecting a rock useless at the moment but likely to be useful for working a puzzle box and getting food later. Ravens also reached apelike levels of self-control, picking a tool instead of a ho-hum treat when the tool would eventually allow them to get a fabulous bit of kibble 17 hours later, Mathias Osvath and Can Kabadayi of Lund University in Sweden report in the July 14 Science. “The insight we get from the experiment is that [ravens] can plan for the future outside behaviors observed in the wild,” Markus Böckle, of the University of Cambridge, said in an e-mail. Böckle, who has studied ravens, coauthored a commentary in the same issue of Science. In the wild, ravens cache some of their food, but that apparent foresight could be more of a specific adaptation that evolved with diet instead of as some broader power of planning. The Lund tests, based on experiments with apes, tried to challenge ravens in less natural ways. The researchers say the birds aren’t considered much of a tool-using species in nature, nor do they trade for food. “The study for the first time in any animal shows that future planning can be used in behaviors it was not originally selected for” in evolution, Böckle says. © Society for Science & the Public 2000 - 2017.

Keyword: Intelligence; Evolution
Link ID: 23835 - Posted: 07.14.2017

By Ryan Cross Can you imagine watching 20,000 videos, 16 minutes apiece, of fruit flies walking, grooming, and chasing mates? Fortunately, you don’t have to, because scientists have designed a computer program that can do it faster. Aided by artificial intelligence, researchers have made 100 billion annotations of behavior from 400,000 flies to create a collection of maps linking fly mannerisms to their corresponding brain regions. Experts say the work is a significant step toward understanding how both simple and complex behaviors can be tied to specific circuits in the brain. “The scale of the study is unprecedented,” says Thomas Serre, a computer vision expert and computational neuroscientist at Brown University. “This is going to be a huge and valuable tool for the community,” adds Bing Zhang, a fly neurobiologist at the University of Missouri in Columbia. “I am sure that follow-up studies will show this is a gold mine.” At a mere 100,000 neurons—compared with our 86 billion—the small size of the fly brain makes it a good place to pick apart the inner workings of neurobiology. Yet scientists are still far from being able to understand a fly’s every move. To conduct the new research, computer scientist Kristin Branson of the Howard Hughes Medical Institute in Ashburn, Virginia, and colleagues acquired 2204 different genetically modified fruit fly strains (Drosophila melanogaster). Each enables the researchers to control different, but sometimes overlapping, subsets of the brain by simply raising the temperature to activate the neurons. © 2017 American Association for the Advancement of Science.

Keyword: Brain imaging
Link ID: 23834 - Posted: 07.14.2017

By BENEDICT CAREY Keith Conners, whose work with hyperactive children established the first standards for diagnosing and treating what is now known as attention deficit hyperactivity disorder, or A.D.H.D. — and who late in life expressed misgivings about how loosely applied that label had become — died on July 5 in Durham, N.C. He was 84. His wife, Carolyn, said the cause was heart failure. The field of child psychiatry was itself still young when Dr. Conners joined the faculty of the Johns Hopkins University School of Medicine in the early 1960s as a clinical psychologist. Children with emotional and behavioral problems often got a variety of diagnoses, depending on the clinic, and often ended up being given strong tranquilizers as treatment. Working with Dr. Leon Eisenberg, a prominent child psychiatrist, Dr. Conners focused on a group of youngsters who were chronically restless, hyperactive and sometimes aggressive. Doctors had recognized this type — “hyperkinesis,” it was called, or “minimal brain dysfunction” — but Dr. Conners combined existing descriptions and, using statistical analysis, focused on the core symptoms. The 39-item questionnaire he devised, called the Conners Rating Scale, quickly became the worldwide standard for assessing the severity of such problems and measuring improvement. It was later abbreviated to 10 items, giving child psychiatry a scientific foothold and anticipating by more than a decade the kind of checklists that would come to define all psychiatric diagnosis. He used his scale to study the effects of stimulant drugs on hyperactive children. Doctors had known since the 1930s that amphetamines could, paradoxically, calm such youngsters; a Rhode Island doctor, Charles Bradley, had published a well-known report detailing striking improvements in attention and academic performance among many children at a children’s inpatient home he ran near Providence. But it was a series of rigorous studies by Dr. Conners, in the 1960s and ’70s, that established stimulants — namely Dexedrine and Ritalin — as the standard treatments. © 2017 The New York Times Company

Keyword: ADHD
Link ID: 23833 - Posted: 07.14.2017

By PAM BELLUCK How we look at other people’s faces is strongly influenced by our genes, scientists have found in new research that may be especially important for understanding autism because it suggests that people are born with neurological differences that affect how they develop socially. The study, published on Wednesday in the journal Nature, adds new pieces to the nature-versus-nurture puzzle, suggesting that genetics underlie how children seek out formative social experiences like making eye contact or observing facial expressions. Experts said the study may also provide a road map for scientists searching for genes linked to autism. “These are very convincing findings, novel findings,” said Charles A. Nelson III, a professor of pediatrics and neuroscience at Harvard Medical School and Boston Children’s Hospital, who was not involved in the research. “They seem to suggest that there’s a genetic underpinning that leads to different patterns of brain development, that leads some kids to develop autism.” Dr. Nelson, an expert in child development and autism who was an independent reviewer of the study for Nature, said that while autism is known to have a genetic basis, how specific genes influence autism’s development remains undetermined. The study provides detailed data on how children look at faces, including which features they focus on and when they move their eyes from one place to another. The information, Dr. Nelson said, could help scientists “work out the circuitry that controls these eye movements, and then we ought to be able to work out which genes are being expressed in that circuit.” “That would be a big advance in autism,” he said. In the study, scientists tracked the eye movements of 338 toddlers while they watched videos of motherly women as well as of children playing in a day care center. The toddlers, 18 months to 24 months old, included 250 children who were developing normally (41 pairs of identical twins, 42 pairs of nonidentical twins and 84 children unrelated to each other). There were also 88 children with autism. © 2017 The New York Times Company

Keyword: Autism; Vision
Link ID: 23832 - Posted: 07.13.2017

Carina Storrs In the late 1960s, a team of researchers began doling out a nutritional supplement to families with young children in rural Guatemala. They were testing the assumption that providing enough protein in the first few years of life would reduce the incidence of stunted growth. It did. Children who got supplements grew 1 to 2 centimetres taller than those in a control group. But the benefits didn't stop there. The children who received added nutrition went on to score higher on reading and knowledge tests as adolescents, and when researchers returned in the early 2000s, women who had received the supplements in the first three years of life completed more years of schooling and men had higher incomes1. “Had there not been these follow-ups, this study probably would have been largely forgotten,” says Reynaldo Martorell, a specialist in maternal and child nutrition at Emory University in Atlanta, Georgia, who led the follow-up studies. Instead, he says, the findings made financial institutions such as the World Bank think of early nutritional interventions as long-term investments in human health. Since the Guatemalan research, studies around the world — in Brazil, Peru, Jamaica, the Philippines, Kenya and Zimbabwe — have all associated poor or stunted growth in young children with lower cognitive test scores and worse school achievement2. A picture slowly emerged that being too short early in life is a sign of adverse conditions — such as poor diet and regular bouts of diarrhoeal disease — and a predictor for intellectual deficits and mortality. But not all stunted growth, which affects an estimated 160 million children worldwide, is connected with these bad outcomes. Now, researchers are trying to untangle the links between growth and neurological development. Is bad nutrition alone the culprit? What about emotional neglect, infectious disease or other challenges? © 2017 Macmillan Publishers Limited

Keyword: Development of the Brain
Link ID: 23831 - Posted: 07.13.2017

By Jane C. Hu In English the sky is blue, and the grass is green. But in Vietnamese there is just one color category for both sky and grass: xanh. For decades cognitive scientists have pointed to such examples as evidence that language largely determines how we see color. But new research with four- to six-month-old infants indicates that long before we learn language, we see up to five basic categories of hue—a finding that suggests a stronger biological component to color perception than previously thought. The study, published recently in the Proceedings of the National Academy of Sciences USA, tested the color-discrimination abilities of more than 170 British infants. Researchers at the University of Sussex in England measured how long babies spent gazing at color swatches, a metric known as looking time. First the team showed infants one swatch repeatedly until their looking time decreased—a sign they had grown bored with it. Then the researchers showed them a different swatch and noted their reaction. Longer looking times were interpreted to mean the babies considered the second swatch to be a new hue. Their cumulative responses showed that they distinguished among five colors: red, green, blue, purple and yellow. The finding “suggests we’re all working from the same template,” explains lead author Alice Skelton, a doctoral student at Sussex. “You come prepackaged to make [color] distinctions, but given your culture and language, certain distinctions may or may not be used.” For instance, infants learning Vietnamese most likely see green and blue, even if their native language does not use distinct words for the two colors. © 2017 Scientific American

Keyword: Vision; Development of the Brain
Link ID: 23830 - Posted: 07.13.2017

By Alice Klein FOR decades, new parents have been warned against sharing a bed with their babies. While snuggling up with your newborn may seem like the most natural thing in the world, prevailing medical advice says this increases the risk of sudden infant death syndrome (SIDS), sometimes called cot death. Instead, doctors say your little ones should sleep in a separate crib in your bedroom. On the other side of the argument are anthropologists and proponents of “attachment parenting”, who believe that infant-parent separation is unnatural and at odds with our evolutionary history. They favour not just room-sharing but bed-sharing – putting them in direct conflict with paediatric advice. This debate was recently reignited by a study suggesting that room-sharing for up to nine months reduces a baby’s sleep, which in theory could have future health consequences. So what’s a sleep-deprived parent to do? Our ancestors slept in direct contact with their young in order to protect them, just as other primates do today, says Helen Ball at Durham University, UK. “Babies respond to close contact – their breathing, blood oxygen and heart rate are on a more even keel.” In Asia and Africa, most babies still share their parents’ beds (see map). But in the West, bed-sharing fell during the industrial revolution as increased wealth let people afford separate rooms and value was placed on teaching early independence. © Copyright New Scientist Ltd.

Keyword: Sleep
Link ID: 23829 - Posted: 07.13.2017

By Giorgia Guglielmi Semen has something in common with the brains of Alzheimer’s sufferers: Both contain bundles of protein filaments called amyloid fibrils. But although amyloid accumulation appears to damage brain cells, these fibrils may be critical for reproduction. A new study suggests that semen fibrils immobilize subpar sperm, ensuring that only the fittest ones make it to the egg. “I’m sure that from the very first time scientists described semen fibrils, they must have been speculating what their natural function was,” says Daniel Otzen, an expert in protein aggregates at Aarhus University in Denmark, who did not participate in the research. “This seems to be the smoking gun.” Researchers discovered semen fibrils in 2007. At first, they seemed like mostly bad news. Scientists showed that the fibrils, found in the seminal fluid together with sperm cells and other components, can bind to HIV, helping it get inside cells. But the fibrils are found in most primates, notes Nadia Roan, a mucosal biologist at the University of California, San Francisco. “If fibrils didn’t serve some beneficial purpose, they would have been eliminated over evolutionary time.” Because the way HIV fuses to cells is reminiscent of the way a sperm fuses to the egg, she wondered whether the fibrils facilitated fertilization. © 2017 American Association for the Advancement of Science.

Keyword: Alzheimers; Sexual Behavior
Link ID: 23828 - Posted: 07.12.2017

By Linda Geddes Many dangers stalk the bushlands of Tanzania while members of the Hadza people sleep, yet no one keeps watch. There is no need because it seems that natural variation in sleep means there’s rarely a moment when someone isn’t alert enough to raise the alarm. That’s the conclusion of a study that sheds new light on why teenagers sleep late while grandparents are often up at the crack of dawn. Fifty years ago, psychologist Frederick Snyder proposed that animals who live in groups stay vigilant during sleep, by having some stay awake while others rest. However, no one had tested this sentinel hypothesis in humans until now. One way of maintaining this constant vigilance might be by the evolution of different chronotypes – individual differences in when we tend to sleep. This changes as we age, with teenagers shifting towards later bedtimes, and older people towards earlier bedtimes. Would such variability be enough to keep a community safe at night? To investigate, David Samson, then at the University of Toronto in Canada, and his colleagues turned to the Hadza, a group of hunter-gatherers in northern Tanzania. The Hadza sleep in grass huts, each containing one or two adults and often several children. They live in camps of around 30 adults, although several other camps may be close by. Samson recruited 33 adults from two nearby groups of 22 huts and asked them to wear motion-sensors on their wrists to monitor sleep, for 20 days. “It turned out that it was extremely rare for there to be synchronous sleep,” says Samson, now at Duke University in Durham, North Carolina. © Copyright New Scientist Ltd.

Keyword: Sleep; Development of the Brain
Link ID: 23827 - Posted: 07.12.2017

By Jessica Wright, Spectrum on July 11, 2017 Treatment with the hormone oxytocin improves social skills in some children with autism, suggest results from a small clinical trial. The results appeared today in the Proceedings of the National Academy of Sciences1. Oxytocin, dubbed the ‘love hormone,’ enhances social behavior in animals. This effect makes it attractive as a potential autism treatment. But studies in people have been inconsistent: Some small trials have shown that the hormone improves social skills in people with autism, and others have shown no benefit. This may be because only a subset of people with autism respond to the treatment. In the new study, researchers tried to identify this subset. The same team showed in 2014 that children with relatively high blood levels of oxytocin have better social skills than do those with low levels2. In their new work, the researchers examined whether oxytocin levels in children with autism alter the children’s response to treatment with the hormone. They found that low levels of the hormone prior to treatment are associated with the most improvement in social skills. “We need to be thinking about a precision-medicine approach for autism,” says Karen Parker, associate professor of psychiatry at Stanford University in California, who co-led the study. “There’s been a reasonable number of failed [oxytocin] trials, and the question is: Could they have failed because all of the kids, by blind, dumb luck, had really high baseline oxytocin levels?” The study marks the first successful attempt to find a biological marker that predicts response to the therapy. © 2017 Scientific American,

Keyword: Autism; Hormones & Behavior
Link ID: 23826 - Posted: 07.12.2017