Most Recent Links

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 22107

By SABRINA TAVERNISE WASHINGTON — The Food and Drug Administration approved the first drug to treat patients with the most common childhood form of muscular dystrophy, a vivid example of the growing power that patients and their advocates wield over the federal government’s evaluation of drugs. The agency’s approval went against the recommendation of its experts. The main clinical trial of the drug was small, involving only 12 boys with the disease known as Duchenne muscular dystrophy, and did not have an adequate control group of boys who had the disease but did not take the drug. A group of independent experts convened by the agency this spring said there was not enough evidence that it was effective. But the vote was close. Large and impassioned groups of patients, including boys in wheelchairs, and their advocates, weighed in. The muscular dystrophy community is well organized and has lobbied for years to win approval for the drug, getting members of Congress to write letters to the agency. A decision on the drug had been delayed for months. The approval was so controversial that F.D.A. employees fought over it, a dispute that was taken to the agency’s commissioner, Dr. Robert M. Califf, who ultimately decided that it would stand. The approval delighted the drug’s advocates and sent the share price of the drug’s maker, Sarepta Therapeutics, soaring. But it was taken as a deeply troubling sign among drug policy experts who believe the F.D.A. has been far too influenced by patient advocates and drug companies, and has allowed the delicate balance in drug approvals to tilt toward speedy decisions based on preliminary data and away from more conclusive evidence of effectiveness and safety. © 2016 The New York Times Company

Keyword: Movement Disorders; Muscles
Link ID: 22671 - Posted: 09.20.2016

Researchers at the National Institutes of Health have discovered a two-way link between depression and gestational diabetes. Women who reported feeling depressed during the first two trimesters of pregnancy were nearly twice as likely to develop gestational diabetes, according to an analysis of pregnancy records. Conversely, a separate analysis found that women who developed gestational diabetes were more likely to report postpartum depression six weeks after giving birth, compared to a similar group of women who did not develop gestational diabetes. The study was published online in Diabetologia. Gestational diabetes is a form of diabetes (high blood sugar level) occurring only in pregnancy, which if untreated may cause serious health problems for mother and infant. “Our data suggest that depression and gestational diabetes may occur together,” said the study’s first author, Stefanie Hinkle, Ph.D., staff scientist in the Division of Intramural Population Health Research at the NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “Until we learn more, physicians may want to consider observing pregnant women with depressive symptoms for signs of gestational diabetes. They also may want to monitor women who have had gestational diabetes for signs of postpartum depression.” Although obesity is known to increase the risk for gestational diabetes, the likelihood of gestational diabetes was higher for non-obese women reporting depression than for obese women with depression.

Keyword: Depression; Obesity
Link ID: 22670 - Posted: 09.20.2016

By Meredith Wadman Last year, in a move to counter charges that it has neglected the health and safety of its players, the National Football League (NFL) tapped Elizabeth “Betsy” Nabel as its first chief health and medical adviser, a paid position to which she told The Boston Globe she devotes about 1 day a month, plus some nights and weekends. (She and NFL have not disclosed her salary.) And last week, Nabel answered Science’s questions on the heels of NFL’s 14 September announcement that it will devote $40 million in new funding to medical research, primarily neuroscience relevant to repetitive head injuries—with grant applications judged by an NFL-convened panel of scientists, rather than by National Institutes of Health (NIH) study sections. Nabel is well known to many medical scientists as the cardiologist who directed the National Heart, Lung, and Blood Institute at NIH, then left that job in 2009 to become president of a prestigious Harvard University–affiliated teaching hospital: Brigham and Women’s Hospital in Boston. Nabel’s new role with NFL came under media scrutiny in May, when a report by Democrats on the House of Representatives Energy and Commerce Committee found that NFL inappropriately tried to influence the way its “unrestricted” donation to NIH was spent. It revealed, for example, that last year Nabel contacted NIH’s neurology institute director Walter Koroshetz to question the objectivity of an NIH study section and of a principal investigator whose team the peer reviewers had just awarded a $16 million grant. Robert Stern and his group at Boston University, with others, were proposing to image the brains and chart the symptoms of scores of college and professional football players across time. NFL suggested that the scientists, who have led in establishing the link between repetitive head injury and the neurodegenerative brain disease chronic traumatic encephalopathy (CTE), were not objective; Nabel described them in one email as “a more marginal group” whose influence it would be well to “dilute.” The scientists were to have been paid from $30 million that NFL donated to NIH in 2012. After the league objected to its $16 million going to fund the Boston University–led team—it did offer to fund $2 million of the amount—NIH’s neurology institute ended up wholly funding the 7-year grant with its own money. © 2016 American Association for the Advancement of Scienc

Keyword: Brain Injury/Concussion
Link ID: 22669 - Posted: 09.20.2016

By CATHERINE SAINT LOUIS Attention deficit disorder is the most common mental health diagnosis among children under 12 who die by suicide, a new study has found. Very few children aged 5 to 11 take their own lives, and little is known about these deaths. The new study, which included deaths in 17 states from 2003 to 2012, compared 87 children aged 5 to 11 who committed suicide with 606 adolescents aged 12 to 14 who did, to see how they differed. The research was published on Monday in the journal Pediatrics. About a third of the children of each group had a known mental health problem. The very young who died by suicide were most likely to have had attention deficit disorder, or A.D.D., with or without accompanying hyperactivity. By contrast, nearly two-thirds of early adolescents who took their lives struggled with depression. Suicide prevention has focused on identifying children struggling with depression; the new study provides an early hint that this strategy may not help the youngest suicide victims. “Maybe in young children, we need to look at behavioral markers,” said Jeffrey Bridge, the paper’s senior author and an epidemiologist at the Research Institute at Nationwide Children’s Hospital in Columbus, Ohio. Jill Harkavy-Friedman, the vice president of research at the American Foundation for Suicide Prevention, agreed. “Not everybody who is at risk for suicide has depression,” even among adults, said Dr. Harkavy-Friedman, who was not involved in the new research. Yet the new research does not definitively establish that attention deficit disorder and attention deficit hyperactivity disorder, or A.D.H.D., are causal risk factors for suicide in children, Dr. Bridge said. Instead, the findings suggest that “suicide is potentially a more impulsive act among children.” © 2016 The New York Times Company

Keyword: ADHD; Depression
Link ID: 22668 - Posted: 09.19.2016

By PAGAN KENNEDY In 1914, The Lancet reported on a clergyman who was found dead in a pool; he had left behind this suicide note: “Another sleepless night, no real sleep for weeks. Oh, my poor brain, I cannot bear the lengthy, dark hours of the night.” I came across that passage with a shock of recognition. Many people think that the worst part of insomnia is the daytime grogginess. But like that pastor, I suffered most in the dark hours after midnight, when my desire for sleep, my raging thirst for it, would drive me into temporary insanity. On the worst nights, my mind would turn into a mad dog that snapped and gnawed itself. Though one in 10 American adults suffer from chronic insomnia, we have yet to answer the most fundamental questions about the affliction. Scientists are still arguing about the mechanisms of sleep and the reasons it fails in seemingly healthy people. There are few — if any — reliable treatments for insomnia. At the same time, medical journals warn that bad sleep can fester into diseases like cancer and diabetes. Deep in the night, those warnings scuttle around my mind like rats. About 18 months ago, during a particularly grueling period, I felt so desperate that I consulted yet another doctor — but all he did was suggest the same drugs that had failed me in the past. I was thrown back once again on my own ways of coping. As a child, I had invented mental games to distract myself. For instance, I would compile a list of things and people that made me happy, starting with words that began with A and moving through the alphabet. One night, I was in the Qs, trying to figure out what to add to quesadillas, queer theory and Questlove. Then, suddenly, the game infuriated me — why, why, why did I have to spend hours doing this? In the red glare of the digital clock, my brain rattled its cage. I prepared for a wave of lunacy. But instead of a meltdown, I had a wild idea: What if there was another, easier, way to drive the miserable thoughts from my mind? I began to fantasize about a machine that would do the thinking for me. I pictured it like another brain that would fit on top of my head. The next day, I cobbled together my first insomnia machine. © 2016 The New York Times Company

Keyword: Sleep
Link ID: 22667 - Posted: 09.19.2016

By DAVID Z. HAMBRICK and ALEXANDER P. BURGOYNE ARE you intelligent — or rational? The question may sound redundant, but in recent years researchers have demonstrated just how distinct those two cognitive attributes actually are. It all started in the early 1970s, when the psychologists Daniel Kahneman and Amos Tversky conducted an influential series of experiments showing that all of us, even highly intelligent people, are prone to irrationality. Across a wide range of scenarios, the experiments revealed, people tend to make decisions based on intuition rather than reason. In one study, Professors Kahneman and Tversky had people read the following personality sketch for a woman named Linda: “Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in antinuclear demonstrations.” Then they asked the subjects which was more probable: (A) Linda is a bank teller or (B) Linda is a bank teller and is active in the feminist movement. Eighty-five percent of the subjects chose B, even though logically speaking, A is more probable. (All feminist bank tellers are bank tellers, though some bank tellers may not be feminists.) In the Linda problem, we fall prey to the conjunction fallacy — the belief that the co-occurrence of two events is more likely than the occurrence of one of the events. In other cases, we ignore information about the prevalence of events when judging their likelihood. We fail to consider alternative explanations. We evaluate evidence in a manner consistent with our prior beliefs. And so on. Humans, it seems, are fundamentally irrational. But starting in the late 1990s, researchers began to add a significant wrinkle to that view. As the psychologist Keith Stanovich and others observed, even the Kahneman and Tversky data show that some people are highly rational. In other words, there are individual differences in rationality, even if we all face cognitive challenges in being rational. So who are these more rational people? Presumably, the more intelligent people, right? © 2016 The New York Times Company

Keyword: Intelligence; Attention
Link ID: 22666 - Posted: 09.19.2016

By Colin Barras It is not just about speed. The only songbird known to perform a rapid tap dance during courtship makes more noise with its feet during its routines than at other times. The blue-capped cordon-bleu (Uraeginthus cyanocephalus) from East Africa is blessed with the attributes of a Broadway star: striking good looks, a strong singing voice – and fine tap-dancing skills. The dances are so fast that they went unnoticed until 2015, when Masayo Soma at Hokkaido University in Japan and her colleagues captured the performances on high-speed film. The bird’s speciality is a left-right-left shuffle ­– only with the feet striking the perch up to 50 times a second. The vision of some birds operates at a faster rate than that of humans, so the cordon-bleu’s dance may simply be about creating an impressive visual performance. But it could also be about winning over a potential mate with rhythm. To explore the idea, Soma and her colleagues recorded audio of the courtship dances, which both males and females perform. They found that the tap dances are unusually loud: the feet strike the branch with enough force to generate sound averaging 30 decibels. This typically drops to just 20 decibels when a bird’s feet strike the branch as it hops around when it is not performing, which means the step sounds are not just a by-product of movement. © Copyright Reed Business Information Ltd.

Keyword: Sexual Behavior
Link ID: 22665 - Posted: 09.19.2016

By Colin Barras Subtract 8 from 52. Did you see the calculation in your head? While a leading theory suggests our visual experiences are linked to our understanding of numbers, a study of people who have been blind from birth suggests the opposite. The link between vision and number processing is strong. Sighted people can estimate the number of people in a crowd just by looking, for instance, while children who can mentally rotate an object and correctly imagine how it might look from a different angle often develop better mathematical skills. “It’s actually hard to think of a situation when you might process numbers through any modality other than vision,” says Shipra Kanjlia at Johns Hopkins University in Baltimore, Maryland. But blind people can do maths too. To understand how they might compensate for their lack of visual experience, Kanjlia and her colleagues asked 36 volunteers – 17 of whom had been blind at birth – to do simple mental arithmetic inside an fMRI scanner. To level the playing field, the sighted participants wore blindfolds. We know that a region of the brain called the intraparietal sulcus (IPS) is, and brain scans revealed that the same area is similarly active in blind people too. “It’s really surprising,” says Kanjlia. “It turns out brain activity is remarkably similar, at least in terms of classic number processing.” This may mean we have a deep understanding of how to handle numbers that is entirely independent of visual experience. This suggests we are all born with a natural understanding of numbers – an idea many researchers find difficult to accept. © Copyright Reed Business Information Ltd.

Keyword: Vision; Attention
Link ID: 22664 - Posted: 09.17.2016

By Catherine Caruso Most of us think little of hopping on Google Maps to look at everything from a bird’s-eye view of an entire continent to an on-the-ground view of a specific street, all carefully labeled. Thanks to a digital atlas published this week, the same is now possible with the human brain. Ed Lein and colleagues at the Allen Institute for Brain Science in Seattle have created a comprehensive, open-access digital atlas of the human brain, which was published this week in The Journal of Comparative Neurology. “Essentially what we were trying to do is to create a new reference standard for a very fine anatomical structural map of the complete human brain,” says Lein, the principal investigator on the project. “It may seem a little bit odd, but actually we are a bit lacking in types of basic reference materials for mapping the human brain that we have in other organisms like mouse or like monkey, and that is in large part because of the enormous size and complexity of the human brain.” The project, which spanned five years, focused on a single healthy postmortem brain from a 34-year-old woman. The researchers started with the big picture: They did a complete scan of the brain using two imaging techniques (magnetic resonance imaging and diffusion weighted imaging), which allowed them to capture both overall brain structure and the connectivity of brain fibers. Next the researchers took the brain and sliced it into 2,716 very thin sections for fine-scale, cellular analysis. They stained a portion of the sections with a traditional Nissl stain to gather information about general cell architecture. They then used two other stains to selectively label certain aspects of the brain, including structural elements of cells, fibers in the white matter, and specific types of neurons. © 2016 Scientific American

Keyword: Brain imaging; Development of the Brain
Link ID: 22663 - Posted: 09.17.2016

Dean Burnett You remember that time a children’s TV presenter, one who has been working in children’s television for decades and is now employed on a channel aimed at under-8-year-olds, decided to risk it all and say one of the worst possible swear words on a show for pre-schoolers that he is famous for co-hosting? Remember how he took a huge risk for no appreciable gain and uttered a context-free profanity to an audience of toddlers? How he must have wanted to swear on children’s TV but paradoxically didn’t want anyone to notice so “snuck it in” as part of a song, where it would be more ambiguous? How all the editors and regulators at the BBC happened to completely miss it and allow it to be aired? Remember this happening? Well you shouldn’t, because it clearly didn’t. No presenter and/or channel would risk their whole livelihood in such a pointless, meaningless way, especially not the ever-pressured BBC. And, yet, an alarming number of people do think it happened. Apparently, there have been some “outraged parents” who are aghast at the whole thing. This seems reasonable in some respects; if your toddler was subjected to extreme cursing then as a parent you probably would object. On the other hand, if your very small child is able to recognise strong expletives, then perhaps misheard lyrics on cheerful TV shows aren’t the most pressing issue in their life. Regardless, a surprising number of people report that they did genuinely “hear” the c-word. This is less likely to be due to a TV presenter having some sort of extremely-fleeting breakdown, and more likely due to the quirks and questionable processing of our senses by our powerful yet imperfect brains. © 2016 Guardian News and Media Limited

Keyword: Hearing; Attention
Link ID: 22662 - Posted: 09.17.2016

Napping for more than an hour during the day could be a warning sign for type-2 diabetes, Japanese researchers suggest. They found the link after analysing observational studies involving more than 300,000 people. UK experts said people with long-term illnesses and undiagnosed diabetes often felt tired during the day. But they said there was no evidence that napping caused or increased the risk of diabetes. The large study, carried out by scientists at the University of Tokyo, is being presented at a meeting of the European Association for the Study of Diabetes in Munich. Their research found there was a link between long daytime naps of more than 60 minutes and a 45% increased risk of type-2 diabetes, compared with no daytime napping - but there was no link with naps of less than 40 minutes. The researchers said long naps could be a result of disturbed sleep at night, potentially caused by sleep apnoea. And this sleeping disorder could increase the risk of heart attacks, stroke, cardiovascular problems and other metabolic disorders, including type-2 diabetes. Sleep deprivation, caused by work or social life patterns, could also lead to increased appetite, which could increase the risk of type-2 diabetes. But it was also possible that people who were less healthy or in the early stages of diabetes were more likely to nap for longer during the day. Shorter naps, in contrast, were more likely to increase alertness and motor skills, the authors said. © 2016 BBC.

Keyword: Sleep; Obesity
Link ID: 22661 - Posted: 09.17.2016

Tina Hesman Saey Color vision may actually work like a colorized version of a black-and-white movie, a new study suggests. Cone cells, which sense red, green or blue light, detect white more often than colors, researchers report September 14 in Science Advances. The textbook-rewriting discovery could change scientists’ thinking about how color vision works. For decades, researchers have known that three types of cone cells in the retina are responsible for color vision. Those cone cells were thought to send “red,” “green” and “blue” signals to the brain. The brain supposedly combines the colors, much the way a color printer does, to create a rainbow-hued picture of the world (including black and white). But the new findings indicate that “the retina is doing more of the work, and it’s doing it in a more simpleminded way,” says Jay Neitz, a color vision scientist at the University of Washington in Seattle who was not involved in the study. Red and green cone cells each come in two types: One type signals “white”; another signals color, vision researcher Ramkumar Sabesan and colleagues at the University of California, Berkeley, discovered. The large number of cells that detect white (and black — the absence of white) create a high-resolution black-and-white picture of a person’s surroundings, picking out edges and fine details. Red- and green-signaling cells fill in low-resolution color information. The process works much like filling in a coloring book or adding color to a black-and-white film, says Sabesan, who is now at the University of Washington. |© Society for Science & the Public 2000 - 2016

Keyword: Vision
Link ID: 22660 - Posted: 09.15.2016

By Brian Owens It’s certainly something to crow about. New Caledonian crows are known for their ingenious use of tools to get at hard-to-reach food. Now it turns out that their Hawaiian cousins are adept tool-users as well. Christian Rutz at the University of St Andrews in the UK has spent 10 years studying the New Caledonian crow and wondered whether any other crow species are disposed to use tools. So he looked for crows that have similar features to the New Caledonian crow – a straight bill and large, mobile eyes that allow it to manipulate tools, much as archaeologists use opposable thumbs as an evolutionary signature for tool use in early humans. “The Hawaiian crow really stood out,” he says. “They look quite similar.” Hawaiian crows are extinct in the wild, but 109 birds still live in two captive breeding facilities in Hawaii. That meant Rutz was able to test pretty much every member of the species. He stuffed tasty morsels into a variety of holes and crevices in a log, and gave the birds a variety of sticks to see if they would use them to dig out the food. Almost all of them did, and most extracted the food in less than a minute, faster than the researchers themselves could. “It’s mind-blowing,” says Rutz. “They’re very good at getting the tool in the right position, and if they’re not happy with it they’ll modify it or make their own.” © Copyright Reed Business Information Ltd.

Keyword: Intelligence; Learning & Memory
Link ID: 22659 - Posted: 09.15.2016

Richard J. McNally The welcoming letter to the class of 2020 in which Jay Ellison, a dean at the University of Chicago, told incoming students not to expect trigger warnings on campus struck a nerve in a highly polarized debate that is embroiling academia. Trigger warnings are countertherapeutic because they encourage avoidance of reminders of trauma, and avoidance maintains P.T.S.D. Trigger warnings, critics claim, imperil academic freedom and further infantilize a cohort of young people accustomed to coddling by their helicopter parents. Proponents of trigger warnings point out that many students have suffered trauma, exemplified by alarming rates of sexual assault on campus. Accordingly, they urge professors to warn students about potentially upsetting course materials and to exempt distressed students from classes covering topics likely to trigger post-traumatic stress disorder, or P.T.S.D., symptoms, such as flashbacks, nightmares and intrusive thoughts about one’s personal trauma. Proponents of trigger warnings are deeply concerned about the emotional well-being of students, especially those with trauma histories. Yet lost in the debate are two key points: Trauma is common, but P.T.S.D. is rare. Epidemiological studies show that many people are exposed to trauma in their lives, and most have had transient stress symptoms. But only a minority fails to recover, thereby developing P.T.S.D. Students with P.T.S.D. are those most likely to have adverse emotional reactions to curricular material, not those with trauma histories whose acute stress responses have dissipated. However, trigger warnings are countertherapeutic because they encourage avoidance of reminders of trauma, and avoidance maintains P.T.S.D. Severe emotional reactions triggered by course material are a signal that students need to prioritize their mental health and obtain evidence-based, cognitive-behavioral therapies that will help them overcome P.T.S.D. These therapies involve gradual, systematic exposure to traumatic memories until their capacity to trigger distress diminishes. © 2015 The New York Times Company

Keyword: Stress; Learning & Memory
Link ID: 22658 - Posted: 09.15.2016

By Jessica Hamzelou After experiencing post-traumatic stress disorder after being raped, Karestan Koenen made it her career to study the condition. Now at Harvard University, Koenen is leading the largest ever genetic study of PTSD, by sifting through the genomes of tens of thousands of people (see Why women are more at risk of PTSD – and how to prevent it”). She tells New Scientist how her experiences shaped her career What was your idea of PTSD before you experienced it yourself? I would have associated it with men who served in the military – the stereotype of a Vietnam veteran who has experienced really horrible combat, and comes back and has nightmares about it. Do you think that is how PTSD is perceived by the public generally? Yes. People know that PTSD is related to trauma, and that people can have flashbacks and nightmares. But they tend to think it is associated with combat. A lot of popular images of PTSD come from war movies, and people tend to associate being a soldier with being a man. They are less aware that most PTSD is related to things that happen to civilians – things like rape, sexual assault and violence, which can affect women more than men. Is this misperception of PTSD problematic? It’s a problem in the sense that women or men who have PTSD from non-combat experiences might not recognise what they have as PTSD, and because of that, may not end up getting help. And if you saw it in a loved one, you may not understand what was going on with them. © Copyright Reed Business Information Ltd.

Keyword: Stress; Genes & Behavior
Link ID: 22657 - Posted: 09.15.2016

By Krystnell A. Storr This one goes out to the head bobbers, the window seat sleepers, and the open-mouth breathers — there is no shame in being able to fall asleep anywhere, and at any time. Be proud, and, if you can’t help it, snore loud. Scientists have come to a consensus that our bodies definitely need sleep, but we don’t all need the same amount. The next step for them is to figure out where the process of sleep starts and ends in the body. And, like a good movie, one revelation about sleep only leads to another. Think of yourself as a very minor character in the scientific story of fatigue. The real star of this cozy mystery is the fruit fly, an A-lister in sleep science. Thanks to fruit flies, we understand two of the basic factors that govern sleep: a biological clock, which scientists know a lot about, and a homeostatic switch, which they only just discovered and are beginning to understand. Let’s start with this biological clock. The clock that is connected to sleep is controlled by a circadian rhythm and uses environmental cues such as sunlight to tell the body when to wake up. This sun-sleep connection in humans and flies alike got scientists like Russell Foster, a professor at Oxford University in the United Kingdom, asking questions such as: What happens when we don’t have the mechanisms in our eye to distinguish dawn from dusk and send that message to the brain? Why can we still fall asleep according to the circadian rhythm? The answer, Foster said, is that mammals have a third layer of photoreceptors in the eye. It used to be that scientists thought rods and cones, cells that help us process images, were the only ones in the eye that worked to detect light. But when they removed these cells in mice, they noticed that the mice could still keep up with the circadian rhythm. The hidden cells, they found, were intrinsically sensitive to light and acted as a backup measure to keep us on our sleep schedule, whether we can see that the sun is up or not.

Keyword: Biological Rhythms; Sleep
Link ID: 22656 - Posted: 09.15.2016

By Rachel Feltman In the age of the quantified self, products that promise to track your habits and fix your behavior are a dime a dozen. Find out how much you walk; do that more. Find out how much junk you eat; do that less. Correct your posture in real time, and get feedback as you strengthen your pelvic floor muscles. More and more companies are built on the notion that any problem can be solved if you get enough numbers to find a pattern. In that sense, Sense — a sleep tracker made by the start-up Hello — isn't all that unusual. But the company's new lead scientist is just getting his hands on two years of user sleep data, and he seems particularly passionate about using it for good. Matthew Walker, a professor of neuroscience and psychology at the University of California in Berkeley, and director of the U.C. Berkeley Sleep and Neuroimaging Laboratory, does not mince words when it comes to snoozing. "It’s very clear right now that the sleep-loss epidemic is the greatest public health crisis in First World nations of the 21st century," Walker told The Washington Post. "Every disease that is killing us, in First World countries, can be linked to loss of sleep." Indeed, the Centers for Disease Control and Prevention states that lack of sleep — in addition to causing fatal accidents and injuries — has been linked to an increase risk of hypertension, diabetes, depression, obesity and even cancer. Just about all scientists and medical professionals agree that good sleep helps keep the body healthy. © 1996-2016 The Washington Post

Keyword: Sleep
Link ID: 22655 - Posted: 09.15.2016

By GINA KOLATA A few years ago, Richard Kahn, the now-retired chief scientific and medical officer of the American Diabetes Association, was charged with organizing a committee to prescribe a diet plan for people with diabetes. He began by looking at the evidence for different diets, asking which, if any, best controlled diabetes. “When you look at the literature, whoa is it weak. It is so weak,” Dr. Kahn said in a recent interview. Studies tended to be short term, diets unsustainable, differences between them clinically insignificant. The only thing that really seemed to help people with diabetes was weight loss — and for weight loss there is no magic diet. But people want diet advice, Dr. Kahn reasoned, and the association really should say something about diets. So it, like the National Institutes of Health, went with the Department of Agriculture’s food pyramid. Why? “It’s a diet for all America,” Dr. Kahn said. ”It has lots of fruits and vegetables and a reasonable amount of fat.” That advice, though, recently came under attack in a New York Times commentary written by Sarah Hallberg, an osteopath at a weight loss clinic in Indiana, and Osama Hamdy, the medical director of the obesity weight loss program at the Joslin Diabetes Center at Harvard Medical School. There is a diet that helps with diabetes, the two doctors said, one that restricts — or according to Dr. Hallberg, severely restricts — — carbohydrates. “If the goal is to get patients off their medications, including insulin, and resolve rather than just control their diabetes, significant carb restriction is by far the best nutrition plan,” Dr. Hallberg said in an email. “This would include elimination of grains, potatoes and sugars and all processed foods. There is a significant and ever growing body of literature that supports this method.” She is in private practice at Indiana University Health Arnett Hospital and is medical director of a startup developing nutrition-based medical interventions. © 2016 The New York Times Company

Keyword: Obesity
Link ID: 22654 - Posted: 09.15.2016

André Corrêa d’Almeida and Amanda Sue Grossi Development. Poverty. Africa. These are just three words on a page – almost no information at all – but how many realities did our readers just conjure? And how many thoughts filled the spaces in-between? Cover yourselves. Your biases are showing. In the last few decades, groundbreaking work by psychologists and behavioural economists has exposed unconscious biases in the way we think. And as the World Bank’s 2015 World Development Report points out, development professionals are not immune to these biases. There is a real possibility that seemingly unbiased and well-intentioned development professionals are capable of making consequential mistakes, with significant impacts upon the lives of others, namely the poor. The problem arises when mindsets are just that – set. As the work of Daniel Kahneman and Amos Tversky has shown, development professionals – like people generally – have two systems of thinking; the automatic and the deliberative. For the automatic, instead of performing complex rational calculations every time we need to make a decision, much of our thinking relies on pre-existing mental models and shortcuts. These are based on assumptions we create throughout our lives and that stem from our experiences and education. More often than not, these mental models are incomplete and shortcuts can lead us down the wrong path. Thinking automatically then becomes thinking harmfully. © 2016 Guardian News and Media Limited

Keyword: Attention
Link ID: 22653 - Posted: 09.15.2016

By Rachel Becker Optical illusions have a way of breaking the internet, and the latest visual trick looks like it’s well on its way. On Sunday afternoon, game developer Will Kerslake tweeted a picture of intersecting gray lines on a white background. Twelve black dots blink in and out of existence where the gray lines meet. In the six hours since he posted the photo to Twitter, it’s been shared more than 6,000 times, with commenters demanding to know why they can’t see all 12 dots at the same time. The optical illusion was first posted to Facebook about a day ago by Japanese psychology professor Akiyoshi Kitaoka, and it has been shared more than 4,600 times so far. But the origin of this bit of visual trickery is a scientific paper published in the journal Perception in 2000. To be clear, there really are 12 black dots in the image. But (most) people can’t see all 12 dots at the same time, which is driving people nuts. "They think, 'It’s an existential crisis,'" says Derek Arnold, a vision scientist at the University of Queensland in Australia. "'How can I ever know what the truth is?'" But, he adds, scientists who study the visual system know that perception doesn’t always equal reality. In this optical illusion, the black dot in the center of your vision should always appear. But the black dots around it seem to appear and disappear. That’s because humans have pretty bad peripheral vision. If you focus on a word in the center of this line you’ll probably see it clearly. But if you try to read the words at either end without moving your eyes, they most likely look blurry. As a result, the brain has to make its best guess about what’s most likely to be going on in the fuzzy periphery — and fill in the mental image accordingly. © 2016 Vox Media, Inc.

Keyword: Vision
Link ID: 22652 - Posted: 09.15.2016