Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Sam Doernberg and Joe DiPietro It’s the first day of class, and we—a couple of instructors from Cornell—sit around a table with a few of our students as the rest trickle in. Anderson, one of the students seated across from us, smiles and says, “I’m going to get an A+ in your class.” “No,” VanAntwerp retorts, “I’m getting the A+.” You might think that this scene is typical of classes at a school like Cornell University, where driven students compete for top marks. But this didn’t happen on a college campus: It took place in a maximum-security prison. To the outside world, they are inmates, but in the classroom, they are students enrolled in the Cornell Prison Education Program, or “CPEP.” Per New York State Department of Corrections rules, we have permission to use the inmates’ last names only—which is also often how we know them best. Those who graduate from the program—taught by Cornell instructors—will receive an associate’s degree from Cayuga Community College. Before teaching neuroscience to prison inmates, we taught it to Cornell undergraduates as part of the teaching staff for Cornell’s Introduction to Neuroscience course. Most Cornell neuroscience students are high-achieving biology majors and premeds, who are well prepared to succeed in a demanding course. They generally have gone from one academic success to another, and it is no secret that they expect a similar level of success in a neuroscience class. © 2016 by The Atlantic Monthly Group
Keyword: Learning & Memory
Link ID: 22093 - Posted: 04.12.2016
By Nicholas Bakalar Hormone therapy for prostate cancer may increase the risk for depression, a new analysis has found. Hormone therapy, or androgen deprivation therapy, a widely used prostate cancer treatment, aims to reduce levels of testosterone and other male hormones, which helps limit the spread of prostate cancer cells. From 1992 to 2006, researchers studied 78,552 prostate cancer patients older than 65, of whom 33,382 had hormone therapy. Compared with those treated with other therapies, men who received androgen deprivation therapy were 23 percent more likely to receive a diagnosis of depression, and they had a 29 percent increased risk of having inpatient psychiatric treatment. Longer hormone treatment increased the risk: Researchers found a 12 percent increased relative risk with six or fewer months of treatment, a 26 percent increased risk with seven to 11 months, and a 37 percent increased risk with a year or more. The study, in The Journal of Clinical Oncology, is observational, and does not prove causation. The senior author, Dr. Paul L. Nguyen, of Brigham and Women’s Hospital, said that research is finding “almost an avalanche of side effects” with hormone therapy. Still, for some patients, especially those with severe disease, it can be a life saver. “You have to know what the potential upside is. For some guys it will still be worth it, but for some not.” © 2016 The New York Times Company
By FRANS de WAAL TICKLING a juvenile chimpanzee is a lot like tickling a child. The ape has the same sensitive spots: under the armpits, on the side, in the belly. He opens his mouth wide, lips relaxed, panting audibly in the same “huh-huh-huh” rhythm of inhalation and exhalation as human laughter. The similarity makes it hard not to giggle yourself. The ape also shows the same ambivalence as a child. He pushes your tickling fingers away and tries to escape, but as soon as you stop he comes back for more, putting his belly right in front of you. At this point, you need only to point to a tickling spot, not even touching it, and he will throw another fit of laughter. Laughter? Now wait a minute! A real scientist should avoid any and all anthropomorphism, which is why hard-nosed colleagues often ask us to change our terminology. Why not call the ape’s reaction something neutral, like, say, vocalized panting? That way we avoid confusion between the human and the animal. The term anthropomorphism, which means “human form,” comes from the Greek philosopher Xenophanes, who protested in the fifth century B.C. against Homer’s poetry because it described the gods as though they looked human. Xenophanes mocked this assumption, reportedly saying that if horses had hands they would “draw their gods like horses.” Nowadays the term has a broader meaning. It is typically used to censure the attribution of humanlike traits and experiences to other species. Animals don’t have “sex,” but engage in breeding behavior. They don’t have “friends,” but favorite affiliation partners. Given how partial our species is to intellectual distinctions, we apply such linguistic castrations even more vigorously in the cognitive domain. By explaining the smartness of animals either as a product of instinct or simple learning, we have kept human cognition on its pedestal under the guise of being scientific. Everything boiled down to genes and reinforcement. To think otherwise opened you up to ridicule, which is what happened to Wolfgang Köhler, the German psychologist who, a century ago, was the first to demonstrate flashes of insight in chimpanzees. © 2016 The New York Times Company
By Neuroskeptic Do you want to be more successful? Happier? More intelligent? Don’t despair. The answer, we’re told, is right in front of your nose—or rather, right behind it. It’s your own brain. Thanks to neuroscience, you can hack your gray matter. According to the sales pitch, almost anything is possible, if you can master your brain—and if you can afford to buy the products that promise to help you do that. But how many of these neuroproducts are neurobullshit? And what makes neuroscience so attractive to people with something to sell? I’m a neuroscientist who has been blogging about the brain for the past eight years. Over this time I’ve noticed a steady increase in the number of neuroscience-themed commercial products. There are brain pills to optimize your mental focus. There are futuristic-looking headbands that promise to measure or stimulate your neural activity in order to make you smarter, or help you sleep better, or even meditate better. There is no end of “brain training” apps and neuroscience-themed self-help books. These products tend to have names based around “Neuro” or “Brain.” And they will come advertised as being “created by neuroscientists,” “based on the latest brain research,” or at least endorsed by some leading brain expert. Once you look beyond the “neuro” gloss, however, you’ll see that many of these products aren’t new at all, but just old products in new packaging. A recent, and notorious, example of this was “Fifth Quarter Fresh,” a brand of chocolate milk.
Link ID: 22090 - Posted: 04.11.2016
Carl Zimmer Five days a week, you can tune into “Paternity Court,” a television show featuring couples embroiled in disputes over fatherhood. It’s entertainment with a very old theme: Uncertainty over paternity goes back a long way in literature. Even Shakespeare and Chaucer cracked wise about cuckolds, who were often depicted wearing horns. But in a number of recent studies, researchers have found that our obsession with cuckolded fathers is seriously overblown. A number of recent genetic studies challenge the notion that mistaken paternity is commonplace. “It’s absolutely ridiculous,” said Maarten H.D. Larmuseau, a geneticist at the University of Leuven in Belgium who has led much of this new research. The term cuckold traditionally refers to the husband of an adulteress, but Dr. Larmuseau and other researchers focus on those cases that produce a child, which scientists politely call “extra-pair paternity.” Until the 20th century, it was difficult to prove that a particular man was the biological father of a particular child. In 1304 a British husband went to court to dispute the paternity of his wife’s child, born while he was abroad for three years. Despite the obvious logistical challenges, the court rejected the husband’s objection. “The privity between a man and his wife cannot be known,” the judge ruled. Modern biology lifted the veil from this mystery, albeit slowly. In the early 1900s, researchers discovered that people have distinct blood types inherited from their parents. In a 1943 lawsuit, Charlie Chaplin relied on blood-type testing to prove that he was not the father of the actress Joan Barry’s child. (The court refused to accept the evidence and forced Chaplin to pay child support anyway.) © 2016 The New York Times Company
Keyword: Sexual Behavior
Link ID: 22089 - Posted: 04.09.2016
Modern humans diverged from Neanderthals some 600,000 years ago – and a new study shows the Y chromosome might be what kept the two species separate. It seems we were genetically incompatible with our ancient relatives – and male fetuses conceived through sex with Neanderthal males would have miscarried. We knew that some cross-breeding between us and Neanderthals happened more recently – around 100,000 to 60,000 years ago. Neanderthal genes have been found in our genomes, on X chromosomes, and have been linked to traits such as skin colour, fertility and even depression and addiction. Now, an analysis of a Y chromosome from a 49,000-year-old male Neanderthal found in El Sidrón, Spain, suggests the chromosome has gone extinct seemingly without leaving any trace in modern humans. This could simply be because it drifted out of the human gene pool or, as the new study suggests, it could be because genetic differences meant that hybrid offspring who had this chromosome were infertile – a genetic dead end. Fernando Mendez of Stanford University, and his colleagues compared the Neanderthal Y chromosome with that of chimps, and ancient and modern humans. They found mutations in four genes that could have prevented the passage of Y chromosome down the paternal line to the hybrid children. “Some of these mutations could have played a role in the loss of Neanderthal Y chromosomes in human populations,” says Mendez. © Copyright Reed Business Information Ltd.
By Jordana Cepelewicz The brain relies on a system of chemical messengers, known as neurotransmitters, to carry missives from cell to cell. When all is well, these communications enable the brain to coordinate various functions, from complex thought to quick, knee-jerk reactions—but when the system is out of whack, serious disease or disorder can ensue. A team of researchers at the Technical University of Denmark (D.T.U.) and University of Oxford have for the first time identified the molecular structure of dopamine beta-hydroxylase (DBH), the enzyme that controls the conversion between dopamine and norepinephrine, two major neurotransmitters. Understanding the crystal structure of the enzyme could provide an ideal target for drug development. Dopamine and norepinephrine play key roles in many brain functions such as learning, memory, movement and the fight-or-flight response. Imbalances in the levels of these neurotransmitters—and the role DBH plays in regulating them—have been implicated in a wide range of disorders, including hypertension, congestive heart failure, anxiety, depression, post-traumatic stress disorder, Alzheimer’s, schizophrenia, Parkinson’s and even cocaine addiction. DBH has long intrigued biochemists but it has been challenging to perform the analyses needed to determine the protein’s structure. “This enzyme has been particularly difficult,” says Hans Christensen, a chemist at D.T.U. and the study’s lead researcher. “We tried many different expression systems before we finally succeeded. Now that we have the structure it is clear why—[it] is very intricate, with different parts of the enzyme interacting very tightly.” © 2016 Scientific American,
For decades, it was thought that scar-forming cells called astrocytes were responsible for blocking neuronal regrowth across the level of spinal cord injury, but recent findings challenge this idea. According to a new mouse study, astrocyte scars may actually be required for repair and regrowth following spinal cord injury. The research was funded by the National Institutes of Health, and published in Nature. “At first, we were completely surprised when our early studies revealed that blocking scar formation after injury resulted in worse outcomes. Once we began looking specifically at regrowth, though, we became convinced that scars may actually be beneficial,” said Michael V. Sofroniew, M.D., Ph.D., professor of neurobiology at the University of California, Los Angeles, and senior author of the study. “Our results suggest that scars may be a bridge and not a barrier towards developing better treatments for paralyzing spinal cord injuries.” Neurons communicate with one another by sending messages down long extensions called axons. When axons in the brain or spinal cord are severed, they do not grow back automatically. For example, damaged axons in the spinal cord can result in paralysis. When an injury occurs, astrocytes become activated and go to the injury site, along with cells from the immune system and form a scar. Scars have immediate benefits by decreasing inflammation at the injury site and preventing spread of tissue damage. However, long-term effects of the scars were thought to interfere with axon regrowth.
By Melinda Wenner Moyer What if you could pop a pill that made you smarter? It sounds like a Hollywood movie plot, but a new systematic review suggests that the decades-long search for a safe and effective “smart drug” (see below) might have notched its first success. Researchers have found that modafinil boosts higher-order cognitive function without causing serious side effects. Modafinil, which has been prescribed in the U.S. since 1998 to treat sleep-related conditions such as narcolepsy and sleep apnea, heightens alertness much as caffeine does. A number of studies have suggested that it could provide other cognitive benefits, but results were uneven. To clear up the confusion, researchers then at the University of Oxford analyzed 24 studies published between 1990 and 2014 that specifically looked at how modafinil affects cognition. In their review, which was published last year in European Neuropsychopharmacology, they found that the methods used to evaluate modafinil strongly affected the outcomes. Research that looked at the drug's effects on the performance of simple tasks—such as pressing a particular button after seeing a certain color—did not detect many benefits. Yet studies that asked participants to do complex and difficult tasks after taking modafinil or a placebo found that those who took the drug were more accurate, which suggests that it may affect “higher cognitive functions—mainly executive functions but also attention and learning,” explains study co-author Ruairidh Battleday, now a medical doctor and Ph.D. student at the University of California, Berkeley. But don't run to the pharmacy just yet. Although many doctors very likely prescribe the drug off-label to help people concentrate—indeed, a 2008 survey by the journal Nature found that one in five of its readers had taken brain-boosting drugs, and half those people had used modafinil—trials have not yet been done on modafinil's long-term effectiveness or safety. © 2016 Scientific American
By Catherine Matacic How does sign language develop? A new study shows that it takes less than five generations for people to go from simple, unconventional pantomimes—essentially telling a story with your hands—to stable signs. Researchers asked a group of volunteers to invent their own signs for a set of 24 words in four separate categories: people, locations, objects, and actions. Examples included “photographer,” “darkroom,” and “camera.” After an initial group made up the signs—pretending to shoot a picture with an old-fashioned camera for “photographer,” for example—they taught the signs to a new generation of learners. That generation then played a game where they tried to guess what sign another player in their group was making. When they got the answer right, they taught that sign to a new generation of volunteers. After a few generations, the volunteers stopped acting out the words with inconsistent gestures and started making them in ways that were more systematic and efficient. What’s more, they added markers for the four categories—pointing to themselves if the category were “person” or making the outline of a house if the category were “location,” for example—and they stopped repeating gestures, the researchers reported last month at the Evolution of Language conference in New Orleans, Louisiana. So in the video above, the first version of “photographer” is unpredictable and long, compared with the final version, which uses the person marker and takes just half the time. The researchers say their finding supports the work of researchers in the field, who have found similar patterns of development in newly emerging sign languages. The results also suggest that learning and social interaction are crucial to this development. © 2016 American Association for the Advancement of Science
Link ID: 22084 - Posted: 04.09.2016
Sara Reardon Prozac (fluoxetine) and similar antidepressants are among the most prescribed drugs in the United States, but scientists still don’t know exactly how they work. Now one piece of that puzzle — the structure of a protein targeted by several widely used antidepressants — has been solved. The finding, reported on 6 April in Nature1, could enable the development of better, more-targeted depression drugs. But it may come too late for drug companies, many of which have abandoned the search for depression treatments. Prozac and its kin — drugs called selective serotonin reuptake inhibitors (SSRIs) — were first discovered2 in 1972. They address one hallmark of depression: low levels of the molecule serotonin, which neurons use to signal one another. By preventing a protein called serotonin transporter (SERT) form absorbing the serotonin back into neurons that release it, the drugs boost serotonin levels in the junctions between cells. But the details of this mechanism have long eluded researchers, who have sought to crystallize and visualize the SERT protein since the early 1990s. “It’s tough to make, and once you make it, it tends to fall apart in your hands,” says Eric Gouaux, a crystallographer at Oregon Health & Science University in Portland. Gouaux and his colleagues finally succeeded by creating small mutations in the SERT gene to make the protein more stable. For the first time, they were able to see the pocket in which two SSRIs — Paxil (paroxetine) and Lexapro (escitalopram) — bind. They also identified a second pocket, called an allosteric site. When escitalopram binds to both sites, the transporter protein and the drug bond more tightly, which increases the medicine's effect. © 2016 Nature Publishing Group
Link ID: 22083 - Posted: 04.07.2016
JUST say no. That’s supposed to be our reaction to recreational drugs. The trouble is, lots of people say yes please. As a result, the world’s governments have been waging a war on drugs for more than a century. Since 1961, the battle has been orchestrated via international treaties targeting all parts of the supply chain, from the producers to the smugglers, the sellers to the buyers. Yet this supposedly united front has developed some conspicuous cracks. Now those countries backing a different approach have called a UN meeting later this month to make the case for change. The question is whether the UN is ready to soften its stance or whether it will plough on despite mountains of evidence suggesting its zero-tolerance approach has failed. As the reformers collate this to present at the meeting, New Scientist looks at how the approaches taken by different countries stack up (see “Drugs around the world”, below), and asks what can happen next. Some nations are already taking change into their own hands. Portugal allows personal use of any drug – including cocaine and heroin – and several South and Central American countries are moving in the same direction. As for cannabis, the number of places where its open sale has been decriminalised in some form grows ever larger. © Copyright Reed Business Information Ltd.
Keyword: Drug Abuse
Link ID: 22082 - Posted: 04.07.2016
By Chris Brown, Chris Corday, Canada's infatuation with getting a legal high may soon lead straight to Mary Jean Dunsdon's Vancouver kitchen. The self-described diva of cooking with cannabis has been baking and selling intoxicating edibles for the better part of 20 years. "I've easily sold 700,000 to one million cookies," she told CBC News recently in her kitchen. To her customers, Dunsdon, best known by her nickname Watermelon, is a trusted brand. Package "I've done it all: 'nice cream cones', marijuana bacon, I've made 'weedish meatballs'," she said. With legalization on the way in Canada, Dunsdon is hoping her underground bakery and the goodies she sells to a loyal base of medical and recreational customers will finally emerge from the shadows and capture a slice of a new market for marijuana edibles. She has good reason to be optimistic about her future in the business of bud. In the U.S. states where recreational marijuana is already legal, edibles — basically any food or drinks containing marijuana — are the fastest growing segment of the market. New Frontier Financials, which tracks the growth of the U.S. marijuana industry, says Washington state's sale of about 280,000 units of edible marijuana in March is double what it was just 10 months ago. For Canada, it's a trend line that offers a glimpse into the future and also a cautionary tale. "Edibles will be more popular. Way more popular than smoking," said Dunsdon. Watermelon During our visit, Dunsdon ground up marijuana leaf and bud and sprinkled the herb mixture over a fillet of wild B.C. chinook salmon. The topping bears a striking resemblance to pesto. "If you eat it, and eat just the right amount, it's probably the nicest thing you've ever felt," she said. ©2016 CBC/Radio-Canada.
Keyword: Drug Abuse
Link ID: 22081 - Posted: 04.07.2016
By DAN BILEFSKY LONDON — The model in the Gucci ad is young and waiflike, her frail body draped in a geometric-pattern dress as she leans back in front of a wall painted with a tree branch that appears to mimic the angle of her silhouette. On Wednesday, the Advertising Standards Authority of Britain ruled that the ad was “irresponsible” and that the model looked “unhealthily thin,” fanning a perennial debate in the fashion industry over when thin is too thin. The regulator said that the way the woman in the image had posed elongated her torso and accentuated her waist, so that it appeared to be very small. It said her “somber facial expression and dark makeup, particularly around her eyes, made her face look gaunt.” It said the offending image — a still photograph of the model that appeared in an online video posted on the website of The Times of London in December — should not appear again in its current form. The specific image was removed from the video on Gucci’s YouTube channel, though the model still appears in the ad directed by Glen Luchford. The image deemed "irresponsible" by the Advertising Standards Authority of Britain appeared at the end of this online video, but has been taken out. Video by Gucci The Italian fashion brand, for its part, had defended the ad, saying it was part of a video that portrayed a dance party and that was aimed at an older and sophisticated audience. Nowhere in the ads were any models’ bones visible, it said, and they were all “toned and slim.” It noted that “it was, to some extent, a subjective issue as to whether a model looked unhealthily thin,” according to the authority. The decision by the advertising authority, an independent industry regulatory group, barred Gucci from using the image in advertisements in Britain. The ruling comes amid a longstanding debate on both sides of the Atlantic about the perils of overly thin models projecting an unhealthy body image for women. As when critics lashed out against idealized images of “heroin chic” in the early 1990s, some have voiced concern that fashion houses are encouraging potentially hazardous behaviors by glamorizing models who are rail-thin. © 2016 The New York Times Company
Keyword: Anorexia & Bulimia
Link ID: 22080 - Posted: 04.07.2016
by Sarah Zielinski Spring has finally arrived, and birds’ nests all over the country will soon be filling up with eggs and then nestlings. Watch a nest long enough (the Science News staff is partial to the DC Eagle Cam) and you’ll see itty bitty baby birds begging for a meal. But mama birds don’t always reward that begging with food. In some species, like the tree swallow, birds that beg more will get more food. But in others, like the hoopoe, mom ignores who is begging and gives more food to the biggest chicks, researchers have found. This lack of an overall pattern has confounded ornithologists, but it seems that they may have been missing a key piece of the puzzle. A new study finds that the quality of the birds’ environment determines whether a mama bird can afford to feed all of her kids or if she has to ignore some to make sure the others survive. The study appears March 29 in Nature Communications. Stuart West of the University of Oxford and colleagues compiled data from 306 studies that looked at 143 bird species. When the birds were living in a good environment — one that had plenty of resources or a high amount of predictability — then mom would feed the chicks that beg the most, which were often the ones that needed the most help. But when the environment was poor in quality or unpredictable, then mama bird responded less to begging. |© Society for Science & the Public 2000 - 2016.
Keyword: Sexual Behavior
Link ID: 22079 - Posted: 04.07.2016
By JOANNA KLEIN Misconception: Migraines are psychological manifestations of women’s inability to manage stress and emotions Actually: Neurologists are very clear that migraines are a real, debilitating medical condition related to temporary abnormal brain activity. The fact that they may be more common for some women during “that time of the month” has nothing to do with emotions. For centuries, doctors explained migraines as a woman’s problem caused by emotional disturbances like hysteria, depression or stress. “Bizarrely, the recommended cure was marriage!” said Dr. Anne MacGregor, the lead author of the British Association for the Study of Headache’s guidelines for diagnosing and managing migraines. While that prescription may be far behind us, the misconception that migraines are fueled by a woman’s inability to cope persists. “It was considered psychological, or that I was a nervous overachiever, so I would never tell people that I have them,” said Lorie Novak, an artist in her sixties who has suffered from migraines since she was 8. After reading Joan Didion’s 1968 essay “In Bed,” about the writer’s struggle with migraines, Ms. Novak decided to tackle the representation of these debilitating headaches. Starting in 2009, Ms. Novak photographed herself every time she got a migraine. Under the hashtag #notjustaheadache, hundreds of others on Twitter and Instagram have demonstrated their own frustration with a widespread lack of understanding of the reality of migraines. © 2016 The New York Times Company
Laura Sanders NEW YORK — Lip-readers’ minds seem to “hear” the words their eyes see being formed. And the better a person is at lipreading, the more neural activity there is in the brain’s auditory cortex, scientists reported April 4 at the annual meeting of the Cognitive Neuroscience Society. Earlier studies have found that auditory brain areas are active during lipreading. But most of those studies focused on small bits of language — simple sentences or even single words, said study coauthor Satu Saalasti of Aalto University in Finland. In contrast, Saalasti and colleagues studied lipreading in more natural situations. Twenty-nine people read the silent lips of a person who spoke Finnish for eight minutes in a video. “We can all lip-read to some extent,” Saalasti said, and the participants, who had no lipreading experience, varied widely in their comprehension of the eight-minute story. In the best lip-readers, activity in the auditory cortex was quite similar to that evoked when the story was read aloud, brain scans revealed. The results suggest that lipreading success depends on a person’s ability to “hear” the words formed by moving lips, Saalasti said. Citations J. Alho et al. Similar brain responses to lip-read, read and listened narratives. Cognitive Neuroscience Society annual meeting, New York City, April 4, 2016. Further Reading © Society for Science & the Public 2000 - 2016.
Link ID: 22077 - Posted: 04.07.2016
Laura Sanders NEW YORK — Cells in a brain structure known as the hippocampus are known to be cartographers, drawing mental maps of physical space. But new studies show that this seahorse-shaped hook of neural tissue can also keep track of social space, auditory space and even time, deftly mapping these various types of information into their proper places. Neuroscientist Rita Tavares described details of one of these new maps April 2 at the annual meeting of the Cognitive Neuroscience Society. Brain scans had previously revealed that activity in the hippocampus was linked to movement through social space. In an experiment reported last year in Neuron, people went on a virtual quest to find a house and job by interacting with a cast of characters. Through these social interactions, the participants formed opinions about how much power each character held, and how kindly they felt toward him or her. These judgments put each character in a position on a “social space” map. Activity in the hippocampus was related to this social mapmaking, Tavares and colleagues found. It turns out that this social map depends on the traits of the person who is drawing it, says Tavares, of Icahn School of Medicine at Mount Sinai in New York City. People with more social anxiety tended to give more power to characters they interacted with. What’s more, these people's social space maps were smaller overall, suggesting that they explored social space less, Tavares says. Tying these behavioral traits to the hippocampus may lead to a greater understanding of social behavior — and how this social mapping may go awry in psychiatric conditions, Tavares said. © Society for Science & the Public 2000 - 2016.
Keyword: Learning & Memory
Link ID: 22076 - Posted: 04.06.2016
Emily Anthes Type 'depression' into the Apple App Store and a list of at least a hundred programs will pop up on the screen. There are apps that diagnose depression (Depression Test), track moods (Optimism) and help people to “think more positive” (Affirmations!). There's Depression Cure Hypnosis (“The #1 Depression Cure Hypnosis App in the App Store”), Gratitude Journal (“the easiest and most effective way to rewire your brain in just five minutes a day”), and dozens more. And that's just for depression. There are apps pitched at people struggling with anxiety, schizophrenia, post-traumatic stress disorder (PTSD), eating disorders and addiction. This burgeoning industry may meet an important need. Estimates suggest that about 29% of people will experience a mental disorder in their lifetime1. Data from the World Health Organization (WHO) show that many of those people — up to 55% in developed countries and 85% in developing ones — are not getting the treatment they need. Mobile health apps could help to fill the gap (see 'Mobilizing mental health'). Given the ubiquity of smartphones, apps might serve as a digital lifeline — particularly in rural and low-income regions — putting a portable therapist in every pocket. “We can now reach people that up until recently were completely unreachable to us,” says Dror Ben-Zeev, who directs the mHealth for Mental Health Program at the Dartmouth Psychiatric Research Center in Lebanon, New Hampshire. Public-health organizations have been buying into the concept. In its Mental Health Action Plan 2013–2020, the WHO recommended “the promotion of self-care, for instance, through the use of electronic and mobile health technologies.” And the UK National Health Service (NHS) website NHS Choices carries a short list of online mental-health resources, including a few apps, that it has formally endorsed. © 2016 Nature Publishing Grou
By Sandhya Somashekhar African Americans are routinely under-treated for their pain compared with whites, according to research. A study released Monday sheds some disturbing light on why that might be the case. Researchers at the University of Virginia quizzed white medical students and residents to see how many believed inaccurate and at times "fantastical" differences about the two races -- for example, that blacks have less sensitive nerve endings than whites or that black people's blood coagulates more quickly. They found that fully half thought at least one of the false statements presented was possibly, probably or definitely true. Moreover, those who held false beliefs often rated black patients' pain as lower than that of white patients and made less appropriate recommendations about how they should be treated. The study, published in the Proceedings of the National Academy of Sciences, could help illuminate one of the most vexing problems in pain treatment today: That whites are more likely than blacks to be prescribed strong pain medications for equivalent ailments. A 2000 study out of Emory University found that at a hospital emergency department in Atlanta, 74 percent of white patients with bone fractures received painkillers compared with 50 percent of black patients. Similarly, a paper last year found that black children with appendicitis were less likely to receive pain medication than their white counterparts. And a 2007 study found that physicians were more likely to underestimate the pain of black patients compared with other patients.