Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Sandhya Somashekhar NEW WESTMINSTER, B.C. — Alanna Whitney was a weird kid. She had a strange knack for pronouncing long words. Anchovies on pizza could send her cowering under a table. Her ability to geek out on subjects such as Greek mythology and world religions could be unsettling. She drank liquids obsessively, and in her teens, her extreme water intake landed her in the hospital. Years later, she found a word that explained it all: Autistic. Instead of grieving, she felt a rush of relief. “It was the answer to every question I’d ever had,” she recalled. “It was kind of like a go-ahead to shed all of those things I could or couldn’t do and embrace myself for who I am.” So it came to be that Whitney, 24, was arranging strawberries and store-bought cookies on platters at the Queensborough Community Center for a celebration of “Autistic Pride Day,” her shoulder-length hair dyed mermaid green to match her purse and sandals. A bowl of orange earplugs sat nearby in case any of the guests found the ambient sounds overwhelming. Whitney is part of a growing movement of autistic adults who are finding a sense of community, identity and purpose in a diagnosis that most people greet with dread. These “neurodiversity” activists contend that autism — and other brain afflictions such as dyslexia and attention deficit hyperactivity disorder — ought to be treated not as a scourge to be eradicated but rather as a difference to be understood and accepted.
Link ID: 21206 - Posted: 07.23.2015
By Hanae Armitage Cataracts cloud the eyes of tens of millions of people around the world and nearly 17.2% of Americans over the age of 40. Currently, the only treatment is surgery—lasers or scalpels cut away the molecular grout that builds in the eye as cataracts develop, and surgeons sometimes replace the lens. But now, a team of scientists and ophthalmologists has tested a solution in dogs that may be able to dissolve the cataract right out of the eye’s lens. And the solution is itself a solution: a steroid-based eye drop. Though scientists don’t fully understand how cataracts form, they do know that the “fog” often seen by patients is a glob of broken proteins, stuck together in a malfunctioning clump. When healthy, these proteins, called crystallins, help the eye’s lens keep its structure and transparency. But as humans and animals alike get older, these crystallin proteins start to come unglued and lose their ability to function. Then they clump together and form a sheathlike obstruction in the lens, causing the signature “steamy glass” vision that accompanies cataracts. Coming up with a solution other than surgery has been tough. Scientists have been hunting for years for mutations in crystallin proteins that might offer new insights and pave the way to an alternate therapy. Now, it looks like a team led by University of California (UC), San Diego, molecular biologist Ling Zhao may have done just that. Her team came up with the eye drop idea after finding that children with a genetically inherited form of cataracts shared a mutation that stopped the production of lanosterol, an important steroid in the body. When their parents did not have the same mutation, the adults produced lanosterol and had no cataracts. © 2015 American Association for the Advancement of Science.
Link ID: 21205 - Posted: 07.23.2015
By Andrea Alfano Unexpectedly losing a loved one launched 18-year-old Debra* into an episode of major depression, triggering dangerous delusions that landed her in a hospital. Her doctor immediately started her on an antidepressant and on risperidone (Risperdal), an antipsychotic. In little more than a month, her weight shot up by 15 pounds. “Gaining weight made it even more difficult for me to want to leave my house because I felt self-conscious,” Debra says. In the medical community, antipsychotics are well known to cause significant weight gain. Gains of 20 to 35 pounds or more over the course of a year or two are not unusual. Debra's doctor never warned her, though, leaving her feeling like she was losing herself both mentally and physically. The situation is not uncommon, according to psychiatrist Matthew Rudorfer, chief of the somatic treatments program at the National Institute of Mental Health, who points out that although the U.S. Food and Drug Administration carefully tracks acute side effects such as seizures, it pays less attention to longer-term complications such as weight change. Perhaps taking their cue from the FDA, doctors tend to downplay weight-related risks that accompany many psychiatric drugs, Rudorfer says. But for Debra and many others, these side effects are not trivial. The three types of psychiatric drugs that can seriously affect body weight are reviewed below. According to a 2014 review of eight studies, as many as 55 percent of patients who take modern antipsychotics experience weight gain—a side effect that appears to be caused by a disruption of the chemical signals controlling appetite. Olanzapine (Zyprexa) and clozapine (Clozaril) are the top two offenders; studies have shown that on average these drugs cause patients to gain more than eight pounds in just 10 weeks. These two drugs also bear the highest risk of metabolic syndrome, which encompasses weight gain and other related disorders, including type 2 diabetes, according to a 2011 study of 90 people with schizophrenia. Although most antipsychotics are associated with weight gain, aripiprazole (Abilify) and ziprasidone (Geodon) stand out for their lower risk. © 2015 Scientific American
By James Gallagher Health editor, BBC News website The first hints a drug can slow the progression of Alzheimer's disease have emerged at a conference. Data from pharmaceutical company Eli Lilly suggests its solanezumab drug can cut the rate of the dementia's progression by about a third. The results are being met with cautious optimism, with a separate trial due to report next year. The death of brain cells in Alzheimer's is currently inexorable. Solanezumab may be able to keep them alive. Current medication, such as Aricept, can only manage the symptoms of dementia by helping the dying brain cells function. But solanezumab attacks the deformed proteins, called amyloid, that build up in the brain during Alzheimer's. It is thought the formation of sticky plaques of amyloid between nerve cells leads to damage and eventually brain cell death. Solanezumab has long been the great hope of dementia research, yet an 18-month trial of the drug seemingly ended in failure in 2012. But when Eli Lilly looked more closely at the data, there were hints it could be working for patients in the earliest stages of the disease. So the company asked just over 1,000 of the patients in the original trial with mild Alzheimer's to take the drug for another two years. And the results from this extension of the original trial have now been presented, at the Alzheimer's Association International Conference. Dr Eric Siemers, from the Lilly Research Laboratories, in Indiana, told the BBC: "It's another piece of evidence that solanezumab does have an effect on the underlying disease pathology. "We think there is a chance that solanezumab will be the first disease-modifying medication to be available." The company also started a completely separate trial in mild patients in 2012, and these results could prove to be the definitive moment for the drug. © 2015 BBC.
Link ID: 21203 - Posted: 07.22.2015
By BENEDICT CAREY Women who develop slight but detectable deficits in memory and mental acuity late in life tend to decline faster than men with mild impairment, researchers reported on Tuesday. Some two-thirds of the five million Americans with Alzheimer’s disease are women, in part because women live longer. Researchers have searched in vain for decades to determine other reasons for the disparity. The authors of the new study, who presented their work at the Alzheimer’s Association International Conference in Washington, said their findings indicated nothing about possible causes of gender differences and had no immediate implications for treatment. “All we can say at this point is that there appears to be a faster trajectory for women than men” toward dementia, said Dr. P. Murali Doraiswamy, a professor of psychiatry at the Duke Institute for Brain Sciences and the study’s senior author. Katherine Amy Lin, a student of Dr. Doraiswamy’s and a co-author, presented the study. Previous research had found a steeper decline in women with mild deficits over a period of about a year. The new study extends that finding to up to eight years. “It’s a very interesting finding, but it’s also still early, so we’re limited in what conclusions we can draw,” said Dr. Edward D. Huey, a geriatric psychiatrist at Columbia University, who was not involved in the study. “I think of this as an excellent hypothesis generator. It’s something we need to investigate more deeply.” In the study, the Duke researchers analyzed scores on standard cognitive tests taken by 398 men and women, most in their 70s, being followed as part of a large, continuing Alzheimer’s trial. The participants have been taking the cognitive tests — as well as other tests, like PET scans — on average for four years, and as long as eight years. Controlling for factors that influence memory and mental acuity, like age, education and genetic predisposition, the research team found that women’s scores slipped by an average of about two points a year, compared with one point for men. The team also looked at a standard measure of life quality, rating how well people functioned socially: at home, at work and with family. That, too, slipped faster for women than for men, at about the same rate. © 2015 The New York Times Company
Ewen Callaway A mysterious group of humans crossed the Bering land bridge from Siberia into the Americas thousands of years ago, genetic analyses reveal. Modern-day signatures of this ‘ghost population’ survive in people who live deep in the Brazilian Amazon, but the two research teams who have made the discovery have different ideas about when and how these migrants reached the Americas1, 2. "This is an unexpected finding," says Jennifer Raff, an anthropological geneticist at the University of Texas at Austin who was not involved in either study. "It’s honestly one of the most exciting results we’ve seen in a while." North and South America were the last continents that humans settled. Previous studies of DNA from modern and ancient Native Americans suggest that the trek was made at least 15,000 years ago (although the timing is not clear-cut) by a single group dubbed the ‘First Americans’, who crossed the Bering land bridge linking Asia and North America. “The simplest hypothesis would be that a single population penetrated the ice sheets and gave rise to most of the Americans,” says David Reich, a population geneticist at Harvard Medical School in Boston, Massachusetts. In 2012, his team found evidence for a single founding migration in the genomes from members of 52 Native American groups3. So Reich was flabbergasted when a colleague called Pontus Skoglund mentioned during a conference last year that he had found signs of a second ancient migration to the Americas lurking in the DNA of contemporary Native Amazonians. Reich wasted no time in verifying the discovery. “During the session afterward, he passed his laptop over the crowd, and he had corroborated the results,” says Skoglund, who is now a researcher in Reich’s lab. © 2015 Nature Publishing Group
Keyword: Genes & Behavior
Link ID: 21201 - Posted: 07.22.2015
By Gretchen Reynolds A walk in the park may soothe the mind and, in the process, change the workings of our brains in ways that improve our mental health, according to an interesting new study of the physical effects on the brain of visiting nature. Most of us today live in cities and spend far less time outside in green, natural spaces than people did several generations ago. City dwellers also have a higher risk for anxiety, depression and other mental illnesses than people living outside urban centers, studies show. These developments seem to be linked to some extent, according to a growing body of research. Various studies have found that urban dwellers with little access to green spaces have a higher incidence of psychological problems than people living near parks and that city dwellers who visit natural environments have lower levels of stress hormones immediately afterward than people who have not recently been outside. But just how a visit to a park or other green space might alter mood has been unclear. Does experiencing nature actually change our brains in some way that affects our emotional health? That possibility intrigued Gregory Bratman, a graduate student at the Emmett Interdisciplinary Program in Environment and Resources at Stanford University, who has been studying the psychological effects of urban living. In an earlier study published last month, he and his colleagues found that volunteers who walked briefly through a lush, green portion of the Stanford campus were more attentive and happier afterward than volunteers who strolled for the same amount of time near heavy traffic. But that study did not examine the neurological mechanisms that might underlie the effects of being outside in nature. So for the new study, which was published last week in Proceedings of the National Academy of Sciences, Mr. Bratman and his collaborators decided to closely scrutinize what effect a walk might have on a person’s tendency to brood. © 2015 The New York Times Company
By Smitha Mundasad Health reporter A type of diabetes drug may offer a glimmer of hope in the fight against Parkinson's disease, research in the journal Plos Medicine suggests. Scientists found people taking glitazone pills were less likely to develop Parkinson's than patients on other diabetes drugs. But they caution the drugs can have serious side-effects and should not be given to healthy people. Instead, they suggest the findings should prompt further research. 'Unintended benefits' There are an estimated 127,000 people in the UK with Parkinson's disease, which can lead to tremor, slow movement and stiff muscles. And charities say with no drugs yet proven to treat the condition, much more work is needed in this area. The latest study focuses solely on people with diabetes who did not have Parkinson's disease at the beginning of the project. Researchers scoured UK electronic health records to compare 44,597 people prescribed glitazone pills with 120,373 people using other anti-diabetic treatment. They matched participants to ensure their age and stage of diabetes treatment were similar. Scientists found fewer people developed Parkinson's in the glitazone group - but the drug did not have a long-lasting benefit. Any potential protection disappeared once patients switched to another type of pill. Dr Ian Douglas, lead researcher at the London School of Hygiene and Tropical Medicine, said: "We often hear about negative side-effects associated with medications, but sometimes there can also be unintended beneficial effects. "Our findings provide unique evidence that we hope will drive further investigation into potential drug treatments for Parkinson's disease." © 2015 BBC
Link ID: 21199 - Posted: 07.22.2015
By BENEDICT CAREY Bill Cosby stands accused of committing date rape long before drugs like GHB or Rohypnol were widely used for that purpose. Many of Mr. Cosby’s accusers believed they had been drugged — but with what? And how? In a recently obtained legal deposition, Mr. Cosby acknowledged giving quaaludes to some women with whom he had sex, but said consumption of the drug was consensual, “the same as a person would say, ‘Have a drink.’ ” In a transcript of the deposition, reported on Sunday in The New York Times, the comedian told lawyers had had obtained seven prescriptions for quaaludes. Originally approved and marketed as a “safer” sleeping pill, less addictive than barbiturates, the drug (known generically as methaqualone) was both sedating and hypnotic. Recreational use was common, but the federal government withdrew them from the market in 1982. “It was inevitable that it would be tried by people looking for a ‘better high,’ ” Dr. David Smith, medical director of the Haight-Ashbury Free Clinic, and Dr. Donald Wesson noted in The Journal of Psychedelic Drugs. Intoxication with quaaludes “soon developed a reputation for being especially pleasant.” Young people in the 1970s used quaaludes as they would a strong drink: to loosen up, to relax, to socialize. The pills also won a reputation for inducing periods of euphoria, as well as sexual arousal — “heroin for lovers,” some called it. By the middle of the decade, quaaludes were a staple of the club scene, often taken with alcohol. So embedded were quaaludes in the cultural scene that even years later the Dead Kennedys and Billy Idol were singing about the drug’s captivating effects. But reckless users risked overdose, especially when combining the pills with alcohol, which could lead to coma, convulsions and sometimes death. In a 1973 review of 252 hospital admissions for drug overdose, doctors in Edinburgh found that the third most common cause of “self-poisoning,” after barbiturates and LSD, was Mandrax — the British version of quaaludes, widely abused in South Africa as well. © 2015 The New York Times Company
Keyword: Drug Abuse
Link ID: 21198 - Posted: 07.22.2015
by Bethany Brookshire For some of us, a weekly case of the Mondays isn’t just because of traffic, work pileups or our soulless office space. It’s because we had to get up early, and sleeping in on the weekend was so incredibly glorious. Besides, because we slept in on Sunday, we didn’t get to the gym until the afternoon, we cooked a late dinner for a friend and then we couldn’t fall asleep at all and so stayed up playing around on the Internet. OK, maybe that’s just me. But you get the general idea. Our obligations — work, family and friends — often don’t line up with when our bodies want to sleep. Scientists call this phenomenon social jetlag. And it may make for more than just miserable Mondays. Social jetlag may also be associated with wider waistlines. As we learn more about how our body clocks work, it might help to think about how our own schedules can shift. Some of us love late nights and can’t help glaring at those who hop out of bed for a 5 a.m. workout (again, maybe that’s just me). But in fact our chronotypes aren’t a result of willpower. Instead they fall in a natural curve. About two-thirds of people are neutral, but a few fall at each end of the spectrum, rising extra early, or staying up until the wee hours. But even those in the middle are still getting up a little bit too early and staying up a little bit too late. We try to make up for it on days off, sleeping in or falling asleep early for a few extra hours of rest. But the result of that shift in sleep schedule? Jetlag. “It’s the equivalent of taking a flight one direction every Friday and going back every Sunday,” says Michael Parsons, a behavioral geneticist at the Medical Research Council Harwell in England. © Society for Science & the Public 2000 - 2015
By James Gallagher Health editor, BBC News website Irregular sleeping patterns have been "unequivocally" shown to lead to cancer in tests on mice, a study suggests. The report, in Current Biology, lends weight to concerns about the damaging impact of shift work on health. The researchers said women with a family risk of breast cancer should never work shifts, but cautioned that further tests in people were needed. The data also indicated the animals were 20% heavier despite eating the same amount of food. Studies in people have often suggested a higher risk of diseases such as breast cancer in shift workers and flight attendants. One argument is disrupting the body's internal rhythm - or body clock - increases the risk of disease. However, the link is uncertain because the type of person who works shifts may also be more likely to develop cancer due to factors such as social class, activity levels or the amount of vitamin D they get. Mice prone to developing breast cancer had their body clock delayed by 12 hours every week for a year. Normally they had tumours after 50 weeks - but with regular disruption to their sleeping patterns, the tumours appeared eight weeks earlier. The report said: "This is the first study that unequivocally shows a link between chronic light-dark inversions and breast cancer development." Interpreting the consequences for humans is fraught with difficulty, but the researchers guesstimated the equivalent effect could be an extra 10kg (1st 8lb) of body weight or for at-risk women getting cancer about five years earlier. "If you had a situation where a family is at risk for breast cancer, I would certainly advise those people not to work as a flight attendant or to do shift work," one of the researchers, Gijsbetus van der Horst, from the Erasmus University Medical Centre, in the Netherlands, said. © 2015 BBC.
Kashmira Gander Performing well at school and going on to have a complex job could lower the risk of dementia, scientists have found. On the contrary, loneliness, watching too much TV and a sedentary lifestyle can make a person’s cognitive abilities decline more quickly, according to new research being presented to experts at the international Alzheimer's Association International Conference in Washington DC. Researchers are also due to show attendees the results from trials Solanezumab – believed to be the first drug to halt the progression of the disease if a patient is diagnosed early enough. One study involving 7,500 people aged 65 and above in Sweden over a 20-year period showed that dementia rates were 21 per cent higher in those whose grades were in the bottom fifth of the population. Meanwhile, participants with complex jobs involving data and numbers saw their chance of developing the disease cut by 23 per cent. Read more: Why fish oil pills may not be so healthy after all Proof that dementia risk can be reduced by improving lifestyle Charity warns of a 'worrying' lack of support for dementia patients Dementia research: Drug firms despair of finding cure and withdraw funding after a catalogue of failures For separate study in Sweden, scientists followed the lives of 440 people aged 75 or over for nine years, and discovered that those in the bottom fifth for school grades were found to have a 50 per cent increase in the risk of developing dementia. © independent.co.uk
By Hanae Armitage Playing an instrument is good for your brain. Compared to nonmusicians, young children who strum a guitar or blow a trombone become better readers with better vocabularies. A new study shows that the benefits extend to teenagers as well. Neuroscientists compared two groups of high school students over 3 years: One began learning their first instrument in band class, whereas the other focused on physical fitness in Junior Reserve Officers’ Training Corps (JROTC). At the end of 3 years, those students who had played instruments were better at detecting speech sounds, like syllables and words that rhyme, than their JROTC peers, the team reports online today in the Proceedings of the National Academy of Sciences. Researchers know that as children grow up, their ability to soak up new information, especially language, starts to diminish. These findings suggest that musical training could keep that window open longer. But the benefits of music aren’t just for musicians; taking up piano could be the difference between an A and a B in Spanish class. © 2015 American Association for the Advancement of Science
Carl Zimmer An ant colony is an insect fortress: When enemies invade, soldier ants quickly detect the incursion and rip their foes apart with their oversize mandibles. But some invaders manage to slip in with ease, none more mystifyingly than the ant nest beetle. Adult beetles stride into an ant colony in search of a mate, without being harassed. They lay eggs, from which larva hatch. As far as scientists can tell, workers feed the young beetles as if they were ants. When the beetles grow into adults, the ants swarm around them, grooming their bodies. In exchange for this hospitality, the beetles sink their jaws into ant larvae and freshly moulted adults in order to drink their body fluids. “They’re like vampire beetles wandering in the ant nests,” said Andrea Di Giulio, an entomologist at Roma Tre University in Rome. Dr. Di Giulio and his colleagues have now uncovered a remarkable trick that the beetles use to fool their hosts. It turns out they can perform uncanny impressions, mimicking a range of ant calls. Dr. Di Giulio and his colleagues study a species of ant nest beetle called Paussus favieri, which lives in the Atlas Mountains of Morocco, where it infiltrates the nests of Moroccan ants, known as Pheidole pallidula. Like many ant species, Pheidole pallidula makes noises by rubbing its legs against ridges on its body. The meanings of these signals vary from species to species; leaf-cutting ants summon bodyguards for the march back to the nest; in other species, a queen trills to her workers to attend to her. Scientists have found that Pheidole pallidula ants make three distinct sounds, each produced by a different caste: soldiers, workers and the queen. © 2015 The New York Times Company
by Stephen Buchmann Flowers, bugs and bees: Stephen Buchmann wanted to study them all when he was a kid. "I never grew out of my bug-and-dinosaur phase," he tells NPR's Arun Rath. "You know, since about the third grade, I decided I wanted to chase insects, especially bees." These days, he's living that dream. As a pollination ecologist, he's now taking a particular interest in how flowers attract insects. In his new book, The Reason for Flowers, he looks at more than just the biology of flowers — he dives into the ways they've laid down roots in human history and culture, too. On the real 'reason for flowers' The reason for flowers is actually one word: sex. So, flowers are literally living scented billboards that are advertising for sexual favors, whether those are from bees, flies, beetles, butterflies or us, because quite frankly most of the flowers in the world have gotten us to do their bidding. But that's only the first stage because flowers, if they're lucky, turn into fruits, and those fruits and seeds feed the world. On the raucous secret lives of beetles One of my favorite memories is roaming the Napa foothills as a UC Davis grad student. And I would go to the wineries, of course, and in between I would find western spice bush, which is this marvelous flower that kind of smells like a blend between a cabernet and rotten fruit. And when you find those flowers and open them up, you discover literally dozens of beetles in there, mating, defecating, pollinating — having a grand time. © 2015 NPR
By NANCY L. SEGAL, AARON T. GOETZ and ALBERTO C. MALDONADO SEVERAL years ago, while browsing the campus bookstore, one of us, Professor Segal, encountered a display table filled with Squirtles. A Squirtle is a plush-toy turtle manufactured by the company Russ Berrie. They were adorable and she couldn’t wait to take one home. Afterward, Professor Segal began wondering why this toy was so attractive and suspected that its large, round eyes played a major role. It’s well known that a preference for large eyes emerges in humans by 5 months of age. But the Squirtle was even more appealing than many of its big-eyed competitors. Was there something else about its eyes? Professor Segal consulted one of us, Professor Goetz, a colleague in evolutionary psychology, who suggested that because the Squirtle’s eyes were bordered in white, the cooperative eye hypothesis might have answers. This hypothesis, developed by the Japanese researchers Hiromi Kobayashi and Shiro Kohshima, holds that the opaque white outer coating of the human eye, or sclera, evolved to assist communication between people by signaling the direction of their gaze. The clear visibility of the sclera is a uniquely human characteristic. Other primates, such as the African great apes, also track the gaze direction of others, yet their sclera are pigmented or, if white, not visible. The great apes appear to use head direction more than other cues when following another’s gaze. Do humans have an instinctive preference for the whites of eyes, thus explaining the allure of the Squirtle? We conducted a study, to be published this year in the journal Evolution and Human Behavior, that suggested that the answer was yes. First we had to make some stuffed animals. We used six specially designed sets of three or four animals each (three cats, three dogs, three octopuses, four elephants, four snails and four turtles). The animals within each set were identical except for the eyes, which varied with respect to the size, color and presence of sclera. © 2015 The New York Times Company
Link ID: 21191 - Posted: 07.20.2015
Results from tests of the drug, announced this week, show that it breaks up plaques in mice affected with Alzheimer’s disease or Parkinson’s disease, and improves the memories and cognitive abilities of the animals. Other promising results in rats and monkeys mean that the drug developers, NeuroPhage Pharmaceuticals, are poised to apply for permission to start testing it in people, with trials starting perhaps as early as next year. The drug is the first that seems to target and destroy the multiple types of plaque implicated in human brain disease. Plaques are clumps of misfolded proteins that gradually accumulate into sticky, brain-clogging gunk that kills neurons and robs people of their memories and other mental faculties. Different kinds of misfolded proteins are implicated in different brain diseases, and some can be seen within the same condition (see “Proteins gone rogue”, below). One thing they share, however, is a structural kink known as a canonical amyloid fold, and it is this on which the new drug acts (Journal of Molecular Biology, DOI: 10.1016/j.jmb.2014.04.015). Animal tests show that the drug reduces levels of amyloid beta plaques and tau protein deposits implicated in Alzheimer’s disease, and the alpha-synuclein protein deposits thought to play a role in Parkinson’s disease. Tests on lab-made samples show that the drug also targets misfolded transthyretin, clumps of which can clog up the heart and kidney, and prion aggregates, the cause of CJD, another neurodegenerative condition. Because correctly folded proteins do not have the distinct “kink”, the drug has no effect on them. © Copyright Reed Business Information Ltd.
T. M. Luhrmann AMERICANS are a pretty anxious people. Nearly one in five of us — 18 percent — has an anxiety disorder. We spend over $2 billion a year on anti-anxiety medications. College students are often described as more stressed than ever before. There are many explanations for these nerves: a bad job market, less cohesive communities, the constant self-comparison that is social media. In 2002 the World Mental Health Survey found that Americans were the most anxious people in the 14 countries studied, with more clinically significant levels of anxiety than people in Nigeria, Lebanon and Ukraine. To be clear, research suggests that anxiety is at least partially temperamental. A recent study of 592 rhesus monkeys found that some of them responded more anxiously than others and that as much as 30 percent of early anxiety may be inherited. Yet what is inherited is the potential for anxiety, not anxiety itself. Life events obviously play a role. Another, less obvious factor may be the way we think about the mind: as an interior place that demands careful, constant attention. Humans seem to distinguish between mind and body in all cultures, but the sharp awareness of mind as a possession, distinct from soul and body, comes from the Enlightenment. It was then, in the aftermath of the crisis of religious authority and the scientific revolution, that there were intense debates about the nature of mental events. Between 1600 and 1815, the place where mental stuff happened — the “thing that thinks,” to use Descartes’s phrase — came to seem more and more important, as George Makari, a psychiatrist and psychoanalyst, explains in his forthcoming book, “Soul Machine: The Invention of the Modern Mind.” From this, Mr. Makari writes, was developed the psychological mind and psychoanalysis and an expectation that personal thoughts and feelings are the central drivers of human action — not roles, not values, not personal sensation, not God. In the United States, the enormous psychotherapeutic and self-help industry teaches us that we must pay scrupulous attention to inner experience. To succeed and be happy, we are taught, we need to know what we feel. © 2015 The New York Times Company
Link ID: 21189 - Posted: 07.20.2015
By THE EDITORIAL BOARD Scientific research has a gender gap, and not just among humans. In many disciplines, the animals used to study diseases and drugs are overwhelmingly male, which may significantly reduce the reliability of research and lead to drugs that won’t work in half the population. A new study published in the journal Nature Neuroscience suggests that research done on male animals may not hold up for women. Its authors reported that hypersensitivity to pain works differently in male and female mice. For males, immune cells called microglia appear to be required for pain hypersensitivity, and inhibiting their function also relieves the pain. But in female mice, different cells are involved, and targeting the microglia has no effect. If these differences occur in mice, they may occur in humans too. This means a pain drug targeting microglia might appear to work in male mice, but wouldn’t work on women. Failure to consider gender in research is very much the norm. According to one analysis of scientific studies that were published in 2009, male animals outnumbered females 5.5 to 1 in neuroscience, 5 to 1 in pharmacology, and 3.7 to 1 in physiology. Only 45 percent of animal studies involving depression or anxiety and only 38 percent involving strokes used females, even though these conditions are more common in women. In 1994, the National Institutes of Health confronted gender imbalance in clinical drug trials and began requiring that women and minorities be included in clinical studies; women now make up around half of clinical trial participants. In June, the N.I.H. announced that it would begin requiring researchers to take gender into account in preclinical research on animals as well. © 2015 The New York Times Company
Keyword: Sexual Behavior
Link ID: 21188 - Posted: 07.20.2015
By C. CLAIBORNE RAY Q. Can you hear without an intact eardrum? A. “When the eardrum is not intact, there is usually some degree of hearing loss until it heals,” said Dr. Ashutosh Kacker, an ear, nose and throat specialist at NewYork-Presbyterian Hospital and a professor at Weill Cornell Medical College, “but depending on the size of the hole, you may still be able to hear almost normally.” Typically, Dr. Kacker said, the larger an eardrum perforation is, the more severe the hearing loss it will cause. The eardrum, or tympanic membrane, is a thin, cone-shaped, pearly gray tissue separating the outer and middle ear canals, he explained. Soundwaves hit the eardrum, which in turn vibrates the bones of the middle ear. The bones pass the vibration to the cochlea, which leads to a signal cascade culminating in the sound being processed by the brain and being heard. There are several ways an eardrum can be ruptured, Dr. Kacker said, including trauma, exposure to sudden or very loud noises, foreign objects inserted deeply into the ear canal, and middle-ear infection. “Usually, the hole will heal by itself and hearing will improve within about two weeks to a few months, especially in cases where the hole is small,” he said. Sometimes, when the hole is larger or does not heal well, surgery will be required to repair the eardrum. Most such operations are done by placing a patch over the hole to allow it to heal, and the surgery is usually very successful in restoring hearing, Dr. Kacker said. © 2015 The New York Times Company
Link ID: 21187 - Posted: 07.20.2015