Chapter 16. None

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 3570

By JANE E. BRODY Problems with estrogen and testosterone, the body’s main sex hormones, tend to attract widespread public interest. But we might all be better off paying more attention to a far more common endocrine disorder: abnormal levels of thyroid hormone. Thyroid disorders can affect a wide range of bodily functions and cause an array of confusing and often misdiagnosed symptoms. Because the thyroid, a small gland in the neck behind the larynx, regulates energy production and metabolism throughout the body, including the heart, brain, skin, bowels and body temperature, too much or too little of its hormones can have a major impact on health and well-being. Yet in a significant number of people with thyroid deficiencies, routine blood tests fail to detect insufficient thyroid hormone, leaving patients without an accurate explanation for their symptoms. These can include excessive fatigue, depression, hair loss, unexplained weight gain, constipation, sleep problems, mental fogginess and anxiety. Women of childbearing age may have difficulty getting pregnant or staying pregnant. Although thyroid disorders are more common in adults, children, whose cognitive and physical development depend on normal thyroid function, are not necessarily spared. In a review article published last year in JAMA Pediatrics, doctors from the Children’s Hospital of Philadelphia pressed primary care doctors to recognize childhood thyroid disease and begin treatment as early as the second week of life to ensure normal development. Hypothyroidism — low hormone levels — in particular is often misdiagnosed, its symptoms resembling those of other diseases or mistaken for “normal” effects of aging. Indeed, the risk of hypothyroidism rises with age. Twenty percent of people over 75, most of them women, lack sufficient levels of thyroid hormone that, among other problems, can cause symptoms of confusion commonly mistaken for dementia. © 2017 The New York Times Company

Keyword: Hormones & Behavior
Link ID: 23866 - Posted: 07.24.2017

Susan Milius Sonar pings from a hungry bat closing in can inspire hawkmoths to get their genitals trilling. The ultrasonic “eeeee” of scraping moth sex organs may serve as a last-second acoustic defense, says behavioral ecologist Jesse Barber of Boise State University in Idaho. In theory, the right squeak could jam bats’ targeting sonar, remind them of a noisy moth that tasted terrible or just startle them enough for the hawkmoth to escape. Males of at least three hawkmoth species in Malaysia squeak in response to recorded echolocation sounds of the final swoop in a bat attack, Barber and Akito Kawahara of the University of Florida in Gainesville report July 3 in Biology Letters. Female hawkmoths are hard to catch, but the few Barber and Kawahara have tested squeak too. Although they’re the same species as the males, they use their genitals in a different way to make ultrasound. Squeak power may have arisen during courtship and later proved useful during attacks. Until now, researchers knew of only two insect groups that talk back to bats: some tiger moths and tiger beetles. Neither is closely related to hawkmoths, so Barber speculates that anti-bat noises might be widespread among insects. Slowed-down video shows first the male and then the female hawkmoth creating ultrasonic trills at the tips of their abdomens. Males use a pair of claspers that grasp females in mating. To sound off, these quickly slide in and out of the abdomen, rasping specialized scales against the sides. Females rub the left and right sides of their abdominal structures together. J. Barber and A.Y. Kawahara. Hawkmoths produce anti-bat ultrasound. Biology Letters. Posted July 3, 2013. doi: 10.1098/rsbl.2013.0161 [Go to] |© Society for Science & the Public 2000 - 2017.

Keyword: Hearing
Link ID: 23864 - Posted: 07.24.2017

Joseph Jebelli The terror of Alzheimer’s is that it acts by degrees, and can therefore bewilder family members as much as its victims. Those who first notice the onset of Alzheimer’s in a loved one tell of forgotten names and unsettling behaviour, of car keys found in the fridge and clothing in the kitchen cabinet, of aimless wanderings. Naturally, they want to understand the boundaries of normal ageing and whether these are being crossed. Often, the answer arrives when they’re greeted as complete strangers, when the patient’s mind becomes irrevocably unmoored from its past. The disease is terrifying for its insidiousness as well as its long-term manifestations. Fear partly explains why Alzheimer’s has been ignored for so long. Yet it is now the leading cause of death among the oldest people, and according to Professor Sir Michael Marmot, an expert in health inequalities, it could be an “important part” of the stagnation in increases in life expectancy since 2010 that he has identified. As a researcher, I have been struck by how many patients speak openly about their condition only after receiving a diagnosis. “I knew something wasn’t right. Sometimes I don’t know what day of the week it is or what I have to do,” one newly diagnosed patient told me. “I look in my calendar but then I think: why am I looking at this? My husband was the one who made me see a GP. I was too frightened. I thought I might have it but I didn’t want to hear it.” © 2017 Guardian News and Media Limited

Keyword: Alzheimers
Link ID: 23862 - Posted: 07.22.2017

By Becca Cudmore A mother rat’s care for her pup reaches all the way into her offspring’s DNA. A young rat that gets licked and groomed a lot early on in life exhibits diminished responses to stress thanks to epigenetic changes in the hippocampus, a brain region that helps transform emotional information into memory. Specifically, maternal solicitude reduces DNA methylation and changes the structure of DNA-packaging proteins, triggering an uptick in the recycling of the neurotransmitter serotonin and the upregulation of the glucocorticoid receptor. These changes make the nurtured rat’s brain quicker to sense and tamp down the production of stress hormones in response to jarring experiences such as unexpected sound and light. That pup will likely grow into a calm adult, and two studies have shown that female rats who exhibit a dampened stress response are more likely to generously lick, groom, and nurse their own young. Caring for pups is one example of what casual observers of behavior might call an animal’s instinct—generally considered to be an innate, genetically encoded phenomenon. But could such epigenetic changes, when encoded as ancestral learning, also be at the root of maternal care and other seemingly instinctual behaviors we see across the animal kingdom? “We don’t have a general theory for the mechanics of instinct as we do for learning, and this is something that has troubled me for a very long time,” says University of Illinois entomologist Gene Robinson. He studies social evolution in the Western honey bee and recently coauthored a perspective piece in Science together with neurobiologist Andrew Barron of Macquarie University in Sydney, Australia, suggesting methylation as a possible mechanism for the transgenerational transmission of instinctual behavior, rather than those behaviors being hardwired in the genome (356:26-27, 2017). Robinson and Barron suggest that instinctual traits, such as honey bees’ well-known waggle dance or a bird’s in-born ability to sing its species’ songs, are the result of traits first learned by their ancestors and inherited across generations by the process of methylation. This differs from classical thoughts on animal learning, which say that if a behavior is learned, it is not innate, and will not be inherited. © 1986-2017 The Scientist

Keyword: Epigenetics; Evolution
Link ID: 23861 - Posted: 07.22.2017

Jon Hamilton Doctors use words like "aggressive" and "highly malignant" to describe the type of brain cancer discovered in Arizona Sen. John McCain. The cancer is a glioblastoma, the Mayo Clinic said in a statement Wednesday. It was diagnosed after doctors surgically removed a blood clot from above McCain's left eye. Doctors who were not involved in his care say the procedure likely removed much of the tumor as well. Glioblastomas, which are the most common malignant brain tumor, tend to be deadly. Each year in the U.S., about 12,000 people are diagnosed with the tumor. Most die within two years, though some survive more than a decade. "It's frustrating," says Nader Sanai, director of neurosurgical oncology at the Barrow Neurological Institute in Phoenix. Only "a very small number" of patients beat the disease, he says. And the odds are especially poor for older patients like McCain, who is 80. "The older you are, the worse your prognosis is," Sanai says, in part because older patients often aren't strong enough to tolerate aggressive radiation and chemotherapy. Arizona Sen. John McCain on Capitol Hill in April 2017, three months before he was diagnosed with brain cancer. © 2017 npr

Keyword: Glia
Link ID: 23857 - Posted: 07.21.2017

By Fergus Walsh Medical correspondent One in three cases of dementia could be prevented if more people looked after their brain health throughout life, according to an international study in the Lancet. It lists nine key risk factors including lack of education, hearing loss, smoking and physical inactivity. The study is being presented at the Alzheimer's Association International Conference in London. By 2050, 131 million people could be living with dementia globally. There are estimated to be 47 million people with the condition at the moment. Nine factors that contribute to the risk of dementia Mid-life hearing loss - responsible for 9% of the risk Failing to complete secondary education - 8% Smoking - 5% Failing to seek early treatment for depression - 4% Physical inactivity - 3% Social isolation - 2% High blood pressure - 2% Obesity - 1% Type 2 diabetes - 1% These risk factors - which are described as potentially modifiable - add up to 35%. The other 65% of dementia risk is thought to be potentially non-modifiable. Source: Lancet Commission on dementia prevention, intervention and care "Although dementia is diagnosed in later life, the brain changes usually begin to develop years before," said lead author Prof Gill Livingston, from University College London. "Acting now will vastly improve life for people with dementia and their families and, in doing so, will transform the future of society." The report, which combines the work of 24 international experts, says lifestyle factors can play a major role in increasing or reducing an individual's dementia risk. It examines the benefits of building a "cognitive reserve", which means strengthening the brain's networks so it can continue to function in later life despite damage. Image caption Eve Laird is taking part in a study on how to prevent dementia © 2017 BBC

Keyword: Alzheimers
Link ID: 23856 - Posted: 07.21.2017

Ashley Yeager DNA might reveal how dogs became man’s best friend. A new study shows that some of the same genes linked to the behavior of extremely social people can also make dogs friendlier. The result, published July 19 in Science Advances, suggests that dogs’ domestication may be the result of just a few genetic changes rather than hundreds or thousands of them. “It is great to see initial genetic evidence supporting the self-domestication hypothesis or ‘survival of the friendliest,’” says evolutionary anthropologist Brian Hare of Duke University, who studies how dogs think and learn. “This is another piece of the puzzle suggesting that humans did not create dogs intentionally, but instead wolves that were friendliest toward humans were at an evolutionary advantage as our two species began to interact.” Not much is known about the underlying genetics of how dogs became domesticated. In 2010, evolutionary geneticist Bridgett vonHoldt of Princeton University and colleagues published a study comparing dogs’ and wolves’ DNA. The biggest genetic differences gave clues to why dogs and wolves don’t look the same. But major differences were also found in WBSCR17, a gene linked to Williams-Beuren syndrome in humans. Williams-Beuren syndrome leads to delayed development, impaired thinking ability and hypersociability. VonHoldt and colleagues wondered if changes to the same gene in dogs would make the animals more social than wolves, and whether that might have influenced dogs’ domestication. © Society for Science & the Public 2000 - 2017.

Keyword: Aggression; Genes & Behavior
Link ID: 23855 - Posted: 07.20.2017

How well cancer patients fared after chemotherapy was affected by their social interaction with other patients during treatment, according to a new study by researchers at the National Human Genome Research Institute (NHGRI), part of the National Institutes of Health, and the University of Oxford in the United Kingdom. Cancer patients were a little more likely to survive for five years or more after chemotherapy if they interacted during chemotherapy with other patients who also survived for five years or more. Patients were a little more likely to die in less than five years after chemotherapy when they interacted during chemotherapy with those who died in less than five years. The findings were published online July 12, 2017, in the journal Network Science. “People model behavior based on what’s around them,” Jeff Lienert, lead author in NHGRI’s Social and Behavioral Research Branch and a National Institutes of Health Oxford-Cambridge Scholars Program fellow. “For example, you will often eat more when you’re dining with friends, even if you can’t see what they’re eating. When you’re bicycling, you will often perform better when you’re cycling with others, regardless of their performance.” Lienert set out to see if the impact of social interaction extended to cancer patients undergoing chemotherapy. Joining this research effort were Lienert’s adviser, Felix Reed-Tsochas, Ph.D., at Oxford’s CABDyN Complexity Centre at the Saïd Business School, Laura Koehly, Ph.D., chief of NHGRI’s Social and Behavioral Research Branch, and Christopher Marcum, Ph.D., a staff scientist also in the Social and Behavioral Research Branch at NHGRI.

Keyword: Stress
Link ID: 23854 - Posted: 07.20.2017

By LISA SANDERS, M.D. The 35-year-old man lay on the bed with his eyes closed, motionless except for the regular jerking of his abdomen and chest — what is known medically as a singultus (from the Latin for ‘‘sob’’) but popularly and onomatopoeically as a hiccup. The man was exhausted. He couldn’t eat, could barely drink and hadn’t slept much since the hiccups began, nearly three weeks earlier. Unending Contractions At first it was just annoying — these spasms that interrupted his life every 10 to 12 seconds. Friends and family suggested remedies, and he tried them all: holding his breath, drinking cold water, drinking hot water, drinking out of the wrong side of the glass, drinking water while holding his nose. Sometimes they even worked for a while. He would find himself waiting for the next jerk, and when it didn’t come, he’d get this tiny sense of triumph that the ridiculous ordeal was over. But after 15 minutes, maybe 30, they would suddenly return: hiccup, hiccup, hiccup. His neck, stomach and chest muscles ached from the constant regular contractions. On this evening, the man had one of the all too rare breaks from the spasms and fell asleep. When his wife heard the regular sound start up again, she came into their bedroom to check on him. He looked awful — thin, tired and uncomfortable. And suddenly she was scared. They needed to go to the hospital, she told him. He was too weak, he told her, ‘‘and so very tired.’’ He would go, but first he’d rest. They had been to the emergency room several times already. During their first visit — nearly two weeks earlier — the doctors at the local hospital in their Queens neighborhood gave him a medication, chlorpromazine, an antipsychotic that has been shown to stop hiccups, though it’s not clear why. It was like a miracle; the rhythmic spasms stopped. But a few hours later, when the drug wore off, the hiccups returned. The couple went back a few days later because he started throwing up while hiccupping. Those doctors offered an acid reducer for his stomach and more chlorpromazine. They encouraged the man to have patience. Sometimes these things can last, they said. © 2017 The New York Times Company

Keyword: Miscellaneous
Link ID: 23853 - Posted: 07.20.2017

By Diana Kwon Using CRISPR, researchers have successfully treated congenital muscular dystrophy type 1A (MDC1A), a rare disease that can lead to severe muscle wasting and paralysis, in mice. The team was able to restore muscle function by correcting a splicing site mutation that causes the disorder, according to a study published today (July 17) in Nature Medicine. “Instead of inserting the corrected piece of information, we used CRISPR to cut DNA in two strategic places,” study coauthor Dwi Kemaladewi, a research fellow at the Hospital for Sick Children (Sick Kids) in Toronto, explains in a statement. “This tricked the two ends of the gene to come back together and create a normal splice site.” By targeting both the skeletal muscles and peripheral nerves, the team was able to improve the animals’ motor function and mobility. “This is important because the development of therapeutic strategies for muscular dystrophies have largely focused on improving the muscle conditions,” Kemaladewi says in the release. “Experts know the peripheral nerves are important, but the skeletal muscles have been perceived as the main culprit in MDC1A and have traditionally been the focus of treatment options.” “The robustness of the correction we see in animal models to me is very encouraging,” Amy Wagers, a biologist at Harvard University who was not involved in this study, tells the Toronto Star. © 1986-2017 The Scientist

Keyword: Muscles
Link ID: 23851 - Posted: 07.19.2017

Shirley S. Wang In nursing homes and residential facilities around the world, health care workers are increasingly asking dementia patients questions: What are your interests? How do you want to address us? What should we do to celebrate the life of a friend who has passed away? The questions are part of an approach to care aimed at giving people with memory loss and other cognitive problems a greater sense of control and independence. At its core is the idea that an individual with dementia should be treated as a whole person and not "just" a patient. Scientists sometimes call this approach an ecopsychosocial intervention. The goal is to create environments that better meet patients' psychological and emotional needs through strategies other than medication. At the Alzheimer's Association International Conference this week in London, researchers from the U.S., the U.K. and Israel presented data from four trials demonstrating that such interventions significantly improve residents' mood and quality of life. The interventions can also reduce their use of antipsychotic drugs and improve their ability to care for themselves. Taken together, these studies and others suggest that relatively simple and potentially cost-effective interventions can yield significant benefits for people with dementia, even those in residential facilities in the later stages of disease. Behavioral Therapy Helps More Than Drugs For Dementia Patients As the population continues to age, and the number of people with dementia continues to rise, these interventions are likely to increase in importance as well. © 2017 npr

Keyword: Alzheimers; Attention
Link ID: 23847 - Posted: 07.19.2017

By LISA FELDMAN BARRETT Imagine that a bully threatens to punch you in the face. A week later, he walks up to you and breaks your nose with his fist. Which is more harmful: the punch or the threat? The answer might seem obvious: Physical violence is physically damaging; verbal statements aren’t. “Sticks and stones can break my bones, but words will never hurt me.” But scientifically speaking, it’s not that simple. Words can have a powerful effect on your nervous system. Certain types of adversity, even those involving no physical contact, can make you sick, alter your brain — even kill neurons — and shorten your life. Your body’s immune system includes little proteins called proinflammatory cytokines that cause inflammation when you’re physically injured. Under certain conditions, however, these cytokines themselves can cause physical illness. What are those conditions? One of them is chronic stress. Your body also contains little packets of genetic material that sit on the ends of your chromosomes. They’re called telomeres. Each time your cells divide, their telomeres get a little shorter, and when they become too short, you die. This is normal aging. But guess what else shrinks your telomeres? Chronic stress. If words can cause stress, and if prolonged stress can cause physical harm, then it seems that speech — at least certain types of speech — can be a form of violence. But which types? This question has taken on some urgency in the past few years, as professed defenders of social justice have clashed with professed defenders of free speech on college campuses. Student advocates have protested vigorously, even violently, against invited speakers whose views they consider not just offensive but harmful — hence the desire to silence, not debate, the speaker. “Trigger warnings” are based on a similar principle: that discussions of certain topics will trigger, or reproduce, past trauma — as opposed to merely challenging or discomfiting the student. The same goes for “microaggressions.” © 2017 The New York Times Company

Keyword: Aggression; Brain imaging
Link ID: 23846 - Posted: 07.18.2017

Katherine Hobson The theory behind artificial sweeteners is simple: If you use them instead of sugar, you get the joy of sweet-tasting beverages and foods without the downer of extra calories, potential weight gain and related health issues. In practice, it's not so simple, as a review of the scientific evidence on non-nutritive sweeteners published Monday shows. After looking at two types of scientific research, the authors conclude that there is no solid evidence that sweeteners like aspartame and sucralose help people manage their weight. And observational data suggest that the people who regularly consume these sweeteners are also more likely to develop future health problems, though those studies can't say those problems are caused by the sweeteners. The health effects of artificial sweeteners are important to study, because so many people use them. Another study published earlier this year found that a quarter of U.S. children and 41 percent of adults reported consuming them, most of them once per day. Even more people may be consuming them unwittingly in products such as granola bars or yogurt. "We were really interested in the everyday person who is consuming these products not to lose weight, but because they think it's the healthier choice, for many years on end," says Meghan Azad, lead author of the review and a research scientist at the University of Manitoba. While more research needs to be done, from what we know now, "there is no clear benefit for weight loss, and there's a potential association with increased weight gain, diabetes and other negative cardiovascular outcomes," says Azad. © 2017 npr

Keyword: Obesity
Link ID: 23845 - Posted: 07.18.2017

By TALYA MINSBERG When Marti Noxon set out to make “To the Bone,” a film about a 20-year-old battling an eating disorder, she initially faced the question: Was the topic too niche? The answer came in the form of a rousing premiere in January at the Sundance Film Festival, Netflix’s reported $8 million purchase of the film, a trailer that went viral with 54 million views in the first week and arguments over whether it glamorized excessive thinness. The film debuted on Netflix on Friday. The film is loosely based on Ms. Noxon’s experience with an eating disorder. She and its star, Lily Collins, are among the 30 million Americans — a third of them men — who have struggled with one. Ms. Collins plays Ellen, an anorexia patient who enters her fifth eating disorder treatment center, an unorthodox group home run by a doctor played by Keanu Reeves. Many of those reacting to the film’s trailer worried that watching it could trigger unhealthy thoughts in viewers who may be prone to eating disorders or already struggling with them. Indeed, some experts said that people who have had eating disorders should consider the state of their health before watching the film. “If you don’t feel solid in your recovery, don’t watch it right now. It could be triggering at any part of your life if you aren’t feeling strong and solid in your recovery,” Dr. Dena Cabera, executive clinical director at Rosewood Centers for Eating Disorders said. “It will always be there; you can look it up later.” Others say the film may help spur action. Eating disorders have the highest mortality rate of any psychiatric disorder, and can affect individuals across every demographic. “If the film helps raise awareness and more people seek treatment, that would be a success that we can be pleased with,” Dr. S. Bryn Austin, a professor at Boston Children’s Hospital and Harvard T.H. Chan School of Public Health, said. “Eating disorders can be successfully treated, they just need to take the first step in reaching out for care.” © 2017 The New York Times Company

Keyword: Anorexia & Bulimia
Link ID: 23843 - Posted: 07.17.2017

Tim Adams Henry Marsh made the decision to become a neurosurgeon after he had witnessed his three-month-old son survive the complex removal of a brain tumour. For two decades he was the senior consultant in the Atkinson Morley wing at St George’s hospital in London, one of the country’s largest specialist brain surgery units. He pioneered techniques in operating on the brain under local anaesthetic and was the subject of the BBC documentary Your Life in Their Hands. His first book, Do No Harm: Stories of Life, Death, and Brain Surgery, was published in 2014 to great acclaim, and became a bestseller across the world. Marsh retired from full-time work at St George’s in 2015, though he continues with long-standing surgical roles at hospitals in the Ukraine and Nepal. He is also an avid carpenter. Earlier this year he published a second volume of memoir, Admissions: a Life in Brain Surgery, in which he looks back on his career as he takes up a “retirement project” of renovating a decrepit lock-keeper’s cottage near where he grew up in Oxfordshire. He lives with his second wife, the social anthropologist and author Kate Fox. They have homes in Oxford, and in south London, which is where the following conversation took place. Have you officially retired now? Well, I still do one day a week for the NHS, though apparently they want a “business case” for it, so I’m not getting paid at present. Yes, well, people talk about the mind-matter problem – it’s not a problem for me: mind is matter. That’s not being reductionist. It is actually elevating matter. We don’t even begin to understand how electrochemistry and nerve cells generate thought and feeling. We have not the first idea. The relation of neurosurgery to neuroscience is a bit like the relationship between plumbing and quantum mechanics.

Keyword: Consciousness
Link ID: 23842 - Posted: 07.17.2017

Pagan Kennedy In 2011, Ben Trumble emerged from the Bolivian jungle with a backpack containing hundreds of vials of saliva. He had spent six weeks following indigenous men as they tramped through the wilderness, shooting arrows at wild pigs. The men belonged to the Tsimane people, who live as our ancestors did thousands of years ago — hunting, foraging and farming small plots of land. Dr. Trumble had asked the men to spit into vials a few times a day so that he could map their testosterone levels. In return, he carried their kills and helped them field-dress their meat — a sort of roadie to the hunters. Dr. Trumble wanted to find out whether the hunters who successfully shot an animal would be rewarded with a spike in testosterone. (They were.) As a researcher with the Tsimane Health and Life History Project, he had joined a long-running investigation into human well-being and aging in the absence of industrialization. That day when he left the jungle, he stumbled across a new and more urgent question about human health. He dropped his backpack, called his mom and heard some terrible news: His 64-year-old uncle had learned he had dementia, probably Alzheimer’s. In just a few short years, his uncle, a vibrant former lawyer, would stop speaking, stop eating and die. “I couldn’t help my uncle,” Dr. Trumble said, but he was driven to understand the disease that killed him. He wondered: Do the Tsimane suffer from Alzheimer’s disease like we do? And if not, what can we learn from them about treating or preventing dementia? “There is really no cure yet for Alzheimer’s,” Dr. Trumble told me. “We have nothing that can undo the damage already done.” Why, he wondered, had billions of dollars and decades of research yielded so little? Perhaps major clues were being missed. Dr. Trumble was trained as an anthropologist, and his field — evolutionary medicine — taught him to see our surroundings as a blip in the timeline of human history. He thinks it’s a problem that medical research focuses almost exclusively on “people who live in cities like New York or L.A.” Scientists often refer to these places as “Weird” — Western, educated, industrialized, rich and democratic — and point out that our bodies are still designed for the not-Weird environment in which our species evolved. Yet we know almost nothing about how dementia affected humans during the 50,000 years before developments like antibiotics and mechanized farming. Studying the Tsimane, Dr. Trumble believes, could shed light on this modern plague. © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23839 - Posted: 07.14.2017

By Aaron Reuben, Jonathan Schaefer Most of us know at least one person who has struggled with a bout of debilitating mental illness. Despite their familiarity, however, these kinds of episodes are typically considered unusual, and even shameful. New research, from our lab and from others around the world, however, suggests mental illnesses are so common that almost everyone will develop at least one diagnosable mental disorder at some point in their lives. Most of these people will never receive treatment, and their relationships, job performance and life satisfaction will likely suffer. Meanwhile the few individuals who never seem to develop a disorder may offer psychology a new avenue of study, allowing researchers to ask what it takes to be abnormally, enduringly, mentally well. Epidemiologists have long known that, at any given point in time, roughly 20 to 25 percent of the population suffers from a mental illness, which means they experience psychological distress severe enough to impair functioning at work, school or in their relationships. Extensive national surveys, conducted from the mid-1990s through the early 2000s, suggested that a much higher percentage, close to half the population, would experience a mental illness at some point in their lives. These surveys were large, involving thousands of participants representative of the U.S. in age, sex, social class and ethnicity. They were also, however, retrospective, which means they relied on survey respondents’ accurate recollection of feelings and behaviors months, years and even decades in the past. Human memory is fallible, and modern science has demonstrated that people are notoriously inconsistent reporters about their own mental health history, leaving the final accuracy of these studies up for debate. Of further concern, up to a third of the people contacted by the national surveys failed to enroll in the studies. Follow-up tests suggested that these “nonresponders” tended to have worse mental health. © 2017 Scientific American

Keyword: Schizophrenia; Depression
Link ID: 23837 - Posted: 07.14.2017

Susan Milius Ravens have passed what may be their toughest tests yet of powers that, at least on a good day, let people and other apes plan ahead. Lab-dwelling common ravens (Corvus corax) in Sweden at least matched the performance of nonhuman apes and young children in peculiar tests of advanced planning ability. The birds faced such challenges as selecting a rock useless at the moment but likely to be useful for working a puzzle box and getting food later. Ravens also reached apelike levels of self-control, picking a tool instead of a ho-hum treat when the tool would eventually allow them to get a fabulous bit of kibble 17 hours later, Mathias Osvath and Can Kabadayi of Lund University in Sweden report in the July 14 Science. “The insight we get from the experiment is that [ravens] can plan for the future outside behaviors observed in the wild,” Markus Böckle, of the University of Cambridge, said in an e-mail. Böckle, who has studied ravens, coauthored a commentary in the same issue of Science. In the wild, ravens cache some of their food, but that apparent foresight could be more of a specific adaptation that evolved with diet instead of as some broader power of planning. The Lund tests, based on experiments with apes, tried to challenge ravens in less natural ways. The researchers say the birds aren’t considered much of a tool-using species in nature, nor do they trade for food. “The study for the first time in any animal shows that future planning can be used in behaviors it was not originally selected for” in evolution, Böckle says. © Society for Science & the Public 2000 - 2017.

Keyword: Intelligence; Evolution
Link ID: 23835 - Posted: 07.14.2017

By Ryan Cross Can you imagine watching 20,000 videos, 16 minutes apiece, of fruit flies walking, grooming, and chasing mates? Fortunately, you don’t have to, because scientists have designed a computer program that can do it faster. Aided by artificial intelligence, researchers have made 100 billion annotations of behavior from 400,000 flies to create a collection of maps linking fly mannerisms to their corresponding brain regions. Experts say the work is a significant step toward understanding how both simple and complex behaviors can be tied to specific circuits in the brain. “The scale of the study is unprecedented,” says Thomas Serre, a computer vision expert and computational neuroscientist at Brown University. “This is going to be a huge and valuable tool for the community,” adds Bing Zhang, a fly neurobiologist at the University of Missouri in Columbia. “I am sure that follow-up studies will show this is a gold mine.” At a mere 100,000 neurons—compared with our 86 billion—the small size of the fly brain makes it a good place to pick apart the inner workings of neurobiology. Yet scientists are still far from being able to understand a fly’s every move. To conduct the new research, computer scientist Kristin Branson of the Howard Hughes Medical Institute in Ashburn, Virginia, and colleagues acquired 2204 different genetically modified fruit fly strains (Drosophila melanogaster). Each enables the researchers to control different, but sometimes overlapping, subsets of the brain by simply raising the temperature to activate the neurons. © 2017 American Association for the Advancement of Science.

Keyword: Brain imaging
Link ID: 23834 - Posted: 07.14.2017

By BENEDICT CAREY Keith Conners, whose work with hyperactive children established the first standards for diagnosing and treating what is now known as attention deficit hyperactivity disorder, or A.D.H.D. — and who late in life expressed misgivings about how loosely applied that label had become — died on July 5 in Durham, N.C. He was 84. His wife, Carolyn, said the cause was heart failure. The field of child psychiatry was itself still young when Dr. Conners joined the faculty of the Johns Hopkins University School of Medicine in the early 1960s as a clinical psychologist. Children with emotional and behavioral problems often got a variety of diagnoses, depending on the clinic, and often ended up being given strong tranquilizers as treatment. Working with Dr. Leon Eisenberg, a prominent child psychiatrist, Dr. Conners focused on a group of youngsters who were chronically restless, hyperactive and sometimes aggressive. Doctors had recognized this type — “hyperkinesis,” it was called, or “minimal brain dysfunction” — but Dr. Conners combined existing descriptions and, using statistical analysis, focused on the core symptoms. The 39-item questionnaire he devised, called the Conners Rating Scale, quickly became the worldwide standard for assessing the severity of such problems and measuring improvement. It was later abbreviated to 10 items, giving child psychiatry a scientific foothold and anticipating by more than a decade the kind of checklists that would come to define all psychiatric diagnosis. He used his scale to study the effects of stimulant drugs on hyperactive children. Doctors had known since the 1930s that amphetamines could, paradoxically, calm such youngsters; a Rhode Island doctor, Charles Bradley, had published a well-known report detailing striking improvements in attention and academic performance among many children at a children’s inpatient home he ran near Providence. But it was a series of rigorous studies by Dr. Conners, in the 1960s and ’70s, that established stimulants — namely Dexedrine and Ritalin — as the standard treatments. © 2017 The New York Times Company

Keyword: ADHD
Link ID: 23833 - Posted: 07.14.2017