Chapter 12. Psychopathology: The Biology of Behavioral Disorders
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Ann Gibbons Depressed? Your inner Neandertal may be to blame. Modern humans met and mated with these archaic people in Europe or Asia about 50,000 years ago, and researchers have long suspected that genes picked up in these trysts might be shaping health and well-being today. Now, a study in the current issue of Science details their impact. It uses a powerful new method for scanning the electronic health records of 28,000 Americans to show that some Neandertal gene variants today can raise the risk of depression, skin lesions, blood clots, and other disorders. Neandertal genes aren’t all bad. “These variants sometimes protect against a disease, sometimes make people more susceptible to disease,” says paleogeneticist Svante Pääbo of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany. Two other new studies identified three archaic genes that boost immune response. And most archaic genes that persist in humans were likely beneficial in prehistoric times. But some now cause disease because modern lifestyles and environments are so different. Living people carry only trace amounts of Neandertal DNA, which makes its impact on health more striking. “The Neandertal genetic contribution to present-day people seems to have larger physiological effects than I would have naïvely thought,” says Pääbo, who helped launch this avenue of research by sequencing the first ancient genomes but was not involved in these studies. On average, Europeans and Asians have inherited about 1.5% of their genomes from Neandertals. Island Melanesians carry an additional 2% to 3% of DNA inherited from another extinct group, the Denisovans. Most Africans lack this archaic DNA because the interbreeding happened after modern humans left Africa. © 2016 American Association for the Advancement of Science
Bruce Bower Winter doesn’t deserve its dour reputation as the season of depression, scientists say. Rates of major depression, a psychiatric condition marked by intense sadness, hopelessness, insomnia and a general loss of interest or pleasure, don’t markedly change from one season to another among U.S. adults, says a team led by psychologist Steven LoBello of Auburn University at Montgomery in Alabama. Neither do symptoms intensify or become more numerous during winter among those already suffering from depression, the researchers report online January 19 in Clinical Psychological Science. A small number of people with regular fall or winter depression may have gone undetected in the new study, which surveyed more than 30,000 U.S. adults. Still, it’s becoming harder to justify the current psychiatric diagnosis of major depression “with seasonal pattern,” LoBello and Auburn colleagues Megan Traffanstedt and Sheila Mehta conclude. Because it’s a recurring disorder, depression can strike in two consecutive winters by chance, the researchers say. Depression in three or more consecutive winters could be due to personal and social factors unrelated to shorter days, they add. “Being depressed during winter is not evidence that one is depressed because of winter,” LoBello says. © Society for Science & the Public 2000 - 2016
By Jordana Cepelewicz As the Panthers and Broncos faced off in the third quarter of last night’s Super Bowl, wide receiver Philly Brown suffered a possible concussion—and to the disappointment of Panthers fans, he never returned to the game. But for good reason: concussions are now known to be much more serious injuries than once thought. And the danger may not be limited to the immediate repercussions. Researchers have already linked more severe traumatic brain injury to later suicide—particularly in military veterans and professional athletes—and have more recently explored the connection between concussion and depression. Now, new research published in the Canadian Medical Association Journal shows that even mild concussions sustained in ordinary community settings might be more detrimental than anyone anticipated; the long-term risk of suicide increases threefold in adults if they have experienced even one concussion. That risk increases by a third if the concussion is sustained on a weekend instead of a weekday—suggesting recreational concussions are riskier long-term than those sustained on the job. “The typical patient I see is a middle-aged adult, not an elite athlete,” says Donald Redelmeier, a senior scientist at the University of Toronto and one of the study’s lead authors. “And the usual circumstances for acquiring a concussion are not while playing football; it is when driving in traffic and getting into a crash, when missing a step and falling down a staircase, when getting overly ambitious about home repairs—the everyday activities of life.” Redelmeier and his team wanted to examine the risks of the concussions acquired under those circumstances. © 2016 Scientific American
By Steven Petrow I have slogged through a number of difficult situations in recent months, among them the ongoing crises of my elderly parents’ illnesses and the suicide of a friend. I never lost my appetite nor burst into tears, and I didn’t suffer from any of the other typical symptoms of depression. Maybe I was more irritable than usual, a bit more prone to snap. And yes, I buried myself in my work. But I didn’t think I’d tripped down into the rabbit hole of depression. You would think I would have been more self-aware, both personally and professionally. As a health journalist, I have often used my own stories to write about difficult-to-discuss medical conditions, including learning I had testicular cancer at age 26 and my misdiagnosis with H.I.V./AIDS — back when it was a death sentence. But I had never written about suffering from depression, even though it’s plagued me since I first put pen to paper, at age 11, when I started keeping a diary. Still, I’m far from alone. At least six million men in the United States suffer from depression, according to the National Institute of Mental Health. The true number is likely to be even higher, said Dr. Matthew Rudorfer, the institute’s associate director for treatment research, since men are less likely than women to report classic symptoms like low mood, sadness or crying, so they often go undiagnosed. Men, he told me, more often demonstrate “externalizing” symptoms like irritability, anger and aggressiveness, substance and alcohol abuse, risk-taking behaviors and “workaholism.” Oh, that macho thing: Men don’t get depressed; they just work, drink and compete harder. Andrew Solomon, author of the pathbreaking memoir about depression, “Noonday Demon,” told me that ridiculous attitude is part of the mind-set that guys should “cover up our moods with militarism or athleticism.” © 2016 The New York Times Company
Link ID: 21874 - Posted: 02.09.2016
It’s well known that some people report that their mood is influenced by the seasons. But can the time of year affect other cognitive functions? To find out, Gilles Vandewalle and colleagues at the University of Liege in Belgium scanned the brains of 28 volunteers while they performed attention and working memory tests at different times of the year. To ensure the results were influenced by the seasons rather than the environmental conditions on the test day, the participants were confined to a lab for 4.5 days prior to the test, exposed to a constant light level and temperature. Although their test scores didn’t change with the seasons, activity in some brain areas showed a consistent seasonal pattern among the volunteers: brain activity peaked in the summer on the attention task and in the autumn on the memory task. Many seasonally changing factors could regulate such a pattern, including day length (known as photoperiod), temperature, humidity, social interaction and physical activity. Since these weren’t all controlled for in the study, it’s impossible to say what is responsible for the seasonal changes seen. “In our data it seems that photoperiod, or the rate of change of photoperiod, was more likely to explain what we were seeing. But we can’t exclude all the others,” says Vandewalle. The results suggest that over the course of a year, the brain might work in different ways to compensate for seasonal factors that could affect its function, enabling it to maintain a stable performance. Vandewalle speculates that these mechanisms might not work as well in some people, for example, those vulnerable to the winter blues. © Copyright Reed Business Information Ltd.
By Jonathan Leo Last week, according to many media accounts, scientists from Harvard Medical School, Boston Children’s Hospital, and the Broad Institute discovered the genetic basis of schizophrenia. The researchers reported in Nature that people with schizophrenia were more likely to have the overactive forms of a gene called complement component 4, or C4, which is involved in pruning synapses during adolescence. However, suggesting a biologic mechanism for a small subset of those diagnosed with schizophrenia is not the same as confirming the genetic theory of schizophrenia. Benedict Carey, science reporter for the New York Times, delved into the details and reported the all-important fact that having the C4 variant would increase a person’s risk by about 25 percent over the 1-percent base rate of schizophrenia—that is, to 1.25 percent. Genes for schizophrenia and depression have been discovered before, and in those cases, the subsequent enthusiastic headlines were shortly followed by retractions and more sober thinking. There are so many open questions (for instance, why do many people with the problematic variant not develop schizophrenia, and why do many people who don’t have the variant develop schizophrenia?) that the same may occur with the C4 discovery. The idea that mental illness is the result of a genetic predisposition is the foundation for modern-day psychiatry and has been the driving force for how research money is allocated, how patients are treated, and how society views people diagnosed with conditions identified in the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition. Schizophrenia holds a unique spot in the annals of mental health research because of its perceived anatomical underpinnings and is often cited as evidence in favor of a genetic predisposition to other conditions.
By Diana Kwon Antidepressants are some of the most commonly prescribed medications out there. More than one out of 10 Americans over age 12—roughly 11 percent—take these drugs, according to a 2011 report by the National Center for Health Statistics. And yet, recent reports have revealed that important data about the safety of these drugs—especially their risks for children and adolescents—has been withheld from the medical community and the public. In the latest and most comprehensive analysis, published last week in BMJ (the British Medical Journal),a group of researchers at the Nordic Cochrane Center in Copenhagen showed that pharmaceutical companies were not presenting the full extent of serious harm in clinical study reports, which are detailed documents sent to regulatory authorities such as the U.S. Food and Drug Administration and the European Medicines Agency (EMA) when applying for approval of a new drug. The researchers examined documents from 70 double-blind, placebo-controlled trials of two common types of antidepressants—selective serotonin reuptake inhibitors (SSRI) and serotonin and norepinephrine reuptake inhibitors (SNRI)—and found that the occurrence of suicidal thoughts and aggressive behavior doubled in children and adolescents who used these medications. This paper comes on the heels of disturbing charges about conflicts of interest in reports on antidepressant trials. Last September a study published in the Journal of Clinical Epidemiology revealed that a third of meta-analyses of antidepressant studies were written by pharma employees and that these were 22 times less likely than other meta-studies to include negative statements about the drug. © 2016 Scientific American
Link ID: 21860 - Posted: 02.04.2016
Heidi Ledford Difficulty with concentration, memory and other cognitive tasks is often associated with depression. In the past quarter of a century, a wave of drugs has transformed the treatment of depression. But the advances have struggled to come to grips with symptoms that often linger long after people start to feel better: cognitive problems such as memory loss and trouble concentrating. On 3 February, the US Food and Drug Administration (FDA) will convene a meeting of its scientific advisers to discuss whether such cognitive impairments are components of the disorder that drugs might be able to target — or just a result of depressed mood. The discussion will help the agency to decide whether two companies that sell the antidepressant vortioxetine should be allowed to label it as a treatment for the cognitive effects. A ‘yes’ could spur drug developers to invest in ways to test cognitive function during their antidepressant trials. Psychiatrists have long noted that some people with depression also struggle to concentrate and to make decisions. The question has been whether such difficulties are merely an offshoot of altered mood and would thus clear up without specific treatment, says Diego Pizzagalli, a neuroscientist at McLean Hospital, an affiliate of Harvard Medical School in Belmont, Massachusetts. But some patients who report improved mood after treatment still struggle with cognitive deficits — so psychiatrists sometimes prescribe concentration-enhancing drugs that are approved to treat attention deficit hyperactivity disorder to people with depression. © 2016 Nature Publishing Group
By Sara Solovitch It was November 2012 when Dennis Hartman, a Seattle business executive, managed to pull himself out of bed, force himself to shower for the first time in days and board a plane that would carry him across the country to a clinical trial at the National Institute of Mental Health (NIMH) in Bethesda. After a lifetime of profound depression, 25 years of therapy and cycling through 18 antidepressants and mood stabilizers, Hartman, then 46, had settled on a date and a plan to end it all. This clinical trial would be his last stab at salvation. For 40 minutes, he sat in a hospital room as an IV drip delivered ketamine through his system. Several more hours passed before it occurred to him that all his thoughts of suicide had evaporated. “My life will always be divided into the time before that first infusion and the time after,” Hartman says today. “That sense of suffering and pain draining away. I was bewildered by the absence of pain.” Ketamine, popularly known as the psychedelic club drug Special K, has been around since the early 1960s. It is a staple anesthetic in emergency rooms, regularly used for children when they come in with broken bones and dislocated shoulders. It’s an important tool in burn centers and veterinary medicine, as well as a notorious date-rape drug, known for its power to quickly numb and render someone immobile.
By BENEDICT CAREY A new approach to treating early schizophrenia, which includes family counseling, results in improvements in quality of life that make it worth the added expense, researchers reported on Monday. The study, published by the journal Schizophrenia Bulletin, is the first rigorous cost analysis of a federally backed treatment program that more than a dozen states have begun trying. In contrast to traditional outpatient care, which generally provides only services covered by insurance, like drugs and some psychotherapy, the new program offers other forms of support, such as help with jobs and school, as well as family counseling. The program also tries to include the patients — people struggling with a first psychotic “break” from reality, most of them in their late teens and 20s — as equals in decisions about care, including drug dosage. In a widely anticipated study last fall, called the Raise trial, researchers reported that after two years, people who got this more comprehensive care did better on a variety of measures than those who received the standard care. But the study found no evidence of related cost savings or differences in hospitalization rates, a prime driver of expense. As lawmakers in Washington are considering broad changes in mental health care, cost issues loom especially large. Outside experts said this analysis — which was based on the Raise trial data — was an important test of the new care program’s value. “This is the way cost analysis should be done,” Sherry Glied, a professor of public service and the dean of New York University’s graduate school of public service, said. “One way to think about it is to ask, if this program were a drug, would we pay for it? And the answer is yes.” © 2016 The New York Times Company
Link ID: 21842 - Posted: 02.01.2016
By CHARLES SIEBERT Nearly 30 years ago, Lilly Love lost her way. She had just completed her five-year tour of duty as an Alaska-based Coast Guard helicopter rescue swimmer, one of an elite team of specialists who are lowered into rough, frigid seas to save foundering fishermen working in dangerous conditions. The day after she left active service, the helicopter she had flown in for the previous three years crashed in severe weather into the side of a mountain, killing six of her former crewmates. Devastated by the loss and overcome with guilt, Love chose as her penance to become one of the very fishermen she spent much of her time in the Coast Guard rescuing. In less than a year on the job, she nearly drowned twice after being dragged overboard in high seas by the hooks of heavy fishing lines. Love would not formally receive a diagnosis of severe post-traumatic stress disorder for another 15 years. In that time, she was married and divorced three times, came out as transgender and retreated periodically to Yelapa, Mexico, where she lived in an isolated cabin accessible only by water. She eventually ended up living on a boat in a Los Angeles marina, drinking heavily and taking an array of psychotropic drugs that doctors at the West Los Angeles Veterans Administration Medical Center began to prescribe with increasing frequency as Love proved resistant to traditional treatments like counseling and group therapy. One night, after her fifth stay in the center’s psych ward, she crashed her boat into a sea wall. Finally, in 2006, she was in the veterans’ garden and happened to catch sight of the parrots being housed in an unusual facility that opened a year earlier on the grounds of the center. ‘‘This place is why I’m still here,’’ Love, now 54, told me one day last summer as I watched her undergo one of her daily therapy sessions at the facility, known as Serenity Park, a name that would seem an utter anomaly to anyone who has ever been within 200 yards of the place. © 2016 The New York Times Company
Link ID: 21839 - Posted: 01.30.2016
By BENEDICT CAREY Scientists reported on Wednesday that they had taken a significant step toward understanding the cause of schizophrenia, in a landmark study that provides the first rigorously tested insight into the biology behind any common psychiatric disorder. More than two million Americans have a diagnosis of schizophrenia, which is characterized by delusional thinking and hallucinations. The drugs available to treat it blunt some of its symptoms but do not touch the underlying cause. The finding, published in the journal Nature, will not lead to new treatments soon, experts said, nor to widely available testing for individual risk. But the results provide researchers with their first biological handle on an ancient disorder whose cause has confounded modern science for generations. The finding also helps explain some other mysteries, including why the disorder often begins in adolescence or young adulthood. “They did a phenomenal job,” said David B. Goldstein, a professor of genetics at Columbia University who has been critical of previous large-scale projects focused on the genetics of psychiatric disorders. “This paper gives us a foothold, something we can work on, and that’s what we’ve been looking for now, for a long, long time.” The researchers pieced together the steps by which genes can increase a person’s risk of developing schizophrenia. That risk, they found, is tied to a natural process called synaptic pruning, in which the brain sheds weak or redundant connections between neurons as it matures. During adolescence and early adulthood, this activity takes place primarily in the section of the brain where thinking and planning skills are centered, known as the prefrontal cortex. People who carry genes that accelerate or intensify that pruning are at higher risk of developing schizophrenia than those who do not, the new study suggests. Some researchers had suspected that the pruning must somehow go awry in people with schizophrenia, because previous studies showed that their prefrontal areas tended to have a diminished number of neural connections, compared with those of unaffected people. © 2016 The New York Times Company
Angus Chen When she was 22, Rachel Star Withers uploaded a video to YouTube called "Normal: Living With Schizophrenia." It starts with her striding across her family's property in Fort Mill, S.C. She looks across the rolling grounds, unsmiling. Her eyes are narrow and grim. She sits down in front of a deserted white cottage and starts sharing. "I see monsters. I see myself chopped up and bloody a lot. Sometimes I'll be walking, and the whole room will just tilt. Like this," she grasps the camera and jerks the frame crooked. She surfaces a fleeting grin. "Try and imagine walking." She becomes serious again. "I'm making this because I don't want you to feel alone whether you're struggling with any kind of mental illness or just struggling." At the time, 2008, there were very few people who had done anything like this online. "As I got diagnosed [with schizophrenia], I started researching everything. The only stuff I could find was like every horror movie," she says. "I felt so alone for years." She decided that schizophrenia was really not that scary. "I want people to find me and see a real person." Over the past eight years, she has made 53 videos documenting her journey with schizophrenia and depression and her therapy. And she is not the only one. There are hundreds of videos online of people publicly sharing their experiences with mental illness. © 2016 npr
Link ID: 21834 - Posted: 01.28.2016
Heidi Ledford Addie plays hard for an 11-year-old greater Swiss mountain dog — she will occasionally ignore her advanced years to hurl her 37-kilogram body at an unwitting house guest in greeting. But she carries a mysterious burden: when she was 18 months old, she started licking her front legs aggressively enough to wear off patches of fur and draw blood. Addie has canine compulsive disorder — a condition that is thought to be similar to human obsessive–compulsive disorder (OCD). Canine compulsive disorder can cause dogs to chase their tails for hours on end, or to suck on a toy or body part so compulsively that it interferes with their eating or sleeping. Addie may soon help researchers to determine why some dogs are more prone to the disorder than others. Her owner, Marjie Alonso of Somerville, Massachusetts, has enrolled her in a project called Darwin’s Dogs, which aims to compare information about the behaviour of thousands of dogs against the animals’ DNA profiles. The hope is that genetic links will emerge to conditions such as canine compulsive disorder and canine cognitive dysfunction — a dog analogue of dementia and possibly Alzheimer’s disease. The project organizers have enrolled 3,000 dogs so far, but hope to gather data from at least 5,000, and they expect to begin analysing DNA samples in March. “It’s very exciting, and in many ways it’s way overdue,” says Clive Wynne, who studies canine behaviour at Arizona State University in Tempe. © 2016 Nature Publishing Group,
By Ellen Hendriksen This topic comes by request on the Savvy Psychologist Facebook page from listener Anita M. of Detroit. Anita works with foster kids and, too often, sees disadvantaged kids who have been on a cocktail of psychiatric medications from as early as age 6. She asks, does such early use alter a child’s brain or body? And have the effects of lifelong psychiatric medication been studied? Childhood mental illness (and resulting medication) is equally overblown and under-recognized. Approximately 21% of American kids - that’s 1 in 5 - will battle a diagnosable mental illness before they reach the age of 17, whether or not they actually get treatment. The problem is anything but simple. Some childhood illnesses - ADHD and autism, for example - often get misused as “grab-bag” diagnoses when something’s wrong but no one knows what. This leads to overdiagnosis and sometimes, overmedicating. Other illnesses, like substance abuse, get overlooked or written off as rebellion or experimentation, leading to underdiagnosis and kids slipping through the cracks. But the most common problem is inconsistent diagnosis. For example, a 2008 study found that fewer than half of individuals diagnosed with bipolar disorder actually had the illness, while 5% of those diagnosed with something completely different actually had bipolar disorder. But let’s get back to Anita’s questions: Does early psychotropic medication alter a child’s brain? The short answer is yes, but the long answer might be different than you think. © 2016 Scientific American
By PAM BELLUCK Women should be screened for depression during pregnancy and after giving birth, an influential government-appointed health panel said Tuesday, the first time it has recommended screening for maternal mental illness. The recommendation, expected to galvanize many more health providers to provide screening, comes in the wake of new evidence that maternal mental illness is more common than previously thought; that many cases of what has been called postpartum depression actually start during pregnancy; and that left untreated, these mood disorders can be detrimental to the well-being of children. It also follows growing efforts by states, medical organizations and health advocates to help women having these symptoms — an estimated one in seven postpartum mothers, some experts say. “There’s better evidence for identifying and treating women with depression” during and after pregnancy, said Dr. Michael Pignone, a professor of medicine at the University of North Carolina at Chapel Hill and an author of the recommendation, which was issued by the United States Preventive Services Task Force. As a result, he said, “we specifically called out the need for screening during this period.” Answers to questions about depression screening and maternal mental illness, following new recommendations saying that women should be screened for depression during pregnancy and after childbirth. The recommendation was part of updated depression screening guidelines issued by the panel, an independent group of experts appointed by the Department of Health and Human Services. In 2009, the group said adults should be screened if clinicians had the staff to provide support and treatment; the new guidelines recommend adult screening even without such staff members, saying mental health support is now more widely available. The 2009 guidelines did not mention depression during or after pregnancy. © 2016 The New York Times Company
by Bethany Brookshire Unless you’re in the middle of biting into a delicious Reuben sandwich, you might forget that taste is one of the fundamental senses. “It’s required for our enjoyment of food,” explains Emily Liman, a taste researcher at the University of Southern California in Los Angeles. “Without taste … people stop eating. They don’t enjoy their food.” A life without the sweet jolt of sugar or the savory delights of umami seems, well, tasteless. When you put that mouthwatering combination of corned beef, Swiss cheese, Thousand Island dressing, sauerkraut and rye in your mouth, the chemicals in the sandwich stimulate taste buds on your tongue and soft palate. Those taste buds connect to the ends of nerve fibers extending delicately into the mouth. Those nerve fibers are the ends of cells located in the geniculate ganglion, a ball of cells nestled up against the ear canal on the side of your head. From there, taste sensations head toward the brain. Chemical messengers bridge the gap between the taste bud and the end of the nerve fiber. But what chemical is involved depends on the type of cell within the bud. There are three types of taste cells (imaginatively titled I, II and III). Type I is not well-understood, but it may be a kind of support cell for other taste cells. Type II, in contrast, is better known. These taste cells sense the slight bitterness of the rye seeds, the sweet edge of the Thousand Island dressing and the savory umami of the beef. They pass that delightful message on using the chemical ATP. © Society for Science & the Public 2000 - 2016
Alison Abbott For the second time in four months, researchers have reported autopsy results that suggest Alzheimer’s disease might occasionally be transmitted to people during certain medical treatments — although scientists say that neither set of findings is conclusive. The latest autopsies, described in the Swiss Medical Weekly1 on 26 January, were conducted on the brains of seven people who died of the rare, brain-wasting Creutzfeldt–Jakob disease (CJD). Decades before their deaths, the individuals had all received surgical grafts of dura mater — the membrane that covers the brain and spinal cord. These grafts had been prepared from human cadavers and were contaminated with the prion protein that causes CJD. But in addition to the damage caused by the prions, five of the brains displayed some of the pathological signs that are associated with Alzheimer’s disease, researchers from Switzerland and Austria report. Plaques formed from amyloid-β protein were discovered in the grey matter and blood vessels. The individuals, aged between 28 and 63, were unusually young to have developed such plaques. A set of 21 controls, who had not had surgical grafts of dura mater but died of sporadic CJD at similar ages, did not have this amyloid signature. According to the authors, it is possible that the transplanted dura mater was contaminated with small ‘seeds’ of amyloid-β protein — which some scientists think could be a trigger for Alzheimer’s — along with the prion protein that gave the recipients CJD. © 2016 Nature Publishing Group,
By Anne Pycha Future doctors may ask us to say more than “Ahhh.” Several groups of neuroscientists, psychiatrists and computer scientists are now investigating the extent to which patients' language use can provide diagnostic clues—before a single laboratory test is run. Increased computing power and new methods to measure the relation between behavior and brain activity have advanced such efforts. And although tests based on the spoken word may not be as accurate as gene sequencing or MRI scans, for diseases lacking clear biological indicators, language mining could help fill the gap. Psychiatrists at Columbia University interviewed 34 young adults at risk for psychosis, a common sign of schizophrenia that includes delusions and hallucinations. Two and a half years later five of the subjects had developed psychosis, and the remaining 29 remained free of the disorder. A specially designed algorithm combed the initial interviews collectively to look for language features that distinguished the two groups and found that psychosis correlated with shorter sentences, loss of flow in meaning from one sentence to the next and less frequent use of the words “that,” “what” and “which.” When later tested on each individual interview, the computer program predicted who did and who did not develop psychosis with 100 percent accuracy. The results were recently published in Schizophrenia, and a second round of testing with another group of at-risk subjects is now under way. Parkinson's Disease Twenty-seven subjects in a study at Favaloro University in Argentina listened to recorded sentences containing verbs associated with specific hand shapes (such as “applaud” or “punch”). As soon as they understood the sentence, participants pressed a button while keeping both hands in either a flat or clenched-fist position. © 2016 Scientific American
Richard A. Friedman WHO among us hasn’t wanted to let go of anxiety or forget about fear? Phobias, panic attacks and disorders like post-traumatic stress are extremely common: 29 percent of American adults will suffer from anxiety at some point in their lives. Sitting at the heart of much anxiety and fear is emotional memory — all the associations that you have between various stimuli and experiences and your emotional response to them. Whether it’s the fear of being embarrassed while talking to strangers (typical of social phobia) or the dread of being attacked while walking down a dark street after you’ve been assaulted (a symptom of PTSD), you have learned that a previously harmless situation predicts something dangerous. It has been an article of faith in neuroscience and psychiatry that, once formed, emotional memories are permanent. Afraid of heights or spiders? The best we could do was to get you to tolerate them, but we could never really rid you of your initial fear. Or so the thinking has gone. The current standard of treatment for such phobias revolves around exposure therapy. This involves repeatedly presenting the feared object or frightening memory in a safe setting, so that the patient acquires a new safe memory that resides in his brain alongside the bad memory. As long as the new memory has the upper hand, his fear is suppressed. But if he is re-traumatized or re-exposed with sufficient intensity to the original experience, his old fear will awaken with a vengeance. This is one of the limitations of exposure therapy, along with the fact that it generally works in only about half of the PTSD patients who try it. Many also find it upsetting or intolerable to relive memories of assaults and other traumatizing experiences. © 2016 The New York Times Company
Link ID: 21815 - Posted: 01.23.2016