Chapter 16. Psychopathology: Biological Basis of Behavior Disorders
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
|By Simon Makin The Claim Casual cannabis use harms young people's brains. The Facts A study found differences in the brains of users and nonusers, but it did not establish that marijuana use caused the variations or that they had any functional significance. The Details Researchers at Northwestern University and Harvard Medical School conducted MRI scans of two groups of 20 young adults ages 18 to 25. One group reported using marijuana at least once a week, smoking 11 joints a week on average, whereas the other had used it less than five times total and not at all during the last year. Neither group had any psychiatric disorders, and the users were psychiatrically assessed as not dependent on the drug. The study focused on two brain regions involved in processing rewards, the nucleus accumbens and the amygdala. These areas create pleasurable experiences of things such as food and sex, as well as the high associated with drugs, and have been shown to change in animals given THC, the main psychoactive component of cannabis. The researchers found that cannabis users had more gray matter density in the left nucleus accumbens and left amygdala, as well as differences in the shape of the left nucleus accumbens and right amygdala. The left nucleus accumbens also tended to be slightly larger in users. They concluded that recreational cannabis use might be associated with abnormalities in the brain's reward system. News reports have proclaimed that scientists have shown that even casual cannabis use harms young people's brains. The Caveats The most obvious problem with leaping to that conclusion is that the scans were conducted at only one point. © 2014 Scientific American
|By Corinne Iozzio Albert “Skip” Rizzo of the University of Southern California began studying virtual reality (VR) as psychological treatment in 1993. Since then, dozens of studies, his included, have shown the immersion technique to be effective for everything from post-traumatic stress disorder (PTSD) and anxiety to phobias and addiction. But a lack of practical hardware has kept VR out of reach for clinicians. The requirements for a VR headset seem simple—a high-resolution, fast-reacting screen, a field of vision that is wide enough to convince patients they are in another world and a reasonable price tag— yet such a product has proved elusive. Says Rizzo, “It’s been 20 frustrating years.” In 2013 VR stepped into the consumer spotlight in the form of a prototype head- mounted display called the Oculus Rift. Inventor Palmer Luckey’s goal was to create a platform for immersive video games, but developers from many fields—medicine, aviation, tourism—are running wild with possibilities. The Rift’s reach is so broad that Oculus, now owned by Facebook, hosted a conference for developers in September. The Rift, slated for public release in 2015, is built largely from off- the-shelf parts, such as the screens used in smartphones. A multi- axis motion sensor lets the headset refresh imagery in real time as the wearer’s head moves. The kicker is the price: $350. (Laboratory systems start at $20,000.) Rizzo has been among the first in line. His work focuses on combat PTSD. In a 2010 study, he placed patients into controlled traumatic scenarios, including a simulated battlefield, so they could confront and process emotions triggered in those situations. © 2014 Scientific American
|By Victoria Stern Many studies show that teens who use marijuana face a greater risk of later developing schizophrenia or symptoms of it, especially if they have a genetic predisposition. For instance, one 15-year study followed more than 45,000 Swedes who initially had no psychotic symptoms. The researchers determined that subjects who smoked marijuana by age 18 were 2.4 times more likely to be diagnosed with schizophrenia than their nonsmoking peers, and this risk increased with the frequency of cannabis use. The connection still held when researchers accounted for participants' use of other drugs. Yet despite these results and an uptick in marijuana use in the 1970s and 1980s, other researchers have not uncovered an increase in the incidence of schizophrenia in the general Swedish population—suggesting that perhaps people who were going to develop schizophrenia anyway were more likely to use marijuana. Another study, conducted in Australia over a 30-year period, also found no increase in schizophrenia diagnoses among the general population, despite rising rates of teen marijuana use. These authors concluded that although cannabis most likely does not cause schizophrenia, its use might trigger psychosis in vulnerable people or exacerbate an existing condition. © 2014 Scientific American
By CLYDE HABERMAN When it came to pharmacological solutions to life’s despairs, Aldous Huxley was ahead of the curve. In Huxley’s 1932 novel about a dystopian future, the Alphas, Betas and others populating his “Brave New World” have at their disposal a drug called soma. A little bit of it chases the blues away: “A gramme” — Huxley was English, remember, spelling included — “is better than a damn.” With a swallow, negative feelings are dispelled. Prozac, the subject of this week’s video documentary from Retro Report, is hardly soma. But its guiding spirit is not dissimilar: A few milligrams of this drug are preferable to the many damns that lie at the core of some people’s lives. Looking back at Prozac’s introduction by Eli Lilly and Company in 1988, and hopscotching to today, the documentary explores the enormous influence, both chemical and cultural, that Prozac and its brethren have had in treating depression, a concern that gained new resonance with the recent suicide of the comedian Robin Williams. In the late 1980s and the 90s, Prozac was widely viewed as a miracle pill, a life preserver thrown to those who felt themselves drowning in the high waters of mental anguish. It was the star in a class of new pharmaceuticals known as S.S.R.I.s — selective serotonin reuptake inhibitors. Underlying their use is a belief that depression is caused by a shortage of the neurotransmitter serotonin. Pump up the levels of this brain chemical and, voilà, the mood lifts. Indeed, millions have embraced Prozac, and swear by it. Depression left them emotionally paralyzed, they say. Now, for the first time in years, they think clearly and can embrace life. Pharmacological merits aside, the green-and-cream pill was also a marvel of commercial branding, down to its market-tested name. Its chemical name is fluoxetine hydrochloride, not the most felicitous of terms. A company called Interbrand went to work for Eli Lilly and came up with Prozac. “Pro” sounds positive. Professional, too. “Ac”? That could signify action. As for the Z, it suggests a certain strength, perhaps with a faint high-techy quality. © 2014 The New York Times Company
Link ID: 20098 - Posted: 09.22.2014
By Douglas Main Researchers have created a blood test that they have used to accurately diagnose depression in a small sample of people, and they hope that with time and funding it could be used on a widespread basis. It is the first blood test—and thus the first “objective” gauge—for any type of mental disorder in adults, says study co-author Eva Redei, a neuroscientist at Northwestern University in Evanston, Ill. Outside experts caution, however, that the results are preliminary, and not close to ready for use the doctor’s office. Meanwhile, diagnosing depression the “old-fashioned way” through an interview works quite well, and should only take 10 to 15 minutes, says Todd Essig, a clinical psychologist in New York. But many doctors are increasingly overburdened and often not reimbursed for taking the time to talk to their patients, he says. The test works by measuring blood levels of nine different types of RNA, a chemical that the body uses to process DNA. Besides accurately diagnosing depression, which affects perhaps 10 percent of American adults and is becoming more common, the technique may also be able to tell who could benefit from talk therapy and who may be vulnerable to the condition in the first place. In a study describing the test, published in the journal Translational Psychiatry, the scientists recruited 32 patients who were diagnosed with depression using a clinical interview, the standard technique. They also got 32 non-depressed patients to participate as a control group. © 2014 Newsweek LLC
Link ID: 20084 - Posted: 09.17.2014
By ANNA FELS THE idea of putting a mind-altering drug in the drinking water is the stuff of sci-fi, terrorist plots and totalitarian governments. Considering the outcry that occurred when putting fluoride in the water was first proposed, one can only imagine the furor that would ensue if such a thing were ever suggested. The debate, however, is moot. It’s a done deal. Mother Nature has already put a psychotropic drug in the drinking water, and that drug is lithium. Although this fact has been largely ignored for over half a century, it appears to have important medical implications. Lithium is a naturally occurring element, not a molecule like most medications, and it is present in the United States, depending on the geographic area, at concentrations that can range widely, from undetectable to around .170 milligrams per liter. This amount is less than a thousandth of the minimum daily dose given for bipolar disorders and for depression that doesn’t respond to antidepressants. Although it seems strange that the microscopic amounts of lithium found in groundwater could have any substantial medical impact, the more scientists look for such effects, the more they seem to discover. Evidence is slowly accumulating that relatively tiny doses of lithium can have beneficial effects. They appear to decrease suicide rates significantly and may even promote brain health and improve mood. Yet despite the studies demonstrating the benefits of relatively high natural lithium levels present in the drinking water of certain communities, few seem to be aware of its potential. Intermittently, stories appear in the scientific journals and media, but they seem to have little traction in the medical community or with the general public. The New York Times Company
Link ID: 20077 - Posted: 09.15.2014
by Bethany Brookshire Post-traumatic stress disorder, or PTSD, has many different symptoms. Patients may suffer from anxiety, flashbacks, memory problems and a host of other reactions to a traumatic event. But one symptom is especially common: 70 percent of civilian patients and 90 percent of combat veterans with PTSD just can’t get a decent night’s sleep. Problems with sleep, including rapid-eye movement — or REM — sleep, have long been associated with PTSD. “We know that sleep difficulties in the weeks following trauma predict the development of PTSD, and we know that bad sleep makes PTSD symptoms worse,” says Sean Drummond, a clinical psychologist who studies sleep at the University of California at San Diego. Studies in rats show that exposing the animals to traumatic, fearful experiences such as foot shocks disrupts their REM sleep. Drummond and his research assistant Anisa Marshall wanted to connect those findings to humans. But he soon found out that in humans, it’s not fear that predicts REM sleep. Instead, it’s safety. The scientists tested this in 42 people without PTSD using a measure called fear-potentiated startle. Subjects sit in a comfortable chair with an electrode on their wrists. A screen shows blue squares or yellow squares. If participants see blue squares, they run a high risk of receiving an annoying shock to the wrist. If they see yellow squares, they can relax; no shocks are headed their way. During this time, they will also hear random, loud bursts of white noise. The scientists measure how much the subjects startle in response to the noise by measuring the strength of their eyeblinks in response to the noise. In the presence of the blue squares, the blinks become much stronger, an effect called fear-potentiated startle. With yellow squares, the blinks weaken. © Society for Science & the Public 2000 - 2014.
by Michael Slezak "Cannabis catastrophic for young brains" screamed the headline on an Australian medical news website this week. The article, and others like it, were reporting on a study linking teenage cannabis use with school dropouts, addiction and suicide, published in the The Lancet Psychiatry. Echoing the research findings, the articles declared that if teenagers smoke cannabis daily, it makes them seven times more likely to commit suicide compared with non-users. Indeed, "there is no safe level of use", most reported. They also urged caution to legislators around the world that are gingerly taking steps towards weakening prohibition of cannabis, lest young people get easier access to it. So does smoking pot cause suicide? The Lancet authors say it probably does. Their study combined data from three previous longitudinal studies which together tracked cannabis use in more than 3000 people in Australia and New Zealand over many years. The authors looked for associations between the frequency of cannabis use before the age of 17 and outcomes, such as high school completion, until the people reached the age of 30. They found that those who smoked cannabis daily before they were 17 had lower odds of finishing high school and getting a degree than people who had never used cannabis. Larger increased odds were associated with cannabis dependence later in life, trying other illicit drugs and suicide attempts. But longitudinal studies only show correlation, not causation. The difficulty is that people take drugs for a reason, and that reason could be what's causing the outcome. In the case of school dropout, suicide and daily pot smoking, it is not hard to imagine what else could be going on in someone's life to engender these behaviours. © Copyright Reed Business Information Ltd
By MICHAEL HEDRICK I can remember the early days of having schizophrenia. I was so afraid of the implications of subtle body language, like a lingering millisecond of eye contact, the way my feet hit the ground when I walked or the way I held my hands to my side. It was a struggle to go into a store or, really, anywhere I was bound to see another living member of the human species. With a simple scratch of the head, someone could be telling me to go forward, or that what I was doing was right or wrong, or that they were acknowledging the symbolic crown on my head that made me a king or a prophet. It’s not hard to imagine that I was having a tough time in the midst of all the anxiety and delusions. Several months after my diagnosis, I took a job at a small town newspaper as a reporter. I sat in on City Council meetings, covering issues related to the lowering water table and interviewing local business owners for small blurbs in the local section, all the while wondering if I was uncovering some vague connections to an international conspiracy. The nights were altogether different. Every day, I would come home to my apartment and smoke pot, then lay on my couch watching television or head out to the bar and get so hammered that I couldn’t walk. It’s hard to admit, but the only time I felt relaxed was when I was drunk. I eventually lost my newspaper job, but that wasn’t the catalyst for change. It all came to a head one night in July. I had been out drinking all night and, in a haze, I decided it would be a good idea to drive the two miles back to my apartment. This is something I had done several times before, but it had never dawned on me that it was a serious deal. I thought I was doing well, not swerving and being only several blocks from my house, when I saw flashing lights behind me. What started as a trip to the bar to unwind ended with me calling my parents to bail me out of jail at 3 a.m. © 2014 The New York Times Company
Being bullied regularly by a sibling could put children at risk of depression when they are older, a study led by the University of Oxford suggests. Around 7,000 children aged 12 were asked if they had experienced a sibling saying hurtful things, hitting, ignoring or lying about them. The children were followed up at 18 and asked about their mental health. A charity said parents should deal with sibling rivalry before it escalates. Previous research has suggested that victims of peer bullying can be more susceptible to depression, anxiety and self-harm. This study claims to be the first to examine bullying by brothers or sisters during childhood for the same psychiatric problems in early adulthood. Researchers from the Universities of Oxford, Warwick and Bristol and University College London sent questionnaires to thousands of families with 12-year-old children in 2003-04 and went back to them six years later to assess their mental health. If they had siblings they were asked about bullying by brothers and sisters. The questionnaire said: "This means when a brother or sister tries to upset you by saying nasty and hurtful things, or completely ignores you from their group of friends, hits, kicks, pushes or shoves you around, tells lies or makes up false rumours about you." Most children said they had not experienced bullying. Of these, at 18, 6.4% had depression scores in the clinically significant range, 9.3% experienced anxiety and 7.6% had self-harmed in the previous year. The 786 children who said they had been bullied by a sibling several times a week were found to be twice as likely to have depression, self-harm and anxiety as the other children. BBC © 2014
By S. Matthew Liao As many as 20 percent of war veterans return from combat in Afghanistan and Iraq with post-traumatic stress disorder (PTSD) or major depression, according to a 2008 report from the RAND Corporation. Many experience constant nightmares and flashbacks and many can’t live normal lives. For significant number of veterans, available medications do not seem to help. In 2010, at least 22 veterans committed suicide each day, according to the Department of Veterans Affairs. In her book, Demon Camp, the author Jen Percy describes damaged veterans who have even resorted to exorcism to alleviate their PTSD symptoms. As part of President Obama’s BRAIN Initiative, the federal Defense Advanced Research Projects Agency (DARPA) plans to spend more than $70 million over five years to develop novel devices that would address neurological disorders such as PTSD. DARPA is particularly interested in a technology called Deep Brain Stimulation (DBS). DBS involves inserting a thin electrode through a small opening in the skull into a specific area in the brain; the electrode is then connected by an insulated wire to a battery pack underneath the skin; the battery pack then sends electrical pulses via the wire to the brain. About 100,000 people around the world today have a DBS implant to ameliorate the effects of Parkinson’s disease, epilepsy and major depression. There is evidence that DBS can also help with PTSD. Functional neuroimaging studies indicate that amygdala hyperactivity is responsible for the symptoms of PTSD and that DBS can functionally reduce the activity of the amygdala. In animal PTSD models, DBS has been found to be more effective than current treatment using selective serotonin reuptake inhibitors. © 2014 Scientific American
Link ID: 20039 - Posted: 09.06.2014
On 5th May, 1953, the novelist Aldous Huxley dissolved four-tenths of a gram of mescaline in a glass of water, drank it, then sat back and waited for the drug to take effect. Huxley took the drug in his California home under the direct supervision of psychiatrist Humphry Osmond, to whom Huxley had volunteered himself as “a willing and eager guinea pig”. Osmond was one of a small group of psychiatrists who pioneered the use of LSD as a treatment for alcoholism and various mental disorders in the early 1950s. He coined the term psychedelic, meaning ‘mind manifesting’ and although his research into the therapeutic potential of LSD produced promising initial results, it was halted during the 1960s for social and political reasons. Born in Surrey in 1917, Osmond studied medicine at Guy’s Hospital, London. He served in the navy as a ship’s psychiatrist during World War II, and afterwards worked in the psychiatric unit at St. George’s Hospital, London, where he became a senior registrar. While at St. George’s, Osmond and his colleague John Smythies learned about Albert Hoffman’s discovery of LSD at the Sandoz Pharmaceutical Company in Bazel, Switzerland. Osmond and Smythies started their own investigation into the properties of hallucinogens and observed that mescaline produced effects similar to the symptoms of schizophrenia, and that its chemical structure was very similar to that of the hormone and neurotransmitter adrenaline. This led them to postulate that schizophrenia was caused by a chemical imbalance in the brain, but these ideas were not favourably received by their colleagues. In 1951 Osmond took a post as deputy director of psychiatry at the Weyburn Mental Hospital in Saskatchewan, Canada and moved there with his family. Within a year, he began collaborating on experiments using LSD with Abram Hoffer. Osmond tried LSD himself and concluded that the drug could produce profound changes in consciousness. Osmond and Hoffer also recruited volunteers to take LSD and theorised that the drug was capable of inducing a new level of self-awareness which may have enormous therapeutic potential. © 2014 Guardian News and Media Limited
Greta Kaul, Stanford researchers say poor sleep may be an independent risk factor for suicide in adults over 65. Researchers used data from a previous epidemiological study to compare the sleep quality of 20 older adults who committed suicide and 400 who didn't, over 10 years. Researchers found that those who didn't sleep well were 1.4 times more likely to commit suicide within a decade. Older adults have disproportionately high suicide rates in the first place, especially older men. The Stanford researchers believe that on its own, sleeping poorly could be a risk factor for suicide later in life. It may even be a more powerful predictor of suicide risk than symptoms of depression. They found that the strongest predictor of suicide was the combination of bad sleep and depression. Unlike many biological, psychological and social risk factors for suicide, sleep disorders tend to be treatable, said Rebecca Bernert, the lead author of the study. Sleep disorders are also less stigmatized than other suicide risk factors. Bernert is now studying whether treating insomnia is effective in preventing depression and suicide. The study was published in JAMA Psychiatry in August. © 2014 Hearst Communications, Inc.
By RONI CARYN RABIN Pregnant women often go to great lengths to give their babies a healthy start in life. They quit smoking, skip the chardonnay, switch to decaf, forgo aspirin. They say no to swordfish and politely decline Brie. Yet they rarely wean themselves from popular selective serotonin reuptake inhibitor antidepressants like Prozac, Celexa and Zoloft despite an increasing number of studies linking prenatal exposure to birth defects, complications after birth and even developmental delays and autism. Up to 14 percent of pregnant women take antidepressants, and the Food and Drug Administration has issued strong warnings that one of them, paroxetine (Paxil), may cause birth defects. But the prevailing attitude among doctors has been that depression during pregnancy is more dangerous to mother and child than any drug could be. Now a growing number of critics are challenging that assumption. “If antidepressants made such a big difference, and women on them were eating better, sleeping better and taking better care of themselves, then one would expect to see better birth outcomes among the women who took medication than among similar women who did not,” said Barbara Mintzes, an associate professor at the University of British Columbia School of Population and Public Health. “What’s striking is that there’s no research evidence showing that.” On the contrary, she said, “when you look for it, all you find are harms.” S.S.R.I.s are believed to work in part by blocking reabsorption (or reuptake) of serotonin, altering levels of this important neurotransmitter in the brain and elsewhere in the body. Taken by a pregnant woman, the drugs cross the placental barrier, affecting the fetus. © 2014 The New York Times Company
By GARY GREENBERG Joel Gold first observed the Truman Show delusion — in which people believe they are the involuntary subjects of a reality television show whose producers are scripting the vicissitudes of their lives — on Halloween night 2003 at Bellevue Hospital, where he was the chief attending psychiatrist. “Suspicious Minds,” which he wrote with his brother, Ian, an associate professor of philosophy and psychology at McGill University, is an attempt to use this delusion, which has been observed by many clinicians, to pose questions that have gone out of fashion in psychiatry over the last half-century: Why does a mentally ill person have the delusions he or she has? And, following the lead of the medical historian Roy Porter, who once wrote that “every age gets the lunatics it deserves,” what can we learn about ourselves and our times from examining the content of madness? The Golds’ answer is a dual broadside: against a psychiatric profession that has become infatuated with neuroscience as part of its longstanding attempt to establish itself as “real medicine,” and against a culture that has become too networked for its own good. Current psychiatric practice is to treat delusions as the random noise generated by a malfunctioning (and mindless) brain — a strategy that would be more convincing if doctors had a better idea of how the brain produced madness and how to cure it. According to the Golds, ignoring the content of delusions like T.S.D. can only make mentally ill people feel more misunderstood, even as it distracts the rest of us from the true significance of the delusion: that we live in a society that has put us all under surveillance. T.S.D. sufferers may be paranoid, but that does not mean they are wrong to think the whole world is watching. This is not to say they aren’t crazy. Mental illness may be “just a frayed, weakened version of mental health,” but what is in tatters for T.S.D. patients is something crucial to negotiating social life, and that, according to the Golds, is the primary purpose toward which our big brains have evolved: the ability to read other people’s intentions or, as cognitive scientists put it, to have a theory of mind. This capacity is double-edged. “The better you are at ToM,” they write, “the greater your capacity for friendship.” © 2014 The New York Times Company
Link ID: 20013 - Posted: 08.30.2014
One of the best things about being a neuroscientist used to be the aura of mystery around it. It was once so mysterious that some people didn’t even know it was a thing. When I first went to university and people asked what I studied, they thought I was saying I was a “Euroscientist”, which is presumably someone who studies the science of Europe. I’d get weird questions such as “what do you think of Belgium?” and I’d have to admit that, in all honesty, I never think of Belgium. That’s how mysterious neuroscience was, once. Of course, you could say this confusion was due to my dense Welsh accent, or the fact that I only had the confidence to talk to strangers after consuming a fair amount of alcohol, but I prefer to go with the mystery. It’s not like that any more. Neuroscience is “mainstream” now, to the point where the press coverage of it can be studied extensively. When there’s such a thing as Neuromarketing (well, there isn’t actually such a thing, but there’s a whole industry that would claim otherwise), it’s impossible to maintain that neuroscience is “cool” or “edgy”. It’s a bad time for us neurohipsters (which are the same as regular hipsters, except the designer beards are on the frontal lobes rather than the jaw-line). One way that we professional neuroscientists could maintain our superiority was by correcting misconceptions about the brain, but lately even that avenue looks to be closing to us. The recent film Lucy is based on the most classic brain misconception: that we only use 10% of our brain. But it’s had a considerable amount of flack for this already, suggesting that many people are wise to this myth. We also saw the recent release of Susan Greenfield’s new book Mind Change, all about how technology is changing (damaging?) our brains. This is a worryingly evidence-free but very common claim by Greenfield. Depressingly common, as this blog has pointed out many times. But now even the non-neuroscientist reviewers aren’t buying her claims. © 2014 Guardian News and Media Limited
Link ID: 20011 - Posted: 08.30.2014
|By Roni Jacobson Almost immediately after Albert Hofmann discovered the hallucinogenic properties of LSD in the 1940s, research on psychedelic drugs took off. These consciousness-altering drugs showed promise for treating anxiety, depression, post-traumatic stress disorder (PTSD), obsessive-compulsive disorder (OCD) and addiction, but increasing government conservatism caused a research blackout that lasted decades. Lately, however, there has been a resurgence of interest in psychedelics as possible therapeutic agents. This past spring Swiss researchers published results from the first drug trial involving LSD in more than 40 years. Although the freeze on psychedelic research is thawing, scientists say that restrictive drug policies are continuing to hinder their progress. In the U.S., LSD, psilocybin, MDMA, DMT, peyote, cannabis and ibogaine (a hallucinogen derived from an African shrub) are all classified as Schedule I illegal drugs, which the U.S. Drug Enforcement Administration defines as having a high potential for abuse and no currently accepted medical applications—despite extensive scientific evidence to the contrary. In a joint report released in June, the Drug Policy Alliance and the Multidisciplinary Association for Psychedelic Studies catalogue several ways in which they say that the DEA has unfairly obstructed research on psychedelics, including by overruling an internal recommendation in 1986 that MDMA be placed on a less restrictive schedule. The DEA and the U.S. Food and Drug Administration maintain that there is insufficient research to justify recategorization. This stance creates a catch-22 by basing the decision on the need for more research while limiting the ability of scientists to conduct that research. © 2014 Scientific American
By James Gallagher Health editor, BBC News website Breastfeeding can halve the risk of post-natal depression, according to a large study of 14,000 new mothers. However, there is a large increase in the risk of depression in women planning to breastfeed who are then unable to do so. The study, published in the journal Maternal and Child Health, called for more support for women unable to breastfeed. A parenting charity said mental health was a "huge issue" for many mothers. The health benefits of breastfeeding to the baby are clear-cut and the World Health Organization recommends feeding a child nothing but breast milk for the first six months. However, researchers at the University of Cambridge said the impact on the mother was not as clearly understood. 'Highest risk' One in 10 women will develop depression after the birth of their child. The researchers analysed data from 13,998 births in the south-west of England. It showed that, out of women who were planning to breastfeed, there was a 50% reduction in the risk of post-natal depression if they started breastfeeding. But the risk of depression more than doubled among women who wanted to, but were not able to, breastfeed. Dr Maria Iacovou, one of the researchers, told the BBC: "Breastfeeding does appear to have a protective effect, but there's the other side of the coin as well. "Those who wanted to and didn't end up breastfeeding had the highest risk of all the groups." BBC © 2014
Claudia M. Gold When I hear debate over the association between SSRI’s (selective serotonin re-uptake inhibitors, a class of antidepressant medication) and suicidal behavior in children and adolescents, I am immediately brought back to a night in the early 2000's. As the covering pediatrician I was called to the emergency room to see a young man, a patient of a pediatrician in a neighboring town, who had attempted suicide by taking a nearly lethal overdose. That night, as I watched over him in the intensive care unit, I learned that he was a high achieving student and athlete who, struggling under the pressures of the college application process, had been prescribed an SSRI by his pediatrician. His parents described a transformation in his personality over the months preceding the suicide attempt that was so dramatic that I ordered a CT scan to see if he had a brain tumor. It was normal. When, in the coming years the data emerged about increasing suicidal behavior following use of SSRI's, I recognized in retrospect that his change in behavior was a result of the medication. But at the time I knew nothing of these serious side effects. At that time, coinciding with pharmaceutical industry's aggressive marketing campaign directed at the public as well as a professional audience, these drugs were becoming increasingly popular with pediatricians. As the possible serious side effects of these medications came increasingly in to awareness, the FDA issued the controversial "black box warning" that the drugs carried an increased risk of suicidal behavior. Following the black box warning, pediatricians, myself included, became reluctant to prescribe these medications. We did not have the time or experience to provide the recommended increased monitoring and close follow-up.
By Rachel Feltman At every waking moment, your brain is juggling two very different sets of information. Input from the world around you, like sights and smells, has to be processed. But so does internal information — your memories and thoughts. Right now, for example, I’m looking at a peach: It’s yellow and pink, and has a lot of fuzz. But I also know that it smells nice (a personal assessment) and I’m imagining how good it will taste, based on my previous experience with fragrant pink fruits. The brain’s ability to handle these different signals is key to cognitive function. In some disorders, particularly autism and schizophrenia, this ability is disrupted. The brain has difficulty keeping internal and external input straight. In a new study published Thursday in Cell, researchers observe the switching method in action for the first time. While the research used mice, not humans, principal investigator and NYU Langone Medical Center assistant professor Michael Halassa sees this as a huge step toward understanding and manipulating the same functions in humans. “This is one of the few moments in my life where I’d actually say yes, absolutely this is going to translate to humans,” Halassa said. “This isn’t something based on genes or molecules that are specific to one organism. The underlying principles of how the brain circuitry works are likely to be very similar in humans and mice.” That circuitry has been hypothesized for decades. Neurologists know that the cortex of the brain is responsible for higher cognitive functions, like music and language. And the thalamus, which is an egg-like structure in the center of the brain, works to direct the flow of internal and external information before it gets to the cortex.