Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By NATALIE ANGIER At birth, the least weasel is as small and light as a paper clip, and the tiny ribs that press visibly against its silvery pink skin give it a segmented look, like that of an insect. A newborn kit is exceptionally underdeveloped, with sealed eyes and ears that won’t open for five or six weeks, an age when puppies and kittens are ready to be weaned. A mother weasel, it seems, has no choice but to deliver her young half-baked. As a member of the mustelid clan — a noble but often misunderstood family of carnivorous mammals that includes ferrets, badgers, minks and wolverines — she holds to a slender, elongated body plan, the better to pursue prey through tight spaces that most carnivores can’t penetrate. Bulging baby bumps would jeopardize that sylphish hunting physique. The solution? Give birth to the equivalent of fetuses and then finish gestating them externally on mother’s milk. “If you want access to small environments, you can’t have a big belly,” said William J. Zielinski, a mustelid researcher with the United States Forest Service in Arcata, Calif. “You don’t see fat weasels.” For Dr. Zielinski and other mustelid-minded scientists, weasels exemplify evolutionary genius and compromise in equal measure, the piecing together of exaggerated and often contradictory traits to yield a lineage of fierce, fleet, quick-witted carnivores that can compete for food against larger celebrity predators like the big cats, wolves and bears. Researchers admit that wild mustelids can be maddening to study. Most species are secretive loners, shrug off standard radio collars with ease, and run close to the ground “like small bolts of brown lightning,” as one team noted. Now you see them, no, you didn’t. Nevertheless, through a mix of dogged field and laboratory studies, scientists have lately made progress in delineating the weasel playbook, and it’s a page turner, or a page burner. © 2016 The New York Times Company
Link ID: 22321 - Posted: 06.14.2016
By Julia Shaw Can you trust your memory? Picture this. You are in a room full of strangers and you are going around introducing yourself. You say your name to about a dozen people, and they say their names to you. How many of these names are you going to remember? More importantly, how many of these names are you going to misremember? Perhaps you call a person you just met John instead of Jack. This kind of thing happens all the time. Now magnify the situation. You are talking to a close friend, and you disclose something important to them, perhaps even something traumatic. You might, for example, say you witnessed the Paris attacks in 2015. But, how can you know for sure that your memory is accurate? Like most people, you probably feel that misremembering someone’s name is totally different from misremembering an important and emotional life event. That you could never forget #JeSuisParis, and will always have stable and reliable memories of such atrocities. I’m sure that is what those who witnessed 9/11, the 7/7 bombings in London or the assassination of JFK also thought. However, when experimenters conduct research on the accuracy of these so-called “flashbulb memories,” they find that many people make grave errors in their recollections of important historical and personal events. And these errors are more than just omissions. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 22320 - Posted: 06.14.2016
by Laura Sanders Any parent trying to hustle a school-bound kid out the door in the morning knows that her child’s skull possesses a strange and powerful form of black magic: It can repel parents’ voices. Important messages like “find your shoes” bounce off the impenetrable fortress and drift unheeded to the floor. But when this perplexing force field is off, it turns out that mothers’ voices actually have profound effects on kids. Children’s brains practically buzz when they hear their moms’ voices, scientists report in the May 31 Proceedings of the National Academy of Sciences. (Fun and not surprising side note: Babies’ voices get into moms’ brains, too.) The parts of kids’ brains that handle emotions, face recognition and reward were prodded into action by mothers’ voices, brain scans of 24 children ages 7 to 12 revealed. And words were not required to get this big reaction. In the study, children listened to nonsense words said by either their mother or one of two unfamiliar women. Even when the words were fake, mothers’ voices still prompted lots of neural action. The study was done in older kids, but children are known to tune into their mothers’ voices early. Really early, in fact. One study found that fetuses’ heart rates change when they hear their moms read a story. For a fetus crammed into a dark, muffled cabin, voices may take on outsized importance. |© Society for Science & the Public 2000 - 2016.
By Brian Platzer It started in 2010 when I smoked pot for the first time since college. It was cheap, gristly weed I’d had in my freezer for nearly six years, but four hours after taking one hit I was still so dizzy I couldn’t stand up without holding on to the furniture. The next day I was still dizzy, and the next, and the next, but it tapered off gradually until about a month later I was mostly fine. Over the following year I got married, started teaching seventh and eighth grade, and began work on a novel. Every week or so the disequilibrium sneaked up on me. The feeling was one of disorientation as much as dizziness, with some cloudy vision, light nausea and the sensation of being overwhelmed by my surroundings. During one eighth-grade English class, when I turned around to write on the blackboard, I stumbled and couldn’t stabilize myself. I fell in front of my students and was too disoriented to stand. My students stared at me slumped on the floor until I mustered enough focus to climb up to a chair and did my best to laugh it off. I was only 29, but my father had had a benign brain tumor around the same age, so I had a brain scan. My brain appeared to be fine. A neurologist recommended I see an ear, nose and throat specialist. A technician flooded my ear canal with water to see if my acoustic nerve reacted properly. The doctor suspected either benign positional vertigo (dizziness caused by a small piece of bonelike calcium stuck in the inner ear) or Ménière’s disease (which leads to dizziness from pressure). Unfortunately, the test showed my inner ear was most likely fine. But just as the marijuana had triggered the dizziness the year before, the test itself catalyzed the dizziness now. In spite of the negative results, doctors still believed I had an inner ear problem. They prescribed exercises to unblock crystals, and salt pills and then prednisone to fight Ménière’s disease. All this took months, and I continued to be dizzy, all day, every day. It felt as though I woke up every morning having already drunk a dozen beers — some days, depending on how active and stressful my day was, it felt like much more. Most days ended with me in tears. © 2016 The New York Times Company
[Agata Blaszczak-Boxe, Contributing Writer] People who use marijuana for many years respond differently to natural rewards than people who don't use the drug, according to a new study. Researchers found that people who had used marijuana for 12 years, on average, showed greater activity in the brain's reward system when they looked at pictures of objects used for smoking marijuana than when they looked at pictures of a natural reward — their favorite fruits. "This study shows that marijuana disrupts the natural reward circuitry of the brain, making marijuana highly salient to those who use it heavily," study author Dr. Francesca Filbey, an associate professor of behavioral and brain science at the University of Texas at Dallas, said in a statement. "In essence, these brain alterations could be a marker of transition from recreational marijuana use to problematic use." [11 Odd Facts About Marijuana] In the study, researchers looked at 59 marijuana users who had used marijuana daily for the past 60 days, and had used the drug on at least 5,000 occasions during their lives. The researchers wanted to see whether the brains of these long-term marijuana users would respond differently to picures of objects related to marijuana use than they did to natural rewards, such as their favorite fruits, compared with people who did not use marijuana.
Keyword: Drug Abuse
Link ID: 22317 - Posted: 06.14.2016
By C. CLAIBORNE RAY Insects have an odor-sensing system that is roughly analogous to that of vertebrates, according to “The Neurobiology of Olfaction,” a survey published in 2010. Different species have varying numbers of odor receptors, special molecules that are attuned to specific odor molecules. Genes govern the production of each kind of receptor; the more genes, the more kinds of receptor. A big difference with insects is that their olfactory receptors are basically external, often within hairlike groups of cells, called sensilla, on the antennas, not inside a collection organ like a nose. Sign Up for the Science Times Newsletter Every week, we'll bring you stories that capture the wonders of the human body, nature and the cosmos. The odorant molecules encounter odorant-binding proteins, assumed to guide them to the long receptor nerve cells, called axons. Electrical signals are sent along the axons. The axons are usually connected to specific processing centers in the brain called glomeruli, held in a region called the antennal lobe. There the signals are analyzed. Depending on the nature, quantity and timing of the odor signals received, still other cells appear to excite or inhibit reactions. Exactly how the reaction system works is not yet fully understood. The Florida carpenter ant and the Indian jumping ant both have wide-ranging abilities to sense odors, with more than 400 genes to make different odor receptors, a 2012 study found. The fruit fly has only 61. The research also found marked differences in the smelling ability of the sexes, with the female ants well ahead. © 2016 The New York Times Company
By Devi Shastri Calling someone a “bird brain” might not be the zinger of an insult you thought it was: A new study shows that—by the total number of forebrain neurons—some birds are much brainier than we thought. The study, published online today in the Proceedings of the National Academy of Sciences, found that 28 bird species have more neurons in their pallial telencephalons, the brain region responsible for higher level learning, than mammals with similar-sized brains. Parrots and songbirds in particular packed in the neurons, with parrots (like the gray parrot, above) ranging from 227 million to 3.14 billion, and songbirds—including the notoriously intelligent crow—from 136 million to 2.17 billion. That’s about twice as many neurons as primates with brains of the same mass and four times as many as rodent brains of the same mass. To come up with their count, the researchers dissected the bird brains and then dissolved them in a detergent solution, ensuring that the cells were suspended in what neuroscientist Suzana Herculano-Houzel of Vanderbilt University in Nashville calls “brain soup.” This allowed them to label, count, and estimate how many neurons were in a particular brain region. The region that they focused on allows some birds to hone skills like tool use, planning for the future, learning birdsong, and mimicking human speech. One surprising finding was that the neurons were much smaller than expected, with shorter and more compact connections between cells. The team’s next step is to examine whether these neurons started out small or instead shrank in order to keep the birds light enough for flights. One thing, at least, is clear: It’s time to find a new insult for your less brainy friends. © 2016 American Association for the Advancement of Science
Aggressive chemotherapy followed by a stem cell transplant can halt the progression of multiple sclerosis (MS), a small study has suggested. The research, published in The Lancet, looked at 24 patients aged between 18 and 50 from three hospitals in Canada. For 23 patients the treatment greatly reduced the onset of the disease, but in one case a person died. An MS Society spokeswoman said this type of treatment does "offer hope" but also comes with "significant risks". Around 100,000 people in the UK have MS, which is an incurable neurological disease. 'No relapses' The condition causes the immune system to attack the lining of nerves in the brain and spinal cord. Most patients are diagnosed in their 20s and 30s. One existing treatment is for the immune system to be suppressed with chemotherapy and then stem cells are introduced to the patient's bloodstream - this procedure is known as an autologous haematopoietic stem cell transplant (HSCT). But in this study, Canadian researchers went further - not just suppressing the immune system, but destroying it altogether. It is then rebuilt with stem cells harvested from the patient's own blood which are at such an early stage, they have not developed the flaws that trigger MS. The authors said that among the survivors, over a period of up to 13 years, there were no relapses and no new detectable disease activity. All the patients who took part in the trial had a "poor prognosis" and had previously undergone standard immunosuppressive therapy which had not controlled the MS - which affects around two million people worldwide. © 2016 BBC.
By Monique Brouillette The brain presents a unique challenge for medical treatment: it is locked away behind an impenetrable layer of tightly packed cells. Although the blood-brain barrier prevents harmful chemicals and bacteria from reaching our control center, it also blocks roughly 95 percent of medicine delivered orally or intravenously. As a result, doctors who treat patients with neurodegenerative diseases, such as Parkinson's, often have to inject drugs directly into the brain, an invasive approach that requires drilling into the skull. Some scientists have had minor successes getting intravenous drugs past the barrier with the help of ultrasound or in the form of nanoparticles, but those methods can target only small areas. Now neuroscientist Viviana Gradinaru and her colleagues at the California Institute of Technology show that a harmless virus can pass through the barricade and deliver treatment throughout the brain. Gradinaru's team turned to viruses because the infective agents are small and adept at entering cells and hijacking the DNA within. They also have protein shells that can hold beneficial deliveries, such as drugs or genetic therapies. To find a suitable virus to enter the brain, the researchers engineered a strain of an adeno-associated virus into millions of variants with slightly different shell structures. They then injected these variants into a mouse and, after a week, recovered the strains that made it into the brain. A virus named AAV-PHP.B most reliably crossed the barrier. © 2016 Scientific American,
Link ID: 22313 - Posted: 06.13.2016
Angus Chen Rachel Star Withers runs a YouTube channel where she performs goofy stunts on camera and talks about her schizophrenia. Since 2008, when the then 22-year-old revealed her diagnosis online, tens of thousands of people have seen her videos. Some of them have a psychotic disorder or mood disorders themselves, or know people who do. They say her explanation about what a symptom like hallucinations feels like can be really helpful. So can Rachel's advice on ways to cope with them, like getting a dog or a cat. If the animal doesn't react to the hallucination, then it's probably not real, she says. We talked with people about how Withers' videos have helped them understand these diseases. What follows is a Q&A with two of these people. The interviews have been edited for length and clarity. Julia Billingsley is 22 years old and from Peoria, Ill. She learned she has schizophrenia last year, but she says her earliest encounter with the disease was back when she was very young. Her mother has schizophrenia, too, Billingsley says, and often had a delusion that their home was bugged. Julia, you started developing symptoms last year. Do you remember the first thing that happened to you? I'd just started dating my current boyfriend. And I'd be over at his house and I'd go to the bathroom. And this thought, this intrusive thought that wasn't my own at all would pop into my head like with force. And it would be like, hey. This room is bugged. And I was like, what? It made me stop. I stopped what I was doing and I didn't understand why my brain was thinking that. © 2016 npr
Link ID: 22312 - Posted: 06.13.2016
By ROBERT F. WORTH In early 2012, a neuropathologist named Daniel Perl was examining a slide of human brain tissue when he saw something odd and unfamiliar in the wormlike squiggles and folds. It looked like brown dust; a distinctive pattern of tiny scars. Perl was intrigued. At 69, he had examined 20,000 brains over a four-decade career, focusing mostly on Alzheimer’s and other degenerative disorders. He had peered through his microscope at countless malformed proteins and twisted axons. He knew as much about the biology of brain disease as just about anyone on earth. But he had never seen anything like this. The brain under Perl’s microscope belonged to an American soldier who had been five feet away when a suicide bomber detonated his belt of explosives in 2009. The soldier survived the blast, thanks to his body armor, but died two years later of an apparent drug overdose after suffering symptoms that have become the hallmark of the recent wars in Iraq and Afghanistan: memory loss, cognitive problems, inability to sleep and profound, often suicidal depression. Nearly 350,000 service members have been given a diagnosis of traumatic brain injury over the past 15 years, many of them from blast exposure. The real number is likely to be much higher, because so many who have enlisted are too proud to report a wound that remains invisible. For years, many scientists have assumed that explosive blasts affect the brain in much the same way as concussions from football or car accidents. Perl himself was a leading researcher on chronic traumatic encephalopathy, or C.T.E., which has caused dementia in N.F.L. players. Several veterans who died after suffering blast wounds have in fact developed C.T.E. But those veterans had other, nonblast injuries too. No one had done a systematic post-mortem study of blast-injured troops. That was exactly what the Pentagon asked Perl to do in 2010, offering him access to the brains they had gathered for research. It was a rare opportunity, and Perl left his post as director of neuropathology at the medical school at Mount Sinai to come to Washington. © 2016 The New York Times Company
By Teal Burrell Sociability may be skin deep. The social impairments and high anxiety seen in people with autism or related disorders may be partly due to a disruption in the nerves of the skin that sense touch, a new study in mice suggests. Autism spectrum disorders are primarily thought of as disorders of the brain, generally characterized by repetitive behaviors and deficits in communication skills and social interaction. But a majority of people with autism spectrum disorders also have an altered tactile sense; they are often hypersensitive to light touch and can be overwhelmed by certain textures. “They tend to be very wary of social touch [like a hug or handshake], or if they go outside and feel a gust of wind, it can be very unnerving,” says neuroscientist Lauren Orefice from Harvard Medical School in Boston. An appreciation for this sensory aspect of autism has grown in recent years. The newest version of psychiatry’s bible, the Diagnostic and Statistical Manual of Mental Disorders, includes the sensory abnormalities of autism as core features of the disease. “That was a big nod and a recognition that this is a really important aspect of autism,” says Kevin Pelphrey, a cognitive neuroscientist at The George Washington University in Washington, D.C., who was not involved in the work. The sensation of touch starts in the peripheral nervous system—in receptors at the surface of the skin—and travels along nerves that connect into the central nervous system. Whereas many autism researchers focus on the end of the pathway—the brain—Orefice and colleagues wondered about the first leg of the trip. So the group introduced mutations that silenced genes associated with autism spectrum disorders in mice, adding them in a way that restricted the effects to peripheral nerve cells, they report today in Cell. The team singled out the gene Mecp2, which encodes a protein that regulates the expression of genes that help forge connections between nerve cells. © 2016 American Association for the Advancement of Science
By Rita Celli, This is what Jennifer Molson remembers doctors saying to her about the high-stakes procedure she would undergo in 2002 as part of an Ottawa study that has yielded some promising results in multiple sclerosis patients. The 41-year-old Ottawa woman was in a wheelchair before the treatment. She now walks, runs and works full time. "I had no feeling from my chest down. I could barely cut my food," Molson remembers. Molson was diagnosed with MS when she was 21, and within five years she needed full-time care. "It was scary. [The procedure] was my last shot at living." MS is among the most common chronic inflammatory diseases of the central nervous system, affecting an estimated two million people worldwide. New Canadian research led by two Ottawa doctors and published in The Lancet medical journal on Thursday suggests the high-risk therapy may stop the disease from progressing. "This is the first treatment to produce this level of disease control or neurological recovery" from MS, said The Lancet in a news release. But The Lancet also highlights the high mortality rate associated with the procedure — one patient out of 24 involved in the clinical trial died from liver failure. "Treatment related risks limit [the therapy's] widespread use," The Lancet concludes. Results 'impressive' Nevertheless, in the journal's accompanying editorial a German doctor calls the results "impressive." ©2016 CBC/Radio-Canada.
By ALAN COWELL LONDON — When Muhammad Ali died last week, the memories spooled back inevitably to the glory days of the man who called himself the Greatest, a champion whose life intertwined with America’s traumas of race, faith and war. It was a chronicle of valor asserted in the most public of arenas scrutinized by an audience that spanned the globe. But there was another narrative, just as striking to some admirers, of a private courage beyond his klieg-lit renown. For the minority afflicted by Parkinson’s disease, Ali’s 30-year struggle with the same illness magnified the broader status he built from his boxing prowess as a black man who embraced radical Islam, refused to fight in Vietnam, earned the opprobrium of the establishment and yet emerged as an icon. “It was his longest bout, and one that ultimately he could not win,” the reporter Patrick Sawer wrote in The Telegraph, referring to Ali’s illness. Yet the affliction “only served to increase the worldwide admiration he had gained before the disease robbed him of his powers.” As a global superstar, Ali touched many lands, and Britain felt a particular bond. Boxing fans recalled his far-flung bouts — the “Rumble in the Jungle” against George Foreman in Zaire, as the Democratic Republic of Congo was then called, in 1974; “The Thrilla in Manila” in the Philippines against Joe Frazier a year later. But in Britain, his two defeats in the 1960s of Henry Cooper, a much-loved British heavyweight who died in 2011, and his feisty appearances in prime-time television interviews left an indelible mark. © 2016 The New York Times Company
Link ID: 22308 - Posted: 06.11.2016
By Linda Marsa| Helen Epstein felt deeply isolated and alone. Haunted by her parents’ harrowing experiences in Nazi concentration camps in World War II, she was troubled as a child by images of piles of skeletons and barbed wire, and, in her words, “a floating sense of danger and incipient harm.” But her Czech-born parents’ defense against the horrific memories was to detach. “Their survival strategy in the war was denial and dissociation, and that carried into their behavior afterward,” recalls Epstein, who was born shortly after the war and grew up in Manhattan. “They believed in action over reflection. Introspection was not encouraged, but a full schedule of activities was.” It was only when she was a student at Israel’s Hebrew University in the late 1960s that she realized she was part of a community that shared a cultural and historical legacy that included both pain and fear. “I met dozens of kids of survivors,” she says, “one after the other who shared certain characteristics: preoccupation with a family past and Israel, and who spoke several middle European languages — just like me.” Epstein’s 1979 book about her observations, Children of the Holocaust, gave voice to that sense of alienation and free-floating anxiety. In the years since, mental health professionals have largely attributed the second generation’s moodiness, hypervigilance and depression to learned behavior. It is only now, more than three decades later, that science has the tools to see that this legacy of trauma becomes etched in our DNA — a process known as epigenetics, in which environmental factors trigger genetic changes that may be passed on, just as surely as blue eyes and crooked smiles.
Michael Graziano Ever since Charles Darwin published On the Origin of Species in 1859, evolution has been the grand unifying theory of biology. Yet one of our most important biological traits, consciousness, is rarely studied in the context of evolution. Theories of consciousness come from religion, from philosophy, from cognitive science, but not so much from evolutionary biology. Maybe that’s why so few theories have been able to tackle basic questions such as: What is the adaptive value of consciousness? When did it evolve and what animals have it? The Attention Schema Theory (AST), developed over the past five years, may be able to answer those questions. The theory suggests that consciousness arises as a solution to one of the most fundamental problems facing any nervous system: Too much information constantly flows in to be fully processed. The brain evolved increasingly sophisticated mechanisms for deeply processing a few select signals at the expense of others, and in the AST, consciousness is the ultimate result of that evolutionary sequence. If the theory is right—and that has yet to be determined—then consciousness evolved gradually over the past half billion years and is present in a range of vertebrate species. Even before the evolution of a central brain, nervous systems took advantage of a simple computing trick: competition. Neurons act like candidates in an election, each one shouting and trying to suppress its fellows. At any moment only a few neurons win that intense competition, their signals rising up above the noise and impacting the animal’s behavior. This process is called selective signal enhancement, and without it, a nervous system can do almost nothing. © 2016 by The Atlantic Monthly Group
Tina Hesman Saey Gut microbes cause obesity by sending messages via the vagus nerve to pack on pounds, new research in rodents suggests. Bacteria in the intestines produce a molecule called acetate, which works through the brain and nervous system to make rats and mice fat, researchers report in the June 9 Nature. If the results hold up in humans, scientists would understand one mechanism by which gut microbes induce obesity: First, the microbes convert fats in food to a short-chain fatty acid called acetate. Acetate in the blood somehow makes its way to the brain. The brain sends a signal through the vagus nerve to the pancreas to increase insulin production. Insulin tells fat cells to store more energy. Fat builds up, leading to obesity. Acetate also increases levels of a hunger hormone called ghrelin, which could lead animals and people to eat even more, says Yale University endocrinologist Gerald Shulman, who led the study. “This is a tour-de-force paper,” says biochemist Jonathan Schertzer of McMaster University in Hamilton, Canada. Most studies that examine the health effects of intestinal microbes just list which bacteria, viruses, fungi and other microorganisms make up the gut microbiome, Schertzer says. But a catalog of differences between lean and obese individuals doesn’t address what those microbes do, he says. “What’s in name?” he asks. “When you find a factor that actually influences metabolism, that’s important.” © Society for Science & the Public 2000 - 2016.
Link ID: 22305 - Posted: 06.09.2016
By Esther Landhuis About 100 times rarer than Parkinson’s, and often mistaken for it, progressive supranuclear palsy afflicts fewer than 20,000 people in the U.S.—and two thirds do not even know they have it. Yet this little-known brain disorder that killed comic actor Dudley Moore in 2002 is quietly becoming a gateway for research that could lead to powerful therapies for a range of intractable neurodegenerative conditions including Alzheimer’s and chronic traumatic encephalopathy, a disorder linked to concussions and head trauma. All these diseases share a common feature: abnormal buildup of a protein called tau in the brains of patients. Progressive supranuclear palsy has no cure and is hard to diagnose. Although doctors may have heard of the disease, many know little about it. It was not described in medical literature until 1964 but some experts believe one of the earliest accounts of the debilitating illness appeared in an 1857 short story by Charles Dickens and his friend Wilke Collins: “A cadaverous man of measured speech. A man who seemed as unable to wink, as if his eyelids had been nailed to his forehead. A man whose eyes—two spots of fire—had no more motion than if they had been connected with the back of his skull by screws driven through them, and riveted and bolted outside among his gray hair. He had come in and shut the door, and he now sat down. He did not bend himself to sit as other people do, but seemed to sink bolt upright, as if in water, until the chair stopped him.” © 2016 Scientific American
Most available antidepressants do not help children and teenagers with serious mental health problems and some may be unsafe, experts have warned. A review of clinical trial evidence found that of 14 antidepressant drugs, only one, fluoxetine – marketed as Prozac – was better than a placebo at relieving the symptoms of young people with major depression. Another drug, venlafaxine, was associated with an increased risk of suicidal thoughts and suicide attempts. Blood test could identify people who will respond to antidepressants Read more But the authors stressed that the true effectiveness and safety of antidepressants taken by children and teenagers remained unclear because of the poor design and selective reporting of trials, which were mostly funded by drug companies. They recommended close monitoring of young people on antidepressants, regardless of what drugs they were prescribed, especially at the start of treatment. Professor Peng Xie, a member of the team from Chongqing Medical University in China, said: “The balance of risks and benefits of antidepressants for the treatment of major depression does not seem to offer a clear advantage in children and teenagers, with probably only the exception of fluoxetine.” Major depressive disorder affects around 3% of children aged six to 12 and 6% of teenagers aged 13 to 18. In 2004 the US Food and Drug Administration (FDA) issued a warning against the use of antidepressants in young people up to the age of 24 because of concerns about suicide risk. Yet the number of young people taking the drugs increased between 2005 and 2012, both in the US and UK, said the study authors writing in the Lancet medical journal. In the UK the proportion of children and teenagers aged 19 and under taking antidepressants rose from 0.7% to 1.1%. © 2016 Guardian News and Media Limited
By Amina Zafar, When Susan Robertson's fingers and left arm felt funny while she was Christmas shopping, they were signs of a stroke she experienced at age 36. The stroke survivor is now concerned about her increased risk of dementia. The link between stroke and dementia is stronger than many Canadians realize, the Heart and Stroke Foundation says. The group's annual report, released Thursday, is titled "Mind the connection: preventing stroke and dementia." Stroke happens when blood stops flowing to parts of the brain. Robertson, 41, of Windsor, Ont., said her short-term memory, word-finding and organizational skills were impaired after her 2011 stroke. She's extremely grateful to have recovered the ability to speak and walk after doctors found clots had damaged her brain's left parietal lobe. "I knew what was happening, but I couldn't say it," the occupational nurse recalled. Dementia risk A stroke more than doubles the risk of dementia, said Dr. Rick Swartz, a spokesman for the foundation and a stroke neurologist in Toronto. Raising awareness about the link is not to scare people, but to show how controlling blood pressure, not smoking or quitting if you do, eating a balanced diet and being physically active reduce the risk to individuals and could make a difference at a society level, Swartz said. While aging is a common risk factor in stroke and dementia, evidence in Canada and other developed countries shows younger people are also increasingly affected. ©2016 CBC/Radio-Canada.