Most Recent Links
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Erin Ross The teenage brain has been characterized as a risk-taking machine, looking for quick rewards and thrills instead of acting responsibly. But these behaviors could actually make teens better than adults at certain kinds of learning. "In neuroscience, we tend to think that if healthy brains act in a certain way, there should be a reason for it," says Juliet Davidow, a postdoctoral researcher at Harvard University in the Affective Neuroscience and Development Lab and the lead author of the study, which was published Wednesday in the journal Neuron. But scientists and the public often focus on the negatives of teen behavior, so she and her colleagues set out to test the hypothesis that teenagers' drive for rewards, and the risk-taking that comes from it, exist for a reason. When it comes to what drives reward-seeking in teens, fingers have always been pointed at the striatum, a lobster-claw-shape structure in the brain. When something surprising and good happens — say, you find $20 on the street — your body produces the pleasure-related hormone dopamine, and the striatum responds. "Research shows that the teenage striatum is very active," says Davidow. This suggests that teens are hard-wired to seek immediate rewards. But, she adds, it's also shown that their prefrontal cortex, which helps with impulse control, isn't fully developed. Combined, these two things have given teens their risky rep. But the striatum isn't just involved in reward-seeking. It's also involved in learning from rewards, explains Daphna Shohamy, a cognitive neuroscientist at the Zuckerman Mind Brain Behavior Institute at Columbia University who worked on the study. She wanted to see if teenagers would be better at this type of learning than adults would. © 2016 npr
Richard A. Friedman There’s a reason adults don’t pick up Japanese or learn how to kite surf. It’s ridiculously hard. In stark contrast, young people can learn the most difficult things relatively easily. Polynomials, Chinese, skateboarding — no problem! Neuroplasticity — the brain’s ability to form new neural connections and be influenced by the environment — is greatest in childhood and adolescence, when the brain is still a work in progress. But this window of opportunity is finite. Eventually it slams shut. Or so we thought. Until recently, the conventional wisdom within the fields of neuroscience and psychiatry has been that development is a one-way street, and once a person has passed through his formative years, experiences and abilities are very hard, if not impossible, to change. What if we could turn back the clock in the brain and recapture its earlier plasticity? This possibility is the focus of recent research in animals and humans. The basic idea is that during critical periods of brain development, the neural circuits that help give rise to mental states and behaviors are being sculpted and are particularly sensitive to the effects of experience. If we can understand what starts and stops these periods, perhaps we can restart them. Think of the brain’s sensitive periods as blown glass: The molten glass is very malleable, but you have a relatively brief time before it cools and becomes crystalline. Put it back into the furnace, and it can once again change shape. © 2016 The New York Times Company
Dean Burnett Throughout history, people have always worried about new technologies. The fear that the human brain cannot cope with the onslaught of information made possible by the latest development was first voiced in response to the printing press, back in the sixteenth century. Swap “printing press” for “internet” and you have the exact same concerns today, regularly voiced in the mainstream media, and usually focused on children. But is there any legitimacy to these claims? Or are they just needless scaremongering? There are several things to bear in mind when considering how our brains deal with the internet. The human brain is always dealing with a constant stream of rich information - that’s what the real world is First, don’t forget that “the internet” is a very vague term, given that it contains so many things across so many formats. You could, for instance, develop a gambling addiction via online casinos or poker sites. This is an example of someone’s brain being negatively affected via the internet, but it would be difficult to argue that the internet is the main culprit, any more than a gambling addiction obtained via a real world casino can be blamed on “buildings”; it’s just the context in which the problem occurred. However, the internet does give us a far more direct, constant and wide ranging access to information than pretty much anything else in human history. So how could, or does, this affect us and our brains? © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22736 - Posted: 10.10.2016
Annette Heist Nisha Pradhan is worried. The recent college graduate just turned 21 and plans to live on her own. But she's afraid she won't be able to stay safe. That's because Pradhan is anosmic — she isn't able to smell. She can't tell if milk is sour, or if she's burning something on the stove, or if there's a gas leak, and that worries her. "It actually didn't even strike me as being a big deal until I got to college," Pradhan says. Back home in Pennington, N.J., her family did her smelling for her, she says. She's moved in with them for now, but she's looking for a place of her own. "Now that I'm searching for ways or places to live as an independent person, I find more and more that the sense of smell is crucial to how we live our lives," Pradhan says. There's no good estimate for how many people live with smell loss. Congenital anosmia, being born without a sense of smell, is a rare condition. Acquired smell loss is more common. That loss can be total, or what's known as hyposmia, a diminished sense of smell. Pradhan doesn't know how she lost her sense of smell. She thinks she was born with it because as a child, she says she liked to eat and ate a lot. But there came a point where she lost interest in food. "That's actually one of the first things that people notice whenever they have a smell problem, is food doesn't taste right anymore," says Beverly Cowart, a researcher at the Monell Chemical Senses Center in Philadelphia. That's because eating and smell go hand in hand. How food tastes often relies on what we smell. © 2016 npr
By Anna Azvolinsky _The human cerebral cortex experiences a burst of growth late in fetal development thanks to the expansion and migration of progenitor cells that ultimately form excitatory neurons. For a fully functional brain, in addition to excitatory neurons, inhibitory ones (called interneurons) are also necessary. Yet scientists have not been able to account for the increase in inhibitory neurons that occurs after birth. Now, in a paper published today (October 6) in Science, researchers from the University of California, San Francisco (UCSF), have shown that there is a reserve of young neurons that continue to migrate and integrate into the frontal lobes of infants. “It was thought previously that addition of new neurons to the human cortex [mostly] happens only during fetal development. This new study shows that young neurons continue to migrate on a large scale into the cerebral cortex of infants,” Benedikt Berninger, who studies brain development at the Johannes Gutenberg University of Mainz, Germany, and was not involved in the work, wrote in an email to The Scientist. “This implies that experience during the first few months could affect this migration and thereby contribute to brain plasticity.” Aside from the migration of neurons into the olfactory bulb in infants, “this is the first time anyone has been able to catch neurons in the act of moving into the cortex,” said New York University neuroscientist Gord Fishell who penned an accompanying editorial but was not involved in the work. “We kept expecting these interneurons to be new cells but, in fact, they are immature ones hanging around and taking the long road from the bottom of the brain to the cortex.” © 1986-2016 The Scientist
Bruce Bower Apes understand what others believe to be true. What’s more, they realize that those beliefs can be wrong, researchers say. To make this discovery, researchers devised experiments involving a concealed, gorilla-suited person or a squirreled-away rock that had been moved from their original hiding places — something the apes knew, but a person looking for King Kong or the stone didn’t. “Apes anticipated that an individual would search for an object where he last saw it, even though the apes knew that the object was no longer there,” says evolutionary anthropologist Christopher Krupenye. If this first-of-its-kind finding holds up, it means that chimpanzees, bonobos and orangutans can understand that others’ actions sometimes reflect mistaken assumptions about reality. Apes’ grasp of others’ false beliefs roughly equals that of human 2-year-olds tested in much the same way, say Krupenye of the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, and his colleagues. Considering their targeted gazes during brief experiments, apes must rapidly assess others’ beliefs about the world in wild and captive communities, the researchers propose in the October 7 Science. Understanding the concept of false beliefs helps wild and captive chimps deceive their comrades, such as hiding food from those who don’t share, Krupenye suggests. |© Society for Science & the Public 2000 - 2016.
In his memoir Do No Harm, Henry Marsh confesses to the uncertainties he's dealt with as a surgeon and reflects on the enigmas of the brain and consciousness. Originally broadcast May 26, 2015. DAVE DAVIES, HOST: This is FRESH AIR. I'm Dave Davies, sitting in for Terry Gross. Our guest has opened heads and cut into brains, performing delicate and risky surgery on the part of the body that controls everything - breathing, movement, memory, and consciousness. In his work as a neurosurgeon, Dr. Henry Marsh has fixed aneurysms and spinal problems and spent many years operating on brain tumors. In his memoir, Dr. Marsh discusses some of his most challenging cases, triumphs and failures and confesses to the fears and uncertainties he's dealt with. He explains the surgical instruments he uses and how procedures have changed since he started practicing. And he reflects on the state of his profession and the mysteries of the brain and consciousness. Last year, he retired as the senior consulting neurosurgeon at St. George's Hospital in London, where he practiced for 28 years. He was the subject of the Emmy Award-winning 2007 documentary "The English Surgeon," which followed him in Ukraine, trying to help patients and improve conditions at a rundown hospital. Marsh's book, "Do No Harm," is now out in paperback. Terry spoke to him when it was published in hardback. © 2016 npr
Link ID: 22732 - Posted: 10.08.2016
By Andy Coghlan More men inevitably means more testosterone-fuelled violence, right? Wrong, according to a comprehensive analysis exploring how a surplus of men or women affect crime rates across the US. In areas where men outnumber women, there were lower rates of murders and assaults as well as fewer sex-related crimes, such as rapes, sex offences and prostitution. Conversely, higher rates of these crimes occurred in areas where there were more women than men. Ryan Schacht of the University of Utah in Salt Lake City and his colleagues analysed sex ratio data from all 3082 US counties, provided by the US Census Bureau in 2010. They compared this with crime data for the same year, issued by the US Federal Bureau of Investigation. They only included information about women and men of reproductive age. For all five types of offence analysed, rising proportions of men in a county correlated with fewer crimes– even when accounting for other potential contributing factors such as poverty. The results suggest that current policies aimed at defusing violence and crime by reducing the amount of men in male-dominated areas may backfire. According to Schacht, when women are in short supply, men must be more dutiful to win and retain a partner. With an abundance of women, men are spoilt for choice and adopt more promiscuous behaviour that brings them into conflict with other men, and more likely to commit sex-related offences. © Copyright Reed Business Information Ltd.
Emily Badger One of the newest chew toys in the presidential campaign is “implicit bias,” a term Mike Pence repeatedly took exception to in the vice-presidential debate on Tuesday. Police officers hear all this badmouthing, said Mr. Pence, Donald J. Trump’s running mate, in response to a question about whether society demands too much of law enforcement. They hear politicians painting them with one broad brush, with disdain, with automatic cries of implicit bias. He criticized Hillary Clinton for saying, in the first presidential debate, that everyone experiences implicit bias. He suggested a black police officer who shoots a black civilian could not logically experience such bias. “Senator, please,” Mr. Pence said, addressing his Democratic opponent, Tim Kaine, “enough of this seeking every opportunity to demean law enforcement broadly by making the accusation of implicit bias every time tragedy occurs.” The concept, in his words, came across as an insult, a put-down on par with branding police as racists. Many Americans may hear it as academic code for “racist.” But that connotation does not line up with scientific research on what implicit bias is and how it really operates. Researchers in this growing field say it isn’t just white police officers, but all of us, who have biases that are subconscious, hidden even to ourselves. Implicit bias is the mind’s way of making uncontrolled and automatic associations between two concepts very quickly. In many forms, implicit bias is a healthy human adaptation — it’s among the mental tools that help you mindlessly navigate your commute each morning. It crops up in contexts far beyond policing and race (if you make the rote assumption that fruit stands have fresher produce, that’s implicit bias). But the same process can also take the form of unconsciously associating certain identities, like African-American, with undesirable attributes, like violence. © 2016 The New York Times Company
Link ID: 22730 - Posted: 10.08.2016
/ By Seth Mnookin When Henry Molaison died at a Connecticut nursing home in 2008, at the age of 82, a front-page obituary in The New York Times called him “the most important patient in the history of brain science.” It was no exaggeration: Much of what we know about how memory works is derived from experiments on Molaison, a patient with severe epilepsy who in 1953 had undergone an operation that left him without medial temporal lobes and the ability to form new memories. The operation didn’t completely stop Molaison’s seizures — the surgeon, William Beecher Scoville, had done little more than guess at the locus of his affliction — but by chance, it rendered him a near-perfect research subject. Not only could postoperative changes in his behavior be attributed to the precise area of his brain that had been removed, but the fact that he couldn’t remember what had happened 30 seconds earlier made him endlessly patient and eternally willing to endure all manner of experiments. It didn’t take long for those experiments to upend our understanding of the human brain. By the mid-1950s, studies on Molaison (known until his death only as Patient H.M.) had shown that, contrary to popular belief, memories were created not in the brain as a whole, but in specific regions — and that different types of memories were formed in different ways. Molaison remained a research subject until his death, and for the last 41 years of his life, the person who controlled access to him, and was involved in virtually all the research on him, was an MIT neuroscientist named Suzanne Corkin. Copyright 2016 Undark
Keyword: Learning & Memory
Link ID: 22729 - Posted: 10.05.2016
By GRETCHEN REYNOLDS A single concussion experienced by a child or teenager may have lasting repercussions on mental health and intellectual and physical functioning throughout adulthood, and multiple head injuries increase the risks of later problems, according to one of the largest, most elaborate studies to date of the impacts of head trauma on the young. You cannot be an athlete, parent of an athlete, sports fan or reader of this newspaper and not be aware that concussions appear to be both more common — and more dangerous — than most of us once thought. According to a report released last week by the health insurer Blue Cross Blue Shield, based on data from medical claims nationwide, the incidence of diagnosed concussions among people under the age of 20 climbed 71 percent between 2010 and 2015. The rates rose most steeply among girls, with the incidence soaring by 119 percent during that time, although almost twice as many concussions over all were diagnosed in boys. The report acknowledges that the startling increase may partly reflect a growing awareness of the injury among parents, sports officials and physicians, which has led to more diagnoses. But the sheer numbers also suggest that more young people, particularly young athletes, are experiencing head injuries than in the past. Similar increases have been noted among young people in other nations. But the consequences, if any, for their health during adulthood have largely remained unknown. So for the new study, which was funded primarily by the Wellcome Trust and published in August in PLOS Medicine, scientists from Oxford University, Indiana University, the Karolinska Institute in Stockholm and other universities turned to an extensive trove of data about the health of people in Sweden. © 2016 The New York Times Company
Jon Hamilton Want to be smarter? More focused? Free of memory problems as you age? If so, don't count on brain games to help you. That's the conclusion of an exhaustive evaluation of the scientific literature on brain training games and programs. It was published Monday in the journal Psychological Science in the Public Interest. "It's disappointing that the evidence isn't stronger," says Daniel Simons, an author of the article and a psychology professor at the University of Illinois at Urbana-Champaign. "It would be really nice if you could play some games and have it radically change your cognitive abilities," Simons says. "But the studies don't show that on objectively measured real-world outcomes." The evaluation, done by a team of seven scientists, is a response to a very public disagreement about the effectiveness of brain games, Simons says. In October 2014, more than 70 scientists published an open letter objecting to marketing claims made by brain training companies. Pretty soon, another group, with more than 100 scientists, published a rebuttal saying brain training has a solid scientific base. "So you had two consensus statements, each signed by many, many people, that came to essentially opposite conclusions," Simons says. © 2016 npr
Keyword: Learning & Memory
Link ID: 22727 - Posted: 10.05.2016
By Michelle Roberts Some people are genetically wired to prefer the taste of fatty foods, putting them at increased risk of obesity, according to UK researchers. The University of Cambridge team offered 54 volunteers unlimited portions of chicken korma, followed by an Eton mess-style dessert. Some of the meals were packed with fat while others were low-fat versions. Those with a gene already linked to obesity showed a preference for the high-fat food and ate more of it. Fat genes The gene in question is called MC4R. It is thought about one in every 1,000 people carries a defective version of this gene which controls hunger and appetite as well as how well we burn off calories. Mutations in MC4R are the most common genetic cause of severe obesity within families that has so far been identified. Humans probably evolved hunger genes to cope in times of famine, say experts. When food is scarce it makes sense to eat and store more fat to fend off starvation. But having a defect in the MC4R gene means hunger can become insatiable. In the study, published in the journal Nature Communications, the researchers created a test menu that varied only in fat or sugar content. The three versions of the main meal on offer - chicken korma - were identical in appearance, and as far as possible, taste, but ranged in fat from low to medium and high. The volunteers were offered a small sample of each and then left to eat as much as they liked of the three dishes. The same was then done for a pudding of strawberries, meringue and cream, but this time varying the sugar content rather than the fat. © 2016 BBC.
Joe Palca Most of us have been tempted at one time or another by the lure of sugar. Think of all the cakes and cookies you consume between Thanksgiving and Christmastime! But why are some people unable to resist that second cupcake or slice of pie? That's the question driving the research of Monica Dus, a molecular biologist at the University of Michigan. She wants to understand how excess sugar leads to obesity by understanding the effect of sugar on the brain. Dus's interest in how animals control the amount they eat started with a curious incident involving her two Bichon Frise dogs. One day, Cupcake and Sprinkles got into a bag of dog treats when Dus wasn't around. The dogs overdid it. "I couldn't believe that these two tiny, 15-pound animals had huge bellies for three days and they couldn't stop themselves from eating," she recalls. Dus was already an expert in fruit fly genetics, so she decided to study flies to see if she could unravel the puzzle of how the brain controls eating behavior. Her lab has a working hypothesis. Dus believes a diet high in sugar actually changes the brain, so it no longer does a good job of knowing how many calories the body is taking in. She thinks there are persistent molecular changes in the brain over time – changes that pave the way for excessive eating and eventually, obesity. Monica Dus is a researcher at the University of Michigan. She just won a $1.5 million Young Innovator grant from the National Institutes of Health to study how a high-sugar diet may lead to obesity by changing brain chemistry. © 2016 npr
Link ID: 22725 - Posted: 10.05.2016
Urine could potentially be used for a quick and simple way to test for CJD or "human mad cow disease", say scientists in the journal JAMA Neurology. The Medical Research Council team say their prototype test still needs honing before it could be used routinely. Currently there is no easy test available for this rare but fatal brain condition. Instead, doctors have to take a sample of spinal fluid or brain tissue, or wait for a post-mortem after death. What they look for is tell-tale deposits of abnormal proteins called prions, which cause the brain damage. Building on earlier US work, Dr Graham Jackson and colleagues, from University College London, have now found it is also possible to detect prions in urine. This might offer a way to diagnose CJD rapidly and earlier, they say, although there is no cure. Creutzfeldt-Jakob disease (CJD): CJD is a rare, but fatal degenerative brain disorder caused by abnormal proteins called prions that damage brain cells. There are several forms of the disease: sporadic, which occurs naturally in the human population, and accounts for 85% of all CJD cases variant CJD, linked to eating beef infected by bovine spongiform encephalopathy (BSE) iatrogenic infection, caused by contamination during medical or surgical treatment In the 1990s it became clear that a brain disease could be passed from cows to humans. The British government introduced a ban on beef on the bone. Since then, officials have kept a close check on how many people have become sick or died from CJD. © 2016 BBC
Link ID: 22724 - Posted: 10.05.2016
By Emily Underwood When you let forth a big, embarrassing yawn during a boring lecture or concert, you succumb to a reflex so universal among animals that Charles Darwin mentioned it in his field notes. “Seeing a dog & horse & man yawn, makes me feel how much all animals are built on one structure,” he wrote in 1838. Scientists, however, still don’t agree on why we yawn or where it came from. So in a new study, researchers watched YouTube videos of 29 different yawning mammals, including mice, kittens, foxes, hedgehogs, walruses, elephants, and humans. (Here is a particularly cute montage used in the study.) They discovered a pattern: Small-brained animals with fewer neurons in the wrinkly outer layer of the brain, called the cortex, had shorter yawns than large-brained animals with more cortical neurons, the scientists report today in Biology Letters. Primates tended to yawn longer than nonprimates, and humans, with about 12,000 million cortical neurons, had the longest average yawn, lasting a little more than 6 seconds. African elephants, whose brains are close to the same weight as humans’ and have a similar number of cortical neurons, lasted about 6 seconds. The yawns of tiny-brained mice, in contrast, were less than 1.5 seconds in duration. The study lends support to a long-held hypothesis that yawning has an important physiological effect, such as increasing blood flood to the brain and cooling it down, the scientists say. © 2016 American Association for the Advancement of Science.
By GINA KOLATA It is not easy to be fat in America, even though more than a third of adults are obese. Donald J. Trump brought the issue of fat shaming to the fore during and after last week’s debate, when he disparaged a former Miss Universe winner who gained weight and when he said the hacking of the Democratic National Committee’s emails might have been done by “somebody sitting on their bed that weighs 400 pounds.” But there also is a body of evidence showing that the effects of fat shaming and stigmatizing go far beyond such remarks, beyond the stares fat people get on the street, the cutting comments strangers make about their weight and the “funny” greeting cards featuring overweight people. It turns out that fat prejudice differs from other forms in ways that make it especially difficult to overcome. The problems with fat shaming start early. Rebecca Puhl, the deputy director of the University of Connecticut’s Rudd Center for Food Policy and Obesity, and her colleagues find that weight is the most common reason children are bullied in school. In one study, nearly 85 percent of adolescents reported seeing overweight classmates teased in gym class. Dr. Puhl and her colleagues asked fat kids who was doing the bullying. It turned out that it was not just friends and classmates but also teachers and — for more than a third of the bullied — parents. “If these kids are not safe at school or at home, where are they going to be supported?” Dr. Puhl asked. The bullying problem is not limited to the United States. Dr. Puhl and her colleagues found the same situation in Canada, Australia and Iceland. Women face harsher judgment than men, Dr. Puhl reports. The cutting remarks can begin when a woman’s body mass index is in the overweight range, while for men the shaming tends to start when they are obese. And women who are obese report more than three times as much shaming and discrimination as men of equal obesity. © 2016 The New York Times Company
By Rebecca Robbins, In the months before his death, Robin Williams was besieged by paranoia and so confused he couldn’t remember his lines while filming a movie, as his brain was ambushed by what doctors later identified as an unusually severe case of Lewy body dementia. “Robin was losing his mind and he was aware of it. Can you imagine the pain he felt as he experienced himself disintegrating?” the actor’s widow, Susan Schneider Williams, wrote in a wrenching editorial published this week in the journal Neurology. The title of her piece: “The terrorist inside my husband’s brain.” Susan Williams addressed the editorial to neurologists, writing that she hoped husband’s story would “help you understand your patients along with their spouses and caregivers a little more.” Susan Williams has previously blamed Lewy body dementia for her husband’s death by suicide in 2014. About 1.3 million Americans have the disease, which is caused by protein deposits in the brain. Williams was diagnosed with Parkinson’s disease a few months before he died; the telltale signs of Lewy body dementia in his brain were not discovered until an autopsy. The editorial chronicles Williams’s desperation as he sought to understand a bewildering array of symptoms that started with insomnia, constipation, and an impaired sense of smell and soon spiraled into extreme anxiety, tremors, and difficulty reasoning. © 2016 Scientific American,
Link ID: 22721 - Posted: 10.02.2016
By Carl Luepker For the past 35 years, a relentless neurological disorder has taken over my body, causing often painful muscle spasms that make it hard for me to walk and write and that cause my speech to be garbled enough that people often can’t understand me I can live with my bad luck in getting this condition, which showed up when I was 10; what’s harder to accept is that I have passed on this disorder, carried in my genes, to my 11-year-old son, Liam. As a parent, you hope that your child’s life will follow an upward trend, one of emotional and physical growth toward an adulthood of wide-open possibilities where they can explore the world, challenge themselves emotionally and physically, and perhaps play on a sports team. And you hope that you can pass down to your child at least some of what was passed down to you. Yet my generalized dystonia, as my progressive condition is called, was one thing I had hoped would end with me. Liam poses for a photograph just months before his diagnosis with dystonia. He “has just moved into middle school,” his father writes, where “he will have to both advocate for himself and educate his new teachers and peers about this genetic disorder.” When my wife and I started thinking of having kids, the statistics were fairly reassuring: There was a 1-in-2 chance that our child would inherit the gene that causes the disorder, but most people who have the gene don’t go on to manifest dystonia. We wanted a family and rolled the dice — twice. Our daughter does not have the gene. © 1996-2016 The Washington Post
Ben Allen Louis Casanova is playing cards with a friend on the back deck of a recovery house in Philadelphia's northern suburbs. He's warm and open as he talks about his past few years. The guy everyone calls Louie started using drugs like Xanax and Valium during his freshman year of high school. At age 18, Casanova turned to heroin. About two years later, the rehab shuffle began. "I relapsed and then I was just getting high. And then I went to treatment again in February of 2015," he says. "Then I relapsed again and went back to treatment." He's 23 now. He's hurt people close to him and his criminal record, fueled by his drug addiction, is long. By Louie's count, he has been through eight inpatient rehabs. Louis says his stays have ranged from about 18 to 45 days. "I did 30 days, and after that I came here," he concludes, talking about his latest visit. A month's stay can be pretty typical among people who go to an inpatient facility. But why? "As far as I know, there's nothing magical about 28 days," says Kimberly Johnson, director of the Center for Substance Abuse Treatment at SAMHSA, the federal agency that studies treatment services. Anne Fletcher, author of the book Inside Rehab, agrees. "It certainly is not scientifically based," she says. "I live in Minnesota where the model was developed and a lot of treatment across the country really stemmed from that." She says the late Daniel Anderson was one of the primary architects of the "Minnesota model," which became the prevailing treatment protocol for addiction specialists. At a state hospital in Minnesota in the 1950s, Anderson saw alcoholics living in locked wards, leaving only to be put to work on a farm. © 2016 npr
Keyword: Drug Abuse
Link ID: 22719 - Posted: 10.02.2016