Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Maggie Fox Researchers looking for simple ways to treat autism say they may have explained why at least some cases occur: It all has to do with the stress babies undergo at birth. They’re already testing a simple drug for treating kids with autism and say their findings may point to ways to treat the disorder earlier in life. It’s all experimental, but the study, published in the journal Science, should inspire other researchers to take a closer look. “This is exciting stuff to people in the field, because it’s getting at a basic mechanism," says Andrew Zimmerman of the University of Massachusetts Medical School, who reviewed the study. Yehezkel Ben-Ari of the Mediterranean Institute of Neurobiology in Marseille, France, and colleagues have been treating children with autism with a diuretic called bumetanide that reduces levels of chloride in cells. Diuretics lower blood pressure by making people urinate more, reducing fluid. Ben-Ari has had mixed success in his trials in kids, and wanted to prove his theory that chloride was the key. He worked with two rodent “models” of autism — they’re the closest things scientists have for replicating autism in a human. One has mutated genes linked with autism, and another develops autism when given valproate, an epilepsy drug blamed for causing autism in the children of mothers who take it while pregnant. They looked at what was going on in the brains of the mouse and rat pups just before and after birth. Then they gave the mouse and rat moms bumetanide — and fewer of their newborns showed autistic-like behaviors.
Ewen Callaway A study in mice and rats suggests that an imbalance in chloride ions during a child's development in the womb could be a factor for autism. Children with autism typically begin showing obvious symptoms, such as trouble making eye contact and slow language development, a year or more after birth. A study in mice and rats now hints that prenatal drug treatment could head off these problems. The findings, reported today in Science1, do not suggest that autism spectrum disorders can be prevented in children. But researchers not involved in the study say that they add support to a controversial clinical trial suggesting that some children with autism benefited from taking a common diuretic medication called bumetanide2. In that trial, a team led by neuroscientist Yehezkel Ben-Ari at the Mediterranean Institute of Neurobiology in Marseille gave 60 children bumetanide or a placebo daily for three months. Children who had less severe forms of autism showed mild improvements in social behaviour after taking the drug, and almost no adverse side effects were observed (see 'Diuretic drug improves symptoms of autism'). But autism researchers greeted the results with caution. Many pointed out that the study did not provide a clear biological mechanism that could explain how the drug improved the symptoms of the disorder. The latest study is an attempt to answer such criticisms by identifying a role for the neurotransmitter GABA. Studies in humans and animals have suggested that GABA, which in healthy people typically inhibits the activity in neurons, is altered in autism and instead activates some brain cells. © 2014 Nature Publishing Group,
Link ID: 19225 - Posted: 02.08.2014
Memory can be altered by new experience, and isn't nearly as accurate as courtroom testimony might have us believe, a new study suggests. The results suggest a cheeky answer to the question posed by comedian Richard Pryor: "Who you gonna believe: me, or your lyin' eyes?" Turns out, Pryor was onto something. The brain behind our eyes can distort reality or verify it, based on subsequent experience. And somewhat paradoxically, the same area of the brain appears to be strongly involved in both activities, according to a study published online Tuesday in the Journal of Neuroscience. Northwestern University cognitive neuroscientist Donna Bridge was testing how memory is either consolidated or altered, by giving 17 subjects a deceptively simple task. They studied the location of dozens of objects briefly flashed at varied locations on a standard computer screen, then were asked to recall the object's original location on a new screen with a different background. When subjects were told to use a mouse to drag the re-presented object from the center of the new screen to the place where they recalled it had been located, 16 of 17 got it wrong, by an average of about 3 inches. When the same subjects then were given three choices - the original location, the wrong guess and a neutral spot between them - they almost unfailingly dragged the object to the incorrectly recalled location, regardless of the background screen. Their new memory was false. © 2014 Hearst Communications, Inc.
Keyword: Learning & Memory
Link ID: 19224 - Posted: 02.08.2014
| by Isaac Saul Multi-step puzzles can be difficult for humans, but what if I told you there was a bird that could solve them on its own? In this BBC special, Dr. Alex Taylor has set up an eight-step puzzle to try and stump one of the smartest crows he's seen in captivity. They describe the puzzle as "one of the most complex tests of the animal mind ever." This isn't the first time crows' intelligence has been tested, either. Along with being problem solvers, these animals have an eerie tendency towards complex human-like memory skills. Through several different studies, we've learned that crows can recognize faces, communicate details of an event to each other and even avoid places they recognize as dangerous. This bird, dubbed "007" for its crafty mind, flies into the caged puzzle and spends only seconds analyzing the puzzle before getting down to business. Despite the puzzle's difficulty, the bird only seems to be stumped momentarily. At the end of the puzzle is a food reward, but how he gets there is what will really blow your mind. © 2014 TheHuffingtonPost.com, Inc
By NICHOLAS BAKALAR There are many well established risk factors for cardiovascular death, but researchers may have found one more: slower reaction time. In the late 1980s and early ’90s, researchers measured the reaction times of 5,134 adults ages 20 to 59, having them press a button as quickly as possible after a light flashed on a computer screen. Then they followed them to see how many would still be alive after 15 years. The study is in the January issue of PLOS One. Unsurprisingly, men, smokers, heavy drinkers and the physically inactive were more likely to die. But after controlling for these and other factors, they found that those with slower reaction times were 25 percent more likely to die of any cause, and 36 percent more likely to die of cardiovascular disease, than those with faster reactions. Reaction time made no difference in cancer mortality. The reasons for the connection are unclear, but the lead author, Gareth Hagger-Johnson, a senior research associate at University College London, said it may reflect problems with the brain or nervous system. He stressed, though, that “a single test of reaction time is not going to tell you when you’re going to die. There’s a link at a population level. We didn’t look at individual people.” © 2014 The New York Times Company
Keyword: Development of the Brain
Link ID: 19210 - Posted: 02.06.2014
|By Geoffrey Giller Working memory—our ability to store pieces of information temporarily—is crucial both for everyday activities like dialing a phone number as well as for more taxing tasks like arithmetic and accurate note-taking. The strength of working memory is often measured with cognitive tests, such as repeating lists of numbers in reverse order or recalling sequences of dots on a screen. For children, performance on working memory assessments is considered a strong predictor for future academic performance. Yet cognitive tests can fail to identify children whose brain development is lagging in subtle ways that may lead to future deficits in working memory and, thus, in learning. Doctors give the tests periodically and plot the results along a development curve, much like a child’s height and weight. By the time these tests reveal that a child’s working memory is below average, however, it may be too late to do much about it. But in a new study, published January 29 in The Journal of Neuroscience, scientists demonstrated that they could predict the future working memory of children and adolescents by examining brain scans from two different types of magnetic resonance imaging (MRI), instead of looking only at cognitive tests. Henrik Ullman, a PhD student at the Karolinska Institute in Stockholm and the lead author on the paper, says that this was the first study attempting to use MRI scans to predict future working memory capacity. “We were pretty surprised when we found what we actually found,” Ullman says. © 2014 Scientific American,
by Andy Coghlan If you flinch where others merely frown, you might want to take a look at your lifestyle. That's because environmental factors may have retuned your genes to make you more sensitive to pain. "We know that stressful life events such as diet, smoking, drinking and exposure to pollution all have effects on your genes, but we didn't know if they specifically affected pain genes," says Tim Spector of King's College London. Now, a study of identical twins suggests they do. It seems that epigenetic changes – environmentally triggered chemical alterations that affect how active your genes are – can dial your pain threshold up or down. This implies that genetic tweaks of this kind, such as the addition of one or more methyl groups to a gene, may account for some differences in how our senses operate. Spector and his colleagues assessed the ability of hundreds of pairs of twins to withstand the heat of a laser on their skin, a standard pain test. They selected 25 pairs who showed the greatest difference in the highest temperature they could bear. Since identical twins have the same genes, any variation in pain sensitivity can be attributed to epigenetic differences. Pain thermostat The researchers screened the twins' DNA for differences in methylation levels across 10 million gene regions. They found a significant difference in nine genes, most of which then turned out to have been previously implicated in pain-sensitivity in animal experiments. © Copyright Reed Business Information Ltd.
One thing marijuana isn’t known to do is improve your memory. But there’s another reason why scientists believe it could fight Alzheimer’s disease. Gary Wenk, PhD, professor of neuroscience, immunology and medical genetics at Ohio State University, has studied how to combat brain inflammation for over 25 years. His research has led him to a class of compounds known as cannabinoids, which includes many of the common ingredients in marijuana. He says, throughout all of his research, cannabinoids have been the only class of drugs he’s found to work. What’s more, he believes early intervention may be the best way of fighting Alzheimer’s. Dr. Wenk doesn’t see cannabinoids – or anything else – as a cure. But he took the time to discuss with us how marijuana might prevent the disorder from developing. Q: What’s so important about brain inflammation? Over the past few years, there’s been a focus on inflammation in the brain as causing a lot more than Alzheimer’s. We now know it plays a role in ALS, Parkinson’s disease, AIDS, dementia, multiple sclerosis, autism, schizophrenia, etc. We’re beginning to see that inflammation in the brain, if it lasts too long, can be quite detrimental. And if you do anything, such as smoke a bunch of marijuana in your 20s and 30s, you may wipe out all of the inflammation in your brain and then things start over again. And you simply die of old age before inflammation becomes an issue for you. © 2013-2014 All rights reserved
By Ariana Eunjung Cha, The National Institutes of Health is undertaking an ambitious collaboration with private industry in an attempt to speed up the search for treatments for some of the world’s most devastating diseases — Alzheimer’s, type 2 diabetes, rheumatoid arthritis and lupus. The pilot projects announced Tuesday will involve the sharing of not only scientists but also of data, blood samples and tissue specimens among 10 rival companies, the federal government and several nonprofit groups and research foundations. The companies that have signed up to participate include most of the large drug makers, which in the past had resisted calls to share detailed data and samples from experiments, preferring to instead use the information to gain lucrative patents. The agreement with NIH represents a major break from how they used to do business. The competing pharmaceutical companies have said they will hold off launching commercial ventures based on discoveries from the partnership until after the data has been made publicly available. The idea behind the collaboration is similar to that of the “open source” movement among some computer scientists who believe that sharing their code with anyone who wants it is the best way to innovate. The first group of projects, which will last three to five years, will involve an investment of more than $230 million from industry participants including Bristol-Myers Squibb, GlaxoSmithKline, Johnson & Johnson, Eli Lilly, Merck, Pfizer, Sanofi and Takeda, as well as a few smaller biotech companies. © 1996-2014 The Washington Post
Link ID: 19206 - Posted: 02.05.2014
Karen Weintraub, Every time you pull up a memory – say of your first kiss – your mind reinterprets it for the present day, new research suggests. If you're in the middle of an ugly divorce, for example, you might recall it differently than if you're happily married and life is going well. This makes your memory quite unlike the video camera you may imagine it to be. But new research in the Journal of Neuroscience suggests it's very effective for helping us adapt to our environments, said co-author Joel Voss, a researcher at Northwestern University's Feinberg School of Medicine. Voss' findings build on others and may also explain why we can be thoroughly convinced that something happened when it didn't, and why eyewitness testimony is notoriously unreliable. The new research also suggests that memory problems like those seen in Alzheimer's could involve a "freezing" of these memories — an inability to adapt the memory to the present, Voss said. Our memories are thus less a snapshot of the past, than "a record of our current view on the past," said Donna Rose Addis, a researcher and associate professor at the University of Auckland in New Zealand, who was not involved in the research. Using brain scans of 17 healthy volunteers as they were taught new data and recalled previously learned information, Voss and his colleagues were able to show for the first time precisely when and where new information gets implanted into existing memories.
Keyword: Learning & Memory
Link ID: 19205 - Posted: 02.05.2014
by Laura Sanders Despite seeming like a bystander, your baby is attuned to your social life (assuming you have one, which, with a baby, would be amazing). Every time you interact with someone, your wee babe is watching, eagerly slurping up social conventions. Scientists already know that babies expect some social graces: They expect people in a conversation to look at each other and talk to other people, not objects, and are eager to see good guys rewarded and bad guys punished, scientists have found. Now, a new study shows that babies are also attuned to other people’s relationships, even when those relationships have nothing to do with them. Babies are pretty good at figuring out who they want to interact with. The answer in most cases: Nice people. And that makes sense. The helpless wailers need someone reliable around to feed, change and entertain them. So to find out how good babies are at reading other people’s social relationships, University of Chicago psychologists showed 64 9-month-old babies a video of two women eating. Sometimes the women ate from the same bowl and agreed that the food was delicious, or agreed that it was gross. Sometimes the women disagreed. Later, the women interacted again, either warmly greeting each other and smiling, or giving each other the cold shoulder, arms crossed with a “hmph.” Researchers then timed how long the babies spent looking at this last scene, with the idea that the longer the baby spent looking, the more surprising the scene was. © Society for Science & the Public 2000 - 2014.
By Jennifer Ouellette It was a brisk October day in a Greenwich Village café when New York University neuroscientist David Poeppel crushed my dream of writing the definitive book on the science of the self. I had naively thought I could take a light-hearted romp through genotyping, brain scans, and a few personality tests and explain how a fully conscious unique individual emerges from the genetic primordial ooze. Instead, I found myself scrambling to navigate bumpy empirical ground that was constantly shifting beneath my feet. How could a humble science writer possibly make sense of something so elusively complex when the world’s most brilliant thinkers are still grappling with this marvelous integration that makes us us? “You can’t. Why should you?” Poeppel asked bluntly when I poured out my woes. “We work for years and years on seemingly simple problems, so why should a very complicated problem yield an intuition? It’s not going to happen that way. You’re not going to find the answer.” Well, he was right. Darn it. But while I might not have found the Ultimate Answer to the source of the self, it proved to be an exciting journey and I learned some fascinating things along the way. 1. Genes are deterministic but they are not destiny. Except for earwax consistency. My earwax is my destiny. We tend to think of our genome as following a “one gene for one trait” model, but the real story is far more complicated. True, there is one gene that codes for a protein that determines whether you will have wet or dry earwax, but most genes serve many more than one function and do not act alone. Height is a simple trait that is almost entirely hereditary, but there is no single gene helpfully labeled height. Rather, there are several genes interacting with one another that determine how tall we will be. Ditto for eye color. It’s even more complicated for personality traits, health risk factors, and behaviors, where traits are influenced, to varying degrees, by parenting, peer pressure, cultural influences, unique life experiences, and even the hormones churning around us as we develop in the womb.
Madhusree Mukerjee By displaying images on an iPad, researchers tested patients' ability to detect contrast after their vision was restored by cataract surgery. In a study of congenitally blind children who underwent surgery to restore vision, researchers have found that the brain can still learn to use the newly acquired sense much later in life than previously thought. Healthy infants start learning to discern objects, typically by their form and colour, from the moment they open their eyes. By the time a baby is a year old vision development is more or less complete, although refinements continue through childhood. But as the brain grows older, it becomes less adaptable, neuroscientists generally believe. "The dogma is that after a certain age the brain is unable to process visual inputs it has never received before," explains cognitive scientist Amy Kalia of the Massachusetts Institute of Technology (MIT) in Cambridge. Consequently, eye surgeons in India often refuse to treat children blinded by cataracts since infancy if they are over the age of seven. Such children are not usually found in wealthier countries such as the United States — where cataracts are treated as early as possible — but are tragically plentiful in India. In the study, which was published last week in Proceedings of the National Academy of Sciences1, Kalia and her collaborators followed 11 children enrolled in Project Prakash2, a humanitarian and scientific effort in India that provides corrective surgery to children with treatable cataracts and subsequently studies their visual abilities. ('Prakash' is Sanskrit for light.) © 2014 Nature Publishing Group
By GINA KOLATA For many obese adults, the die was cast by the time they were 5 years old. A major new study of more than 7,000 children has found that a third of children who were overweight in kindergarten were obese by eighth grade. And almost every child who was very obese remained that way. Some obese or overweight kindergartners lost their excess weight, and some children of normal weight got fat over the years. But every year, the chances that a child would slide into or out of being overweight or obese diminished. By age 11, there were few additional changes: Those who were obese or overweight stayed that way, and those whose weight was normal did not become fat. “The main message is that obesity is established very early in life, and that it basically tracks through adolescence to adulthood,” said Ruth Loos, a professor of preventive medicine at the Icahn School of Medicine at Mount Sinai in New York, who was not involved in the study. These results, surprising to many experts, arose from a rare study that tracked children’s body weight for years, from kindergarten through eighth grade. Experts say they may reshape approaches to combating the nation’s obesity epidemic, suggesting that efforts must start much earlier and focus more on the children at greatest risk. The findings, to be published Thursday in The New England Journal of Medicine, do not explain why the effect occurs. Researchers say it may be a combination of genetic predispositions to being heavy and environments that encourage overeating in those prone to it. But the results do provide a possible explanation for why efforts to help children lose weight have often had disappointing results. The steps may have aimed too broadly at all schoolchildren, rather than starting before children enrolled in kindergarten and concentrating on those who were already fat at very young ages. © 2014 The New York Times Company
by Susan Milius Male bee flies fooled into trying to copulate with a daisy may learn from the awkward incident. Certain orchids and several forms of South Africa’s Gorteria diffusa daisy lure pollinators by mimicking female insects. The most effective daisy seducers row a dark, somewhat fly-shaped bump on one of their otherwise yellow-to-orange petals. Males of small, dark Megapalpus capensis bee flies go wild. But tests show the daisy’s victims waste less time trying to mate with a second deceptive daisy than with the first. “Far from being slow and stupid, these males are actually quite keen observers and fairly perceptive for a fly,” says Marinus L. de Jager of Stellenbosch University in South Africa. Males’ success locating a female bee fly drops in the presence of deceitful daisies, de Jager and Stellenbosch University colleague Allan Ellis say January 29 in the Proceedings of the Royal Society B. That’s the first clear demonstration of sexual deceit’s cost to a pollinator, Ellis says. Such evolutionary costs might push the bee fly to learn from mating mistakes. How long bee flies stay daisy-wary remains unknown. In other studies, wasps tricked by an Australian orchid forgot their lesson after about 24 hours. © Society for Science & the Public 2000 - 2014
Alison Abbott By slicing up and reconstructing the brain of Henry Gustav Molaison, researchers have confirmed predictions about a patient that has already contributed more than most to neuroscience. No big scientific surprises emerge from the anatomical analysis, which was carried out by Jacopo Annese of the Brain Observatory at the University of California, San Diego, and his colleagues, and published today in Nature Communications1. But it has confirmed scientists’ deductions about the parts of the brain involved in learning and memory. “The confirmation is surely important,” says Richard Morris, who studies learning and memory at the University of Edinburgh, UK. “The patient is a classic case, and so the paper will be extensively cited.” Molaison, known in the scientific literature as patient H.M., lost his ability to store new memories in 1953 after surgeon William Scoville removed part of his brain — including a large swathe of the hippocampus — to treat his epilepsy. That provided the first conclusive evidence that the hippocampus is fundamental for memory. H.M. was studied extensively by cognitive neuroscientists during his life. After H.M. died in 2008, Annese set out to discover exactly what Scoville had excised. The surgeon had made sketches during the operation, and brain-imaging studies in the 1990s confirmed that the lesion corresponded to the sketches, although was slightly smaller. But whereas brain imaging is relatively low-resolution, Annese and his colleagues were able to carry out an analysis at the micrometre scale. © 2014 Nature Publishing Group
Henry Molaison, the famous amnesic patient better known as “H.M.,” was unable to form new long-term memories following brain surgery to treat his epilepsy. Scientists who studied his condition made groundbreaking discoveries that revealed how memory works, and before his 2008 death, H.M. and his guardian agreed that his brain would be donated to science. One year after his death, H.M.’s brain was sliced into 2,401 70-micron-thick sections for further study. MIT neuroscience professor emerita Suzanne Corkin studied H.M. during his life and is now part of a team that is analyzing his brain. She is an author of a paper appearing in Nature Communications today reporting preliminary results of the postmortem study. The research team was led by Jacopo Annese at the University of California at San Diego (UCSD). Q: What can we learn from studying H.M.’s brain after his death? And when did you begin laying the groundwork for these postmortem studies? A: It was important to get H.M.’s brain after he died, for three reasons: first of all, to document the exact locus and extent of his lesions, in order to identify the neural substrate for declarative memory. Second, to evaluate the status of the intact brain tissue, revealing the possible brain substrates for the many cognitive functions that H.M. performed normally, including nondeclarative learning without awareness. The third reason was to identify any new abnormalities that occurred as a result of his getting old and were unrelated to the operation. In 1992, I explained to H.M. and his conservator that it would be extremely valuable to have his brain after he died. I told them how important he was to the science of memory, and that he had already made amazing contributions. It would make those even more significant to actually have his brain and see exactly where the damage was. That year, they signed a brain donation form leaving his brain to Massachusetts General Hospital [MGH] and MIT.
Keyword: Learning & Memory
Link ID: 19182 - Posted: 01.29.2014
By James Gallagher Health and science reporter, BBC News Exposure to a once widely used pesticide, DDT, may increase the chances of developing Alzheimer's disease, suggest US researchers. A study, published in JAMA Neurology, showed patients with Alzheimer's had four times the levels of DDT lingering in the body than healthy people. Some countries still use the pesticide to control malaria. Alzheimer's Research UK said more evidence was needed to prove DDT had a role in dementia. DDT was a massively successful pesticide, initially used to control malaria at the end of World War Two and then to protect crops in commercial agriculture. However, there were questions about its impact on human health and wider environmental concerns, particularly for predators. It was banned in the US in 1972 and in many other countries. But the World Health Organization still recommends using DDT to keep malaria in check. Not clear DDT also lingers in the human body where it is broken down into DDE. The team at Rutgers University and Emory University tested levels of DDE in the blood of 86 people with Alzheimer's disease and compared the results with 79 healthy people of a similar age and background. The results showed those with Alzheimer's had 3.8 times the level of DDE. However, the picture is not clear-cut. Some healthy people had high levels of DDE while some with Alzheimer's had low levels. Alzheimer's also predates the use of DDT. The researchers believe the chemical is increasing the chance of Alzheimer's and may be involved in the development of amyloid plaques in the brain, a hallmark of the disease, which contribute to the death of brain cells. BBC © 2014
By BENEDICT CAREY People of a certain age (and we know who we are) don’t spend much leisure time reviewing the research into cognitive performance and aging. The story is grim, for one thing: Memory’s speed and accuracy begin to slip around age 25 and keep on slipping. The story is familiar, too, for anyone who is over 50 and, having finally learned to live fully in the moment, discovers it’s a senior moment. The finding that the brain slows with age is one of the strongest in all of psychology. Lisa Haney Over the years, some scientists have questioned this dotage curve. But these challenges have had an ornery-old-person slant: that the tests were biased toward the young, for example. Or that older people have learned not to care about clearly trivial things, like memory tests. Or that an older mind must organize information differently from one attached to some 22-year-old who records his every Ultimate Frisbee move on Instagram. Now comes a new kind of challenge to the evidence of a cognitive decline, from a decidedly digital quarter: data mining, based on theories of information processing. In a paper published in Topics in Cognitive Science, a team of linguistic researchers from the University of Tübingen in Germany used advanced learning models to search enormous databases of words and phrases. Since educated older people generally know more words than younger people, simply by virtue of having been around longer, the experiment simulates what an older brain has to do to retrieve a word. And when the researchers incorporated that difference into the models, the aging “deficits” largely disappeared. “What shocked me, to be honest, is that for the first half of the time we were doing this project, I totally bought into the idea of age-related cognitive decline in healthy adults,” the lead author, Michael Ramscar, said by email. But the simulations, he added, “fit so well to human data that it slowly forced me to entertain this idea that I didn’t need to invoke decline at all.” © 2014 The New York Times Company
by Helen Thomson When the criteria for diagnosing autism were changed last year, concerns were raised that people already diagnosed might be re-evaluated and end up losing access to treatments and services. The American Psychiatric Association (APA), which publishes the diagnostic guidelines, recommends that children who are receiving appropriate treatment as the result of the old criteria should not be required to undergo a re-examination with the new criteria by insurance companies. But a small survey revealed to New Scientist suggests that not everyone is following the party line. In May, the APA published the DSM-5, the latest edition of what has come to be known as psychiatry's diagnostic bible. One controversial change was to the criteria used to diagnose different kinds of autism, which are now combined under the umbrella term of "Autism Spectrum Disorder" (ASD). Under the previous criteria of DSM-4, a person would be diagnosed with ASD by exhibiting at least six of 12 behaviours, which include problems with communication, interaction and repetition. Now, that same person would need to exhibit three deficits in social communication and interaction and at least two repetitive behaviours – the latter, say critics, makes the new criteria more restrictive. To see how the change in criteria was affecting people, Autism Speaks, a US science and advocacy organisation, asked users of its website to complete an online survey about their experiences. "We wanted to ensure that people are still maintaining access to the services they need," says Michael Rosanoff, Autism Speaks' associate director for public health research and scientific review. © Copyright Reed Business Information Ltd.
Link ID: 19174 - Posted: 01.27.2014