Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Sam Doernberg and Joe DiPietro It’s the first day of class, and we—a couple of instructors from Cornell—sit around a table with a few of our students as the rest trickle in. Anderson, one of the students seated across from us, smiles and says, “I’m going to get an A+ in your class.” “No,” VanAntwerp retorts, “I’m getting the A+.” You might think that this scene is typical of classes at a school like Cornell University, where driven students compete for top marks. But this didn’t happen on a college campus: It took place in a maximum-security prison. To the outside world, they are inmates, but in the classroom, they are students enrolled in the Cornell Prison Education Program, or “CPEP.” Per New York State Department of Corrections rules, we have permission to use the inmates’ last names only—which is also often how we know them best. Those who graduate from the program—taught by Cornell instructors—will receive an associate’s degree from Cayuga Community College. Before teaching neuroscience to prison inmates, we taught it to Cornell undergraduates as part of the teaching staff for Cornell’s Introduction to Neuroscience course. Most Cornell neuroscience students are high-achieving biology majors and premeds, who are well prepared to succeed in a demanding course. They generally have gone from one academic success to another, and it is no secret that they expect a similar level of success in a neuroscience class. © 2016 by The Atlantic Monthly Group
Keyword: Learning & Memory
Link ID: 22093 - Posted: 04.12.2016
By Jordana Cepelewicz The brain relies on a system of chemical messengers, known as neurotransmitters, to carry missives from cell to cell. When all is well, these communications enable the brain to coordinate various functions, from complex thought to quick, knee-jerk reactions—but when the system is out of whack, serious disease or disorder can ensue. A team of researchers at the Technical University of Denmark (D.T.U.) and University of Oxford have for the first time identified the molecular structure of dopamine beta-hydroxylase (DBH), the enzyme that controls the conversion between dopamine and norepinephrine, two major neurotransmitters. Understanding the crystal structure of the enzyme could provide an ideal target for drug development. Dopamine and norepinephrine play key roles in many brain functions such as learning, memory, movement and the fight-or-flight response. Imbalances in the levels of these neurotransmitters—and the role DBH plays in regulating them—have been implicated in a wide range of disorders, including hypertension, congestive heart failure, anxiety, depression, post-traumatic stress disorder, Alzheimer’s, schizophrenia, Parkinson’s and even cocaine addiction. DBH has long intrigued biochemists but it has been challenging to perform the analyses needed to determine the protein’s structure. “This enzyme has been particularly difficult,” says Hans Christensen, a chemist at D.T.U. and the study’s lead researcher. “We tried many different expression systems before we finally succeeded. Now that we have the structure it is clear why—[it] is very intricate, with different parts of the enzyme interacting very tightly.” © 2016 Scientific American,
By Melinda Wenner Moyer What if you could pop a pill that made you smarter? It sounds like a Hollywood movie plot, but a new systematic review suggests that the decades-long search for a safe and effective “smart drug” (see below) might have notched its first success. Researchers have found that modafinil boosts higher-order cognitive function without causing serious side effects. Modafinil, which has been prescribed in the U.S. since 1998 to treat sleep-related conditions such as narcolepsy and sleep apnea, heightens alertness much as caffeine does. A number of studies have suggested that it could provide other cognitive benefits, but results were uneven. To clear up the confusion, researchers then at the University of Oxford analyzed 24 studies published between 1990 and 2014 that specifically looked at how modafinil affects cognition. In their review, which was published last year in European Neuropsychopharmacology, they found that the methods used to evaluate modafinil strongly affected the outcomes. Research that looked at the drug's effects on the performance of simple tasks—such as pressing a particular button after seeing a certain color—did not detect many benefits. Yet studies that asked participants to do complex and difficult tasks after taking modafinil or a placebo found that those who took the drug were more accurate, which suggests that it may affect “higher cognitive functions—mainly executive functions but also attention and learning,” explains study co-author Ruairidh Battleday, now a medical doctor and Ph.D. student at the University of California, Berkeley. But don't run to the pharmacy just yet. Although many doctors very likely prescribe the drug off-label to help people concentrate—indeed, a 2008 survey by the journal Nature found that one in five of its readers had taken brain-boosting drugs, and half those people had used modafinil—trials have not yet been done on modafinil's long-term effectiveness or safety. © 2016 Scientific American
Laura Sanders NEW YORK — Cells in a brain structure known as the hippocampus are known to be cartographers, drawing mental maps of physical space. But new studies show that this seahorse-shaped hook of neural tissue can also keep track of social space, auditory space and even time, deftly mapping these various types of information into their proper places. Neuroscientist Rita Tavares described details of one of these new maps April 2 at the annual meeting of the Cognitive Neuroscience Society. Brain scans had previously revealed that activity in the hippocampus was linked to movement through social space. In an experiment reported last year in Neuron, people went on a virtual quest to find a house and job by interacting with a cast of characters. Through these social interactions, the participants formed opinions about how much power each character held, and how kindly they felt toward him or her. These judgments put each character in a position on a “social space” map. Activity in the hippocampus was related to this social mapmaking, Tavares and colleagues found. It turns out that this social map depends on the traits of the person who is drawing it, says Tavares, of Icahn School of Medicine at Mount Sinai in New York City. People with more social anxiety tended to give more power to characters they interacted with. What’s more, these people's social space maps were smaller overall, suggesting that they explored social space less, Tavares says. Tying these behavioral traits to the hippocampus may lead to a greater understanding of social behavior — and how this social mapping may go awry in psychiatric conditions, Tavares said. © Society for Science & the Public 2000 - 2016.
Keyword: Learning & Memory
Link ID: 22076 - Posted: 04.06.2016
by Daniel Galef Footage from a revolutionary behavioural experiment showed non-primates making and using tools just like humans. In the video, a crow is trying to get food out of a narrow vessel, but its beak is too short for it to reach through the container. Nearby, the researchers placed a straight wire, which the crow bent against a nearby surface into a hook. Then, holding the hook in its beak, it fished the food from the bottle. Corvids—the family of birds that includes crows, ravens, rooks, jackdaws, and jays—are pretty smart overall. Although not to the level of parrots and cockatoos, ravens can also mimic human speech. They also have a highly developed system of communication and are believed to be among the most intelligent non-primate animals in existence. McGill Professor Andrew Reisner recalls meeting a graduate student studying corvid intelligence at Oxford University when these results were first published in 2015. “I had read early in the year that some crows had been observed making tools, and I mentioned this to him,” Reisner explained. “He said that he knew about that, as it had been he who had first observed it happening. Evidently the graduate students took turns watching the ‘bird box,’ […] and the tool making first occurred there on his shift.”
By Roni Caryn Rabin Alzheimer’s disease is a progressive brain disorder that causes dementia, destroying memory, cognitive skills, the ability to care for oneself, speak and walk, said Ruth Drew, director of family and information services at the Alzheimer’s Association. “And since the brain affects everything, Alzheimer’s ultimately affects everything,” she said, “including the ability to swallow, cough and breathe.” Once patients reach the advanced stages of Alzheimer’s, they may stop eating and become weak and susceptible to infections, said Dr. Jason Karlawish, a professor of medicine at the University of Pennsylvania. Unable to swallow or cough, they are at high risk of choking, aspirating food particles or water into the lungs and developing pneumonia, which is often the immediate cause of death, he said. “You see a general decline in the contribution the brain makes, not just in thinking, but in maintaining the body’s homeostasis,” Dr. Karlawish said. Using a feeding tube to nourish patients and hospitalizing them for infections does not significantly extend life at the advanced stages of the disease and is discouraged because it can prolong suffering with no hope of recovery, he said. Alzheimer's is the sixth leading cause of death in the United States, according to the Centers for Disease Control and Prevention, but that figure may underestimate the actual number of cases, Dr. Karlawish said, since some deaths may be attributed to other causes like pneumonia. © 2016 The New York Times Company
Link ID: 22071 - Posted: 04.06.2016
Laura Sanders NEW YORK — Sometimes forgetting can be harder than remembering. When people forced themselves to forget a recently seen image, select brain activity was higher than when they tried to remember that image. Forgetting is often a passive process, one in which the memory slips out of the brain, Tracy Wang of the University of Texas at Austin said April 2 at the annual meeting of the Cognitive Neuroscience Society. But in some cases, forgetting can be deliberate. Twenty adults saw images of faces, scenes and objects while an fMRI scanner recorded their brains’ reactions to the images. If instructed to forget the preceding image, people were less likely to remember that image later. Researchers used the scan data to build a computer model that could infer how strongly the brain responds to each particular kind of image. In the ventral temporal cortex, a part of the brain above the ear, brain patterns elicited by a particular image were stronger when a participant was told to forget the sight than when instructed to remember it. Of course, everyone knows that it’s easy to forget something without even trying. But these results show that intentional forgetting isn’t a passive process — the brain has to actively work to wipe out a memory on purpose. Citations T.H. Wang et al. Forgetting is more work than remembering. Annual meeting of the Cognitive Neuroscience Society, New York City, April 2, 2016. © Society for Science & the Public 2000 - 2016
Keyword: Learning & Memory
Link ID: 22068 - Posted: 04.05.2016
Mo Costandi This spectacular image – which took the best part of a year to create – shows the fine structure of a nerve terminal at high resolution, revealing, for the very first time, an intricate network of fine filaments that controls the movements of synaptic vesicles. The brain is soft and wet, with the consistency of a lump of jelly. Yet, it is the most complex and highly organized structure that we know of, containing hundreds of billions of neurons and glial cells, and something on the order of one quadrillion synaptic connections, all of which are arranged in a very specific manner. This high degree of specificity extends down to the deepest levels of brain organization. Just beneath the membrane at the nerve terminal, synaptic vesicles store neurotransmitter molecules, and await the arrival of a nervous impulse, whereupon they fuse with the membrane and release their contents into the synaptic cleft, the miniscule gap at the junction between nerve cells, and diffuse across it to bind to receptor protein molecules embedded at the surface of the partner cell. 3D model of a nerve terminal in atomic detail The process of neurotransmitter release is tightly orchestrated. Ready vesicles are ‘docked’ in the ‘active zone’ lying beneath the cell membrane, and are depleted when they fuse with the membrane, only to be replenished from a reservoir of pre-prepared vesicles located further inside the cell. Spent vesicles are quickly pulled back out of the membrane, reformed, refilled with neurotransmitter molecules, and then returned to the reservoir, so that they can be shuttled into the active zone when needed. An individual nerve cell may use up hundreds, or perhaps thousands, of vesicles every second, and so this recycling process occurs continuously to maintain the signalling between nerve cells. © 2016 Guardian News and Media Limited
Keyword: Development of the Brain
Link ID: 22067 - Posted: 04.04.2016
By DONALD G. McNEIL Jr The World Health Organization said on Thursday that there is “strong scientific consensus” that Zika virus is a cause of microcephaly, unusually small heads with brain damage in infants, as well as other neurological disorders. Yet a surge in microcephaly has been reported only in Brazil; a small increase was reported in French Polynesia, and a cluster of 32 cases is now under investigation in Colombia. For proof of the connection between infection with the virus and birth defects, scientists are waiting for the results of a large study of 5,000 pregnant women, most of them in Colombia. Women with past Zika infections will be compared with similar women without infections to see if they have more microcephalic children. The epidemic peaked in Colombia in early February, according to the W.H.O. Most of the women in the study are due to give birth in May and June. Virtually all public health agencies already believe the virus is to blame for these birth defects and are giving medical advice based on that assumption. Here are the lines of evidence they cite. As early as last August, hospitals in northeast Brazil realized that something unheard of was happening: Neonatal wards that normally saw one or two microcephalic babies a year were seeing five or more at the same time. Doctors learned from the mothers that many of them had had Zika symptoms months earlier. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22065 - Posted: 04.04.2016
The mystery is starting to untangle. It has long been known that twisted fibres of a protein called tau collect in the brain cells of people with Alzheimer’s, but their exact role in the disease is unclear. Now a study in mice has shown how tau interferes with the strengthening of connections between neurons – the key mechanism by which we form memories. In healthy cells, the tau protein helps to stabilise microtubules that act as rails for transporting materials around the cell. In people with Alzheimer’s, these proteins become toxic, but an important unanswered question is what forms of tau are toxic: the tangles may not be the whole story. In the new study, Li Gan and her colleagues at the Gladstone Institute of Neurological Disease in San Francisco found that the brains of those with Alzheimer’s have high levels of tau with a particular modification, called acetylated tau. They then looked at what acetylated tau does in a mouse model of Alzheimer’s, finding that it accumulates at synapses – the connections between neurons. When we form memories, synapses become strengthened through extra receptors inserted into the cell membranes, and this heightens their response. But acetylated tau depletes another protein called KIBRA, which is essential for this synapse-strengthening mechanism. “We’re excited because we think we now have a handle on the link between tau and memory,” says Gan. “We’re also cautious because we know this may not be the only link. It’s still early days in understanding the mechanism.” © Copyright Reed Business Information Ltd.
By Emily Underwood More than 99% of clinical trials for Alzheimer’s drugs have failed, leading many to wonder whether pharmaceutical companies have gone after the wrong targets. Now, research in mice points to a potential new target: a developmental process gone awry, which causes some immune cells to feast on the connections between neurons. “It is beautiful new work,” which “brings into light what’s happening in the early stage of the disease,” says Jonathan Kipnis, a neuroscientist at the University of Virginia School of Medicine in Charlottesville. Most new Alzheimer’s drugs aim to eliminate β amyloid, a protein that forms telltale sticky plaques around neurons in people with the disease. Those with Alzheimer’s tend to have more of these deposits in their brains than do healthy people, yet more plaques don’t always mean more severe symptoms such as memory loss or poor attention, says Beth Stevens of Boston Children’s Hospital, who led the new work. What does track well with the cognitive decline seen in Alzheimer’s disease—at least in mice that carry genes that confer high risk for the condition in people—is a marked loss of synapses, particularly in brain regions key to memory, Stevens says. These junctions between nerve cells are where neurotransmitters are released to spark the brain’s electrical activity. Stevens has spent much of her career studying a normal immune mechanism that prunes weak or unnecessary synapses as the brain matures from the womb through adolescence, allowing more important connections to become stronger. In this process, a protein called C1q sets off a series of chemical reactions that ultimately mark a synapse for destruction. After a synapse has been “tagged,” immune cells called microglia—the brain’s trash disposal service—know to “eat” it, Stevens says. © 2016 American Association for the Advancement of Science
By Nicholas Bakalar Stress in childhood may be linked to hardening of the arteries in adulthood, new research suggests. Finnish researchers studied 311 children 12 to 18 years old, scoring their levels of stress according to a variety of components, including the family’s economic circumstances, the emotional environment in the home, whether parents engaged in healthy behaviors, stressful events (such as divorce, moves or death of a family member) and parental concerns about the child’s social adjustment. Using these criteria, they calculated a stress score. When the members of the group were 40 to 46 years old, they used computed tomography to measure coronary artery calcification, a marker of atherosclerosis and a risk factor for cardiovascular disease. The study, in JAMA Pediatrics, controlled for sex, cholesterol, body mass index and other factors, but still found that the higher the childhood stress score, the greater the risk for coronary artery calcification. The study is observational, and the data is based largely on parental reports, which can be biased. Still, its long follow-up time and careful control of other variables gives it considerable strength. There are plausible mechanisms for the connection, including stress-induced increases in inflammation, which in animal models have been linked to a variety of ailments. “I think that economic conditions are important here,” said the lead author, Dr. Markus Juonala, a professor of internal medicine at the University of Turku in Finland. “Public health interventions should focus on how to intervene in better ways with people with higher stress and lower socioeconomic status.” © 2016 The New York Times Company
By Matthew Hutson Earlier this month, a computer program called AlphaGo defeated a (human) world champion of the board game Go, years before most experts expected computers to rival the best flesh-and-bone players. But then last week, Microsoft was forced to silence its millennial-imitating chatbot Tay for blithely parroting Nazi propaganda and misogynistic attacks after just one day online, her failure a testimony to the often underestimated role of human sensibility in intelligent behavior. Why are we so compelled to pit human against machine, and why are we so bad at predicting the outcome? As the number of jobs susceptible to automation rises, and as Stephen Hawking, Elon Musk, and Bill Gates warn that artificial intelligence poses an existential threat to humanity, it’s natural to wonder how humans measure up to our future robot overlords. But even those tracking technology’s progress in taking on human skills have a hard time setting an accurate date for the uprising. That’s in part because one prediction strategy popular among both scientists and journalists—benchmarking the human brain with digital metrics such as bits, hertz, and million instructions per section, or MIPS—is severely misguided. And doing so could warp our expectations of what technology can do for us and to us. Since their development, digital computers have become a standard metaphor for the mind and brain. The comparison makes sense, in that brains and computers both transform input into output. Most human brains, like computers, can also manipulate abstract symbols. (Think arithmetic or language processing.) But like any metaphor, this one has limitations.
By David Z. Hambrick Nearly a century after James Truslow Adams coined the phrase, the “American dream” has become a staple of presidential campaign speeches. Kicking off her 2016 campaign, Hillary Clinton told supporters that “we need to do a better job of getting our economy growing again and producing results and renewing the American dream.” Marco Rubio lamented that “too many Americans are starting to doubt” that it is still possible to achieve the American dream, and Ted Cruz asked his supporters to “imagine a legal immigration system that welcomes and celebrates those who come to achieve the American dream.” Donald Trump claimed that “the American dream is dead” and Bernie Sanders quipped that for many “the American dream has become a nightmare.” But the American dream is not just a pie-in-the-sky notion—it’s a scientifically testable proposition. The American dream, Adams wrote, “is not a dream of motor cars and high wages merely, but a dream of social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable…regardless of the fortuitous circumstances of birth or position.” In the parlance of behavioral genetics—the scientific study of genetic influences on individual differences in behavior—Adams’ idea was that all Americans should have an equal opportunity to realize their genetic potential. A study just published in Psychological Science by psychologists Elliot Tucker-Drob and Timothy Bates reveals that this version of the American dream is in serious trouble. Tucker-Drob and Bates set out to evaluate evidence for the influence of genetic factors on IQ-type measures (aptitude and achievement) that predict success in school, work, and everyday life. Their specific question was how the contribution of genes to these measures would compare at low versus high levels of socioeconomic status (or SES), and whether the results would differ across countries. The results reveal, ironically, that the American dream is more of a reality for other countries than it is for America: genetic influences on IQ were uniform across levels of SES in Western Europe and Australia, but, in the United States, were much higher for the rich than for the poor. © 2016 Scientific American
Chris French The fallibility of human memory is one of the most well established findings in psychology. There have been thousands of demonstrations of the unreliability of eyewitness testimony under well-controlled conditions dating back to the very earliest years of the discipline. Relatively recently, it was discovered that some apparent memories are not just distorted memories of witnessed events: they are false memories for events that simply never took place at all. Psychologists have developed several reliable methods for implanting false memories in a sizeable proportion of experimental participants. It is only in the last few years, however, that scientists have begun to systematically investigate the phenomenon of non-believed memories. These are subjectively vivid memories of personal experiences that an individual once believed were accurate but now accepts are not based upon real events. Prior to this, there were occasional anecdotal reports of non-believed memories. One of the most famous was provided by the influential developmental psychologist Jean Piaget. He had a clear memory of almost being kidnapped at about the age of two and of his brave nurse beating off the attacker. His grateful family were so impressed with the nurse that they gave her a watch as a reward. Years later, the nurse confessed that she had made the whole story up. Even after he no longer believed that the event had taken place, Piaget still retained his vivid and detailed memory of it. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22050 - Posted: 03.30.2016
Brendan Maher It took less than a minute of playing League of Legends for a homophobic slur to pop up on my screen. Actually, I hadn't even started playing. It was my first attempt to join what many agree to be the world's leading online game, and I was slow to pick a character. The messages started to pour in. “Pick one, kidd,” one nudged. Then, “Choose FA GO TT.” It was an unusual spelling, and the spaces may have been added to ease the word past the game's default vulgarity filter, but the message was clear. Online gamers have a reputation for hostility. In a largely consequence-free environment inhabited mostly by anonymous and competitive young men, the antics can be downright nasty. Players harass one another for not performing well and can cheat, sabotage games and do any number of things to intentionally ruin the experience for others — a practice that gamers refer to as griefing. Racist, sexist and homophobic language is rampant; aggressors often threaten violence or urge a player to commit suicide; and from time to time, the vitriol spills beyond the confines of the game. In the notorious 'gamergate' controversy that erupted in late 2014, several women involved in the gaming industry were subjected to a campaign of harassment, including invasions of privacy and threats of death and rape. League of Legends has 67 million players and grossed an estimated US$1.25 billion in revenue last year. But it also has a reputation for toxic in-game behaviour, which its parent company, Riot Games in Los Angeles, California, sees as an obstacle to attracting and retaining players. © 2016 Nature Publishing Group
By Patrick Monahan Yesterday, mountaineer Richard Parks set out for Kathmandu to begin some highly unusual data-gathering. As part of Project Everest Cynllun, he will climb Mount Everest without supplemental oxygen and perform—on himself—a series of blood draws, muscle biopsies, and cognitive tests. If he makes it to the summit, these will be the highest-elevation blood and tissue samples ever collected. Damian Bailey, a physiologist at the University of South Wales, Pontypridd, in the United Kingdom and the project’s lead scientist, hopes the risky experiment will yield new information about how the human body responds to low-oxygen conditions, and how similar mechanisms might drive cognitive decline with aging. As Parks began the acclimatization process with warm-up climbs on two smaller peaks, Bailey told ScienceInsider about his ambitions for the project. This interview has been edited for clarity and brevity. Q: Parks is an extreme athlete who has climbed Everest before. What can his performance tell us about regular people? A: What we’re trying to understand is, what is it about Richard’s brain that is potentially different from other people’s brains, and can that provide us with some clues to accelerated cognitive decline, which occurs with aging [and] dementia. We know that sedentary aging is associated with a progressive decline in blood flow to the brain. … And the main challenge for sedentary aging is we have to wait so long to see the changes occurring. So this is almost a snapshot, a day in the life of a patient with cognitive decline. © 2016 American Association for the Advancement of Science.
By Esther Hsieh Spinal implants have suffered similar problems as those in the brain—they tend to abrade tissue, causing inflammation and ultimately rejection by the body. Now an interdisciplinary research collaboration based in Switzerland has made a stretchable implant that appears to solve this problem. Like Lieber's new brain implant, it matches the physical qualities of the tissue where it is embedded. The “e-dura” implant is made from a silicone rubber that has the same elasticity as dura mater, the protective skin that surrounds the spinal cord and brain, explains Stéphanie Lacour, a professor at the school of engineering at the Swiss Federal Institute of Technology in Lausanne. This feature allows the implant to mimic the movement of the surrounding tissues. Embedded in the e-dura are electrodes for stimulation and microchannels for drug therapy. Ultrathin gold wires are made with microscopic cracks that allow them to stretch. Also, the electrodes are coated with a special platinum-silicone mixture that is stretchable. In an experiment that lasted two months, the scientists found that healthy rats with an e-dura spinal implant could walk across a ladder as well as a control group with no implant. Yet rats with a traditional plastic implant (which is flexible but not stretchable) started stumbling and missing rungs a few weeks after surgery. The researchers removed the implants and found that rats with a traditional implant had flattened, damaged spinal cords—but the e-dura implants had left spinal cords intact. Cellular testing also showed a strong immune response to the traditional implant, which was minimal in rats with the e-dura implant. © 2016 Scientific American
New York's Tribeca Film Festival will not show Vaxxed, a controversial film about the MMR vaccine, its founder Robert De Niro says. As recently as Friday, Mr De Niro stood by his decision to include the film by anti-vaccination activist Andrew Wakefield in next month's festival. The link the film makes between the measles, mumps and rubella vaccine and autism has been widely discredited. "We have concerns with certain things in this film," said Mr De Niro. Mr De Niro, who has a child with autism, said he had hoped the film would provide the opportunity for discussion of the issue. But after reviewing the film with festival organisers and scientists, he said: "We do not believe it contributes to or furthers the discussion I had hoped for." Image caption Wakefield published his controversial study in 1998 Vaxxed was directed and co-written by Mr Wakefield, who described it as a "whistle-blower documentary". In a statement issued following the Tribeca Film Festival's decision, he and the film's producer Del Bigtree said that "we have just witnessed yet another example of the power of corporate interests censoring free speech, art and truth". The British doctor was the lead author of a controversial study published in 1998, which argued there might be a link between MMR and autism and bowel disease. Mr Wakefield suggested that parents should opt for single jabs against mumps, measles and rubella instead of the three-in-one vaccine. His comments and the subsequent media furore led to a sharp drop in the number of children being vaccinated against these diseases. But the study, first published in The Lancet, was later retracted by the medical journal. Mr Wakefield's research methods were subsequently investigated by the General Medical Council and he was struck off the medical register.
Link ID: 22037 - Posted: 03.28.2016
John Consentino After multiple doctors had conflicted about ADHD, I decided to move away from psychiatry and seek a neuropsychologist. I thought that autism made sense, but what ultimately led me to seek help was my focus problem. When I was 8 years old, it would take me HOURS to do homework. On Wednesdays, we got out of school at noon, and I wouldn't finish homework until about 8 p.m. No one understood why this was happening, and with all of the screaming and punishments I withstood, nothing improved. I still had GPAs near the high 90s, so all was OK, supposedly. I struggled with eye contact during that time, and this is very much apparent now. I struggled speaking to waiters/waitresses, to teachers, to family members. Speaking to members of the opposite sex was a near-impossible task. I never understood social groups. I went through all of high school in the same fashion. However, my family felt that everything was OK. I still had a mid-90 GPA, and I had made numerous friends. Unfortunately, my GPA had dropped by about 15-plus points by my senior year. I struggled badly during my first two years of college. I was constantly unhappy, and I made little to no friends. My GPA was horrid, and my time at the university was dwindling. I dropped out of school twice, and my future felt bleak. After transferring schools, I did great. So, everything was OK yet again. © 2016 npr
Link ID: 22036 - Posted: 03.28.2016