Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Emily Underwood When pharmaceutical company Eli Lilly in Indianapolis last week announced a major change to its closely watched clinical trial for the Alzheimer’s drug solanezumab, some in the scientific community and drug development industry cried foul. To critics, the company’s decision to eliminate changes in a person’s daily ability to function as a primary measure of solanezumab’s efficacy and focus solely on a cognitive test seemed like a last-ditch attempt to keep a doomed drug from failing its third trial. Lilly’s stock plunged by nearly 5%, apparently reflecting that sentiment. Largely lost in the online “chatter,” however, was that Lilly’s move reflects a growing scientific consensus about how the early stages of Alzheimer’s disease progress, says Dennis Selkoe, a neurologist at Brigham and Women’s Hospital in Boston, who is not involved in the Lilly trial. “From the point of view of a neurologist who’s seen hundreds of patients, [Lilly’s decision] makes clinical sense,” he says. Solanezumab is an antibody designed to bind to and promote the clearance of the β-amyloid protein, which forms plaques around the neurons of people with Alzheimer’s. Not everyone agrees that these plaques are at the root of the disease—a concept called the amyloid hypothesis, of which Selkoe is a major proponent—but fighting them is the foundation of nearly all current efforts in Alzheimer’s drug development. By helping destroy the plaques in people with early stages of Alzheimer’s, Lilly hopes solanezumab can slow the disease’s progression. © 2016 American Association for the Advancement of Science.
Link ID: 22011 - Posted: 03.22.2016
By Emily Underwood People with autism spectrum disorder (ASD) die on average 18 years before the general population, according to a report released today by Autistica, a philanthropic group based in the United Kingdom. People with both ASD and an intellectual disability die even younger, on average 30 years earlier than those without the conditions. Fatal accidents—often by drowning, when a child or adult with ASD wanders away from caregivers—are one of the classic causes of premature death in people who have both ASD and an intellectual disability, says Sven Bölte, a clinical psychologist at the Karolinksa Institute in Stockholm, whose research is cited in the Autistica report. Epilepsy, along with several other neurological disorders, is another common cause of death among people with both ASD and learning difficulties, suggesting that early disruption of neurodevelopment is to blame. These “classic” causes of premature death in autism, however, do not fully account for a decades-long life span gap between autistic and nonautistic people, or the difference in mortality between autistic people with and without an intellectual disability, Bölte says. To explore these gaps, in 2015 Bölte’s group published a large epidemiological study of more than 27,000 Swedish people with ASD, 6500 of whom had an intellectual disability. They found that risk of premature death was about 2.5 times higher for the entire group, a gap largely due to increased prevalence of common health problems such as diabetes and respiratory disease. Patients may be being diagnosed too late because they do not know how to express health concerns to their doctors, Bölte says, making it “extremely important” for general practitioners to thoroughly explore autistic patients’ symptoms and histories. © 2016 American Association for the Advancement of Scienc
Link ID: 22010 - Posted: 03.19.2016
By John Elder Robison What happens to your relationships when your emotional perception changes overnight? Because I’m autistic, I have always been oblivious to unspoken cues from other people. My wife, my son and my friends liked my unflappable demeanor and my predictable behavior. They told me I was great the way I was, but I never really agreed. For 50 years I made the best of how I was, because there was nothing else I could do. Then I was offered a chance to participate in a study at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Investigators at the Berenson-Allen Center there were studying transcranial magnetic stimulation, or T.M.S., a noninvasive procedure that applies magnetic pulses to stimulate the brain. It offers promise for many brain disorders. Several T.M.S. devices have been approved by the Food and Drug Administration for the treatment of severe depression, and others are under study for different conditions. (It’s still in the experimental phase for autism.) The doctors wondered if changing activity in a particular part of the autistic brain could change the way we sense emotions. That sounded exciting. I hoped it would help me read people a little better. They say, be careful what you wish for. The intervention succeeded beyond my wildest dreams — and it turned my life upside down. After one of my first T.M.S. sessions, in 2008, I thought nothing had happened. But when I got home and closed my eyes, I felt as if I were on a ship at sea. And there were dreams — so real they felt like hallucinations. It sounds like a fairy tale, but the next morning when I went to work, everything was different. Emotions came at me from all directions, so fast that I didn’t have a moment to process them. © 2016 The New York Times Company
By Daisy Yuhas Something was wrong with Brayson Thibodeaux. At 15 months old, he still was not walking; his parents and grandparents were certain that his development was slower than normal. After pushing doctors for answers they finally got him to a neurologist who recommended a genetic test. Brayson had fragile X syndrome, the leading heritable cause of intellectual disability and of autism. The discovery sent ripples through the extended family, who live outside New Orleans. Brayson’s great-grandmother, Cheryl, recalled having heard of fragile X and discovered a cousin whose grandson had the same condition. She soon learned that many members of her family were confirmed carriers of a genetic condition—the fragile X pre-mutation—that put them at risk of having children with this syndrome. “Fragile X” refers to a mutation that alters the X chromosome in such a way that, viewed under a microscope, it would look like a piece was about to break off. That is because one gene contains multiple repetitions of noncoding DNA—specifically CGG (cytosine, guanine, guanine). The exact number of CGG repetitions is variable, but when it reaches more than 200, it is considered to be the full mutation, which causes the syndrome. People with between 55 and 200 repeats are said to have a partial or pre-mutation, an unstable gene that can expand into the full mutation in future generations. © 2016 Scientific American,
Mo Costandi In order to remember, we must forget. Recent research shows that when your brain retrieves newly encoded information, it suppresses older related information so that it does not interfere with the process of recall. Now a team of European researchers has identified a neural pathway that induces forgetting by actively erasing memories. The findings could eventually lead to novel treatments for conditions such as post-traumatic stress disorder (PTSD). We’ve known since the early 1950s that a brain structure called the hippocampus is critical for memory formation and retrieval, and subsequent work using modern techniques has revealed a great deal of information about the underlying cellular mechanisms. The hippocampus contains neural circuits that loop through three of its sub-regions – the dentate gyrus and the CA3 and CA1 areas – and it’s widely believed that memories form by the strengthening and weakening of synaptic connections within these circuits. The dentate gyrus gives rise to so-called mossy fibres, which form the main ‘input’ to the hippocampus, by relaying sensory information from an upstream region called the entorhinal cortex first to CA3 and then onto CA1. It’s thought that the CA3 region integrates the information to encode, store, and retrieve new memories, before transferring them to the cerebral cortex for long-term storage. Exactly how each of these hippocampal sub-regions contribute to memory formation, storage, and retrieval is still not entirely clear, however. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22007 - Posted: 03.19.2016
Alison Abbott In the 25 years that John Collinge has studied neurology, he has seen hundreds of human brains. But the ones he was looking at under the microscope in January 2015 were like nothing he had seen before. He and his team of pathologists were examining the autopsied brains of four people who had once received injections of growth hormone derived from human cadavers. It turned out that some of the preparations were contaminated with a misfolded protein — a prion — that causes a rare and deadly condition called Creutzfeldt–Jakob disease (CJD), and all four had died in their 40s or 50s as a result. But for Collinge, the reason that these brains looked extraordinary was not the damage wrought by prion disease; it was that they were scarred in another way. “It was very clear that something was there beyond what you'd expect,” he says. The brains were spotted with the whitish plaques typical of people with Alzheimer's disease. They looked, in other words, like young people with an old person's disease. For Collinge, this led to a worrying conclusion: that the plaques might have been transmitted, alongside the prions, in the injections of growth hormone — the first evidence that Alzheimer's could be transmitted from one person to another. If true, that could have far-reaching implications: the possibility that 'seeds' of the amyloid-β protein involved in Alzheimer's could be transferred during other procedures in which fluid or tissues from one person are introduced into another, such as blood transfusions, organ transplants and other common medical procedures. © 2016 Nature Publishing Group,
Laura Sanders Using flashes of blue light, scientists have pulled forgotten memories out of the foggy brains of mice engineered to have signs of early Alzheimer’s disease. This memory rehab feat, described online March 16 in Nature, offers new clues about how the brain handles memories, and how that process can go awry. The result “provides a theoretical mechanism for reviving old, forgotten memories,” says Yale School of Medicine neurologist Arash Salardini. Memory manipulations, such as the retrieval of lost memories and the creation of false memories, were “once the realm of science fiction,” he says. But this experiment and other recent work have now accomplished these feats, at least in rodents (SN: 12/27/14, p. 19), he says. To recover a lost memory, scientists first had to mark it. Neuroscientist Susumu Tonegawa of MIT and colleagues devised a system that tagged the specific nerve cells that stored a memory — in this case, an association between a particular cage and a shock. A virus delivered a gene for a protein that allowed researchers to control this collection of memory-holding nerve cells. The genetic tweak caused these cells to fire off signals in response to blue laser light, letting Tonegawa and colleagues call up the memory with light delivered by an optic fiber implanted in the brain. A day after receiving a shock in a particular cage, mice carrying two genes associated with Alzheimer’s seemed to have forgotten their ordeal; when put back in that cage, these mice didn’t seem as frightened as mice without the Alzheimer’s-related genes. But when the researchers used light to restore this frightening memory, it caused the mice to freeze in place in a different cage. (Freezing in a new venue showed that laser activation of the memory cells, and not environmental cues, caused the fear reaction.) © Society for Science & the Public 2000 - 2016. All rights reserved.
THERE they are! Newborn neurons vital for memory have been viewed in a live brain for the first time. The work could aid treatments for anxiety and stress disorders. Attila Losonczy at Columbia University Medical Center in New York and his team implanted a tiny microscope into the brains of live mice, the brain cells of which had been modified to make newly made neurons glow. The mice then ran on a treadmill as the team tweaked the surrounding sights, smells and sounds. The researchers paired a small electric shock with some cues, so the mice learned to associate these with an unpleasant experience. They then deactivated the newborn neurons – present in areas of the brain responsible for learning and memory – using optogenetics, which switches off specific cells with light. After this, the mice were unable to tell the difference between the scary and safe cues, becoming fearful of them all (Neuron, doi.org/bc7v). “It suggests that newborn cells do something special that allows animals to tell apart and separate memories,” says Losonczy. An inability to discriminate between similar sensory information triggered by different events – such as the sound of a gunshot and a car backfiring – is often seen in panic and anxiety disorders, such as PTSD. This suggests that new neurons, or a lack of them, plays a part in such conditions and could guide novel treatments. © Copyright Reed Business Information Ltd.
Barbara Bradley Hagerty Faced with her own forgetfulness, former NPR correspondent and author Barbara Bradley Hagerty tried to do something about it. She's written about her efforts in her book on midlife, called Life Reimagined. To her surprise, she discovered that an older dog can learn new tricks. A confession: I loathe standardized tests, and one of the perks of reaching midlife is that I thought I'd never have to take another. But lately I've noticed that in my 50s, my memory isn't the same as it once was. And so I decided to take a radical leap into the world of brain training. At the memory laboratory at the University of Maryland, manager Ally Stegman slides a sheet of paper in front of me. It has a series of boxes containing different patterns and one blank space. My job is to figure out the missing pattern. The test measures a sort of raw intelligence, the ability to figure out novel problems. Time races by. It takes me two minutes to crack the first question. I am stumped by the second and third. Finally, I begin to guess. After 25 minutes, the test is over, and to my relief, Stegman walks in. This test was really, really hard. The reason I am here, voluntarily reliving my nightmare, is simple: I want to tune up my 50-something brain. So over the next month, I will do brain-training exercises, then come back, take the test again and see if I made myself smarter. © 2016 npr
Linda Geddes The health effects of a bad diet can carry over to offspring through eggs and sperm cells without DNA mutations, researchers have found. The mouse study, published in Nature Genetics1, provides some of the strongest evidence yet for the non-genetic inheritance of traits acquired during an organism’s lifetime. And although previous work has suggested that sperm cells can carry 'epigenetic' factors, this is the first time that such an effect has been observed with egg cells. Researchers have suspected for some time that parents' lifestyle and behaviour choices can affect their children's health through epigenetics. These are chemical modifications to DNA or the proteins in chromosomes that affect how genes are expressed, but that do not alter the gene sequences themselves. Whether those changes can be inherited is still controversial. In particular, there have been suggestions that parental eating habits might shape the offspring's risk of obesity and diabetes. However, it has been difficult to disentangle the possibility that the parents’ behaviour during pregnancy or during the offspring's early childhood was to blame, rather than epigenetic changes that had occurred before conception. To get around this issue, endocrinologist Peter Huypens at the German Research Center for Environmental Health in Neuherberg, Germany, and his colleagues gave genetically identical mice one of three diets — high fat, low fat or standard laboratory chow — for six weeks. As expected, those fed the high-fat diet became obese and had impaired tolerance to glucose, an early sign of type 2 diabetes. © 2016 Nature Publishing Group
By Perri Klass, M.D. I got my good sleeper second. My oldest child, my first darling baby, did not reliably sleep through the night till he was well past 2. Since he is now an adult, I can skip right over all the questions of whether we could have trained him to self-soothe and stop calling for us in the night — we tried; we failed; we eventually gave up. The good sleeper was a good sleeper right from the beginning. She followed the timeline in the books, slept longer and longer between feedings, till she was reliably giving us a real night while she was still an infant and she never looked back. Had we matured as parents, become less anxious, more willing to let her learn how to soothe herself? Were our lives calmer? Well, no. In fact, kind of the opposite. We just got dealt two very different babies. I supervise pediatric residents as they learn to provide primary care, to offer guidance to parents as they struggle with all the complexities of baby and toddler sleep, eating, potty training, discipline and tantrums. All of the stuff that shapes your daily life with a small child, and I’m talking about an essentially healthy, normally developing small child. And the hardest thing to teach, especially to people who haven’t yet done any child-rearing, is how different those healthy, normal babies can be, right from the beginning. So we review our sensible pediatric rubrics that deal with these questions, from establishing good sleep patterns to setting limits to encouraging a healthy varied diet. But sometimes it seems that these rubrics work best with the children and families who need them least. Every child is a different assignment — and we can all pay lip service to that cheerfully enough. But the hard thing to believe is how different the assignments can be. Within the range of developmentally normal children, some parents have a much, much harder job than others: more drudge work, less gratification, more public shaming. It sometimes feels like the great undiscussed secret of pediatrics — and of parenting © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 21988 - Posted: 03.15.2016
By Julia Shaw Our brains play tricks on us all the time, and these tricks can mislead us into believing we can accurately reconstruct our personal past. In reality, false memories are everywhere. False memories are recollections of things that you never actually experienced. These can be small memory errors, such as thinking you saw a yield sign when you actually saw a stop sign, or big errors like thinking you took a hot air balloon ride that never actually happened. If you want to know more about how we can come to misremember complex autobiographical events, here is a recipe and here is a video with footage from my own research. A few weeks ago I reached out to see what you actually wanted to know about this phenomenon on Reddit, and here are the answers to my six favorite questions. 1. Is there any way a person can check if their own memories are real or false? The way that I have interpreted the academic literature, once they take hold false memories are no different from true memories in the brain. This means that they have the same properties as any other memories, and are indistinguishable from memories of events that actually happened. The only way to check, is to find corroborating evidence for any particular memory that you are interested in “validating”. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 21985 - Posted: 03.15.2016
Deborah Orr The psychologist Oliver James has for many years been a part of the cultural landscape, writing best-selling books, making television programmes, contributing articles to newspapers and generally offering his views. As a practicing psychotherapist of many years’ standing, he has good reason to believe that he has important insights to offer. James is particularly exercised by the damage caused by casual emotional abuse – the explosive parent who shouts and swears at their kids, displays resentment against them or tries to coerce them into doing things instead of employing reason. No sensible person disagrees with him on this, and only a harsh critic would deny that James has played a strong and positive part in popularising these simple, important wisdoms. That’s why it’s so very odd that James has chosen now to perpetrate casual emotional abuse on a grand scale. His latest book, Not in Your Genes: The Real Reason Parents Are Like Their Children, expands on an argument he’s been making for years: that there is no scientific basis for belief in the idea that there is any genetic element to any psychological trait. Even illnesses such as schizophrenia and bipolar disorder are completely down to the environment in which you grew up, not the complex interplay between nature and nurture that mainstream science espouses. Even if James had conclusive evidence to back up his absolutist claim – which he does not – I would suggest that such news should be broken gently. © 2016 Guardian News and Media Limited
A senior British doctor, who has been an expert defence witness for parents accused of killing their children, has been found guilty of multiple charges that include giving misleading evidence in court. The Medical Practitioners Tribunal Service said that Waney Squier, a consultant pathologist at John Radcliffe Hospital in Oxford, UK, had failed to work within the limits of her competence, failed to be objective and unbiased, and failed to heed the views of other experts. In many of the cases investigated, her actions were deliberately misleading and irresponsible. The MPTS had considered Squier’s work as an expert witness in six child abuse cases and one appeal in which parents faced charges of non-accidental head injury, formerly known as shaken-baby syndrome. Squier is prominent among several researchers worldwide who have challenged a long-standing belief that a trio of symptoms of head injury provide unequivocal evidence of abusive behaviour. Squier has argued in the scientific literature and in court that the symptoms in question – haemorrhages on the surface of the brain, haemorrhages in the retinas, and a swollen brain – can have innocent causes, such as choking or other difficulties in breathing. These symptoms, they say, can also arise from the birthing process itself. Michele Codd, chair of the tribunal, gave examples of where the panel felt Squier’s court evidence had strayed outside her field of expertise. These included offering opinions on biomechanics in relation to injuries from falling, pathology of the eyes, and paediatric medicine. © Copyright Reed Business Information Ltd.
By Emily Underwood Nestled deep within a brain region that processes memory is a sliver of tissue that continually sprouts brand-new neurons, at least into late adulthood. A study in mice now provides the first glimpse at how these newborn neurons behave in animals as they learn, and hints at the purpose of the new arrivals: to keep closely-related but separate memories distinct. A number of previous studies have suggested that the birth of new neurons is key to memory formation. In particular, scientists believe the new cell production—known as neurogenesis—plays a role in pattern separation, the ability to discriminate between similar experiences, events, or contexts based on sensory cues such as a certain smell or visual landmark. Pattern separation helps us use cues such as the presence of a particular tree or cars nearby, for example, to distinguish which parking space we chose today, as opposed to yesterday or the day before. This ability appears to be particularly diminished in people with anxiety and mood disorders. Scientists can produce deficits in pattern separation in animals by blocking neurogenesis, using x-ray radiation to kill targeted populations of cells in the dentate gyrus. Because such studies have not established the precise identity of which cells are being recorded from, however, no one has been able to address the “burning question” in the field: "how young, adult-born neurons and mature dentate granule neurons differ in their activity," says Amar Sahay, a neuroscientist at the Massachusetts General Hospital and Harvard Medical School. © 2016 American Association for the Advancement of Scienc
How is the brain able to use past experiences to guide decision-making? A few years ago, researchers supported by the National Institutes of Health discovered in rats that awake mental replay of past experiences is critical for learning and making informed choices. Now, the team has discovered key secrets of the underlying brain circuitry – including a unique system that encodes location during inactive periods. “Advances such as these in understanding cellular and circuit-level processes underlying such basic functions as executive function, social cognition, and memory fit into NIMH’s mission of discovering the roots of complex behaviors,” said NIMH acting director Bruce Cuthbert, Ph.D. While a rat is moving through a maze — or just mentally replaying the experience — an area in the brain’s memory hub, or hippocampus, specialized for locations, called CA1, communicates with a decision-making area in the executive hub or prefrontal cortex (PFC). A distinct subset of PFC neurons excited during mental replay of the experience are activated during movement, while another distinct subset, less engaged during movement in the maze – and therefore potentially distracting – are inhibited during replay. “Such strongly coordinated activity within this CA1-PFC circuit during awake replay is likely to optimize the brain’s ability to consolidate memories and use them to decide on future action” explained Shantanu Jadhav, Ph.D. (link is external), now an assistant professor at Brandeis University, Waltham, MA., the study’s co-first author. His contributions to this line of research were made possible, in part, by a Pathway to Independence award from the Office of Research Training and Career Development of the NIH’s National Institute of Mental Health (NIMH).
By Kj Dell’Antonia New research shows that the youngest students in a classroom are more likely to be given a diagnosis of attention deficit hyperactivity disorder than the oldest. The findings raise questions about how we regard those wiggly children who just can’t seem to sit still – and who also happen to be the youngest in their class. Researchers in Taiwan looked at data from 378,881 children ages 4 to 17 and found that students born in August, the cut-off month for school entry in that country, were more likely to be given diagnoses of A.D.H.D. than students born in September. The children born in September would have missed the previous year’s cut-off date for school entry, and thus had nearly a full extra year to mature before entering school. The findings were published Thursday in The Journal of Pediatrics. While few dispute that A.D.H.D. is a legitimate disability that can impede a child’s personal and school success and that treatment can be effective, “our findings emphasize the importance of considering the age of a child within a grade when diagnosing A.D.H.D. and prescribing medication for treating A.D.H.D.,” the authors concluded. Dr. Mu-Hong Chen, a member of the department of psychiatry at Taipei Veterans General Hospital in Taiwan and the lead author of the study, hopes that a better understanding of the data linking relative age at school entry to an A.D.H.D. diagnosis will encourage parents, teachers and clinicians to give the youngest children in a grade enough time and help to allow them to prove their ability. Other research has shown similar results. An earlier study in the United States, for example, found that roughly 8.4 percent of children born in the month before their state’s cutoff date for kindergarten eligibility are given A.D.H.D. diagnoses, compared to 5.1 percent of children born in the month immediately afterward. © 2016 The New York Times Company
By Dominic Howell BBC News Gum disease has been linked to a greater rate of cognitive decline in people with Alzheimer's disease, early stage research has suggested. The small study, published in PLOS ONE, looked at 59 people who were all deemed to have mild to moderate dementia. It is thought the body's response to gum inflammation may be hastening the brain's decline. The Alzheimer's Society said if the link was proven to be true, then good oral health may help slow dementia. The body's response to inflammatory conditions was cited as a possible reason for the quicker decline. Inflammation causes immune cells to swell and has long been associated with Alzheimer's. Researchers believe their findings add weight to evidence that inflammation in the brain is what drives the disease. 'Six-fold increase' The study, jointly led by the University of Southampton and King's College London, cognitively assessed the participants, and took blood samples to measure inflammatory markers in their blood. Their oral health was also assessed by a dental hygienist who was unaware of the cognitive outcomes. Of the sample group, 22 were found to have considerable gum disease while for the remaining 37 patients the disease was much less apparent. The average age of the group with gum disease was 75, and in the other group it was 79. A majority of participants - 52 - were followed up at six months, and all assessments were repeated. The presence of gum disease - or periodontitis as it is known - was associated with a six-fold increase in the rate of cognitive decline, the study suggested. © 2016 BBC
Link ID: 21976 - Posted: 03.12.2016
Susan Gaidos Most people would be happy to get rid of excess body fat. Even better: Trade the spare tire for something useful — say, better-functioning knees or hips, or a fix for an ailing heart or a broken bone. The idea is not far-fetched, some scientists say. Researchers worldwide are repurposing discarded fat to repair body parts damaged by injury, disease or age. Recent studies in lab animals and humans show that the much-maligned material can be a source of cells useful for treating a wide range of ills. At the University of Pittsburgh, bioengineer Rocky Tuan and colleagues extract buckets full of yellow fat from volunteers’ bellies and thighs and turn the liposuctioned material into tissue that resembles shock-absorbing cartilage. If the cartilage works as well in people as it has in animals, Tuan’s approach might someday offer a kind of self-repair for osteoarthritis, the painful degeneration of cartilage in the joints. He’s also using fat cells to grow replacement parts for the tendons and ligaments that support the joints. Foremost among fat’s virtues is its richness of stem cells, which have the ability to divide and grow into a wide variety of tissue types. Fat stem cells — also known as adipose-derived stem cells — can be coerced to grow into bone, cartilage, muscle tissue or, of course, more fat. Cells from fat are being tested to mend tissues found in damaged joints, hearts and muscle, and to regrow bone and heal wounds. © Society for Science & the Public 2000 - 2016
Rich Stanton In 1976, the driving simulation Death Race was removed from an Illinois amusement park. There had, according to a news story at the time, been complaints that it encouraged players to run over pedestrians to score points. Through a series of subsequent newspaper reports, the US National Safety Council labelled the game “gross” and motoring groups demanded its removal from distribution. The first moral panic over video game violence had begun. This January, a group of four scholars published a paper analysing the links between playing violent video games at a young age and aggressive behaviour in later life. The titles mentioned in the report are around 15-years-old – one of several troubling ambiguities to be found in the research. Nevertheless, the quality and quantity of the data make this an uncommonly valuable study. Given that game violence remains a favoured bogeyman for politicians, press and pressure groups, it should be shocking that such a robust study of the phenomenon is rare. But it is, and it’s important to ask why. A history of violence With the arrival of Pong in 1973, video games became a commercial reality, but now, in 2016, they are still on the rocky path to mass acceptance that all new media must traverse. The truth is that the big targets of moral concern – Doom, Grand Theft Auto, Call of Duty – are undeniably about killing and they are undeniably popular among male teenagers. An industry report estimates that 80% of the audience for the Call of Duty series is male, and 21% is aged 10-14. Going by the 18 rating on the last three entries, that means at least a fifth of the game’s vast audience shouldn’t be playing. © 2016 Guardian News and Media Limited