Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 5190

In 1938, an Austrian pediatrician named Hans Asperger gave the first public talk on autism in history. Asperger was speaking to an audience of Nazis, and he feared that his patients — children who fell onto what we now call the autism spectrum — were in danger of being sent to Nazi extermination camps. As Asperger spoke, he highlighted his "most promising" patients, a notion that would stick with the autistic spectrum for decades to come. "That is where the idea of so-called high-functioning versus low-functioning autistic people comes from really — it comes from Asperger's attempt to save the lives of the children in his clinic," science writer Steve Silberman tells Fresh Air's Terry Gross. Silberman chronicles the history of autism and examines some of the myths surrounding our current understanding of the condition in his new book, NeuroTribes. Along the way, he revisits Asperger's calculated efforts to save his patients. Steve Silberman's articles have been published in Wired, The New Yorker, Nature and Salon. Silberman shies away from using the terms high-functioning and low-functioning, because "both of those terms can be off base," he says. But he praises Asperger's courage in speaking to the Nazis. "I would literally weep while I was writing that chapter," he says. NeuroTribes also explores how a 1987 expansion of the medical definition of autism (which was previously much narrower and led to less frequent diagnoses) contributed to the perception that there was an autism epidemic. © 2015 NPR

Keyword: Autism
Link ID: 21378 - Posted: 09.03.2015

By Jennifer Couzin-Frankel Some rare diseases pull researchers in and don’t let them go, and the unusual bone condition called fibrodysplasia ossificans progressiva (FOP) has long had its hooks in Aris Economides. “The minute you experience it it’s impossible to step back and forget it,” says the functional geneticist who runs the skeletal disease program at Regeneron Pharmaceuticals in Tarrytown, New York. “It’s devastating in the most profound way.” The few thousand or so people with FOP worldwide live with grueling uncertainty: Some of their muscles or other soft tissues periodically, and abruptly, transform into new bone that permanently immobilizes parts of their bodies. Joints such as elbows or ankles may become frozen in place; jaw motion can be impeded and the rib cage fixed, making eating or even breathing difficult. Twenty years after he first stumbled on FOP, Economides and his colleagues report today that the gene mutation shared by 97% of people with the disease can trigger its symptoms in a manner different than had been assumed—through a single molecule not previously eyed as a suspect. And by sheer chance, Regeneron had a treatment for this particular target in its freezers. The company tested that potential therapy, a type of protein known as a monoclonal antibody, on mice with their own form of FOP and lo and behold, they stopped growing unwelcome new bone. © 2015 American Association for the Advancement of Science.

Keyword: Movement Disorders; Trophic Factors
Link ID: 21377 - Posted: 09.03.2015

Aftab Ali People who were born prematurely are less intelligent later on in life and earn less money as a result, according to a new study by the University of Warwick. Researchers at the Coventry-based institution said they found a link which connects pre-term birth with low reading and, in particular, maths skills which affect the amount of wealth accumulated as adults. Funded by the Nuffield Foundation, the researchers examined data from two other large studies, following children born more than a decade apart, with one group from 1958 and the other from 1970. In total, more than 15,000 individuals were surveyed – which recruited all children born in a single week in England, Scotland, and Wales. Data were examined for all individuals who were born at between 28 and 42 weeks gestational age, and who had available wealth information at the age of 42. Those participants who were born pre-term – at less than 37 weeks – were compared with those who were born full-term to find both groups’ mathematical ability in childhood had a direct effect on how much they earned as an adult, regardless of later educational qualifications. In order to measure adult wealth, the researchers looked at factors including: family income and social class, housing and employment status, and their own perceptions of their financial situation. In regards to academic abilities, they examined: validated measures for mathematics, reading, and intelligence, along with ratings from teachers and parents. © independent.co.uk

Keyword: Development of the Brain; Intelligence
Link ID: 21375 - Posted: 09.02.2015

By Gretchen Reynolds At the age of 93, Olga Kotelko — one of the most successful and acclaimed nonagenarian track-and-field athletes in history — traveled to the University of Illinois to let scientists study her brain. Ms. Kotelko held a number of world records and had won hundreds of gold medals in masters events. But she was of particular interest to the scientific community because she hadn’t begun serious athletic training until age 77. So scanning her brain could potentially show scientists what late-life exercise might do for brains. Ms. Kotelko died last year at the age of 95, but the results of that summer brain scan were published last month in Neurocase. And indeed, Ms. Kotelko’s brain looked quite different from those of other volunteers aged 90-plus who participated in the study, the scans showed. The white matter of her brain — the cells that connect neurons and help to transmit messages from one part of the brain to another — showed fewer abnormalities than the brains of other people her age. And her hippocampus, a portion of the brain involved in memory, was larger than that of similarly aged volunteers (although it was somewhat shrunken in comparison to the brains of volunteers decades younger than her). Over all, her brain seemed younger than her age. But because the scientists didn’t have a scan showing Ms. Kotelko’s brain before she began training, it’s impossible to know whether becoming an athlete late in life improved her brain’s health or whether her naturally healthy brain allowed her to become a stellar masters athlete. © 2015 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 21372 - Posted: 09.02.2015

By SINDYA N. BHANOO The human eye has a blind spot, though few of us realize it. Now, a new study suggests that it is possible to reduce the spot with training. The optic nerve, which carries visual signals to the brain, passes through the retina, a light-sensitive layer of tissue. There are no so-called photoreceptors at the point where the optic nerve intersects the retina. The right eye generally compensates for the left eye’s blind spot and vice versa, so the spot is hardly noticed. Researchers trained 10 people using a computer monitor and an eye patch. The participants were shown a waveform in the visual field of their blind spot day after day. After 20 days of this repeated stimulation, the blind spot shrunk by about 10 percent. The researchers believe that neurons at the periphery of the blind spot became more responsive, effectively reducing the extent of functional blindness. The findings add to a growing body of research suggesting that the human eye can be trained, said Paul Miller, a psychologist at the University of Queensland in Australia and an author of the study, which appeared in the journal Current Biology. This kind of training may help researchers develop better treatments for visual impairments like macular degeneration. “This is the leading cause of blindness in the western world,” Mr. Miller said. © 2015 The New York Times Company

Keyword: Vision; Learning & Memory
Link ID: 21367 - Posted: 09.01.2015

An experimental gene therapy reduces the rate at which nerve cells in the brains of Alzheimer’s patients degenerate and die, according to new results from a small clinical trial, published in the current issue of the journal JAMA Neurology. Targeted injection of the Nerve Growth Factor gene into the patients’ brains rescued dying cells around the injection site, enhancing their growth and inducing them to sprout new fibres. In some cases, these beneficial effects persisted for 10 years after the therapy was first delivered. Alzheimer’s is the world’s leading form of dementia, affecting an estimated 47 million people worldwide. This figure is predicted to almost double every 20 years, with much of this increase is likely to be in the developing world. And despite the huge amounts of time, effort, and money devoted to developing an effective cure, the vast majority of new drugs have failed in clinical trials. The new results are preliminary findings from the very first human trials designed to test the potential benefits of nerve growth factor (NGF) gene therapy for Alzheimer’s patients. NGF was discovered in the 1940s by Rita Levi-Montalcini, who convincingly demonstrated that the small protein promotes the survival of certain sub-types of sensory neurons during development of the nervous system. Since then, others have shown that it also promotes the survival of acetylcholine-producing cells in the basal forebrain, which die off in Alzheimer’s. © 2015 Guardian News and Media Limited

Keyword: Alzheimers; Trophic Factors
Link ID: 21360 - Posted: 08.29.2015

We all have days when we feel like our brain is going at a snail’s pace, when our neurons forgot to get out of bed. And psychologists have shown that IQ can fluctuate day to day. So if we’re in good health and don’t have a sleep deficit from last night’s shenanigans to blame, what’s the explanation? Sophie von Stumm, a psychologist at Goldsmiths University, London, set about finding out. In particular, she wanted to know whether mood might explain the brain’s dimmer switch. Although it seems intuitively obvious that feeling low could compromise intellectual performance, von Stumm says research to date has been inconclusive, with some studies finding an effect and others not. “On bad mood days, we tend to feel that our brains are lame and work or study is particularly challenging. But scientists still don’t really know if our brains work better when we are happy compared to when we are sad.” To see if she could pin down mood’s effect on IQ more convincingly, von Stumm recruited 98 participants. Over five consecutive days they completed questionnaires to assess their mood, as well as tests to measure cognitive functions, such as short-term memory, working memory and processing speed. Surprisingly, being in a bad mood didn’t translate into worse cognitive performance. However, when people reported feeling positive, von Stumm saw a modest boost in their processing speed. © Copyright Reed Business Information Ltd.

Keyword: Depression; Intelligence
Link ID: 21359 - Posted: 08.29.2015

By Dina Fine Maron Whenever the fictional character Popeye the Sailor Man managed to down a can of spinach, the results were almost instantaneous: he gained superhuman strength. Devouring any solid object similarly did the trick for one of the X-Men. As we age and begin to struggle with memory problems, many of us would love to reach for an edible mental fix. Sadly, such supernatural effects remain fantastical. Yet making the right food choices may well yield more modest gains. A growing body of evidence suggests that adopting the Mediterranean diet, or one much like it, can help slow memory loss as people age. The diet's hallmarks include lots of fruits and vegetables and whole grains (as opposed to ultrarefined ones) and a moderate intake of fish, poultry and red wine. Dining mainly on single ingredients, such as pumpkin seeds or blueberries, however, will not do the trick. What is more, this diet approach appears to reap brain benefits even when adopted later in life—sometimes aiding cognition in as little as two years. “You will not be Superman or Superwoman,” says Miguel A. Martínez González, chair of the department of preventive medicine at the University of Navarra in Barcelona. “You can keep your cognitive abilities or even improve them slightly, but diet is not magic.” Those small gains, however, can be meaningful in day-to-day life. Scientists long believed that altering diet could not improve memory. But evidence to the contrary started to emerge about 10 years ago. © 2015 Scientific American

Keyword: Alzheimers
Link ID: 21350 - Posted: 08.28.2015

By Emily Underwood It is famous for robbing Lou Gehrig of his life and Stephen Hawking of his mobility and voice, but just how amyotrophic lateral sclerosis (ALS) destroys motor neurons in the brain and spinal cord remains a mystery. Now, scientists are converging on an explanation, at least for a fraction of the ALS cases caused by a specific mutation. In cells with the mutation, the new work shows, pores in the membrane separating the nucleus and cytoplasm become clogged, preventing vital molecules from passing through and creating a fatal cellular traffic jam. For now, the work applies only to the mutation dubbed C9orf72—a DNA stutter in which a short nucleotide sequence, GGGGCC, is repeated hundreds to thousands of times in a gene on chromosome 9. Nor do the multiple labs reporting results this week agree on exactly what plugs those nuclear pores and how the cells die. Still, the work is “a major breakthrough” in ALS research, says Amelie Gubitz, program director of the neurodegeneration division at the National Institute of Neurological Disorders in Bethesda, Maryland. The groups worked independently, starting with different hypotheses and experimental designs, yet reached similar conclusions, making the finding more convincing. And it suggests that boosting 
traffic through nuclear pores could be a new strategy for treating some cases of ALS and frontotemporal dementia (FTD), another neurodegenerative condition C9orf72 can cause. Based on past work by their own and other groups, neuroscientists Jeff 
Rothstein and Tom Lloyd at Johns Hopkins University in Baltimore, Maryland, suspected that the long strands of excess RNA produced by C9orf72 cause neurodegeneration by binding to, and thus sequestering, key cellular proteins. The team tested the idea in fruit flies with the mutation, which display damage in the nerve cells of their eyes and in motor neurons. © 2015 American Association for the Advancement of Science

Keyword: Alzheimers; ALS-Lou Gehrig's Disease
Link ID: 21349 - Posted: 08.27.2015

By Elizabeth Kolbert C57BL/6J mice are black, with pink ears and long pink tails. Inbred for the purposes of experimentation, they exhibit a number of infelicitous traits, including a susceptibility to obesity, a taste for morphine, and a tendency to nibble off their cage mates’ hair. They’re also tipplers. Given access to ethanol, C57BL/6J mice routinely suck away until the point that, were they to get behind the wheel of a Stuart Little-size roadster, they’d get pulled over for D.U.I. Not long ago, a team of researchers at Temple University decided to take advantage of C57BL/6Js’ bad habits to test a hunch. They gathered eighty-six mice and placed them in Plexiglas cages, either singly or in groups of three. Then they spiked the water with ethanol and videotaped the results. Half of the test mice were four weeks old, which, in murine terms, qualifies them as adolescents. The other half were twelve-week-old adults. When the researchers watched the videos, they found that the youngsters had, on average, outdrunk their elders. More striking still was the pattern of consumption. Young male C57BL/6Js who were alone drank roughly the same amount as adult males. But adolescent males with cage mates went on a bender; they spent, on average, twice as much time drinking as solo boy mice and about thirty per cent more time than solo girls. The researchers published the results in the journal Developmental Science. In their paper, they noted that it was “not possible” to conduct a similar study on human adolescents, owing to the obvious ethical concerns. But, of course, similar experiments are performed all the time, under far less controlled circumstances. Just ask any college dean. Or ask a teen-ager.

Keyword: Development of the Brain; Attention
Link ID: 21345 - Posted: 08.27.2015

While some research suggests that a diet high in omega-3 fatty acids can protect brain health, a large clinical trial by researchers at the National Institutes of Health found that omega-3 supplements did not slow cognitive decline in older persons. With 4,000 patients followed over a five-year period, the study is one of the largest and longest of its kind. It was published today in the Journal of the American Medical Association. “Contrary to popular belief, we didn’t see any benefit of omega-3 supplements for stopping cognitive decline,” said Emily Chew, M.D., . Dr. Chew leads the Age-Related Eye Disease Study (AREDS), which was designed to investigate a combination of nutritional supplements for slowing age-related macular degeneration (AMD), a major cause of vision loss among older Americans. That study established that daily high doses of certain antioxidants and minerals — called the AREDS formulation — can help slow the progression to advanced AMD. A later study, called AREDS2, tested the addition of omega-3 fatty acids to the AREDS formula. But the omega-3’s made no difference. Omega-3 fatty acids are made by marine algae and are concentrated in fish oils; they are believed to be responsible for the health benefits associated with regularly eating fish, such as salmon, tuna, and halibut.*Where studies have surveyed people on their dietary habits and health, they’ve found that regular consumption of fish is associated with lower rates of AMD, cardiovascular disease, and possibly dementia. “We’ve seen data that eating foods with omega-3 may have a benefit for eye, brain, and heart health,” Dr. Chew explained.

Keyword: Alzheimers
Link ID: 21340 - Posted: 08.26.2015

By Laura Sanders By tweaking a single gene, scientists have turned average mice into supersmart daredevils. The findings are preliminary but hint at therapies that may one day ease the symptoms of such disorders as Alzheimer’s disease and schizophrenia, scientists report August 14 in Neuropsychopharmacology. The altered gene provides instructions for a protein called phosphodiesterase-4B, or PDE4B, which has been implicated in schizophrenia. It’s too early to say whether PDE4B will turn out to be a useful target for drugs that treat these disorders, cautions pharmacologist Ernesto Fedele of the University of Genoa in Italy. Nonetheless, the protein certainly deserves further investigation, he says. The genetic change interfered with PDE4B’s ability to do its job breaking down a molecular messenger called cAMP. Mice designed to have this disabled form of PDE4B showed a suite of curious behaviors, including signs of smarts, says study coauthor Alexander McGirr of the University of British Columbia. Compared with normal mice, these mice more quickly learned which objects in a cage had been moved to a new location, for instance, and could better recognize a familiar mouse after 24 hours. “The system is primed and ready to learn, and it doesn’t require the same kind of input as a normal mouse,” McGirr says. These mice also spent more time than usual exploring brightly lit spaces, spots that normal mice avoid. But this devil-may-care attitude sometimes made the “smart” mice blind to risky situations. The mice were happy to spend time poking around an area that had been sprinkled with bobcat urine. “Not being afraid of cat urine is not a good thing for a mouse,” McGirr says. © Society for Science & the Public 2000 - 2015

Keyword: Learning & Memory; Schizophrenia
Link ID: 21338 - Posted: 08.26.2015

Helen Thomson Genetic changes stemming from the trauma suffered by Holocaust survivors are capable of being passed on to their children, the clearest sign yet that one person’s life experience can affect subsequent generations. The conclusion from a research team at New York’s Mount Sinai hospital led by Rachel Yehuda stems from the genetic study of 32 Jewish men and women who had either been interned in a Nazi concentration camp, witnessed or experienced torture or who had had to hide during the second world war. They also analysed the genes of their children, who are known to have increased likelihood of stress disorders, and compared the results with Jewish families who were living outside of Europe during the war. “The gene changes in the children could only be attributed to Holocaust exposure in the parents,” said Yehuda. Her team’s work is the clearest example in humans of the transmission of trauma to a child via what is called “epigenetic inheritance” - the idea that environmental influences such as smoking, diet and stress can affect the genes of your children and possibly even grandchildren. The idea is controversial, as scientific convention states that genes contained in DNA are the only way to transmit biological information between generations. However, our genes are modified by the environment all the time, through chemical tags that attach themselves to our DNA, switching genes on and off. Recent studies suggest that some of these tags might somehow be passed through generations, meaning our environment could have and impact on our children’s health. © 2015 Guardian News and Media Limited

Keyword: Epigenetics; Stress
Link ID: 21325 - Posted: 08.22.2015

Almost fully-formed brain grown in a lab. Woah: Scientists grow first nearly fully-formed human brain. Boffins raise five-week-old fetal human brain in the lab for experimentation. On Tuesday, all the above appeared as headlines for one particular story. What was it all about? Mini-brains 3 to 4 millimetres across have been grown in the lab before, but if a larger brain had been created – and the press release publicising the claim said it was the size of a pencil eraser – that would be a major breakthrough. New Scientist investigated the claims. The announcement was made by Rene Anand, a neuroscientist at Ohio State University in Columbus, at a military health research meeting in Florida. Anand says he has grown a brain – complete with a cortex, midbrain and brainstem – in a dish, comparable in maturity to that of a fetus aged 5 weeks. Anand and his colleague Susan McKay started with human skin cells, which they turned into induced pluripotent stem cells (iPSCs) using a tried-and-tested method. By applying an undisclosed technique, one that a patent has been applied for, the pair say they were able to encourage these stem cells to form a brain. “We are replicating normal development,” says Anand. He says they hope to be able to create miniature models of brains experiencing a range of diseases, such as Parkinson’s and Alzheimer’s. Inconclusive evidence But not everyone is convinced, especially as Anand hasn’t published his results. Scientists we sent Anand’s poster presentation to said that although the team has indeed grown some kind of miniature collection of cells, or “organoid”, in a dish, the structure isn’t much like a fetal brain. © Copyright Reed Business Information Ltd.

Keyword: Development of the Brain
Link ID: 21322 - Posted: 08.22.2015

Tina Hesman Saey Researchers have discovered a “genetic switch” that determines whether people will burn extra calories or save them as fat. A genetic variant tightly linked to obesity causes fat-producing cells to become energy-storing white fat cells instead of energy-burning beige fat, researchers report online August 19 in the New England Journal of Medicine. Previously scientists thought that the variant, in a gene known as FTO (originally called fatso), worked in the brain to increase appetite. The new work shows that the FTO gene itself has nothing to do with obesity, says coauthor Manolis Kellis, a computational biologist at MIT and the Broad Institute. But the work may point to a new way to control body fat. In humans and many other organisms, genes are interrupted by stretches of DNA known as introns. Kellis and Melina Claussnitzer of Harvard Medical School and colleagues discovered that a genetic variant linked to increased risk of obesity affects one of the introns in the FTO gene. It does not change the protein produced from the FTO gene or change the gene’s activity. Instead, the variant doubles the activity of two genes, IRX3 and IRX5, which are involved in determining which kind of fat cells will be produced. FTO’s intron is an enhancer, a stretch of DNA needed to control activity of far-away genes, the researchers discovered. Normally, a protein called ARID5B squats on the enhancer and prevents it from dialing up activity of the fat-determining genes. In fat cells of people who have the obesity-risk variant, ARID5B can’t do its job and the IRX genes crank up production of energy-storing white fat. © Society for Science & the Public 2000 - 2015.

Keyword: Obesity; Genes & Behavior
Link ID: 21321 - Posted: 08.20.2015

Helen Thomson An almost fully-formed human brain has been grown in a lab for the first time, claim scientists from Ohio State University. The team behind the feat hope the brain could transform our understanding of neurological disease. Though not conscious the miniature brain, which resembles that of a five-week-old foetus, could potentially be useful for scientists who want to study the progression of developmental diseases. It could also be used to test drugs for conditions such as Alzheimer’s and Parkinson’s, since the regions they affect are in place during an early stage of brain development. The brain, which is about the size of a pencil eraser, is engineered from adult human skin cells and is the most complete human brain model yet developed, claimed Rene Anand of Ohio State University, Columbus, who presented the work today at the Military Health System Research Symposium in Fort Lauderdale, Florida. Previous attempts at growing whole brains have at best achieved mini-organs that resemble those of nine-week-old foetuses, although these “cerebral organoids” were not complete and only contained certain aspects of the brain. “We have grown the entire brain from the get-go,” said Anand. Anand and his colleagues claim to have reproduced 99% of the brain’s diverse cell types and genes. They say their brain also contains a spinal cord, signalling circuitry and even a retina. The ethical concerns were non-existent, said Anand. “We don’t have any sensory stimuli entering the brain. This brain is not thinking in any way.” © 2015 Guardian News and Media Limited

Keyword: Development of the Brain
Link ID: 21316 - Posted: 08.19.2015

—By Chris Mooney It is still considered highly uncool to ascribe a person's political beliefs, even in part, to that person's biology: hormones, physiological responses, even brain structures and genes. And no wonder: Doing so raises all kinds of thorny, non-PC issues involving free will, determinism, toleration, and much else. There's just one problem: Published scientific research keeps going there, with ever increasing audacity (not to mention growing stacks of data). The past two weeks have seen not one but two studies published in scientific journals on the biological underpinnings of political ideology. And these studies go straight at the role of genes and the brain in shaping our views, and even our votes. First, in the American Journal of Political Science, a team of researchers including Peter Hatemi of Penn State University and Rose McDermott of Brown University studied the relationship between our deep-seated tendencies to experience fear—tendencies that vary from person to person, partly for reasons that seem rooted in our genes—and our political beliefs. What they found is that people who have more fearful disposition also tend to be more politically conservative, and less tolerant of immigrants and people of races different from their own. As McDermott carefully emphasizes, that does not mean that every conservative has a high fear disposition. "It's not that conservative people are more fearful, it's that fearful people are more conservative," as she puts it. I interviewed the paper's lead author, Peter Hatemi, about his research for my 2012 book The Republican Brain. Hatemi is both a political scientist and also a microbiologist, and as he stressed to me, "nothing is all genes, or all environment." These forces combine to make us who we are, in incredibly intricate ways. ©2015 Mother Jones

Keyword: Emotions; Genes & Behavior
Link ID: 21313 - Posted: 08.19.2015

Helen Thomson Serious mood disorders such as bipolar may be the price humans have had to pay for our intelligence and creativity. That’s according to new research which links high childhood IQ to an increased risk of experiencing manic bipolar traits in later life. Researchers examined data from a large birth cohort to identify the IQ of 1,881 individuals at age eight. These same individuals were then assessed for manic traits at the age of 22 or 23. The statements they provided were part of a checklist widely used to diagnose bipolar disorder. Each person was given a score out of 100 related to how many manic traits they had previously experienced. Individuals who scored in the top 10% of manic features had a childhood IQ almost 10 points higher than those who scored in the lowest 10%. This correlation appeared strongest for those with high verbal IQ. “Our study offers a possible explanation for how bipolar disorder may have been selected through generations,” said Daniel Smith of the University of Glasgow , who led the study. “There is something about the genetics underlying the disorder that are advantageous. One possibility is that serious disorders of mood - such as bipolar disorder - are the price that human beings have had to pay for more adaptive traits such as intelligence, creativity and verbal proficiency.” Smith emphasises that as things stand, having a high IQ is only an advantage: “A high IQ is not a clear-cut risk factor for bipolar, but perhaps the genes that confer intelligence can get expressed as illness in the context of other risk factors, such as exposure to maternal influenza in the womb or childhood sexual abuse.” © 2015 Guardian News and Media Limited

Keyword: Schizophrenia; Genes & Behavior
Link ID: 21312 - Posted: 08.19.2015

By Zoe Kleinman Technology reporter, BBC News More than 200 academics have signed an open letter criticising controversial new research suggesting a link between violent video games and aggression. The findings were released by the American Psychological Association. It set up a taskforce that reviewed hundreds of studies and papers published between 2005 and 2013. The American Psychological Association concluded while there was "no single risk factor" to blame for aggression, violent video games did contribute. "The research demonstrates a consistent relation between violent video game use and increases in aggressive behaviour, aggressive cognitions and aggressive affect, and decreases in pro-social behaviour, empathy and sensitivity to aggression," said the report. "It is the accumulation of risk factors that tends to lead to aggressive or violent behaviour. The research reviewed here demonstrates that violent video game use is one such risk factor." However, a large group of academics said they felt the methodology of the research was deeply flawed as a significant part of material included in the study had not been subjected to peer review. "I fully acknowledge that exposure to repeated violence may have short-term effects - you would be a fool to deny that - but the long-term consequences of crime and actual violent behaviour, there is just no evidence linking violent video games with that," Dr Mark Coulson, associate professor of psychology at Middlesex University and one of the signatories of the letter told the BBC. "If you play three hours of Call of Duty you might feel a little bit pumped, but you are not going to go out and mug someone." © 2015 BBC

Keyword: Aggression
Link ID: 21310 - Posted: 08.19.2015

By Perri Klass, A little more than a year ago, the American Academy of Pediatrics issued a policy statement saying that all pediatric primary care should include literacy promotion, starting at birth. That means pediatricians taking care of infants and toddlers should routinely be advising parents about how important it is to read to even very young children. The policy statement, which I wrote with Dr. Pamela C. High, included a review of the extensive research on the links between growing up with books and reading aloud, and later language development and school success. But while we know that reading to a young child is associated with good outcomes, there is only limited understanding of what the mechanism might be. Two new studies examine the unexpectedly complex interactions that happen when you put a small child on your lap and open a picture book. This month, the journal Pediatrics published a study that used functional magnetic resonance imaging to study brain activity in 3-to 5-year-old children as they listened to age-appropriate stories. The researchers found differences in brain activation according to how much the children had been read to at home. Children whose parents reported more reading at home and more books in the home showed significantly greater activation of brain areas in a region of the left hemisphere called the parietal-temporal-occipital association cortex. This brain area is “a watershed region, all about multisensory integration, integrating sound and then visual stimulation,” said the lead author, Dr. John S. Hutton, a clinical research fellow at Cincinnati Children’s Hospital Medical Center. This region of the brain is known to be very active when older children read to themselves, but Dr. Hutton notes that it also lights up when younger children are hearing stories. What was especially novel was that children who were exposed to more books and home reading showed significantly more activity in the areas of the brain that process visual association, even though the child was in the scanner just listening to a story and could not see any pictures. © 2015 The New York Times Company

Keyword: Language; Development of the Brain
Link ID: 21308 - Posted: 08.18.2015