Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By JAN HOFFMAN BOWLING GREEN, Ky. — Crosby J. Gardner has never had a girlfriend. Now 20 and living for the first time in a dorm here at Western Kentucky University, he has designed a fast-track experiment to find her. He ticks off the math. Two meals a day at the student dining hall, three courses per meal. Girls make up 57 percent of the 20,068 students. And so, he sums up, gray-blue eyes triumphant, if he sits at a table with at least four new girls for every course, he should be able to meet all 11,439 by graduation. “I’m Crosby Gardner!” he announces each time he descends upon a fresh group, trying out the social-skills script he had practiced in the university’s autism support program. “What is your name and what is your major?” The first generation of college students with an autism diagnosis is fanning out to campuses across the country. These growing numbers reflect the sharp rise in diagnosis rates since the 1990s, as well as the success of early-learning interventions and efforts to include these students in mainstream activities. But while these young adults have opportunities that could not have been imagined had they been born even a decade earlier, their success in college is still a long shot. Increasingly, schools are realizing that most of these students will not graduate without comprehensive support like the Kelly Autism Program at Western Kentucky. Similar programs have been taking root at nearly 40 colleges around the country, including large public institutions like Eastern Michigan University, California State University, Long Beach, the University of Connecticut and Rutgers. For decades, universities have provided academic safety nets to students with physical disabilities and learning challenges like dyslexia. But students on the autism spectrum need a web of support that is far more nuanced and complex. © 2016 The New York Times Company
Link ID: 22894 - Posted: 11.21.2016
Ian Sample Science editor A leading psychologist whose research on human memory exposed her to death threats, lawsuits, personal abuse and a campaign to have her sacked has won a prestigious prize for her courage in standing up for science. Professor Elizabeth Loftus endured a torrent of abuse from critics who objected to her work on the unreliable nature of eyewitness testimonies, and her defining research on how people can develop rich memories for events that never happened. The work propelled Loftus into the heart of the 1990 “memory wars”, when scores of people who had gone into therapy with depression, eating disorders and other common psychological problems, came out believing they had recovered repressed memories for traumatic events, often involving childhood abuse. Loftus, now a professor of law and cognitive science at the University of California, Irvine, performed a series of experiments that showed how exposure to inaccurate information and leading questions could corrupt eyewitness testimonies. More controversially, she demonstrated how therapy and hypnosis could plant completely false childhood memories in patients. She went on to become an expert witness or consultant for hundreds of court cases. In the 1990s, thousands of repressed memory cases came to light, with affected patients taking legal action against family members, former neighbours, doctors, dentists and teachers. The accusations tore many families apart. As an expert witness in such cases, Loftus came under sustained attack from therapists and patients who were convinced the new-found memories were accurate. The abuse marked a distinct shift away from the good-natured debates she was used to having in academic journals. © 2016 Guardian News and Media Limited
Keyword: Learning & Memory
Link ID: 22890 - Posted: 11.19.2016
Nicola Davis Smart bottles that dispense the correct dose of medication at the correct time, digital assistants, and chairs that know how long you’ve sat in them are among the devices set to change the face of care for those living with dementia. Dementia is now the leading cause of death in England and Wales, and is thought to affect more than 850,000 people in the UK. But a new wave of connected devices, dubbed “the internet of things”, could offer new ways to help people live independently for longer. “We have got an elderly population, and children in their 40s and 50s are looking after their elderly parents – and they may not have the capabilities to coordinate that care effectively,” said Idris Jahn, head of health and data at IoTUK, a programme within the government-backed Digital Catapult. While phone calls and text messages help to keep people in touch, says Jahn, problems can still arise, from missed appointments to difficulties in taking medication correctly. But he adds, connected sensors and devices that collect and process data in real time could help solve the problem. “For [people living with dementia] the sensors would be more in the environment itself, so embedded into the plug sockets, into the lights – so it is effectively invisible. You carry on living your life but in the background things will monitor you and provide feedback to people who need to know,” he said. “That might be your carer, it might be your family, it might be your clinician.” The approach, he added, has the potential to change the way care is given. “It is having that cohesive mechanism to put everyone into the loop, which I think hasn’t existed in the past and it is something that people need.” © 2016 Guardian News and Media Limited
Link ID: 22887 - Posted: 11.19.2016
Laura Sanders SAN DIEGO — Mice raised in cages bombarded with glowing lights and sounds have profound brain abnormalities and behavioral trouble. Hours of daily stimulation led to behaviors reminiscent of attention-deficit/hyperactivity disorder, scientists reported November 14 at the annual meeting of the Society for Neuroscience. Certain kinds of sensory stimulation, such as sights and sounds, are known to help the brain develop correctly. But scientists from Seattle Children’s Research Institute wondered whether too much stimulation or stimulation of the wrong sort could have negative effects on the growing brain. To mimic extreme screen exposure, mice were blasted with flashing lights and TV audio for six hours a day. The cacophony began when the mice were 10 days old and lasted for six weeks. After the end of the ordeal, scientists examined the mice’s brains. “We found dramatic changes everywhere in the brain,” said study coauthor Jan-Marino Ramirez. Mice that had been stimulated had fewer newborn nerve cells in the hippocampus, a brain structure important for learning and memory, than unstimulated mice, Ramirez said. The stimulation also made certain nerve cells more active in general. Stimulated mice also displayed behaviors similar to some associated with ADHD in children. These mice were noticeably more active and had trouble remembering whether they had encountered an object. The mice also seemed more inclined to take risks, venturing into open areas that mice normally shy away from, for instance. |© Society for Science & the Public 2000 - 2016.
By Jessica Hamzelou You’ve got a spare hour before a big exam. How should you spend it? It seems napping is just as effective as revising, and could even have a longer-lasting impact. Repeatedly revising information to learn it makes sense. “Any kind of reactivation of a memory trace will lead to it being strengthened and reconsolidated,” says James Cousins at the Duke-NUS Medical School in Singapore. “With any memory, the more you recall it, the stronger the memory trace.” However, sleep is also thought to be vital for memory. A good night’s sleep seems to help our brains consolidate what we’ve learned in the day, and learning anything when you’re not well rested is tricky. Many people swear by a quick afternoon kip. So if you’ve got an hour free, is it better to nap or revise? Cousins, along with Michael Chee and their colleagues, also at Duke-NUS Medical School, set out to compare the two options. The team mocked-up a real student experience, and had 72 volunteers sit through presentations of about 12 different species of ants and crabs. The participants were asked to learn all about these animals, including their diets and habitats, for example. After 80 minutes of this, the students were given an hour to either watch a film, have a nap, or revise what they had just learned. After this hour, they had another 80 minutes of learning. Then they had to sit an exam in which they were asked 360 questions about the ants and the crabs. “The napping group got the best scores,” says Cousins, whose work was presented at the Society for Neuroscience annual meeting in San Diego, California on Tuesday. © Copyright Reed Business Information Ltd.
By Jessica Hamzelou Having an agile mind in your 90s might sound like wishful thinking, but some people manage to retain youthful memories until their dying days. Now post mortems have revealed that these “superagers” manage to do this even when their brains have the hallmarks of Alzheimer’s diseases. Superagers have the memory and cognition of the average person almost half their age, and manage to avoid Alzheimer’s symptoms. Aras Rezvanian at Northwestern University in Chicago, Illinois, and his colleagues have been looking at brain samples donated by such people to try to understand what their secret might be. The group looked at eight brains, all from people who had lived into their 90s, and had memory and cognition scores of the average 50-year-old until their final days. Specifically, the team studied two brain regions – the hippocampus, which is involved in memory, and the prefrontal cortex, which is key for cognition. They found that the brain samples of the superagers had plaques and tangles in them to varying degrees. These are sticky clumps and twisted fibres of protein that seem to be linked to the death of neurons, and are usually found in the brains of people with Alzheimer’s disease after they die. Of the eight superager samples, two had such a high density and distribution of these proteins that they resembled the most severe cases of Alzheimer’s. © Copyright Reed Business Information Ltd.
Link ID: 22868 - Posted: 11.15.2016
Tom Siegfried SAN DIEGO — Babies as young as 5 months old possess networks of brain cell activity that react to facial emotions, especially fear, a new study finds. “Networks for recognizing facial expressions are in place shortly after birth,” Catherine Stamoulis of Harvard Medical School said November 13 during a news conference at the annual meeting of the Society for Neuroscience. “This work … is the first evidence that networks that are involved in a function that is critical to survival, such as the recognition of facial expressions, come online very early in life.” Stamoulis and colleagues at Harvard and Boston Children’s Hospital analyzed a database of brain electrical activity collected from 58 infants as they aged from 5 months to 3 years. Brain activity was measured as the infants viewed pictures of female faces expressing happiness, anger or fear. Computer models of the brain activity showed that networks responding to fear were activated much more dramatically than those for happy or angry faces, even in the youngest infants. As babies grew older, their brain networks responding to facial emotions became less complex as redundant nerve cell connections were pruned. But the fear network remained more complex than the others, and response to fearful faces remained elevated over time. Understanding the brain circuitry involved in responding to emotional facial expressions could have implications for research on developmental disorders, Stamoulis said. |© Society for Science & the Public 2000 - 2016.
Kathleen Taylor The global rise in dementia should surprise no one. The figures — such as the 9.9 million new diagnoses each year — have been known for decades. As slow as we are to accept such vast changes on a personal, societal and political level, so research is slow to uncover why our brains become fragile with age. Neuroscientist and writer Kathleen Taylor's The Fragile Brain is about that research. But it is much more than a simple reflection on the best published hypotheses. Taylor has crafted a personal, astonishingly coherent review of our current state of knowledge about the causes of Alzheimer's disease and dementia, as well as possible solutions, from lifestyle adjustments to drug developments. Filled with elegant metaphors, her study covers the detail of molecular biology and larger-scale analysis, including epidemiological observations and clinical studies. It extends to dementia due to multiple sclerosis, stroke and encephalitis. For instance, some 5–30% of people who have a first stroke develop dementia. But the book's focus is Alzheimer's disease, and rightly so: it is what up to 80% of people with dementia are diagnosed with. Taylor begins with a shocking juxtaposition, setting the costs of age-related disorders and of dementia alongside the scarcity in funding. In Britain, Australia and the United States, for example, funding for dementia research is a fraction of that for cancer — in the United States, just 18%. She contextualizes with reflections on the history of dementia research, deftly unravelling the roles of pioneering scientists Alois Alzheimer, Franz Nissl and Emil Kraepelin in describing the condition. © 2016 Macmillan Publishers Limited,
Elie Dolgin There are not a lot of things that could bring together people as far apart on the US political spectrum as Republican Newt Gingrich and Democrat Bob Kerrey. But in 2007, after leading a three-year commission that looked into the costs of care for elderly people, the political rivals came to full agreement on a common enemy: dementia. At the time, there were fewer than 30 million people worldwide diagnosed with the condition, but it was clear that the numbers were set to explode. By 2050, current predictions suggest, it could reach more than 130 million, at which point the cost to US health care alone from diseases such as Alzheimer’s will probably hit US$1 trillion per year in today’s dollars. “We looked at each other and said, ‘You know, if we don’t get a grip on Alzheimer’s, we can’t get anything done because it’s going to drown the system,’” recalls Gingrich, the former speaker of the US House of Representatives. He still feels that sense of urgency, and for good reason. Funding has not kept pace with the scale of the problem; targets for treatments are thin on the ground and poorly understood; and more than 200 clinical trials for Alzheimer’s therapies have been terminated because the treatments were ineffective. Of the few treatments available, none addresses the underlying disease process. “We’re faced with a tsunami and we’re trying to deal with it with a bucket,” says Gingrich. But this message has begun to reverberate around the world, which gives hope to the clinicians and scientists. Experts say that the coming wave can be calmed with the help of just three things: more money for research, better diagnostics and drugs, and a victory — however small — that would boost morale. © 2016 Macmillan Publishers Limited
Link ID: 22848 - Posted: 11.09.2016
Anesthesia during early childhood surgery poses little risk for intelligence and academics later on, the largest study of its kind suggests. The results were found in research on nearly 200,000 Swedish teens. School grades were only marginally lower in kids who'd had one or more common surgeries with anesthesia before age 4, compared with those who'd had no anesthesia during those early years. Whether the results apply to sicker children who have riskier surgeries with anesthesia is not known. But the researchers from Sweden's Karolinska Institute and doctors elsewhere called the new results reassuring, given experiments in young animals linking anesthesia drugs with brain damage. Previous studies of children have been relatively small, with conflicting results. The new findings, published Monday in JAMA Pediatrics, don't provide a definitive answer and other research is ongoing. The study authors and other doctors say the harms from postponing surgery must be considered when evaluating any potential risks from anesthesia in young children. The most common procedures in the study were hernia repairs; ear, nose or throat surgeries; and abdominal operations. The researchers say the operations likely lasted an hour or less. The study did not include children with other serious health problems and those who had more complex or risky operations, including brain, heart and cancer surgeries. The research involved about 33,500 teens who'd had surgery before age 4 and nearly 160,000 who did not. ©2016 CBC/Radio-Canada.
By Jessica Boddy Glasses may be trendy now, but for centuries they were the stodgy accessories of the elderly worn only for failing eyes. Now, new research suggests that aging bonobos might also benefit from a pair of specs—not for reading, but for grooming. Many older bonobos groom their partners at arm’s length instead of just centimeters away, in the same way that older humans often hold newspapers farther out to read. This made researchers think the apes might also be losing their close-up vision as they age. To see whether their hypothesis held, the researchers took photos of 14 different bonobos of varying ages as they groomed one another (above) and measured the distance between their hands and faces. By analyzing how this so-called grooming distance varied from ape to ape, the researchers found that grooming distance increased exponentially with age, they report today in Current Biology. And because both humans and bonobos shows signs of farsightedness around age 40, deterioration in human eyes might not be the mere result of staring at screens and small text, the scientists say. Rather, it might be a deep-rooted natural trait reaching back to a common ancestor. © 2016 American Association for the Advancement of Science.
Link ID: 22841 - Posted: 11.08.2016
By Marian Vidal-Fernandez, Ana Nuevo-Chiquero, The title of this article might trigger self-satisfied smiles among first-borns, and some concerns among the rest of us. Many studies show children born earlier in the family enjoy better wages and more education, but until now we didn’t really know why. Our recently published findings are the first to suggest advantages of first born siblings start very early in life—around zero to three years old! We observe parents changing their behaviour as new children are born, and offering less cognitive stimulation to children of higher birth order. It now seems clear that for those born and raised in high-income countries such as the United States, the UK and Norway, earlier-born children enjoy higher wages and education as adults—known as the “birth order effect”. Comparing two siblings, the greater the difference in their birth order, the greater the relative benefit to the older child. However, to date we’ve had no evidence that explains where such differences come from. We know it’s not an effect of family size, because the effect remains when comparing siblings within the same family and families with the same number of children. While it makes sense that parents earn more money and gain experience as they get older and have more children, they also need to divide their economic resources and attention among any children that arrive after the first born. We wondered where in childhood these differences began, and what the cause or causes might be. © 2016 Scientific American,
Laura Sanders A protein that can switch shapes and accumulate inside brain cells helps fruit flies form and retrieve memories, a new study finds. Such shape-shifting is the hallmark move of prions — proteins that can alternate between two forms and aggregate under certain conditions. In fruit flies’ brain cells, clumps of the prionlike protein called Orb2 stores long-lasting memories, report scientists from the Stowers Institute for Medical Research in Kansas City, Mo. Figuring out how the brain forms and calls up memories may ultimately help scientists devise ways to restore that process in people with diseases such as Alzheimer’s. The new finding, described online November 3 in Current Biology, is “absolutely superb,” says neuroscientist Eric Kandel of Columbia University. “It fills in a lot of missing pieces.” People possess a version of the Orb2 protein called CPEB, a commonality that suggests memory might work in a similar way in people, Kandel says. It’s not yet known whether people rely on the prion to store long-term memories. “We can’t be sure, but it’s very suggestive,” Kandel says. When neuroscientist Kausik Si and colleagues used a genetic trick to inactivate Orb2 protein, male flies were worse at remembering rejection. These lovesick males continued to woo a nonreceptive female long past when they should have learned that courtship was futile. In different tests, these flies also had trouble remembering that a certain odor was tied to food. |© Society for Science & the Public 2000 - 2016. All rights reserved.
By Simon Makin Cerebral autopsy specimen of a patient diagosed having Alzheimer Disease. In the HE stain numerous plaque formations within the neuropil background are visible. Credit: WIKIPEDIA, CC BY-SA 3.0 On Monday Pres. Barack Obama proclaimed November “National Alzheimer's Disease Awareness Month.” The administration’s ambitious goal is to prevent and treat Alzheimer's by 2025. Although there are currently no approved therapies that slow or stop progression of the disease, several approaches are showing promise. In a study published today in Science Translational Medicine, a team from Merck Research Laboratories reports results of early human and animal trials of a drug called verubecestat, which targets the production of protein plaques associated with the disease. “It's a summary of the discovery and early-stage profiling of what we hope is going to be a new therapeutic for Alzheimer's,” says team leader Matthew Kennedy. “It represents well over a decade of investment in this project by many, many scientists.” Definitive conclusions will have to await the results of larger, ongoing phase III clinical trials to assess their efficacy, effectiveness and safety, but the results are promising, experts say. Verubecestat is a so-called BACE1 inhibitor. BACE1 (for Beta-site Amyloid precursor protein Cleaving Enzyme 1, aka beta-secretase 1) is an enzyme involved in producing amyloid beta (Ab), a protein that clumps together, eventually forming the plaques surrounding neurons that are the disease's key hallmark. The amyloid hypothesis of Alzheimer's proposes that the accumulation of amyloid beta aggregates in the brain drives a cascade of biological events leading to neurodegeneration. By blocking BACE1, the hope is this approach could prevent the buildup of these clumps in the first place. But until now, development of these drugs has been hindered by problems finding molecules with the right characteristics, and concerns over theoretical and actual side effects. © 2016 Scientific American
Link ID: 22827 - Posted: 11.03.2016
By Virginia Morell Human hunters may be making birds smarter by inadvertently shooting those with smaller brains. That’s the conclusion of a new study, which finds that hunting may be exerting a powerful evolutionary force on bird populations in Denmark, and likely wherever birds are hunted. But the work also raises a red flag for some researchers who question whether the evolution of brain size can ever be tied to a single factor. The new work “broadens an emerging view that smarts really do matter in the natural, and increasingly human-dominated, world,” says John Marzluff, a wildlife biologist and expert on crow cognition at the University of Washington in Seattle who was not involved with the work. Hunting and fishing are known to affect many animal populations. For instance, the pike-perch in the Finnish Archipelago Sea has become smaller over time thanks to fishing, which typically removes the largest individuals from a population. This pressure also causes fish to reach sexual maturity earlier. On land, natural predators like arctic foxes and polar bears can also drive their prey species to become smarter because predators are most likely to catch those with smaller brains. For instance, a recent study showed that common eiders (maritime ducks) that raise the most chicks also have the largest heads and are better at forming protective neighborhood alliances than ducks with smaller heads—and presumably, brains. © 2016 American Association for the Advancement of Science
Bruce Bower Many preschoolers take a surprisingly long and bumpy mental path to the realization that people can have mistaken beliefs — say, thinking that a ball is in a basket when it has secretly been moved to a toy box. Traditional learning curves, in which kids gradually move from knowing nothing to complete understanding, don’t apply to this landmark social achievement and probably to many other types of learning, a new study concludes. Kids ranging in age from 3 to 5 often go back and forth between passing and failing false-belief tests for several months to more than one year, say psychologist Sara Baker of the University of Cambridge and her colleagues. A small minority of youngsters jump quickly from always failing to always passing these tests, the scientists report October 20 in Cognitive Psychology. “If these results are replicated, it will surprise a lot of researchers that there is such a low level of sudden insight into false beliefs,” says psychologist Malinda Carpenter, currently at the Max Planck Institute for Evolutionary Anthropology in Leipzig. Early childhood researchers generally assume that preschoolers either pass or fail false-belief tests, with a brief transition between the two, explains Carpenter, who did not participate in the new study. Grasping that others sometimes have mistaken beliefs is a key step in social thinking. False-belief understanding may start out as something that can be indicated nonverbally but not described. Human 2-year-olds and even chimpanzees tend to look toward spots where a person would expect to find a hidden item that only the children or apes have seen moved elsewhere (SN Online: 10/6/16). © Society for Science & the Public 2000 - 2016
By Jesse Singal For a long time, the United States’ justice system has been notorious for its proclivity for imprisoning children. Because of laws that grant prosecutors and judges discretion to bump juveniles up to the category of “adult” when they commit crimes deemed serious enough by the authorities, the U.S. is an outlier in locking up kids, with some youthful defendants even getting life sentences. Naturally, this has attracted a great deal of outrage and advocacy from human-rights organizations, who argue that kids, by virtue of not lacking certain judgment, foresight, and decision-making abilities, should be treated a bit more leniently. Writing for the Marshall Project and drawing on some interesting brain science, Dana Goldstein takes the argument about youth incarceration even further: We should also rethink our treatment of offenders who are young adults. As Goldstein explains, the more researchers study the brain, the more they realize that it takes decades for the organ to develop fully and to impart to its owners their full, adult capacities for reasoning. “Altogether,” she writes, “the research suggests that brain maturation continues into one’s twenties and even thirties.” Many of these insights come from the newest generation of neuroscience research. “Everyone has always known that there are behavioral changes throughout the lifespan,” Catherine Lebel, an assistant professor of radiology at the University of Calgary who has conducted research into brain development, told Goldstein. “It’s only with new imaging techniques over the last 15 years that we’ve been able to get at some of these more subtle changes.” ! © 2016, New York Media LLC.
Emily Sohn After a mother killed her four young children and then herself last month in rural China, onlookers quickly pointed to life circumstances. The family lived in extreme poverty, and bloggers speculated that her inability to escape adversity pushed her over the edge. Can poverty really cause mental illness? It's a complex question that is fairly new to science. Despite high rates of both poverty and mental disorders around the world, researchers only started probing the possible links about 25 years ago. Since then, evidence has piled up to make the case that, at the very least, there is a connection. People who live in poverty appear to be at higher risk for mental illnesses. They also report lower levels of happiness. That seems to be true all over the globe. In a 2010 review of 115 studies that spanned 33 countries across the developed and developing worlds, nearly 80 percent of the studies showed that poverty comes with higher rates of mental illness. Among people living in poverty, those studies also found, mental illnesses were more severe, lasted longer and had worse outcomes. And there's growing evidence that levels of depression are higher in poorer countries than in wealthier ones. Those kinds of findings challenge a long-held myth of the "poor but happy African sitting under a palm tree," says Johannes Haushofer, an economist and neurobiologist who studies interactions between poverty and mental health at Princeton University. © 2016 npr
By Ruth Williams .Newly made cells in the brains of mice adopt a more complex morphology and connectivity when the animals encounter an unusual environment than if their experiences are run-of-the-mill. Researchers have now figured out just how that happens. According to a study published today (October 27) in Science, a particular type of cell—called an interneuron—in the hippocampus processes the animals’ experiences and subsequently shapes the newly formed neurons. “We knew that experience shapes the maturation of these new neurons, but what this paper does is it lays out the entire circuit through which that happens,” said Heather Cameron, a neuroscientist at the National Institute of Mental Health in Bethesda who was not involved with the work. “It’s a really nicely done piece of work because they go step-by-step and show all of the cells that are involved and how they’re connected.” Most of the cells in the adult mammalian brain are mature and don’t divide, but in a few regions, including an area of the hippocampus called the dentate gyrus, neurogenesis occurs. The dentate gyrus is thought to be involved in the formation of new memories. In mice, for instance, exploring novel surroundings electrically activates the dentate gyrus and can affect the production, maturation, and survival of the newly born cells. Now, Alejandro Schinder and his team at the Leloir Institute in Buenos Aires, Argentina, have investigated the process in detail. © 1986-2016 The Scientist
By Catherine Caruso Babies and children undergo massive brain restructuring as they mature, and for good reason—they have a whole world of information to absorb during their sprint toward adulthood. This mental renovation doesn’t stop there, however. Adult brains continue to produce new cells and restructure themselves throughout life, and a new study in mice reveals more about the details of this process and the important role environmental experience plays. Through a series of experiments, researchers at the Leloir Institute in Buenos Aires showed that when adult mice are exposed to stimulating environments, their brains are able to more quickly integrate new brain cells into existing neural networks through a process that involves new and old cells connecting to one another via special helper cells called interneurons. The adult mammalian brain, long believed to lack the capacity to make new cells, has two main areas that continuously produce new neurons throughout life. One of these areas, the hippocampus (which is involved in memory, navigation, mood regulation and stress response) produces new neurons in a specialized region called the dentate gyrus. Many previous studies have focused on how the dentate gyrus produces new neurons and what happens to these neurons as they mature, but Alejandro Schinder and his colleagues at Leloir wanted to go one step further and understand how new neurons produced by the dentate gyrus are incorporated into the existing neural networks of the brain, and whether environment affects this process. © 2016 Scientific American