Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
Alison Abbott The fact that Edvard and May-Britt Moser have collaborated for 30 years — and been married for 28 — has done nothing to dull their passion for the brain. They talk about it at breakfast. They discuss its finer points at their morning lab meeting. And at a local restaurant on a recent summer evening, they are still deep into a back-and-forth about how their own brains know where they are and will guide them home. “Just to walk there, we have to understand where we are now, where we want to go, when to turn and when to stop,” says May-Britt. “It's incredible that we are not permanently lost.” If anyone knows how we navigate home, it is the Mosers. They shot to fame in 2005 with their discovery of grid cells deep in the brains of rats. These intriguing cells, which are also present in humans, work much like the Global Positioning System, allowing animals to understand their location. The Mosers have since carved out a niche studying how grid cells interact with other specialized neurons to form what may be a complete navigation system that tells animals where they are going and where they have been. Studies of grid cells could help to explain how memories are formed, and why recalling events so often involves re-envisioning a place, such as a room, street or landscape. While pursuing their studies, the two scientists have become a phenomenon. Tall and good-looking, they operate like a single brain in two athletic bodies in their generously funded lab in Trondheim, Norway — a remote corner of northern Europe just 350 kilometres south of the Arctic Circle. They publish together and receive prizes as a single unit — most recently, the Nobel Prize in Physiology or Medicine, which they won this week with their former supervisor, neuroscientist John O’Keefe at University College London. In 2007, while still only in their mid-40s, they won a competition by the Kavli Foundation of Oxnard, California, to build and direct one of only 17 Kavli Institutes around the world. The Mosers are now minor celebrities in their home country, and their institute has become a magnet for other big thinkers in neuroscience. “It is definitely intellectually stimulating to be around them,” says neurobiologist Nachum Ulanovsky from the Weizmann Institute of Science in Rehovot, Israel, who visited the Trondheim institute for the first time in September. © 2014 Nature Publishing Grou
Keyword: Learning & Memory
Link ID: 20162 - Posted: 10.06.2014
By ALINA TUGEND MANY workers now feel as if they’re doing the job of three people. They are on call 24 hours a day. They rush their children from tests to tournaments to tutoring. The stress is draining, both mentally and physically. At least that is the standard story about stress. It turns out, though, that many of the common beliefs about stress don’t necessarily give the complete picture. MISCONCEPTION NO. 1 Stress is usually caused by having too much work. While being overworked can be overwhelming, research increasingly shows that being underworked can be just as challenging. In essence, boredom is stressful. “We tend to think of stress in the original engineering way, that too much pressure or too much weight on a bridge causes it to collapse,” said Paul E. Spector, a professor of psychology at the University of South Florida. “It’s more complicated than that.” Professor Spector and others say too little to do — or underload, as he calls it — can cause many of the physical discomforts we associate with being overloaded, like muscle tension, stomachaches and headaches. A study published this year in the journal Experimental Brain Research found that measurements of people’s heart rates, hormonal levels and other factors while watching a boring movie — men hanging laundry — showed greater signs of stress than those watching a sad movie. “We tend to think of boredom as someone lazy, as a couch potato,” said James Danckert, a professor of neuroscience at the University of Waterloo in Ontario, Canada, and a co-author of the paper. “It’s actually when someone is motivated to engage with their environment and all attempts to do so fail. It’s aggressively dissatisfying.” © 2014 The New York Times Company
by Michael Marshall When we search for the seat of humanity, are we looking at the wrong part of the brain? Most neuroscientists assume that the neocortex, the brain's distinctive folded outer layer, is the thing that makes us uniquely human. But a new study suggests that another part of the brain, the cerebellum, grew much faster in our ape ancestors. "Contrary to traditional wisdom, in the human lineage the cerebellum was the part of the brain that accelerated its expansion most rapidly, rather than the neocortex," says Rob Barton of Durham University in the UK. With Chris Venditti of the University of Reading in the UK, Barton examined how the relative sizes of different parts of the brain changed as primates evolved. During the evolution of monkeys, the neocortex and cerebellum grew in tandem, a change in one being swiftly followed by a change in the other. But starting with the first apes around 25 million years ago through to chimpanzees and humans, the cerebellum grew much faster. As a result, the cerebellums of apes and humans contain far more neurons than the cerebellum of a monkey, even if that monkey were scaled up to the size of an ape. "The difference in ape cerebellar volume, relative to a scaled monkey brain, is equal to 16 billion extra neurons," says Barton. "That's the number of neurons in the entire human neocortex." © Copyright Reed Business Information Ltd.
|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American
By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.
Keyword: Learning & Memory
Link ID: 20157 - Posted: 10.04.2014
Helen Thomson You'll have heard of Pavlov's dogs, conditioned to expect food at the sound of a bell. You might not have heard that a scarier experiment – arguably one of psychology's most unethical – was once performed on a baby. In it, a 9-month-old, at first unfazed by the presence of animals, was conditioned to feel fear at the sight of a rat. The infant was presented with the animal as someone struck a metal pole with a hammer above his head. This was repeated until he cried at merely the sight of any furry object – animate or inanimate. The "Little Albert" experiment, performed in 1919 by John Watson of Johns Hopkins University Hospital in Baltimore, Maryland, was the first to show that a human could be classically conditioned. The fate of Albert B has intrigued researchers ever since. Hall Beck at the Appalachian State University in Boone, North Carolina, has been one of the most tenacious researchers on the case. Watson's papers stated that Albert B was the son of a wet nurse who worked at the hospital. Beck spent seven years exploring potential candidates and used facial analysis to conclude in 2009 that Little Albert was Douglas Merritte, son of hospital employee Arvilla. Douglas was born on the same day as Albert and several other points tallied with Watson's notes. Tragically, medical records showed that Douglas had severe neurological problems and died at an early age of hydrocephalus, or water on the brain. According to his records, this seems to have resulted in vision problems, so much so that at times he was considered blind. © Copyright Reed Business Information Ltd.
By Fredrick Kunkle Years ago, many scientists assumed that a woman’s heart worked pretty much the same as a man’s. But as more women entered the male-dominated field of cardiology, many such assumptions vanished, opening the way for new approaches to research and treatment. A similar shift is underway in the study of Alzheimer’s disease. It has long been known that more women than men get the deadly neurodegenerative disease, and an emerging body of research is challenging the common wisdom as to why. Although the question is by no means settled, recent findings suggest that biological, genetic and even cultural influences may play heavy roles. Of the more than 5 million people in the United States who have been diagnosed with Alzheimer’s, the leading cause of dementia, two-thirds are women. Because advancing age is considered the biggest risk factor for the disease, researchers largely have attributed that disparity to women’s longer life spans. The average life expectancy for women is 81 years, compared with 76 for men. Yet “even after taking age into account, women are more at risk,” said Richard Lipton, a physician who heads the Einstein Aging Study at Albert Einstein College of Medicine in New York. With the number of Alzheimer’s cases in the United States expected to more than triple by 2050, some researchers are urging a greater focus on understanding the underlying reasons women are more prone to the disease and on developing gender-specific treatments. .
By Smitha Mundasad Health reporter, BBC News Measuring people's sense of smell in later life could help doctors predict how likely they are to be alive in five years' time, a PLOS One study suggests. A survey of 3,000 adults found 39% with the poorest sense of smell were dead within five years - compared to just 10% who identified odours correctly. Scientists say the loss of smell sense does not cause death directly, but may be an early warning sign. They say anyone with long-lasting changes should seek medical advice. Researchers from the University of Chicago asked a representative sample of adults between the ages of 57-85 to take part in a quick smell test. The assessment involved identifying distinct odours encased on the tips of felt-tip pens. The smells included peppermint, fish, orange, rose and leather. Five years later some 39% of adults who had the lowest scores (4-5 errors) had passed away, compared with 19% with moderate smell loss and just 10% with a healthy sense of smell (0-1 errors). And despite taking issues such as age, nutrition, smoking habits, poverty and overall health into account, researchers found those with the poorest sense of smell were still at greatest risk. Lead scientist, Prof Jayant Pinto, said: "We think loss of the sense of smell is like the canary in the coal mine. BBC © 2014
By Fredrick Kunkle Here’s something to worry about: A recent study suggests that middle-age women whose personalities tend toward the neurotic run a higher risk of developing Alzheimer’s disease later in life. The study by researchers at the University of Gothenburg in Sweden followed a group of women in their 40s, whose disposition made them prone to anxiety, moodiness and psychological distress, to see how many developed dementia over the next 38 years. In line with other research, the study suggested that women who were the most easily upset by stress — as determined by a commonly used personality test — were two times more likely to develop Alzheimer’s disease than women who were least prone to neuroticism. In other words, personality really is — in some ways — destiny. “Most Alzheimer’s research has been devoted to factors such as education, heart and blood risk factors, head trauma, family history and genetics,” study author Lena Johansson said in a written statement. “Personality may influence the individual’s risk for dementia through its effect on behavior, lifestyle or reactions to stress.” The researchers cautioned that the results cannot be extrapolated to men because they were not included in the study and that further work is needed to determine possible causes for the link. The study, which appeared Wednesday in the American Academy of Neurology’s journal, Neurology, examined 800 women whose average age in 1968 was 46 years to see whether neuroticism — which involves being easily distressed and subject to excessive worry, jealousy or moodiness — might have a bearing on the risk of dementia.
|By Brian Bienkowski and Environmental Health News Babies born to mothers with high levels of perchlorate during their first trimester are more likely to have lower IQs later in life, according to a new study. The research is the first to link pregnant women's perchlorate levels to their babies’ brain development. It adds to evidence that the drinking water contaminant may disrupt thyroid hormones that are crucial for proper brain development. Perchlorate, which is both naturally occurring and manmade, is used in rocket fuel, fireworks and fertilizers. It has been found in 4 percent of U.S. public water systems serving an estimated 5 to 17 million people, largely near military bases and defense contractors in the U.S. West, particularly around Las Vegas and in Southern California. “We would not recommend action on perchlorate levels from this study alone, although our report highlights a pressing need for larger studies of perchlorate levels from the general pregnant population and those with undetected hypothyroidism,” the authors from the United Kingdom, Italy and Boston wrote in the study published in The Journal of Clinical Endocrinology & Metabolism. The Environmental Protection Agency for decades has debated setting a national drinking water standard for perchlorate. The agency in 2011 announced it would start developing a standard, reversing an earlier decision. In the meantime, two states, California and Massachusetts, have set their own standards. © 2014 Scientific American
Wild marmosets in the Brazilian forest can learn quite successfully from video demonstrations featuring other marmosets, Austrian scientists have reported, showing not only that marmosets are even better learners than previously known, but that video can be used successfully in experiments in the wild. Tina Gunhold, a cognitive biologist at the University of Vienna, had worked with a population of marmoset monkeys in a bit of Brazilian forest before this particular experiment. The forest is not wilderness. It lies near some apartment complexes, and the marmosets are somewhat used to human beings. But the monkeys are wild, and each extended family group has its own foraging territory. Dr. Gunhold and her colleagues reported in the journal Biology Letters this month that they had tested 12 family groups, setting up a series of video monitors, each with a kind of complicated box that they called an “artificial fruit.” All the boxes contained food. Six of the monitors showed just an unchanging image of a marmoset near a similar box. Three of them showed a marmoset opening the box by pulling a drawer, and three others a marmoset lifting a lid to get at the food. Marmosets are very territorial and would not tolerate a strange individual on their turf, but the image of a strange marmoset on video didn’t seem to bother them. Individual marmosets “differed in their reactions to the video,” Dr. Gunhold said. “Some were more shy, some more bold. The younger ones were more attracted to the video, perhaps because of greater curiosity.” © 2014 The New York Times Company
By David Z. Hambrick, Fernanda Ferreira, and John M. Henderson A decade ago, Magnus Carlsen, who at the time was only 13 years old, created a sensation in the chess world when he defeated former world champion Anatoly Karpov at a chess tournament in Reykjavik, Iceland, and the next day played then-top-rated Garry Kasparov—who is widely regarded as the best chess player of all time—to a draw. Carlsen’s subsequent rise to chess stardom was meteoric: grandmaster status later in 2004; a share of first place in the Norwegian Chess Championship in 2006; youngest player ever to reach World No. 1 in 2010; and highest-rated player in history in 2012. What explains this sort of spectacular success? What makes someone rise to the top in music, games, sports, business, or science? This question is the subject of one of psychology’s oldest debates. In the late 1800s, Francis Galton—founder of the scientific study of intelligence and a cousin of Charles Darwin—analyzed the genealogical records of hundreds of scholars, artists, musicians, and other professionals and found that greatness tends to run in families. For example, he counted more than 20 eminent musicians in the Bach family. (Johann Sebastian was just the most famous.) Galton concluded that experts are “born.” Nearly half a century later, the behaviorist John Watson countered that experts are “made” when he famously guaranteed that he could take any infant at random and “train him to become any type of specialist [he] might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents.” One player needed 22 times more deliberate practice than another player to become a master. © 2014 The Slate Group LLC.
Keyword: Learning & Memory
Link ID: 20136 - Posted: 09.30.2014
By Gary Stix If it’s good for the heart, it could also be good for the neurons, astrocytes and oligodendrocytes, cells that make up the main items on the brain’s parts list. The heart-brain adage comes from epidemiological studies that show that people with cardiovascular risk factors such as high-blood pressure and elevated cholesterol levels, may be more at risk for Alzheimer’s and other dementias. This connection between heart and brain has also led to some disappointments: clinical trials of lipid-lowering statins have not helped patients diagnosed with Alzheimer’s, although epidemiological studies suggest that long-term use of the drugs may help prevent Alzheimer’s and other dementias. The link between head and heart is still being pursued because new Alzheimer’s drugs have failed time and again. One approach that is now drawing some interest looks at the set of proteins that carry around fats in the brain. These lipoproteins could potentially act as molecular sponges that mop up the amyloid-beta peptide that clogs up connections among brain cells in Alzheimer’s. One of these proteins—Apolipoprotein J, also known as clusterin—intrigues researchers because of the way it interacts with amyloid-beta and the status of its gene as a risk factor for Alzheimer’s. A researcher from the University of Minnesota, Ling Li, recently presented preliminary work at the Alzheimer’s Disease Drug Discovery Foundation annual meeting that showed that, at least in a lab dish, a molecule made up of a group of amino acids from APOJ is capable of protecting against the toxicity of the amyloid-beta peptide. It also quelled inflammation and promoted the health of synapses—the junctions where one brain cell encounters another. Earlier work by another group showed that the peptide prevented the development of lesions in the blood vessels of animals.
Link ID: 20135 - Posted: 09.30.2014
by Elijah Wolfson @elijahwolfson The class was the most difficult of the fall 2013 semester, and J.D. Leadam had missed all but one lecture. His grandfather’s health had worsened, and he left San Jose State, where he was studying for a degree in business, to return home to Los Angeles to help out. Before he knew it, midterm exams had almost arrived. At this point, Leadam had, for a while, been playing around with transcranial direct-current stimulation, or tDCS, an experimental treatment for all sorts of health issues that, at its most basic, involves running a very weak electric current through the brain. When he first came across tDCS, Leadam was immediately intrigued but thought, “There’s no way I’m gonna put electrodes on my head. It’s just not going to happen.” After extensive research, though, he changed his mind. He looked into buying a device online, but there wasn’t much available — just one extremely expensive machine and then a bare-bones $40 device that didn’t even have a switch. So he dug around online and figured he could build one himself. He bought all the pieces he needed and put it together. He tried it a few times, but didn’t notice much, so he put it aside. But now, with the test looming, he picked it back up. The professor had written a book, and Leadam knew all the information he’d be tested on was written in its pages. “But I’m an auditory learner,” he said, “so I knew it wouldn’t work to just read it.” He strapped on the device, turned it on and read the chapters. “Nothing,” he thought. But when he got to the classroom and put pen to paper, he had a revelation. “I could remember concepts down to the exact paragraphs in the textbook,” Leadam said. “I actually ended up getting an A on the test. I couldn’t believe it.”
Keyword: Learning & Memory
Link ID: 20130 - Posted: 09.29.2014
By Smitha Mundasad Health reporter, BBC News A spice commonly found in curries may boost the brain's ability to heal itself, according to a report in the journal Stem Cell Research and Therapy. The German study suggests a compound found in turmeric could encourage the growth of nerve cells thought to be part of the brain's repair kit. Scientists say this work, based in rats, may pave the way for future drugs for strokes and Alzheimer's disease. But they say more trials are needed to see whether this applies to humans. Researchers from the Institute of Neuroscience and Medicine in Julich, Germany, studied the effects of aromatic-turmerone - a compound found naturally in turmeric. Rats were injected with the compound and their brains were then scanned. Particular parts of the brain, known to be involved in nerve cell growth, were seen to be more active after the aromatic-turmerone infusion. Scientists say the compound may encourage a proliferation of brain cells. In a separate part of the trial, researchers bathed rodent neural stem cells (NSCs) in different concentrations of aromatic-tumerone extract. NSCs have the ability to transform into any type of brain cell and scientists suggest they could have a role in repair after damage or disease. Dr Maria Adele Rueger, who was part of the research team, said: "In humans and higher developed animals their abilities do not seem to be sufficient to repair the brain but in fish and smaller animals they seem to work well." Picture of the spice turmeric Turmeric belongs to the same plant family as ginger BBC © 2014
By Dick Miller, CBC News Dan Campbell felt the bullets whiz past his head. The tracer rounds zipped between his legs. It was his first firefight as a Canadian soldier in Afghanistan. "I was completely frightened and scared like I’d never been before in my life,” he says. As the attack continued, the sights, sounds and smells started to form memories inside his brain. The fear he felt released the hormone norepinephrine, and in the complex chemistry of the brain, the memories of the battle became associated with the fear. 'I think one day, hopefully in the not-too-distant future, we will be able to delete a memory.'- Dr. Sheena Josselyn, senior scientist, Hospital For Sick Children Research Institute Six years later, a sight or sound such as a firecracker or car backfiring can remind him of that night in 2008. The fear comes back and he relives rather than remembers the moments. "It can be hard. Physically, you know, there’s the tapping foot, my heart beating,” he says. Like so many soldiers and victims of assault or people who have experienced horrific accidents, Campbell was diagnosed with post traumatic stress disorder. Now a newspaper reporter in Yellowknife, Campbell thinks one day he may get therapy. But for now he is working on his own to control the fear and anger the memories bring. © CBC 2014
by Sarah Zielinski Chimps may be cute and have mannerisms similar to humans, but they are wild animals. A new study finds that chimps raised as pets or entertainers have behavioral problems as adults. There are plenty of good reasons why chimpanzees should not be pets or performers, no matter how cute or humanlike they appear: They are wild animals. They can be violent with each other. And they can be violent toward humans — even humans that have a long history with the chimp. Plus, there’s evidence that seeing an adorable chimp dressed up like a miniature human actually makes us care less about the plight of their species. Now comes evidence that the way that chimps are raised to become pets or entertainers — taking them away from other chimps at a young age and putting them in the care of humans, who may or may not feed and care for them properly — has long-term, negative effects on their behavior. “We now add empirical evidence of the potentially negative welfare effects on the chimpanzees themselves as important considerations in the discussion of privately owned chimpanzees,” Hani Freeman and Stephen Ross of the Lincoln Park Zoo in Chicago write September 23 in PeerJ. Freeman and Ross compiled life history and behavioral data on 60 captive chimps living in zoos. Some of the animals had always lived in zoos and grew up in groups of chimpanzees. Six were raised solely by humans and were later placed in zoos after they became too big or too old for their owners to care for them. Others had a more mixed background. © Society for Science & the Public 2000 - 2014
Jia You In the future, a nurse could determine whether a baby is likely to develop a reading disorder simply by attaching a few electrodes to its scalp and watching its brain waves respond to human speech. Such is the scenario suggested by a new study, which finds a potential biological indicator of how well preschool children perceive rhythm, an ability linked to language development. “It’s really impressive to work with children this young, who are not often looked at,” says Aniruddh Patel, a cognitive neuroscientist at Tufts University in Medford, Massachusetts, who was not involved with the research. Spoken language consists of sound waves occurring over multiple timescales. A syllable, for example, takes place over a quarter of a second, while a sentence unfolds over a few seconds. To make sense of this complex auditory information, humans use rhythmic cues such as stress and pause to discern words and syllables. Adults and school-aged children with reading disorders, however, struggle to pick up on these rhythmic patterns. Scientists estimate that dyslexia and other reading disabilities plague about 5% to 10% of the population. Detecting such impairments early could lead to more effective intervention, but observing telltale signs in younger children who have not learned to read has proven a challenge. So biologist Nina Kraus of Northwestern University in Evanston, Illinois, and her colleagues looked for automatic brain responses that can track language development in preschoolers, who have not learned to read. © 2014 American Association for the Advancement of Science
|By Melinda Wenner Moyer Autism is primarily a disorder of the brain, but research suggests that as many as nine out of 10 individuals with the condition also suffer from gastrointestinal problems such as inflammatory bowel disease and “leaky gut.” The latter condition occurs when the intestines become excessively permeable and leak their contents into the bloodstream. Scientists have long wondered whether the composition of bacteria in the intestines, known as the gut microbiome, might be abnormal in people with autism and drive some of these symptoms. Now a spate of new studies supports this notion and suggests that restoring proper microbial balance could alleviate some of the disorder's behavioral symptoms. At the annual meeting of the American Society for Microbiology held in May in Boston, researchers at Arizona State University reported the results of an experiment in which they measured the levels of various microbial by-products in the feces of children with autism and compared them with those found in healthy children. The levels of 50 of these substances, they found, significantly differed between the two groups. And in a 2013 study published in PLOS ONE, Italian researchers reported that, compared with healthy kids, those with autism had altered levels of several intestinal bacterial species, including fewer Bifidobacterium, a group known to promote good intestinal health. One open question is whether these microbial differences drive the development of the condition or are instead a consequence of it. A study published in December 2013 in Cell supports the former idea. When researchers at the California Institute of Technology incited autismlike symptoms in mice using an established paradigm that involved infecting their mothers with a viruslike molecule during pregnancy, they found that after birth, the mice had altered gut bacteria compared with healthy mice. © 2014 Scientific American,
Link ID: 20104 - Posted: 09.23.2014
By Melissa Dahl Recently, I was visiting my family in Seattle, and we were doing that thing families do: retelling old stories. As we talked, a common theme emerged. My brother hardly remembered anything from our childhood, even the stories in which he was the star player. (That time he fell down the basement steps and needed stitches in the ER? Nope. That panicky afternoon when we all thought he’d disappeared, only to discover he’d been hiding in his room, and then fell asleep? Nothing.) “Boys never remember anything,” my mom huffed. She’s right. Researchers are finding some preliminary evidence that women are indeed better at recalling memories, especially autobiographical ones. Girls and women tend to recall these memories faster and with more specific details, and some studies have demonstrated that these memories tend to be more accurate, too, when compared to those of boys and men. And there’s an explanation for this: It could come down to the way parents talk to their daughters, as compared to their sons, when the children are developing memory skills. To understand this apparent gender divide in recalling memories, it helps to start with early childhood—specifically, ages 2 to 6. Whether you knew it or not, during these years, you learned how to form memories, and researchers believe this happens mostly through conversations with others, primarily our parents. These conversations teach us how to tell our own stories, essentially; when a mother asks her child for more details about something that happened that day in school, for example, she is implicitly communicating that these extra details are essential parts to the story. © 2014 The Slate Group LLC