Links for Keyword: Learning & Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 822

By Eric Niiler Has our reliance on iPhones and other instant-info devices harmed our memories? Michael Kahana, a University of Pennsylvania psychology professor who studies memory, says maybe: “We don’t know what the long-lasting impact of this technology will be on our brains and our ability to recall.” Kahana, 45, who has spent the past 20 years looking at how the brain creates memories, is leading an ambitious four-year Pentagon project to build a prosthetic memory device that can be implanted into human brains to help veterans with traumatic brain injuries. He spoke by telephone with The Post about what we can do to preserve or improve memory. Practicing the use of your memory is helpful. The other thing which I find helpful is sleep, which I don’t get enough of. As a general principle, skills that one continues to practice are skills that one will maintain in the face of age-related changes in cognition. [As for all those brain games available], I am not aware of any convincing data that mental exercises have a more general effect other than maintaining the skills for those exercises. I think the jury is out on that. If you practice doing crossword puzzles, you will preserve your ability to do crossword puzzles. If you practice any other cognitive skill, you will get better at that as well. Michael Kahana once could name every student in a class of 100. Now, says the University of Pennsylvania psychology professor who studies memory, “I find it too difficult even with a class of 20.” (From Michael Kahana)

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20249 - Posted: 10.28.2014

By PAM BELLUCK Science edged closer on Sunday to showing that an antioxidant in chocolate appears to improve some memory skills that people lose with age. In a small study in the journal Nature Neuroscience, healthy people, ages 50 to 69, who drank a mixture high in antioxidants called cocoa flavanols for three months performed better on a memory test than people who drank a low-flavanol mixture. On average, the improvement of high-flavanol drinkers meant they performed like people two to three decades younger on the study’s memory task, said Dr. Scott A. Small, a neurologist at Columbia University Medical Center and the study’s senior author. They performed about 25 percent better than the low-flavanol group. “An exciting result,” said Craig Stark, a neurobiologist at the University of California, Irvine, who was not involved in the research. “It’s an initial study, and I sort of view this as the opening salvo.” He added, “And look, it’s chocolate. Who’s going to complain about chocolate?” The findings support recent research linking flavanols, especially epicatechin, to improved blood circulation, heart health and memory in mice, snails and humans. But experts said the new study, although involving only 37 participants and partly funded by Mars Inc., the chocolate company, goes further and was a well-controlled, randomized trial led by experienced researchers. Besides improvements on the memory test — a pattern recognition test involving the kind of skill used in remembering where you parked the car or recalling the face of someone you just met — researchers found increased function in an area of the brain’s hippocampus called the dentate gyrus, which has been linked to this type of memory. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20246 - Posted: 10.27.2014

By Gary Stix Scott Small, a professor of neurology at Columbia University’s College of Physicians and Surgeons, researches Alzheimer’s, but he also studies the memory loss that occurs during the normal aging process. Research on the commonplace “senior moments” focuses on the hippocampus, an area of the brain involved with formation of new memories. In particular, one area of the hippocampus, the dentate gyrus, which helps distinguish one object from another, has lured researchers on age-related memory problems. In a study by Small and colleagues published Oct. 26 in Nature Neuroscience, naturally occurring chemicals in cocoa increased dentate gyrus blood flow. Psychological testing showed that the pattern recognition abilities of a typical 60-year-old on a high dose of the cocoa phytochemicals in the 37-person study matched those of a 30-or 40-year old after three months. The study received support from the food company Mars, but Small cautions against going out to gorge on Snickers Bars, as most of the beneficial chemicals, or flavanols, are removed when processing cocoa. An edited transcript of an interview with Small follows: Can you explain what you found in your study? The main motive of the study was to causally establish an anatomical source of age-related memory loss. A number of labs have shown in the last 10 years that there’s one area of the brain called the dentate gyrus that is linked to the aging process. But no one has tested that concept. Until now the observations have been correlational. There is decreased function in that region and, to prove causation, we were trying to see if we could reverse that. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 13: Homeostasis: Active Regulation of the Internal Environment
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 9: Homeostasis: Active Regulation of the Internal Environment
Link ID: 20245 - Posted: 10.27.2014

By PAUL VITELLO Most adults do not remember anything before the age of 3 or 4, a gap that researchers had chalked up to the vagaries of the still-developing infant brain. By some accounts, the infant brain was just not equipped to remember much. Textbooks referred to the deficiency as infant amnesia. Carolyn Rovee-Collier, a developmental psychologist at Rutgers University who died on Oct. 2 at 72, challenged the theory, showing in a series of papers in the early 1980s that babies remember plenty. A 3-month-old can recall what he or she learned yesterday, she found, and a 9-month-old can remember a game for as long as a month and a half. She cited experiments suggesting that memory processes in adults and infants are virtually the same, and argued that infant memories were never lost. They just become increasingly harder to retrieve as the child grows, learns language and loses touch with the visual triggers that had kept those memories sharp — a view from between the bars of a crib, say, or the view of the floor as a crawler, not a toddler, sees it. Not all of Dr. Rovee-Collier’s theories won over the psychology establishment, which still uses the infant amnesia concept to explain why people do not remember life as a baby. But her insights about an infant’s short-term memory and ability to learn have been widely accepted, and have helped recast scientific thinking about the infant mind over the past 30 years. Since the first of her 200 papers was published, infant cognitive studies has undergone a boom in university programs around the country. It was a field that had been largely unexplored in any systematic way by the giants of psychological theory. Freud and Jean Piaget never directly addressed the subject of infant memory. William James, considered the father of American psychology, once hazarded a guess that the human baby’s mind was a place of “blooming, buzzing confusion.” © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 20233 - Posted: 10.23.2014

By Josie Gurney-Read, Online Education Editor Myths about the brain and how it functions are being used to justify and promote teaching methods that are essentially “ineffective”, according to new research. The study, published today in Nature Reviews Neuroscience, began by presenting teachers in the UK, Turkey, Greece, China and the Netherlands, with seven myths about the brain and asked them whether they believed the myths to be true. According to the figures, over half of teachers in the UK, the Netherlands and China believe that children are less attentive after sugary drinks and snacks and over a quarter of teachers in the UK and Turkey believe that a pupil’s brain will shrink if they drink fewer than six to eight glasses of water a day. Furthermore, over 90 per cent of teachers in all countries believe that a student will learn better if they receive information in their preferred learning style – auditory, visual, kinaesthetic. This is despite the fact that there is "no convincing evidence to support this theory". Dr Paul Howard-Jones, author of the article from Bristol University’s Graduate School of Education, said that many teaching practices are “sold to teachers as based on neuroscience”. However, he added that, in many cases, these ideas have “no educational value and are often associated with poor practice in the classroom.” The prevalence of many of these “neuromyths” in different countries, could reflect the absence of any teacher training in neuroscience, the research concludes. Dr Howard-Jones warned that this could mean that many teachers are “ill-prepared to be critical of ideas and educational programmes that claim a neuroscientific basis.” © Copyright of Telegraph Media Group Limited 2014

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20207 - Posted: 10.16.2014

Ann Robinson Neuroscience research got a huge boost last week with news of Professor John O’Keefe’s Nobel prize for work on the “brain’s internal GPS system”. It is an exciting new part of the giant jigsaw puzzle of our brain and how it functions. But how does cutting-edge neuroscience research translate into practical advice about how to pass exams, remember names, tot up household bills and find where the hell you left the car in a crowded car park? O’Keefe’s prize was awarded jointly with Swedish husband and wife team Edvard and May-Britt Moser for their discovery of “place and grid cells” that allow rats to chart where they are. When rats run through a new environment, these cells show increased activity. The same activity happens much faster while the rats are asleep, as they replay the new route. We already knew that the part of the brain known as the hippocampus was involved in spatial awareness in birds and mammals, and this latest work on place cells sheds more light on how we know where we are and where we’re going. In 2000, researchers at University College London led by Dr Eleanor Maguire showed that London taxi drivers develop a pumped-up hippocampus after years of doing the knowledge and navigating the backstreets of the city. MRI scans showed that cabbies start off with bigger hippocampuses than average, and that the area gets bigger the longer they do the job. As driver David Cohen said at the time to BBC News: “I never noticed part of my brain growing – it makes you wonder what happened to the rest of it!” © 2014 Guardian News and Media Limited

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 20199 - Posted: 10.13.2014

By Carl T. Hall Even Clayton Kershaw, the Los Angeles Dodgers’ pitching ace, makes mistakes now and then. And although very few of his mistakes seemed to do Giants hitters much good this season, a team of San Francisco scientists found a way to take full advantage. A new study by UCSF researchers revealed a tendency of the brain’s motion-control system to run off track in a predictable way when we try to perform the same practiced movement over and over. The scientists found the phenomenon first in macaque monkeys, then documented exactly the same thing in Kershaw’s game video. Although he struggled in a playoff appearance last week, the left-hander’s pitching performance during the regular season was among the best on record. It included a minuscule 1.77 earned run average, a nearly flawless no-hitter in June, 239 strikeouts and only 31 walks. He led the major leagues with 10.85 strikeouts per nine innings pitched. In what turned out to be an early warm-up to the playoffs, UCSF scientists Kris Chaisanguanthum, Helen Shen and Philip Sabes delved into the motor-control system of the primate brain. Their study, published in the Journal of Neuroscience, could help design better prosthetic limbs — or make robots that move less like robots and more like Kershaw. Unlike most machines, our brains seem to never stop trying to adapt to new information, making subtle adjustments each time we repeat a particular movement no matter how practiced. This trial-by-trial form of learning has obvious advantages in a fast-changing world, but also seems prone to drift away from spot-on accuracy as those small adjustments go too far.

Related chapters from BP7e: Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 5: The Sensorimotor System
Link ID: 20193 - Posted: 10.11.2014

|By Tori Rodriguez Imagining your tennis serve or mentally running through an upcoming speech might help you perform better, studies have shown, but the reasons why have been unclear. A common theory is that mental imagery activates some of the same neural pathways involved in the actual experience, and a recent study in Psychological Science lends support to that idea. Scientists at the University of Oslo conducted five experiments investigating whether eye pupils adjust to imagined light as they do to real light, in an attempt to see whether mental imagery can trigger automatic neural processes such as pupil dilation. Using infrared eye-tracking technology, they measured the diameter of participants' pupils as they viewed shapes of varying brightness and as they imagined the shapes they viewed or visualized a sunny sky or a dark room. In response to imagined light, pupils constricted 87 percent as much as they did during actual viewing, on average; in response to imagined darkness, pupils dilated to 56 percent of their size during real perception. Two other experiments ruled out the possibility that participants were able to adjust their pupil size at will or that pupils were changing in response to mental effort, which can cause dilation. The finding helps to explain why imagined rehearsals can improve your game. The mental picture activates and strengthens the very neural circuits—even subconscious ones that control automated processes like pupil dilation—that you will need to recruit when it is time to perform. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 5: The Sensorimotor System
Link ID: 20176 - Posted: 10.08.2014

By LAWRENCE K. ALTMAN A British-American scientist and a pair of Norwegian researchers were awarded this year’s Nobel Prize in Physiology or Medicine on Monday for discovering “an inner GPS in the brain” that enables virtually all creatures to navigate their surroundings. John O’Keefe, 75, a British-American scientist, will share the prize of $1.1 million with May-Britt Moser, 51, and Edvard I. Moser, 52, only the second married couple to win a Nobel in medicine, who will receive the other half. The three scientists’ discoveries “have solved a problem that has occupied philosophers and scientists for centuries — how does the brain create a map of the space surrounding us and how can we navigate our way through a complex environment?” said the Karolinska Institute in Sweden, which chooses the laureates. The positioning system they discovered helps us know where we are, find our way from place to place and store the information for the next time, said Goran K. Hansson, secretary of the Karolinska’s Nobel Committee. The researchers documented that certain cells are responsible for the higher cognitive function that steers the navigational system. Dr. O’Keefe began using neurophysiological methods in the late 1960s to study how the brain controls behavior and sense of direction. In 1971, he discovered the first component of the inner navigational system in rats. He identified nerve cells in the hippocampus region of the brain that were always activated when a rat was at a certain location. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20169 - Posted: 10.07.2014

By Gretchen Vogel Research on how the brain knows where it is has bagged the 2014 Nobel Prize in Physiology or Medicine, the Nobel Committee has announced from Stockholm. One half of the prize goes to John O'Keefe, director of the Sainsbury Wellcome Centre in Neural Circuits and Behaviour at University College London. The other is for a husband-wife couple: May-Britt Moser, who is director of the Centre for Neural Computation in Trondheim, and Edvard Moser, director of the Kavli Institute for Systems Neuroscience in Trondheim. "In 1971, John O´Keefe discovered the first component of this positioning system," the Nobel Committee says in a statement that was just released. "He found that a type of nerve cell in an area of the brain called the hippocampus that was always activated when a rat was at a certain place in a room. Other nerve cells were activated when the rat was at other places. O´Keefe concluded that these “place cells” formed a map of the room." "More than three decades later, in 2005, May‐Britt and Edvard Moser discovered another key component of the brain’s positioning system," the statement goes on to explain. "They identified another type of nerve cell, which they called “grid cells”, that generate a coordinate system and allow for precise positioning and pathfinding. Their subsequent research showed how place and grid cells make it possible to determine position and to navigate." © 2014 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 1: An Introduction to Brain and Behavior
Link ID: 20163 - Posted: 10.06.2014

Alison Abbott The fact that Edvard and May-Britt Moser have collaborated for 30 years — and been married for 28 — has done nothing to dull their passion for the brain. They talk about it at breakfast. They discuss its finer points at their morning lab meeting. And at a local restaurant on a recent summer evening, they are still deep into a back-and-forth about how their own brains know where they are and will guide them home. “Just to walk there, we have to understand where we are now, where we want to go, when to turn and when to stop,” says May-Britt. “It's incredible that we are not permanently lost.” If anyone knows how we navigate home, it is the Mosers. They shot to fame in 2005 with their discovery of grid cells deep in the brains of rats. These intriguing cells, which are also present in humans, work much like the Global Positioning System, allowing animals to understand their location. The Mosers have since carved out a niche studying how grid cells interact with other specialized neurons to form what may be a complete navigation system that tells animals where they are going and where they have been. Studies of grid cells could help to explain how memories are formed, and why recalling events so often involves re-envisioning a place, such as a room, street or landscape. While pursuing their studies, the two scientists have become a phenomenon. Tall and good-looking, they operate like a single brain in two athletic bodies in their generously funded lab in Trondheim, Norway — a remote corner of northern Europe just 350 kilometres south of the Arctic Circle. They publish together and receive prizes as a single unit — most recently, the Nobel Prize in Physiology or Medicine, which they won this week with their former supervisor, neuroscientist John O’Keefe at University College London. In 2007, while still only in their mid-40s, they won a competition by the Kavli Foundation of Oxnard, California, to build and direct one of only 17 Kavli Institutes around the world. The Mosers are now minor celebrities in their home country, and their institute has become a magnet for other big thinkers in neuroscience. “It is definitely intellectually stimulating to be around them,” says neurobiologist Nachum Ulanovsky from the Weizmann Institute of Science in Rehovot, Israel, who visited the Trondheim institute for the first time in September. © 2014 Nature Publishing Grou

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 1: An Introduction to Brain and Behavior
Link ID: 20162 - Posted: 10.06.2014

|By Daisy Yuhas Do we live in a holographic universe? How green is your coffee? And could drinking too much water actually kill you? Before you click those links you might consider how your knowledge-hungry brain is preparing for the answers. A new study from the University of California, Davis, suggests that when our curiosity is piqued, changes in the brain ready us to learn not only about the subject at hand, but incidental information, too. Neuroscientist Charan Ranganath and his fellow researchers asked 19 participants to review more than 100 questions, rating each in terms of how curious they were about the answer. Next, each subject revisited 112 of the questions—half of which strongly intrigued them whereas the rest they found uninteresting—while the researchers scanned their brain activity using functional magnetic resonance imaging (fMRI). During the scanning session participants would view a question then wait 14 seconds and view a photograph of a face totally unrelated to the trivia before seeing the answer. Afterward the researchers tested participants to see how well they could recall and retain both the trivia answers and the faces they had seen. Ranganath and his colleagues discovered that greater interest in a question would predict not only better memory for the answer but also for the unrelated face that had preceded it. A follow-up test one day later found the same results—people could better remember a face if it had been preceded by an intriguing question. Somehow curiosity could prepare the brain for learning and long-term memory more broadly. The findings are somewhat reminiscent of the work of U.C. Irvine neuroscientist James McGaugh, who has found that emotional arousal can bolster certain memories. But, as the researchers reveal in the October 2 Neuron, curiosity involves very different pathways. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20159 - Posted: 10.04.2014

By John Bohannon The victim peers across the courtroom, points at a man sitting next to a defense lawyer, and confidently says, "That's him!" Such moments have a powerful sway on jurors who decide the fate of thousands of people every day in criminal cases. But how reliable is eyewitness testimony? A new report concludes that the use of eyewitness accounts need tighter control, and among its recommendations is a call for a more scientific approach to how eyewitnesses identify suspects during the classic police lineup. For decades, researchers have been trying to nail down what influences eyewitness testimony and how much confidence to place in it. After a year of sifting through the scientific evidence, a committee of psychologists and criminologists organized by the U.S. National Research Council (NRC) has now gingerly weighed in. "This is a serious issue with major implications for our justice system," says committee member Elizabeth Phelps, a psychologist at New York University in New York City. Their 2 October report, Identifying the Culprit: Assessing Eyewitness Identification, is likely to change the way that criminal cases are prosecuted, says Elizabeth Loftus, a psychologist at the University of California, Irvine, who was an external reviewer of the report. As Loftus puts it, "just because someone says something confidently doesn't mean it's true." Jurors can't help but find an eyewitness’s confidence compelling, even though experiments have shown that a person's confidence in their own memory is sometimes undiminished even in the face of evidence that their memory of an event is false. © 2014 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20157 - Posted: 10.04.2014

Wild marmosets in the Brazilian forest can learn quite successfully from video demonstrations featuring other marmosets, Austrian scientists have reported, showing not only that marmosets are even better learners than previously known, but that video can be used successfully in experiments in the wild. Tina Gunhold, a cognitive biologist at the University of Vienna, had worked with a population of marmoset monkeys in a bit of Brazilian forest before this particular experiment. The forest is not wilderness. It lies near some apartment complexes, and the marmosets are somewhat used to human beings. But the monkeys are wild, and each extended family group has its own foraging territory. Dr. Gunhold and her colleagues reported in the journal Biology Letters this month that they had tested 12 family groups, setting up a series of video monitors, each with a kind of complicated box that they called an “artificial fruit.” All the boxes contained food. Six of the monitors showed just an unchanging image of a marmoset near a similar box. Three of them showed a marmoset opening the box by pulling a drawer, and three others a marmoset lifting a lid to get at the food. Marmosets are very territorial and would not tolerate a strange individual on their turf, but the image of a strange marmoset on video didn’t seem to bother them. Individual marmosets “differed in their reactions to the video,” Dr. Gunhold said. “Some were more shy, some more bold. The younger ones were more attracted to the video, perhaps because of greater curiosity.” © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20138 - Posted: 09.30.2014

By David Z. Hambrick, Fernanda Ferreira, and John M. Henderson A decade ago, Magnus Carlsen, who at the time was only 13 years old, created a sensation in the chess world when he defeated former world champion Anatoly Karpov at a chess tournament in Reykjavik, Iceland, and the next day played then-top-rated Garry Kasparov—who is widely regarded as the best chess player of all time—to a draw. Carlsen’s subsequent rise to chess stardom was meteoric: grandmaster status later in 2004; a share of first place in the Norwegian Chess Championship in 2006; youngest player ever to reach World No. 1 in 2010; and highest-rated player in history in 2012. What explains this sort of spectacular success? What makes someone rise to the top in music, games, sports, business, or science? This question is the subject of one of psychology’s oldest debates. In the late 1800s, Francis Galton—founder of the scientific study of intelligence and a cousin of Charles Darwin—analyzed the genealogical records of hundreds of scholars, artists, musicians, and other professionals and found that greatness tends to run in families. For example, he counted more than 20 eminent musicians in the Bach family. (Johann Sebastian was just the most famous.) Galton concluded that experts are “born.” Nearly half a century later, the behaviorist John Watson countered that experts are “made” when he famously guaranteed that he could take any infant at random and “train him to become any type of specialist [he] might select—doctor, lawyer, artist, merchant-chief and, yes, even beggar-man and thief, regardless of his talents.” One player needed 22 times more deliberate practice than another player to become a master. © 2014 The Slate Group LLC.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20136 - Posted: 09.30.2014

by Elijah Wolfson @elijahwolfson The class was the most difficult of the fall 2013 semester, and J.D. Leadam had missed all but one lecture. His grandfather’s health had worsened, and he left San Jose State, where he was studying for a degree in business, to return home to Los Angeles to help out. Before he knew it, midterm exams had almost arrived. At this point, Leadam had, for a while, been playing around with transcranial direct-current stimulation, or tDCS, an experimental treatment for all sorts of health issues that, at its most basic, involves running a very weak electric current through the brain. When he first came across tDCS, Leadam was immediately intrigued but thought, “There’s no way I’m gonna put electrodes on my head. It’s just not going to happen.” After extensive research, though, he changed his mind. He looked into buying a device online, but there wasn’t much available — just one extremely expensive machine and then a bare-bones $40 device that didn’t even have a switch. So he dug around online and figured he could build one himself. He bought all the pieces he needed and put it together. He tried it a few times, but didn’t notice much, so he put it aside. But now, with the test looming, he picked it back up. The professor had written a book, and Leadam knew all the information he’d be tested on was written in its pages. “But I’m an auditory learner,” he said, “so I knew it wouldn’t work to just read it.” He strapped on the device, turned it on and read the chapters. “Nothing,” he thought. But when he got to the classroom and put pen to paper, he had a revelation. “I could remember concepts down to the exact paragraphs in the textbook,” Leadam said. “I actually ended up getting an A on the test. I couldn’t believe it.”

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20130 - Posted: 09.29.2014

By Melissa Dahl Recently, I was visiting my family in Seattle, and we were doing that thing families do: retelling old stories. As we talked, a common theme emerged. My brother hardly remembered anything from our childhood, even the stories in which he was the star player. (That time he fell down the basement steps and needed stitches in the ER? Nope. That panicky afternoon when we all thought he’d disappeared, only to discover he’d been hiding in his room, and then fell asleep? Nothing.) “Boys never remember anything,” my mom huffed. She’s right. Researchers are finding some preliminary evidence that women are indeed better at recalling memories, especially autobiographical ones. Girls and women tend to recall these memories faster and with more specific details, and some studies have demonstrated that these memories tend to be more accurate, too, when compared to those of boys and men. And there’s an explanation for this: It could come down to the way parents talk to their daughters, as compared to their sons, when the children are developing memory skills. To understand this apparent gender divide in recalling memories, it helps to start with early childhood—specifically, ages 2 to 6. Whether you knew it or not, during these years, you learned how to form memories, and researchers believe this happens mostly through conversations with others, primarily our parents. These conversations teach us how to tell our own stories, essentially; when a mother asks her child for more details about something that happened that day in school, for example, she is implicitly communicating that these extra details are essential parts to the story. © 2014 The Slate Group LLC

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 12: Sex: Evolutionary, Hormonal, and Neural Bases
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 8: Hormones and Sex
Link ID: 20100 - Posted: 09.22.2014

By Maria Konnikova At the turn of the twentieth century, Ivan Pavlov conducted the experiments that turned his last name into an adjective. By playing a sound just before he presented dogs with a snack, he taught them to salivate upon hearing the tone alone, even when no food was offered. That type of learning is now called classical—or Pavlovian—conditioning. Less well known is an experiment that Pavlov was conducting at around the same time: when some unfortunate canines heard the same sound, they were given acid. Just as their luckier counterparts had learned to salivate at the noise, these animals would respond by doing everything in their power to get the imagined acid out of their mouths, each “shaking its head violently, opening its mouth and making movements with its tongue.” For many years, Pavlov’s classical conditioning findings overshadowed the darker version of the same discovery, but, in the nineteen-eighties, the New York University neuroscientist Joseph LeDoux revived the technique to study the fear reflex in rats. LeDoux first taught the rats to associate a certain tone with an electric shock so that they froze upon hearing the tone alone. In essence, the rat had formed a new memory—that the tone signifies pain. He then blunted that memory by playing the tone repeatedly without following it with a shock. After multiple shock-less tones, the animals ceased to be afraid. Now a new generation of researchers is trying to figure out the next logical step: re-creating the same effects within the brain, without deploying a single tone or shock. Is memory formation now understood well enough that memories can be implanted and then removed absent the environmental stimulus?

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20097 - Posted: 09.19.2014

By Elizabeth Pennisi "What's for dinner?" The words roll off the tongue without even thinking about it—for adults, at least. But how do humans learn to speak as children? Now, a new study in mice shows how a gene, called FOXP2, implicated in a language disorder may have changed between humans and chimps to make learning to speak possible—or at least a little easier. As a uniquely human trait, language has long baffled evolutionary biologists. Not until FOXP2 was linked to a genetic disorder that caused problems in forming words could they even begin to study language’s roots in our genes. Soon after that discovery, a team at the Max Planck Institute for Evolutionary Anthropology in Leipzig, Germany, discovered that just two bases, the letters that make up DNA, distinguished the human and chimp versions of FOXP2. To try to determine how those changes influenced the gene's function, that group put the human version of the gene in mice. In 2009, they observed that these "humanized" mice produced more frequent and complex alarm calls, suggesting the human mutations may have been involved in the evolution of more complex speech. Another study showed that humanized mice have different activity in the part of the brain called the striatum, which is involved in learning, among other tasks. But the details of how the human FOXP2 mutations might affect real-world learning remained murky. To solve the mystery, the Max Planck researchers sent graduate student Christiane Schreiweis to work with Ann Graybiel, a neuroscientist at the Massachusetts Institute of Technology in Cambridge. She's an expert in testing mouse smarts by seeing how quickly they can learn to find rewards in mazes. © 2014 American Association for the Advancement of Science

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 15: Language and Our Divided Brain
Link ID: 20081 - Posted: 09.16.2014

By BENEDICT CAREY Imagine that on Day 1 of a difficult course, before you studied a single thing, you got hold of the final exam. The motherlode itself, full text, right there in your email inbox — attached mistakenly by the teacher, perhaps, or poached by a campus hacker. No answer key, no notes or guidelines. Just the questions. Would that help you study more effectively? Of course it would. You would read the questions carefully. You would know exactly what to focus on in your notes. Your ears would perk up anytime the teacher mentioned something relevant to a specific question. You would search the textbook for its discussion of each question. If you were thorough, you would have memorized the answer to every item before the course ended. On the day of that final, you would be the first to finish, sauntering out with an A+ in your pocket. And you would be cheating. But what if, instead, you took a test on Day 1 that was just as comprehensive as the final but not a replica? You would bomb the thing, for sure. You might not understand a single question. And yet as disorienting as that experience might feel, it would alter how you subsequently tuned into the course itself — and could sharply improve your overall performance. This is the idea behind pretesting, one of the most exciting developments in learning-­science. Across a variety of experiments, psychologists have found that, in some circumstances, wrong answers on a pretest aren’t merely useless guesses. Rather, the attempts themselves change how we think about and store the information contained in the questions. On some kinds of tests, particularly multiple-choice, we benefit from answering incorrectly by, in effect, priming our brain for what’s coming later. That is: The (bombed) pretest drives home the information in a way that studying as usual does not. We fail, but we fail forward. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20043 - Posted: 09.08.2014