Links for Keyword: Learning & Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 41 - 60 of 886

By PAM BELLUCK What happens to forgotten memories — old computer passwords, friends’ previous phone numbers? Scientists have long held two different theories. One is that memories do not diminish but simply get overshadowed by new memories. The other is that older memories become weaker, that pulling to mind new passwords or phone numbers degrades old recollections so they do not interfere. The difference could be significant. If old memories stay strong and are merely papered over by new ones, they may be easier to recover. That could be positive for someone trying to remember an acquaintance’s name, but difficult for someone trying to lessen memories of abuse. It could suggest different strategies for easing traumatic memories, evaluating witness testimony about crimes, or helping students study for tests. Now, a study claims to provide evidence of memory’s weakening by showing that people’s ability to remember something and the pattern of brain activity that thing generates both appear to diminish when a competing memory gets stronger. Demonstrating sophisticated use of brain scans in memory research, authors of the study, published Monday in the journal Nature Neuroscience, appear to have identified neural fingerprints of specific memories, distinguishing brain activity patterns produced when viewing a picture of a necklace, say, from a picture of binoculars or other objects. The experiment, conducted by scientists in Birmingham and Cambridge, England, involved several stages with 24 participants first trained to associate words to two unrelated black and white pictures from lists of famous people, ordinary objects or scenes. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20695 - Posted: 03.17.2015

By Douglas Starr In 1906, Hugo Münsterberg, the chair of the psychology laboratory at Harvard University and the president of the American Psychological Association, wrote in the Times Magazine about a case of false confession. A woman had been found dead in Chicago, garroted with a copper wire and left in a barnyard, and the simpleminded farmer’s son who had discovered her body stood accused. The young man had an alibi, but after questioning by police he admitted to the murder. He did not simply confess, Münsterberg wrote; “he was quite willing to repeat his confession again and again. Each time it became richer in detail.” The young man’s account, he continued, was “absurd and contradictory,” a clear instance of “the involuntary elaboration of a suggestion” from his interrogators. Münsterberg cited the Salem witch trials, in which similarly vulnerable people were coerced into self-incrimination. He shared his opinion in a letter to a Chicago nerve specialist, which made the local press. A week later, the farmer’s son was hanged. Münsterberg was ahead of his time. It would be decades before the legal and psychological communities began to understand how powerfully suggestion can shape memory and, in turn, the course of justice. In the early nineteen-nineties, American society was recuperating from another panic over occult influence; Satanists had replaced witches. One case, the McMartin Preschool trial, hinged on nine young victims’ memories of molestation and ritual abuse—memories that they had supposedly forgotten and then, after being interviewed, recovered. The case fell apart, in 1990, because the prosecution could produce no persuasive evidence of the victims’ claims. A cognitive psychologist named Elizabeth Loftus, who had consulted on the case, wondered whether the children’s memories might have been fabricated—in Münsterberg’s formulation, involuntarily elaborated—rather than actually recovered.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20679 - Posted: 03.12.2015

Mo Costandi Neuroscientists in France have implanted false memories into the brains of sleeping mice. Using electrodes to directly stimulate and record the activity of nerve cells, they created artificial associative memories that persisted while the animals snoozed and then influenced their behaviour when they awoke. Manipulating memories by tinkering with brain cells is becoming routine in neuroscience labs. Last year, one team of researchers used a technique called optogenetics to label the cells encoding fearful memories in the mouse brain and to switch the memories on and off, and another used it to identify the cells encoding positive and negative emotional memories, so that they could convert positive memories into negative ones, and vice versa. The new work, published today in the journal Nature Neuroscience, shows for the first time that artificial memories can be implanted into the brains of sleeping animals. It also provides more details about how populations of nerve cells encode spatial memories, and about the important role that sleep plays in making such memories stronger. Karim Benchenane of the French National Centre for Scientific Research (CNRS) in Paris and his colleagues implanted electrodes into the brains of 40 mice, targeting the medial forebrain bundle (MFB), a component of the reward circuitry, and the CA1 region of the hippocampus, which contains at least three different cell types that encode the memories needed for spatial navigation. © 2015 Guardian News and Media Limited

Related chapters from BP7e: Chapter 14: Biological Rhythms, Sleep, and Dreaming; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 10: Biological Rhythms and Sleep; Chapter 13: Memory, Learning, and Development
Link ID: 20677 - Posted: 03.10.2015

by Catherine Lawson Over the last six years Adam Gazzaley's research has undergone a transformation. He's moved from studying how the brain works, to studying the brain as it ages, then into the domain of applying methodology he's developed to improve the brain's functions. At WIRED Health 2015 he'll outline his vision of the future, one where "we're thinking about software and hardware as medicine". In particular, Gazzaley plans to talk to the WIRED Health audience about video games "that are custom-designed to challenge the brain in a very particular way". Gazzaley's team at University of California, San Francisco previously demonstrated that a custom-designed video game can be highly effective in treating a specific cognitive deficit. They developed NeuroRacer, a driving game aimed at improving multi-tasking skills in older people. The success of NeuroRacer propelled Gazzaley into new partnerships, giving him access to resources that further advance his games development program into areas like motion capture and virtual reality. He's excited about coupling his games with mobile devices that will allow them to function outside the lab. Gazzaley will talk about four new games he's working on, in particular a meditation-inspired one. Meditrain is the product of his collaboration with Buddhist author and teacher Jack Kornfield. Developed for the iPad, he hopes to demonstrate part of it at WIRED Health.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20593 - Posted: 02.19.2015

Carl Zimmer In 2010, a graduate student named Tamar Gefen got to know a remarkable group of older people. They had volunteered for a study of memory at the Feinberg School of Medicine at Northwestern University. Although they were all over age 80, Ms. Gefen and her colleagues found that they scored as well on memory tests as people in their 50s. Some complained that they remembered too much. She and her colleagues referred to them as SuperAgers. Many were also friends. “A couple tried to set me up with their grandsons,” Ms. Gefen said. She was impressed by their resilience and humor: “It takes wisdom to a whole new level.” Recently, Ms. Gefen’s research has taken a sharp turn. At the outset of the study, the volunteers agreed to donate their brains for medical research. Some of them have died, and it has been Ms. Gefen’s job to look for anatomical clues to their extraordinary minds. “I had this enormous privilege I can’t even begin to describe,” she said. “I knew them and tested them in life and in death. At the end, I was the one looking at them through a microscope.” Ms. Gefen and her colleagues are now starting to publish the results of these post-mortem studies. Last month in The Journal of Neuroscience, the scientists reported that one of the biggest differences involves peculiar, oversize brain cells known as von Economo neurons. SuperAgers have almost five times as many of them as other people. Learning what makes these brains special could help point researchers to treatments for Alzheimer’s disease and other kinds of mental decline. But it is hard to say how an abundance of von Economo neurons actually helps the brain. © 2015 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 20577 - Posted: 02.13.2015

By Amy Ellis Nutt When we tell stories about our lives, most of us never have our memories questioned. NBC's Brian Williams, like other high-profile people in the past, is finding out what happens when questions arise. Williams's faux pas – retelling a story of his helicopter coming under fire in Iraq a dozen years ago when it was actually the helicopter flying ahead of him – was much like Hillary Rodham Clinton's during the 2008 presidential campaign. Her story was about coming under fire during a visit to an airfield in Bosnia 12 years earlier. George W. Bush also misremembered when, on several occasions, he told audiences that on 9/11 he watched the first plane fly into the north tower of the World Trade Center on TV, just before entering that classroom in Florida to read a book to school kids. In each case, these were highly emotional moments. Williams's helicopter made an emergency landing in the desert behind the aircraft that was hit; Clinton was made to don a flak jacket and was told her airplane might not be able to land at the airport in Bosnia because of sniper fire in the area; and Bush was told by an aide about the first crash into World Trade Center just before entering the classroom. That each of those memories was false created huge public relations headaches for Clinton and Williams. But the fact is that false memories are not that uncommon, especially when they involve highly emotional events. Scientists have been telling us for years that memory of autobiographical events, also known as episodic memory, is pliable and even unreliable. The consensus from neuroimaging studies and laboratory experiments is that episodic memory is not like replaying a film but more like reconstructing an event from bits and pieces of information. Memories are stored in clusters of neurons called engrams, and the proteins responsible for storing those memories, scientists say, are modified and changed just by the reconstruction process of remembering.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20566 - Posted: 02.09.2015

By Kate Baggaley Stem cells can help heal long-term brain damage suffered by rats blasted with radiation, researchers report in the Feb. 5 Cell Stem Cell. The treatment allows the brain to rebuild the insulation on its nerve cells so they can start carrying messages again. The researchers directed human stem cells to become a type of brain cell that is destroyed by radiation, a common cancer treatment, then grafted the cells into the brains of irradiated rats. Within a few months, the rats’ performance on learning and memory tests improved. “This technique, translated to humans, could be a major step forward for the treatment of radiation-induced brain … injury,” says Jonathan Glass, a neurologist at Emory University in Atlanta. Steve Goldman, a neurologist at the University of Rochester in New York, agrees that the treatment could repair a lot of the damage caused by radiation. “Radiation therapy … is very effective, but the problem is patients end up with severe disability,” he says. “Fuzzy thinking, a loss in higher intellectual functions, decreases in memory — all those are part and parcel of radiation therapy to the brain.” For children, the damage can be profound. “Those kids have really significant detriments in their adult IQs,” Goldman says. Radiation obliterates cells that mature into oligodendrocytes, a type of cell that coats the message-carrying part of nerve cells with insulation. Without that cover, known as the myelin sheath, nerve cells can’t transmit information, leading to memory and other brain problems. © Society for Science & the Public 2000 - 2015

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 2: Functional Neuroanatomy: The Nervous System and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 2: Cells and Structures: The Anatomy of the Nervous System
Link ID: 20563 - Posted: 02.07.2015

By Maria Konnikova R. T. first heard about the Challenger explosion as she and her roommate sat watching television in their Emory University dorm room. A news flash came across the screen, shocking them both. R. T., visibly upset, raced upstairs to tell another friend the news. Then she called her parents. Two and a half years after the event, she remembered it as if it were yesterday: the TV, the terrible news, the call home. She could say with absolute certainty that that’s precisely how it happened. Except, it turns out, none of what she remembered was accurate. R. T. was a student in a class taught by Ulric Neisser, a cognitive psychologist who had begun studying memory in the seventies. Early in his career, Neisser became fascinated by the concept of flashbulb memories—the times when a shocking, emotional event seems to leave a particularly vivid imprint on the mind. William James had described such impressions, in 1890, as “so exciting emotionally as almost to leave a scar upon the cerebral tissues.” The day following the explosion of the Challenger, in January, 1986, Neisser, then a professor of cognitive psychology at Emory, and his assistant, Nicole Harsch, handed out a questionnaire about the event to the hundred and six students in their ten o’clock psychology 101 class, “Personality Development.” Where were the students when they heard the news? Whom were they with? What were they doing? The professor and his assistant carefully filed the responses away. In the fall of 1988, two and a half years later, the questionnaire was given a second time to the same students. It was then that R. T. recalled, with absolute confidence, her dorm-room experience. But when Neisser and Harsch compared the two sets of answers, they found barely any similarities.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20548 - Posted: 02.05.2015

By BENEDICT CAREY The surge of emotion that makes memories of embarrassment, triumph and disappointment so vivid can also reach back in time, strengthening recall of seemingly mundane things that happened just beforehand and that, in retrospect, are relevant, a new study has found. The report, published Wednesday in the journal Nature, suggests that the television detective’s standard query — “Do you remember any unusual behavior in the days before the murder?” — is based on solid brain science, at least in some circumstances. The findings fit into the predominant theory of memory: that it is an adaptive process, continually updating itself according to what knowledge may be important in the future. The new study suggests that human memory has, in effect, a just-in-case file, keeping seemingly trivial sights, sounds and observations in cold storage for a time in case they become useful later on. But the experiment said nothing about the effect of trauma, which shapes memory in unpredictable ways. Rather, it aimed to mimic the arousals of daily life: The study used mild electric shocks to create apprehension and measured how the emotion affected memory of previously seen photographs. In earlier work, researchers had found plenty of evidence in animals and humans of this memory effect, called retroactive consolidation. The new study shows that the effect applies selectively to related, relevant information. “The study provides strong evidence for a specific kind of retroactive enhancement,” said Daniel L. Schacter, a professor of psychology at Harvard who was not involved in the research. “The findings go beyond what we’ve found previously in humans.” © 2015 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20507 - Posted: 01.22.2015

Closing your eyes when trying to recall events increases the chances of accuracy, researchers at the University of Surrey suggest. Scientists tested people's ability to remember details of films showing fake crime scenes. They hope the studies will help witnesses recall details more accurately when questioned by police. They say establishing a rapport with the person asking the questions can also help boost memory. Writing in the journal Legal and Criminological Psychology, scientists tested 178 participants in two separate experiments. In the first, they asked volunteers to watch a film showing an electrician entering a property, carrying out work and then stealing a number of items. Volunteers were then questioned in one of four groups. People were either asked questions with their eyes open or closed, and after a sense of rapport had been built with the interviewer or no attempt had been made to create a friendly introduction. People who had some rapport with their interviewer and had their eyes shut throughout questioning answered three-quarters of the 17 questions correctly. But those who did not have a friendly introduction with the interviewer and had their eyes open answered 41% correctly. The analysis showed that eye closing had the strongest impact on remembering details correctly ,but that feeling comfortable during the interview also helped. In the second experiment, people were asked to remember details of what they had heard during a mock crime scene. © 2015 BBC

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 20493 - Posted: 01.17.2015

by Michael Hotchkiss Forget about it. Your brain is a memory powerhouse, constantly recording experiences in long-term memory. Those memories help you find your way through the world: Who works the counter each morning at your favorite coffee shop? How do you turn on the headlights of your car? What color is your best friend's house? But then your barista leaves for law school, you finally buy a new car and your buddy spends the summer with a paint brush in hand. Suddenly, your memories are out of date. What happens next? An experiment conducted by researchers from Princeton University and the University of Texas-Austin shows that the human brain uses memories to make predictions about what it expects to find in familiar contexts. When those subconscious predictions are shown to be wrong, the related memories are weakened and are more likely to be forgotten. And the greater the error, the more likely you are to forget the memory. "This has the benefit ultimately of reducing or eliminating noisy or inaccurate memories and prioritizing those things that are more reliable and that are more accurate in terms of the current state of the world," said Nicholas Turk-Browne, an associate professor of psychology at Princeton and one of the researchers. The research was featured in an article, "Pruning of memories by context-based prediction error," that appeared in 2014 in the Proceedings of the National Academy of Sciences. The other co-authors are Ghootae Kim, a Princeton graduate student; Jarrod Lewis-Peacock, an assistant professor of psychology at the University of Texas-Austin; and Kenneth Norman, a Princeton professor of psychology and the Princeton Neuroscience Institute. © Medical Xpress 2011-2014,

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20469 - Posted: 01.10.2015

George Johnson Training a dog to salivate at the sound of a bell would have seemed pretty stupid to Ivan Pavlov. He was after much bigger things. Using instruments like metronomes and harmoniums, he demonstrated that a dog could make astonishingly fine discriminations — distinguishing between a rhythm of 96 and 104 beats a minute or an ascending and a descending musical scale. But what he really wanted to know was what his animals were thinking. His dream was a grand theory of the mind. He couldn’t put his subjects on a couch like his colleague Freud and ask them to free-associate, so he gauged their reactions to a variety of stimuli, meticulously counting their “psychic secretions,” those droplets of drool. He knew he was pricking at the skin of something deeper. “It would be stupid,” he said, “to reject the subjective world.” This is not the Pavlov most people think they know. In an excellent new biography, “Ivan Pavlov: A Russian Life in Science,” Daniel P. Todes, a medical historian, describes a man whose laboratory in pre-Soviet Russia was like an early-20th-century version of the White House Brain Initiative, with its aim “to revolutionize our understanding of the human mind.” That was also Pavlov’s goal: to build a science that would “brightly illuminate our mysterious nature” and “our consciousness and its torments.” He spoke those words 111 years ago and spent his life pursuing his goal. Yet when we hear his name, we reflexively think of a drooling dog and a clanging bell. Our brains have been conditioned with the myth. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20439 - Posted: 12.23.2014

By David Noonan It was the day before Christmas, and the normally busy MIT laboratory on Vassar Street in Cambridge was quiet. But creatures were definitely stirring, including a mouse that would soon be world famous. Steve Ramirez, a 24-year-old doctoral student at the time, placed the mouse in a small metal box with a black plastic floor. Instead of curiously sniffing around, though, the animal instantly froze in terror, recalling the experience of receiving a foot shock in that same box. It was a textbook fear response, and if anything, the mouse’s posture was more rigid than Ramirez had expected. Its memory of the trauma must have been quite vivid. Which was amazing, because the memory was bogus: The mouse had never received an electric shock in that box. Rather, it was reacting to a false memory that Ramirez and his MIT colleague Xu Liu had planted in its brain. “Merry Freaking Christmas,” read the subject line of the email Ramirez shot off to Liu, who was spending the 2012 holiday in Yosemite National Park. The observation culminated more than two years of a long-shot research effort and supported an extraordinary hypothesis: Not only was it possible to identify brain cells involved in the encoding of a single memory, but those specific cells could be manipulated to create a whole new “memory” of an event that never happened. “It’s a fantastic feat,” says Howard Eichenbaum, a leading memory researcher and director of the Center for Neuroscience at Boston University, where Ramirez did his undergraduate work. “It’s a real breakthrough that shows the power of these techniques to address fundamental questions about how the brain works.” In a neuroscience breakthrough, the duo implanted a false memory in a mouse

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20418 - Posted: 12.16.2014

By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20409 - Posted: 12.13.2014

|By Bret Stetka When University of Bonn psychologist Monika Eckstein designed her latest published study, the goal was simple: administer a hormone into the noses of 62 men in hopes that their fear would go away. And for the most part, it did. The hormone was oxytocin, often called our “love hormone” due to its crucial role in mother-child relationships, social bonding, and intimacy (levels soar during sex). But it also seems to have a significant antianxiety effect. Give oxytocin to people with certain anxiety disorders, and activity in the amygdala—the primary fear center in human and other mammalian brains, two almond-shaped bits of brain tissue sitting deep beneath our temples—falls. The amygdala normally buzzes with activity in response to potentially threatening stimuli. When an organism repeatedly encounters a stimulus that at first seemed frightening but turns out to be benign—like, say, a balloon popping—a brain region called the prefrontal cortex inhibits amygdala activity. But in cases of repeated presentations of an actual threat, or in people with anxiety who continually perceive a stimulus as threatening, amygdala activity doesn’t subside and fear memories are more easily formed. To study the effects of oxytocin on the development of these fear memories, Eckstein and her colleagues first subjected study participants to Pavlovian fear conditioning, in which neutral stimuli (photographs of faces and houses) were sometimes paired with electric shocks. Subjects were then randomly assigned to receive either a single intranasal dose of oxytocin or a placebo. Thirty minutes later they received functional MRI scans while undergoing simultaneous fear extinction therapy, a standard approach to anxiety disorders in which patients are continually exposed to an anxiety-producing stimulus until they no longer find it stressful. In this case they were again exposed to images of faces and houses, but this time minus the electric shocks. © 2014 Scientific American

Related chapters from BP7e: Chapter 15: Emotions, Aggression, and Stress; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 11: Emotions, Aggression, and Stress; Chapter 13: Memory, Learning, and Development
Link ID: 20397 - Posted: 12.06.2014

By CHRISTOPHER F. CHABRIS and DANIEL J. SIMONS NEIL DEGRASSE TYSON, the astrophysicist and host of the TV series “Cosmos,” regularly speaks to audiences on topics ranging from cosmology to climate change to the appalling state of science literacy in America. One of his staple stories hinges on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” — meaning to divide Judeo-Christian Americans from fundamentalist Muslims — Mr. Bush uttered the words “Our God is the God who named the stars.” Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy — and Mr. Bush might not have said what he did if he had known this fact. This is a powerful example of how our biases can blind us. But not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory. In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush had indeed once said something like what Dr. Tyson remembered; in 2003 Mr. Bush said, in tribute to the astronauts lost in the Columbia space shuttle explosion, that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20387 - Posted: 12.03.2014

|By David Z. Hambrick If you’ve spent more than about 5 minutes surfing the web, listening to the radio, or watching TV in the past few years, you will know that cognitive training—better known as “brain training”—is one of the hottest new trends in self improvement. Lumosity, which offers web-based tasks designed to improve cognitive abilities such as memory and attention, boasts 50 million subscribers and advertises on National Public Radio. Cogmed claims to be “a computer-based solution for attention problems caused by poor working memory,” and BrainHQ will help you “make the most of your unique brain.” The promise of all of these products, implied or explicit, is that brain training can make you smarter—and make your life better. Yet, according to a statement released by the Stanford University Center on Longevity and the Berlin Max Planck Institute for Human Development, there is no solid scientific evidence to back up this promise. Signed by 70 of the world’s leading cognitive psychologists and neuroscientists, the statement minces no words: "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based “brain games” alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease." The statement also cautions that although some brain training companies “present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training…the cited research is [often] only tangentially related to the scientific claims of the company, and to the games they sell.” © 2014 Scientific American,

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20380 - Posted: 12.02.2014

by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20375 - Posted: 12.01.2014

By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20364 - Posted: 11.29.2014

By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20359 - Posted: 11.26.2014