Links for Keyword: Learning & Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 21 - 40 of 819

Memory can be boosted by using a magnetic field to stimulate part of the brain, a study has shown. The effect lasts at least 24 hours after the stimulation is given, improving the ability of volunteers to remember words linked to photos of faces. Scientists believe the discovery could lead to new treatments for loss of memory function caused by ageing, strokes, head injuries and early Alzheimer's disease. Dr Joel Voss, from Northwestern University in Chicago, said: "We show for the first time that you can specifically change memory functions of the brain in adults without surgery or drugs, which have not proven effective. "This non-invasive stimulation improves the ability to learn new things. It has tremendous potential for treating memory disorders." The scientists focused on associative memory, the ability to learn and remember relationships between unrelated items. An example of associative memory would be linking someone to a particular restaurant where you both once dined. It involves a network of different brain regions working in concert with a key memory structure called the hippocampus, which has been compared to an "orchestra conductor" directing brain activity. Stimulating the hippocampus caused the "musicians" – the brain regions – to "play" more in time, thereby tightening up their performance. A total of 16 volunteers aged 21-40 took part in the study, agreeing to undergo 20 minutes of transcranial magnetic stimulation (TMS) every day for five days. © 2014 Guardian News and Media Limited

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20015 - Posted: 08.30.2014

by Michael Slezak It's odourless, colourless, tasteless and mostly non-reactive – but it may help you forget. Xenon gas has been shown to erase fearful memories in mice, raising the possibility that it could be used to treat post-traumatic stress disorder (PTSD) if the results are replicated in a human trial next year. The method exploits a neurological process known as "reconsolidation". When memories are recalled, they seem to get re-encoded, almost like a new memory. When this process is taking place, the memories become malleable and can be subtly altered. This new research suggests that at least in mice, the reconsolidation process might be partially blocked by xenon, essentially erasing fearful memories. Among other things, xenon is used as an anaesthetic. Frozen in fear Edward Meloni and his colleagues at Harvard Medical School in Boston trained mice to be afraid of a sound by placing them in a cage and giving them an electric shock after the sound was played. Thereafter, if the mice heard the noise, they would become frightened and freeze. Later, the team played the sound and then gave the mice either a low dose of xenon gas for an hour or just exposed them to normal air. Mice that were exposed to xenon froze for less time in response to the sound than the other mice. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 20014 - Posted: 08.30.2014

By PAM BELLUCK Memories and the feelings associated with them are not set in stone. You may have happy memories about your family’s annual ski vacation, but if you see a tragic accident on the slopes, those feelings may change. You might even be afraid to ski that mountain again. Now, using a technique in which light is used to switch neurons on and off, neuroscientists at the Massachusetts Institute of Technology appear to have unlocked some secrets about how the brain attaches emotions to memories and how those emotions can be adjusted. Their research, published Wednesday in the journal Nature, was conducted on mice, not humans, so the findings cannot immediately be translated to the treatment of patients. But experts said the experiments may eventually lead to more effective therapies for people with psychological problems such as depression, anxiety or post-traumatic stress disorder. “Imagine you can go in and find a particular traumatic memory and turn it off or change it somehow,” said David Moorman, an assistant professor of psychological and brain sciences at the University of Massachusetts Amherst, who was not involved in the research. “That’s still science fiction, but with this we’re getting a lot closer to it.” The M.I.T. scientists labeled neurons in the brains of mice with a light-sensitive protein and used pulses of light to switch the cells on and off, a technique called optogenetics. Then they identified patterns of neurons activated when mice created a negative memory or a positive one. A negative memory formed when mice received a mild electric shock to their feet; a positive one was formed when the mice, all male, were allowed to spend time with female mice. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20010 - Posted: 08.28.2014

by Penny Sarchet Memory is a fickle beast. A bad experience can turn a once-loved coffee shop or holiday destination into a place to be avoided. Now experiments in mice have shown how such associations can be reversed. When forming a memory of a place, the details of the location and the associated emotions are encoded in different regions of the brain. Memories of the place are formed in the hippocampus, whereas positive or negative associations are encoded in the amygdala. In experiments with mice in 2012, a group led by Susumo Tonegawa of the Massachusetts Institute of Technology managed to trigger the fear part of a memory associated with a location when the animals were in a different location. They used a technique known as optogenetics, which involves genetically engineering mice so that their brains produce a light-sensitive protein in response to a certain cue. In this case, the cue was the formation of the location memory. This meant the team could make the mouse recall the location just by flashing pulses of light down an optical fibre embedded in the skull. The mice were given electric shocks while their memories of the place were was being formed, so that the animals learned to associate that location with pain. Once trained, the mice were put in a new place and a pulse of light was flashed into their brains. This activated the neurons associated with the original location memory and the mice froze, terrified of a shock, demonstrating that the emotion associated with the original location could be induced by reactivating the memory of the place. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 20009 - Posted: 08.28.2014

Learning is easier when it only requires nerve cells to rearrange existing patterns of activity than when the nerve cells have to generate new patterns, a study of monkeys has found. The scientists explored the brain’s capacity to learn through recordings of electrical activity of brain cell networks. The study was partly funded by the National Institutes of Health. “We looked into the brain and may have seen why it’s so hard to think outside the box,” said Aaron Batista, Ph.D., an assistant professor at the University of Pittsburgh and a senior author of the study published in Nature, with Byron Yu, Ph.D., assistant professor at Carnegie Mellon University, Pittsburgh. The human brain contains nearly 86 billion neurons, which communicate through intricate networks of connections. Understanding how they work together during learning can be challenging. Dr. Batista and his colleagues combined two innovative technologies, brain-computer interfaces and machine learning, to study patterns of activity among neurons in monkey brains as the animals learned to use their thoughts to move a computer cursor. “This is a fundamental advance in understanding the neurobiological patterns that underlie the learning process,” said Theresa Cruz, Ph.D., a program official at the National Center for Medical Rehabilitations Research at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD). “The findings may eventually lead to new treatments for stroke as well as other neurological disorders.”

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 11: Motor Control and Plasticity
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 5: The Sensorimotor System
Link ID: 20008 - Posted: 08.28.2014

|By Roni Jacobson Children are notoriously unreliable witnesses. Conventional wisdom holds that they frequently “remember” things that never happened. Yet a large body of research indicates that adults actually generate more false memories than children. Now a new study finds that children are just as susceptible to false memories as adults, if not more so. Scientists may simply have been using the wrong test. Traditionally, researchers have explored false memories by presenting test subjects with a list of associated words (for instance, “weep,” “sorrow” and “wet”) thematically related to a word not on the list (in this case, “cry”) and then asking them what words they remember. Adults typically mention the missing related word more often than children do—possibly because their life experiences enable them to draw associations between concepts more readily, says Henry Otgaar, a forensic psychologist at Maastricht University in the Netherlands and co-author of the new paper, published in May in the Journal of Experimental Child Psychology. Instead of using word lists to investigate false memories, Otgaar and his colleagues showed participants pictures of scenes, including a classroom, a funeral and a beach. After a short break, they asked those participants whether they remembered seeing certain objects in each picture. Across three experiments, seven- and eight-year-old children consistently reported seeing more objects that were not in the pictures than adults did. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 19999 - Posted: 08.27.2014

|By Jason G. Goldman When you do not know the answer to a question, say, a crossword puzzle hint, you realize your shortcomings and devise a strategy for finding the missing information. The ability to identify the state of your knowledge—thinking about thinking—is known as metacognition. It is hard to tell whether other animals are also capable of metacognition because we cannot ask them; studies of primates and birds have not yet been able to rule out simpler explanations for this complex process. Scientists know, however, that some animals, such as western scrub jays, can plan for the future. Western scrub jays, corvids native to western North America, are a favorite of cognitive scientists because they are not “stuck in time”—that is, they are able to remember past events and are known to cache their food in anticipation of hunger, according to psychologist Arii Watanabe of the University of Cambridge. But the question remained: Are they aware that they are planning? Watanabe devised a way to test them. He let five birds watch two researchers hide food, in this case a wax worm. The first researcher could hide the food in any of four cups lined up in front of him. The second had three covered cups, so he could place the food only in the open one. The trick was that the researchers hid their food at the same time, forcing the birds to choose which one to watch. If the jays were capable of metacognition, Watanabe surmised, the birds should realize that they could easily find the second researcher's food. The wax worm had to be in the singular open cup. They should instead prefer keeping their eyes on the setup with four open cups because witnessing where that food went would prove more useful in the future. And that is exactly what happened: the jays spent more time watching the first researcher. The results appeared in the July issue of the journal © 2014 Scientific American,

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19985 - Posted: 08.22.2014

by Sarah Zielinski PRINCETON, N.J. — Learning can be a quick shortcut for figuring out how to do something on your own. The ability to learn from watching another individual — called social learning — is something that hasn’t been documented in many species outside of primates and birds. But now a lizard can be added to the list of critters that can learn from one another. Young eastern water skinks were able to learn by watching older lizards, Martin Whiting of Macquarie University in Sydney reported August 10 at the Animal Behavior Society meeting at Princeton University. The eastern water skink, which reaches a length of about 30 centimeters, can be found near streams and waterways in eastern Australia. The lizards live up to eight years, and while they don’t live in groups, they often see each other in the wild. That could provide an opportunity for learning from each other. Whiting and his colleagues worked with 18 mature (older than 5 years) and 18 young (1.5 to 2 years) male skinks in the lab. The lizards were placed in bins with a barrier in the middle that was either opaque or transparent. In the first of two experiments, the skinks were given a yellow-lidded container with a mealworm inside. They had to learn to open the lid to get the food. In that task, skinks that could see a demonstrator through a transparent barrier were no better at opening the lid than those who had to figure it out on their own. After watching a demonstrator lizard (top row), the skink in the other half of the tub was supposed to have learned that a mealworm was beneath the blue lid. The skink in the middle arena, however, failed the task when he opened the white lid first.D.W.A. Noble et al/Biology Letters 2014 © Society for Science & the Public 2000 - 2013.

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 6: Evolution of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19984 - Posted: 08.22.2014

|By Annie Sneed It's easy to recall events of decades past—birthdays, high school graduations, visits to Grandma—yet who can remember being a baby? Researchers have tried for more than a century to identify the cause of “infantile amnesia.” Sigmund Freud blamed it on repression of early sexual experiences, an idea that has been discredited. More recently, researchers have attributed it to a child's lack of self-perception, language or other mental equipment required to encode memories. Neuroscientists Paul Frankland and Sheena Josselyn, both at the Hospital for Sick Children in Toronto, do not think linguistics or a sense of self offers a good explanation, either. It so happens that humans are not the only animals that experience infantile amnesia. Mice and monkeys also forget their early childhood. To account for the similarities, Frankland and Josselyn have another theory: the rapid birth of many new neurons in a young brain blocks access to old memories. In a new experiment, the scientists manipulated the rate at which hippocampal neurons grew in young and adult mice. The hippocampus is the region in the brain that records autobiographical events. The young mice with slowed neuron growth had better long-term memory. Conversely, the older mice with increased rates of neuron formation had memory loss. Based on these results, published in May in the journal Science, Frankland and Josselyn think that rapid neuron growth during early childhood disrupts the brain circuitry that stores old memories, making them inaccessible. Young children also have an underdeveloped prefrontal cortex, another region of the brain that encodes memories, so infantile amnesia may be a combination of these two factors. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 19901 - Posted: 07.31.2014

By DOUGLAS QUENQUA Like Pavlov’s dogs, most organisms can learn to associate two events that usually occur together. Now, a team of researchers says they have identified a gene that enables such learning. The scientists, at the University of Tokyo, found that worms could learn to avoid unpleasant situations as long as a specific insulin receptor remained intact. Roundworms were exposed to different concentrations of salt; some received food during the initial exposure, others did not. Later, when exposed to various concentrations of salt again, the roundworms that had been fed during the first stage gravitated toward their initial salt concentrations, while those that had been starved avoided them. But the results changed when the researchers repeated the experiment using worms with a defect in a particular receptor for insulin, a protein crucial to metabolism. Those worms could not learn to avoid the salt concentrations associated with starvation. “We looked for different forms of the receptor and found that a new one, which we named DAF-2c, functions in taste-aversion learning,” said Masahiro Tomioka, a geneticist at the University of Tokyo and an author of the study, which was published in the journal Science. “It turned out that only this form of the receptor can support learning” in roundworms. While human insulin receptors bear some resemblance to those of a roundworm, more study is needed to determine if it plays a similar role in memory and decision-making, Dr. Tomioka said. But studies have suggested a link between insulin levels and Alzheimer’s disease in humans. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 19888 - Posted: 07.28.2014

By HENRY L. ROEDIGER III TESTS have a bad reputation in education circles these days: They take time, the critics say, put students under pressure and, in the case of standardized testing, crowd out other educational priorities. But the truth is that, used properly, testing as part of an educational routine provides an important tool not just to measure learning, but to promote it. In one study I published with Jeffrey D. Karpicke, a psychologist at Purdue, we assessed how well students remembered material they had read. After an initial reading, students were tested on some passages by being given a blank sheet of paper and asked to recall as much as possible. They recalled about 70 percent of the ideas. Other passages were not tested but were reread, and thus 100 percent of the ideas were re-exposed. In final tests given either two days or a week later, the passages that had been tested just after reading were remembered much better than those that had been reread. What’s at work here? When students are tested, they are required to retrieve knowledge from memory. Much educational activity, such as lectures and textbook readings, is aimed at helping students acquire and store knowledge. Various kinds of testing, though, when used appropriately, encourage students to practice the valuable skill of retrieving and using knowledge. The fact of improved retention after a quiz — called the testing effect or the retrieval practice effect — makes the learning stronger and embeds it more securely in memory. This is vital, because many studies reveal that much of what we learn is quickly forgotten. Thus a central challenge to learning is finding a way to stem forgetting. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19861 - Posted: 07.21.2014

By Emily Anthes The women that come to see Deane Aikins, a clinical psychologist at Wayne State University, in Detroit, are searching for a way to leave their traumas behind them. Veterans in their late 20s and 30s, they served in Iraq and Afghanistan. Technically, they’d been in non-combat positions, but that didn’t eliminate the dangers of warfare. Mortars and rockets were an ever-present threat on their bases, and they learned to sleep lightly so as not to miss alarms signaling late-night attacks. Some of the women drove convoys of supplies across the desert. It was a job that involved worrying about whether a bump in the road was an improvised explosive device, or if civilians in their path were strategic human roadblocks. On top of all that, some of the women had been sexually assaulted by their military colleagues. After one woman was raped, she helped her drunk assailant sneak back into his barracks because she worried that if they were caught, she’d be disciplined or lose her job. These traumas followed the women home. Today, far from the battlefield, they find themselves struggling with vivid flashbacks and nightmares, tucking their guns under their pillows at night. Some have turned to alcohol to manage their symptoms; others have developed exhausting routines to avoid any people or places that might trigger painful memories and cause them to re-live their experiences in excruciating detail. © 2014 Nautilus,

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 12: Psychopathology: Biological Basis of Behavioral Disorders
Link ID: 19853 - Posted: 07.19.2014

Fearful memories can be dampened by imagining past traumas in a safe setting. The "extinction" of fear is fragile, however, and surprising or unexpected events can cause fear memories to return. Inactivating brain areas that detect novelty prevents relapse of unwanted fear memories. Traumatic and emotional experiences often lead to debilitating mental health disorders, including post-traumatic stress disorder (PTSD). In the clinic, it is typical to use behavioral therapies such as exposure therapy to help reduce fear in patients suffering from traumatic memories. Using these approaches, patients are asked to remember the circumstances and stimuli surrounding their traumatic memory in a safe setting in order to "extinguish" their fear response to those events. While effective in many cases, the loss of fear and anxiety achieved by these therapies is often short-lived—fear returns or relapses under a variety of conditions. Many years ago, the famous Russian physiologist Ivan Pavlov noted that simply exposing animals to novel or unexpected events could cause extinguished responses (such as salivary responses to sounds) to return. Might exposure to novelty also cause extinguished fear responses to return? In a recent study (Maren, 2014), rats first learned that an innocuous tone predicted an aversive (but mild) electric shock to their feet. The subsequent fear response to the tone was then extinguished by presenting the stimulus to the animals many times without the shock. After the fear response to the tone was reduced with the extinction procedure, they were then presented with the tone in either a new location (a novel test box) or in a familiar location, but in the presence of an unexpected sound (a noise burst). In both cases, fear to the tone returned as Pavlov predicted: the unexpected places and sounds led to a disinhibition of fear—in other words, fear relapsed. © 2014 Publiscize

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 16: Psychopathology: Biological Basis of Behavior Disorders
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 12: Psychopathology: Biological Basis of Behavioral Disorders
Link ID: 19852 - Posted: 07.19.2014

Kelly Servick If you’re a bird enthusiast, you can pick out the “chick-a-DEE-dee” song of the Carolina chickadee with just a little practice. But if you’re an environmental scientist faced with parsing thousands of hours of recordings of birdsongs in the lab, you might want to enlist some help from your computer. A new approach to automatic classification of birdsong borrows techniques from human voice recognition software to sort through the sounds of hundreds of species and decides on its own which features make each one unique. Collectors of animal sounds are facing a data deluge. Thanks to cheap digital recording devices that can capture sound for days in the field, “it’s really, really easy to collect sound, but it’s really difficult to analyze it,” say Aaron Rice, a bioacoustics researcher at Cornell University, who was not involved in the new work. His lab has collected 6 million hours of underwater recordings, from which they hope to pick out the signature sounds of various marine mammals. Knowing where and when a certain species is vocalizing might help scientists understand habitat preferences, track their movements or population changes, and recognize when a species is disrupted by human development. But to keep these detailed records, researchers rely on software that can reliably sort through the cacophony they capture in the field. Typically, scientists build one computer program to recognize one species, and then start all over for another species, Rice says. Training a computer to recognize lots of species in one pass is “a challenge that we’re all facing.” © 2014 American Association for the Advancement of Science.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19849 - Posted: 07.19.2014

By BENEDICT CAREY The 8-year-old juggling a soccer ball and the 48-year-old jogging by, with Japanese lessons ringing from her earbuds, have something fundamental in common: At some level, both are wondering whether their investment of time and effort is worth it. How good can I get? How much time will it take? Is it possible I’m a natural at this (for once)? What’s the percentage in this, exactly? Scientists have long argued over the relative contributions of practice and native talent to the development of elite performance. This debate swings back and forth every century, it seems, but a paper in the current issue of the journal Psychological Science illustrates where the discussion now stands and hints — more tantalizingly, for people who just want to do their best — at where the research will go next. The value-of-practice debate has reached a stalemate. In a landmark 1993 study of musicians, a research team led by K. Anders Ericsson, a psychologist now at Florida State University, found that practice time explained almost all the difference (about 80 percent) between elite performers and committed amateurs. The finding rippled quickly through the popular culture, perhaps most visibly as the apparent inspiration for the “10,000-hour rule” in Malcolm Gladwell’s best-selling “Outliers” — a rough average of the amount of practice time required for expert performance. Scientists begin to shed light on the placenta, an important organ that we rarely think of; virtual reality companies work out the kinks in their immersive worlds; research shows that practice may not be as important as once thought. The new paper, the most comprehensive review of relevant research to date, comes to a different conclusion. Compiling results from 88 studies across a wide range of skills, it estimates that practice time explains about 20 percent to 25 percent of the difference in performance in music, sports and games like chess. In academics, the number is much lower — 4 percent — in part because it’s hard to assess the effect of previous knowledge, the authors wrote. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19835 - Posted: 07.15.2014

By Gary Stix Popular neuroscience books have made much in recent years of the possibility that the adult brain is capable of restoring lost function or even enhancing cognition through sustained mental or physical activities. One piece of evidence often cited is a 14-year-old study that that shows that London taxi drivers have enlarged hippocampi, brain areas that store a mental map of one’s surroundings. Taxi drivers, it is assumed, have better spatial memory because they must constantly distinguish the streets and landmarks of Shepherd’s Bush from those of Brixton. A mini-industry now peddles books with titles like The Brain that Changes Itself or Rewire Your Brain: Think Your Way to a Better Life. Along with self-help guides, the value of games intended to enhance what is known as neuroplasticity are still a topic of heated debate because no one knows for sure whether or not they improve intelligence, memory, reaction times or any other facet of cognition. Beyond the controversy, however, scientists have taken a number of steps in recent years to start to answer the basic biological questions that may ultimately lead to a deeper understanding of neuroplasticity. This type of research does not look at whether psychological tests used to assess cognitive deficits can be refashioned with cartoonlike graphics and marketed as games intended to improve mental skills. Rather, these studies attempt to provide a simple definition of how mutable the brain really is at all life stages, from infancy onward into adulthood. One ongoing question that preoccupies the basic scientists pursuing this line of research is how routine everyday activities—sleep, wakefulness, even any sort of movement—may affect the ability to perceive things in the surrounding environment. One of the leaders in these efforts is Michael Stryker, who researches neuroplasticity at the University of California San Francisco. Stryker headed a group that in 2010 published a study on what happened when mice run on top of a Styrofoam ball floating on air. They found that neurons in a brain region that processes visual signals—the visual cortex—nearly doubled their firing rate when the mice ran on the ball. © 2014 Scientific American

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 19834 - Posted: 07.15.2014

By BENEDICT CAREY PHILADELPHIA — The man in the hospital bed was playing video games on a laptop, absorbed and relaxed despite the bustle of scientists on all sides and the electrodes threaded through his skull and deep into his brain. “O.K., that’s enough,” he told doctors after more than an hour. “All those memory tests, it’s exhausting.” The man, Ralph, a health care worker who asked that his last name be omitted for privacy, has severe epilepsy; and the operation to find the source of his seizures had provided researchers an exquisite opportunity to study the biology of memory. The Department of Defense on Tuesday announced a $40 million investment in what has become the fastest-moving branch of neuroscience: direct brain recording. Two centers, one at the University of Pennsylvania and the other at the University of California, Los Angeles, won contracts to develop brain implants for memory deficits. Their aim is to develop new treatments for traumatic brain injury, the signature wound of the wars in Iraq and in Afghanistan. Its most devastating symptom is the blunting of memory and reasoning. Scientists have found in preliminary studies that they can sharpen some kinds of memory by directly recording, and stimulating, circuits deep in the brain. Unlike brain imaging, direct brain recording allows scientists to conduct experiments while listening to the brain’s internal dialogue in real time, using epilepsy patients like Ralph or people with Parkinson’s disease as active collaborators. The technique has provided the clearest picture yet of how neural circuits function, and raised hopes of new therapies for depression and anxiety as well as cognitive problems. But experts also worry about the possible side effects of directly tampering with memory. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory; Chapter 19: Language and Hemispheric Asymmetry
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 15: Language and Our Divided Brain
Link ID: 19810 - Posted: 07.09.2014

by Bethany Brookshire One day when I came in to the office, my air conditioning unit was making a weird rattling sound. At first, I was slightly annoyed, but then I chose to ignore it and get to work. In another 30 minutes, I was completely oblivious to the noise. It wasn’t until my cubicle neighbor Meghan Rosen came in and asked about the racket that I realized the rattle was still there. My brain had habituated to the sound. Habituation, the ability to stop noticing or responding to an irrelevant signal, is one of the simplest forms of learning. But it turns out that at the level of a brain cell, it’s a far more complex process than scientists previously thought. In the June 18 Neuron, Mani Ramaswami of Trinity College Dublin proposes a new framework to describe how habituation might occur in our brains. The paper not only offers a new mechanism to help us understand one of our most basic behaviors, it also demonstrates how taking the time to integrate new findings into a novel framework can help push a field forward. Our ability to ignore the irrelevant and familiar has been a long-known feature of human learning. It’s so simple, even a sea slug can do it. Because the ability to habituate is so simple, scientists hypothesized that the mechanism behind it must also be simple. The previous framework for habituation has been synaptic depression, a decrease in chemical release. When one brain cell sends a signal to another, it releases chemical messengers into a synapse, the small gap between neurons. Receptors on the other side pick up this excitatory signal and send the message onward. But in habituation, neurons would release fewer chemicals, making the signal less likely to hit the other side. Fewer chemicals, fewer signals, and you’ve habituated. Simple. But, as David Glanzman, a neurobiologist at the University of California, Los Angeles points out, there are problems with this idea. © Society for Science & the Public 2000 - 2013

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19772 - Posted: 06.25.2014

By DOUGLAS QUENQUA When it comes to forming memories that involve recalling a personal experience, neuroscientists are of two minds. Some say that each memory is stored in a single neuron in a region of the brain called the hippocampus. But a new study is lending weight to the theory of neuroscientists who believe that every memory is spread out, or distributed, across many neurons in that part of the brain. By watching patients with electrodes in their brains play a memory game, researchers found that each such memory is committed to cells distributed across the hippocampus. Though the proportion of cells responsible for each memory is small (about 2 percent of the hippocampus), the absolute number is in the millions. So the loss of any one cell should not have a noticeable effect on memory or mental acuity, said Peter N. Steinmetz, a research neurologist at the Dignity Health Barrow Neurological Institute in Phoenix and senior author of the study. “The significance of losing one cell is substantially reduced because you’ve got this whole population that’s turning on” when you access a memory, he said. The findings also suggest that memory researchers “need to use techniques that allow us to look at the whole population of neurons” rather than focus on individual cells. The patients in the study, which is published in Proceedings of the National Academy of Sciences, first memorized a list of words on a computer screen, then viewed a second list that included those words and others. When asked to identify words they had seen earlier, the patients displayed cell-firing activity consistent with the distributed model of memory. © 2014 The New York Times Company

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19763 - Posted: 06.24.2014

by Lauren Hitchings Our brain's ability to rapidly interpret and analyse new information may lie in the musical hum of our brainwaves. We continuously take in information about the world but establishing new neural connections and pathways – the process thought to underlie memory formation – is too slow to account for our ability to learn rapidly. Evan Antzoulatos and Earl Miller at the Massachusetts Institute of Technology decided to see if brainwaves – the surges of electricity produced by individual neurons firing en masse – play a role. They used EEG to observe patterns of electrical activity in the brains of monkeys as they taught the animals to categorise patterns of dots into two distinct groups. At first, they memorised which dots went where, but as the task became harder, they shifted to learning the rules that defined the categories. Humming brainwaves The researchers found that, initially, brainwaves of different frequencies were being produced independently by the prefrontal cortex and the striatum – two brain regions involved in learning. But as the monkeys made sense of the game, the waves began to synchronise and "hum" at the same frequency – with each category of dots having its own frequency. Miller says the synchronised brainwaves indicate the formation of a communication circuit between the two brain regions. He believes this happens before anatomical changes in brain connections take place, giving our minds time to think through various options when presented with new information before the right one gets laid down as a memory. Otherwise, the process is too time-consuming to account for the flexibility and speed of the human mind, says Miller. © Copyright Reed Business Information Ltd.

Related chapters from BP7e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 19746 - Posted: 06.19.2014