Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 61 - 80 of 4999

By David Noonan It was the day before Christmas, and the normally busy MIT laboratory on Vassar Street in Cambridge was quiet. But creatures were definitely stirring, including a mouse that would soon be world famous. Steve Ramirez, a 24-year-old doctoral student at the time, placed the mouse in a small metal box with a black plastic floor. Instead of curiously sniffing around, though, the animal instantly froze in terror, recalling the experience of receiving a foot shock in that same box. It was a textbook fear response, and if anything, the mouse’s posture was more rigid than Ramirez had expected. Its memory of the trauma must have been quite vivid. Which was amazing, because the memory was bogus: The mouse had never received an electric shock in that box. Rather, it was reacting to a false memory that Ramirez and his MIT colleague Xu Liu had planted in its brain. “Merry Freaking Christmas,” read the subject line of the email Ramirez shot off to Liu, who was spending the 2012 holiday in Yosemite National Park. The observation culminated more than two years of a long-shot research effort and supported an extraordinary hypothesis: Not only was it possible to identify brain cells involved in the encoding of a single memory, but those specific cells could be manipulated to create a whole new “memory” of an event that never happened. “It’s a fantastic feat,” says Howard Eichenbaum, a leading memory researcher and director of the Center for Neuroscience at Boston University, where Ramirez did his undergraduate work. “It’s a real breakthrough that shows the power of these techniques to address fundamental questions about how the brain works.” In a neuroscience breakthrough, the duo implanted a false memory in a mouse

Keyword: Learning & Memory; Emotions
Link ID: 20418 - Posted: 12.16.2014

By Gail Sullivan Chemicals found in food and common household products have been linked to lower IQ in kids exposed to high levels during pregnancy. Previous research linked higher exposure to chemicals called "phthalates" to poor mental and motor development in preschoolers. This study was said to be the first to report a link between prenatal exposure to the chemicals and childhood development. Researchers from Columbia University’s Mailman School of Public Health studied exposure to five types of phthalates, which are sometimes referred to as “hormone disruptors” or “endocrine disruptors.” Among these, di-n-butyl phthalate (DnBP) is used in shower curtains, raincoats, hairspray, food wraps, vinyl and pill coating, among other things — but according to the EPA, the largest source of exposure may be seafood. Di-isobutyl phthalate (DiBP) and Butylbenzyl phthalate (BBzP) are added to plastics to make them flexible. These chemicals may also used in makeup, nail polish, lacquer and explosives. The researchers linked prenatal exposure to phthalates to a more than six-point drop in IQ score compared with kids with less exposure. The study, “Persistent Associations between Maternal Prenatal Exposure to Phthalates on Child IQ at Age 7 Years," was published Wednesday in the journal PLOS One. "The magnitude of these IQ differences is troubling," one of the study’s authors, Robin Whyatt, said in a press release. "A six- or seven-point decline in IQ may have substantial consequences for academic achievement and occupational potential."

Keyword: Intelligence; Neurotoxins
Link ID: 20413 - Posted: 12.13.2014

By Gary Stix Our site recently ran a great story about how brain training really doesn’t endow you instantly with genius IQ. The games you play just make you better at playing those same games. They aren’t a direct route to a Mensa membership. Just a few days before that story came out—Proceedings of the National Academy of Sciences—published a report that suggested that playing action video games, Call of Duty: Black Ops II and the like—actually lets gamers learn the essentials of a particular visual task (the orientation of a Gabor signal—don’t ask) more rapidly than non-gamers, a skill that has real-world relevance beyond the confines of the artificial reality of the game itself. As psychologists say, it has “transfer effects.” Gamers appear to have learned how to do stuff like home in quickly on a target or multitask better than those who inhabit the non-gaming world. Their skills might, in theory, make them great pilots or laparoscopic surgeons, not just high scorers among their peers. Action video games are not billed as brain training, but both Call of Duty and nominally accredited training programs like Lumosity are both structured as computer games. So that leads to the question of what’s going on here? Every new finding about brain training as B.S. appears to be contradicted by another that points to the promise of cognitive exercise, if that’s what you call a session with Call of Duty. It may boil down to a realization that the whole story about exercising your neurons to keep the brain supple may be a lot less simple than proponents make it out to be. © 2014 Scientific American

Keyword: Learning & Memory
Link ID: 20409 - Posted: 12.13.2014

by Helen Thomson Zapping your brain might make you better at maths tests – or worse. It depends how anxious you are about taking the test in the first place. A recent surge of studies has shown that brain stimulation can make people more creative and better at maths, and can even improve memory, but these studies tend to neglect individual differences. Now, Roi Cohen Kadosh at the University of Oxford and his colleagues have shown that brain stimulation can have completely opposite effects depending on your personality. Previous research has shown that a type of non-invasive brain stimulation called transcranial direct current stimulation (tDCS) – which enhances brain activity using an electric current – can improve mathematical ability when applied to the dorsolateral prefrontal cortex, an area involved in regulating emotion. To test whether personality traits might affect this result, Kadosh's team tried the technique on 25 people who find mental arithmetic highly stressful, and 20 people who do not. They found that participants with high maths anxiety made correct responses more quickly and, after the test, showed lower levels of cortisol, an indicator of stress. On the other hand, individuals with low maths anxiety performed worse after tDCS. "It is hard to believe that all people would benefit similarly [from] brain stimulation," says Cohen Kadosh. He says that further research could shed light on how to optimise the technology and help to discover who is most likely to benefit from stimulation. © Copyright Reed Business Information Ltd.

Keyword: Brain imaging; Learning & Memory
Link ID: 20406 - Posted: 12.10.2014

Ian Sample, science editor Electrical brain stimulation equipment – which can boost cognitive performance and is easy to buy online – can have bad effects, impairing brain functioning, research from scientists at Oxford University has shown. A steady stream of reports of stimulators being able to boost brain performance, coupled with the simplicity of the devices, has led to a rise in DIY enthusiasts who cobble the equipment together themselves, or buy it assembled on the web, then zap themselves at home. In science laboratories brain stimulators have long been used to explore cognition. The equipment uses electrodes to pass gentle electric pulses through the brain, to stimulate activity in specific regions of the organ. Roi Cohen Kadosh, who led the study, published in the Journal of Neuroscience, said: “It’s not something people should be doing at home at this stage. I do not recommend people buy this equipment. At the moment it’s not therapy, it’s an experimental tool.” The Oxford scientists used a technique called transcranial direct current stimulation (tDCS) to stimulate the dorsolateral prefrontal cortex in students as they did simple sums. The results of the test were surprising. Students who became anxious when confronted with sums became calmer and solved the problems faster than when they had sham stimulation (the stimulation itself lasted only 30 seconds of the half hour study). The shock was that the students who did not fear maths performed worse with the same stimulation.

Keyword: Brain imaging; Learning & Memory
Link ID: 20405 - Posted: 12.10.2014

Kelly Servick* Anesthesiologists and surgeons who operate on children have been dogged by a growing fear—that being under anesthesia can permanently damage the developing brain. Although the few studies of children knocked out for surgeries have been inconclusive, evidence of impaired development in nematodes, zebrafish, rats, guinea pigs, pigs, and monkeys given common anesthetics has piled up in recent years. Now, the alarm is reaching a tipping point. “Anything that goes from [the roundworm] C. elegans to nonhuman primates, I've got to worry about,” Maria Freire, co-chair of the U.S. Food and Drug Administration (FDA) science advisory board, told attendees at a meeting the agency convened here last month to discuss the issue. The gathering came as anesthesia researchers and regulators consider several moves to address the concerns: a clinical trial of anesthetics in children, a consensus statement about their possible risks, and an FDA warning label on certain drugs. But each step stirs debate. Many involved in the issue are reluctant to make recommendations to parents and physicians based on animal data alone. At the same time, more direct studies of anesthesia's risks in children are plagued by confounding factors, lack of funding, and ethical issues. “We have to generate—very quickly—an action item, because I don't think the status quo is acceptable,” Freire said at the 19 November meeting. “Generating an action item without having the data is where things become very, very tricky.” © 2014 American Association for the Advancement of Science

Keyword: Sleep; Development of the Brain
Link ID: 20399 - Posted: 12.06.2014

|By Bret Stetka When University of Bonn psychologist Monika Eckstein designed her latest published study, the goal was simple: administer a hormone into the noses of 62 men in hopes that their fear would go away. And for the most part, it did. The hormone was oxytocin, often called our “love hormone” due to its crucial role in mother-child relationships, social bonding, and intimacy (levels soar during sex). But it also seems to have a significant antianxiety effect. Give oxytocin to people with certain anxiety disorders, and activity in the amygdala—the primary fear center in human and other mammalian brains, two almond-shaped bits of brain tissue sitting deep beneath our temples—falls. The amygdala normally buzzes with activity in response to potentially threatening stimuli. When an organism repeatedly encounters a stimulus that at first seemed frightening but turns out to be benign—like, say, a balloon popping—a brain region called the prefrontal cortex inhibits amygdala activity. But in cases of repeated presentations of an actual threat, or in people with anxiety who continually perceive a stimulus as threatening, amygdala activity doesn’t subside and fear memories are more easily formed. To study the effects of oxytocin on the development of these fear memories, Eckstein and her colleagues first subjected study participants to Pavlovian fear conditioning, in which neutral stimuli (photographs of faces and houses) were sometimes paired with electric shocks. Subjects were then randomly assigned to receive either a single intranasal dose of oxytocin or a placebo. Thirty minutes later they received functional MRI scans while undergoing simultaneous fear extinction therapy, a standard approach to anxiety disorders in which patients are continually exposed to an anxiety-producing stimulus until they no longer find it stressful. In this case they were again exposed to images of faces and houses, but this time minus the electric shocks. © 2014 Scientific American

Keyword: Learning & Memory; Emotions
Link ID: 20397 - Posted: 12.06.2014

By recording from the brains of bats as they flew and landed, scientists have found that the animals have a "neural compass" - allowing them to keep track of exactly where and even which way up they are. These head-direction cells track bats in three dimensions as they manoeuvre. The researchers think a similar 3D internal navigation system is likely to be found throughout the animal kingdom. The findings are published in the journal Nature. Lead researcher Arseny Finkelstein, from the Weizmann Institute of Science in Rehovot, Israel, explained that this was the first time measurements had been taken from animals as they had flown around a space in any direction and even carried out their acrobatic upside-down landings. "We're the only lab currently able to conduct wireless recordings in flying animals," he told BBC News. "A tiny device attached to the bats allows us to monitor the activity of single neurons while the animal is freely moving." Decades of study of the brain's internal navigation system garnered three renowned neuroscientists this year's Nobel Prize for physiology and medicine. The research, primarily in rats, revealed how animals had "place" and "grid" cells - essentially building a map in the brain and coding for where on that map an animal was at any time. Mr Finkelstein and his colleagues' work in bats has revealed that their brains also have "pitch" and "roll" cells. These tell the animal whether it is pointing upwards or downwards and whether its head is tilted one way or the other. BBC © 2014

Keyword: Hearing; Learning & Memory
Link ID: 20393 - Posted: 12.04.2014

by Andy Coghlan How does this make you feel? Simply asking people to think about emotion-laden actions as their brains are scanned could become one of the first evidence-based tests for psychiatric illness. Assessing people in this way would be a step towards a more scientific approach to diagnosis, away from that based on how someone behaves or how they describe their symptoms. The US National Institute of Mental Health has had such a goal in mind since 2013. Marcel Just of Carnegie Mellon University in Pittsburgh, Pennsylvania, and his colleagues developed the brain scanning technique and used it to identify people with autism. "This gives us a whole new perspective to understanding psychiatric illnesses and disorders," says Just. "We've discovered a biological thought-marker for autism." The technique builds on work by the group showing that specific thoughts and emotions are represented in the brain by certain patterns of neural activation. The idea is that deviations from these patterns, what Just refers to as thought-markers, can be used to diagnose different psychiatric conditions. The team asked a group of adults to imagine 16 actions, some of which required emotional involvement, such as "hugging", "persuading" or "adoring", while they lay in an fMRI scanner. © Copyright Reed Business Information Ltd.

Keyword: Autism; Emotions
Link ID: 20392 - Posted: 12.04.2014

By CHRISTOPHER F. CHABRIS and DANIEL J. SIMONS NEIL DEGRASSE TYSON, the astrophysicist and host of the TV series “Cosmos,” regularly speaks to audiences on topics ranging from cosmology to climate change to the appalling state of science literacy in America. One of his staple stories hinges on a line from President George W. Bush’s speech to Congress after the 9/11 terrorist attacks. In a 2008 talk, for example, Dr. Tyson said that in order “to distinguish we from they” — meaning to divide Judeo-Christian Americans from fundamentalist Muslims — Mr. Bush uttered the words “Our God is the God who named the stars.” Dr. Tyson implied that President Bush was prejudiced against Islam in order to make a broader point about scientific awareness: Two-thirds of the named stars actually have Arabic names, given to them at a time when Muslims led the world in astronomy — and Mr. Bush might not have said what he did if he had known this fact. This is a powerful example of how our biases can blind us. But not in the way Dr. Tyson thought. Mr. Bush wasn’t blinded by religious bigotry. Instead, Dr. Tyson was fooled by his faith in the accuracy of his own memory. In his post-9/11 speech, Mr. Bush actually said, “The enemy of America is not our many Muslim friends,” and he said nothing about the stars. Mr. Bush had indeed once said something like what Dr. Tyson remembered; in 2003 Mr. Bush said, in tribute to the astronauts lost in the Columbia space shuttle explosion, that “the same creator who names the stars also knows the names of the seven souls we mourn today.” Critics pointed these facts out; some accused Dr. Tyson of lying and argued that the episode should call into question his reliability as a scientist and a public advocate. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20387 - Posted: 12.03.2014

|By David Z. Hambrick If you’ve spent more than about 5 minutes surfing the web, listening to the radio, or watching TV in the past few years, you will know that cognitive training—better known as “brain training”—is one of the hottest new trends in self improvement. Lumosity, which offers web-based tasks designed to improve cognitive abilities such as memory and attention, boasts 50 million subscribers and advertises on National Public Radio. Cogmed claims to be “a computer-based solution for attention problems caused by poor working memory,” and BrainHQ will help you “make the most of your unique brain.” The promise of all of these products, implied or explicit, is that brain training can make you smarter—and make your life better. Yet, according to a statement released by the Stanford University Center on Longevity and the Berlin Max Planck Institute for Human Development, there is no solid scientific evidence to back up this promise. Signed by 70 of the world’s leading cognitive psychologists and neuroscientists, the statement minces no words: "The strong consensus of this group is that the scientific literature does not support claims that the use of software-based “brain games” alters neural functioning in ways that improve general cognitive performance in everyday life, or prevent cognitive slowing and brain disease." The statement also cautions that although some brain training companies “present lists of credentialed scientific consultants and keep registries of scientific studies pertinent to cognitive training…the cited research is [often] only tangentially related to the scientific claims of the company, and to the games they sell.” © 2014 Scientific American,

Keyword: Learning & Memory
Link ID: 20380 - Posted: 12.02.2014

By Nicholas Bakalar Researchers have found that people diagnosed with diabetes in their 50’s are significantly more likely than others to suffer mental decline by their 70’s. The study, published Monday in the Annals of Internal Medicine, started in 1990. Scientists examined 13,351 black and white adults, aged 48 to 67, for diabetes and prediabetes using self-reported physician diagnoses and glucose control tests. They also administered widely used tests of memory, reasoning, problem solving and planning. About 13 percent had diabetes at the start. The researchers followed them with five periodic examinations over the following 20 years. By that time, 5,987 participants were still enrolled. After adjusting for numerous health and behavioral factors, and for the large attrition in the study, the researchers found people with diabetes suffered a 30 percent larger decline in mental acuity than those without the disease. Diabetes can impair blood circulation, and the authors suggest that the association of diabetes with thinking and memory problems may be the result of damage to small blood vessels in the brain. “People may think cognitive decline with age is inevitable, but it’s not,” said the senior author, Elizabeth Selvin, an associate professor of epidemiology at the Johns Hopkins Bloomberg School of Public Health. “Factors like diabetes are potentially modifiable. If we can better control diabetes we can stave off cognitive decline and future dementia.” © 2014 The New York Times Company

Keyword: Obesity; Learning & Memory
Link ID: 20377 - Posted: 12.02.2014

by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.

Keyword: Learning & Memory; Glia
Link ID: 20375 - Posted: 12.01.2014

by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.

Keyword: Language; Development of the Brain
Link ID: 20367 - Posted: 11.29.2014

By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."

Keyword: Vision; Development of the Brain
Link ID: 20365 - Posted: 11.29.2014

By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company

Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014

By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.

Keyword: Vision; Development of the Brain
Link ID: 20361 - Posted: 11.26.2014

By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.

Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014

By Anna North The idea that poverty can change the brain has gotten significant attention recently, and not just from those lay readers (a minority, according to recent research) who spend a lot of time thinking about neuroscience. Policy makers and others have begun to apply neuroscientific principles to their thinking about poverty — and some say this could end up harming poor people rather than helping. At The Conversation, the sociologist Susan Sered takes issue with “news reports with headlines like this one: ‘Can Brain Science Help Lift People Out Of Poverty?’” She’s referring to a June story by Rachel Zimmerman at WBUR, about a nonprofit called Crittenton Women’s Union that aims to use neuroscience to help get people out of poverty. Elisabeth Babcock, Crittenton’s chief executive, tells Ms. Zimmerman: “What the new brain science says is that the stresses created by living in poverty often work against us, make it harder for our brains to find the best solutions to our problems. This is a part of the reason why poverty is so ‘sticky.’” And, she adds: “If we’ve been raised in poverty under all this stress, our executive functioning wiring, the actual neurology of our brains, is built differently than if we’re not raised in poverty. It is built to react quickly to danger and threats and not built as much to plan or execute strategies for how we want things to be in the future because the future is so uncertain and planning is so pointless that this wiring isn’t as called for.” Dr. Sered, however, says that applying neuroscience to problems like poverty can sometimes lead to trouble: “Studies showing that trauma and poverty change people’s brains can too easily be read as scientific proof that poor people (albeit through no fault of their own) have inferior brains or that women who have been raped are now brain-damaged.” © 2014 The New York Times Company

Keyword: Development of the Brain; Brain imaging
Link ID: 20358 - Posted: 11.25.2014

By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,

Keyword: Learning & Memory; Emotions
Link ID: 20357 - Posted: 11.25.2014