Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Laura Sanders Last Sunday, the Giants battled the Redskins in our living room, and there was no bigger fan than 9-month-old Baby V. Unlike her father, she was not interested in RG3’s shortcomings. The tiny, colorful guys running around on a bright green field, the psychedelic special effects and the bursts of noise drew her in like a moth to a 42-inch high-definition flame. My friends with kids have noticed the same screen fascination in their little ones. Like adults, kids love colorful, shiny, moving screens. The problem, of course, is that watching TV probably isn’t the best way for little kids to spend their time. Long bouts in front of the tube have been linked to obesity, weaker attention spans and aggression in kids. Now, a new study of Japanese children has linked TV time with changes in the growing brains, effects that have been harder to spot. And the more television a kid watches, the more profound the brain differences, scientists report November 20 in Cerebral Cortex. Researchers studied kids between age 5 and 18 who watched between zero and four hours of television a day. On average, the kids watched TV for about two hours a day. Brain scans revealed that the more television a kid watched, the larger certain parts of the brain were. Gray matter volume was higher in regions toward the front and side of the head in kids who watched a lot of TV. Say that again? Watching television boosts brain volume? Before you rejoice and fire up Season 1 of Breaking Bad, keep in mind: Bigger isn’t always better. In this case, higher brain volume in these kids was associated with a lower verbal IQ. Study coauthor Hikaru Takeuchi Tohoku University in Japan says that these brain areas need to be pruned during childhood to operate efficiently. © Society for Science & the Public 2000 - 2013.
by Bob Holmes Dying cells may play only a small role in the brain decline that accompanies ageing. That is the suggestion from the first computer simulation of brain function that can solve intelligence tests almost as well as university undergraduates. The model promises to reveal how our brains and behaviour are affected by age, and might even offer a way of testing drugs that compensate for cognitive decline. Psychologists have known for many years that our ability to think through novel problems – our "fluid intelligence" – gradually declines with age. However, the reasons for this decline aren't clear, because many features of the brain change as we age: neurons die; connections become sparser between regions of the brain and between individual brain cells; and our mental representation of different concepts becomes less distinct, among other changes. So far, psychologists have been unable to tease apart these possible explanations for cognitive decline. Enter Chris Eliasmith, a theoretical neuroscientist at the University of Waterloo in Ontario, Canada, and his student Daniel Rasmussen. The pair used a computer to simulate the behaviour of about 35,000 individual brain cells wired together in a biologically realistic way. Just like a real brain, their model encoded information as a pattern of electrical activity in particular sets of cells. The researchers set up the system to solve a widely used intelligence test known as Raven's Progressive Matrices, which involves predicting what abstract symbol comes next in a sequence. © Copyright Reed Business Information Ltd.
Ewen Callaway Certain fears can be inherited through the generations, a provocative study of mice reports1. The authors suggest that a similar phenomenon could influence anxiety and addiction in humans. But some researchers are sceptical of the findings because a biological mechanism that explains the phenomenon has not been identified. According to convention, the genetic sequences contained in DNA are the only way to transmit biological information across generations. Random DNA mutations, when beneficial, enable organisms to adapt to changing conditions, but this process typically occurs slowly over many generations. Yet some studies have hinted that environmental factors can influence biology more rapidly through 'epigenetic' modifications, which alter the expression of genes, but not their actual nucleotide sequence. For instance, children who were conceived during a harsh wartime famine in the Netherlands in the 1940s are at increased risk of diabetes, heart disease and other conditions — possibly because of epigenetic alterations to genes involved in these diseases2. Yet although epigenetic modifications are known to be important for processes such as development and the inactivation of one copy of the X-chromsome in females, their role in the inheritance of behaviour is still controversial. Kerry Ressler, a neurobiologist and psychiatrist at Emory University in Atlanta, Georgia, and a co-author of the latest study, became interested in epigenetic inheritance after working with poor people living in inner cities, where cycles of drug addiction, neuropsychiatric illness and other problems often seem to recur in parents and their children. “There are a lot of anecdotes to suggest that there’s intergenerational transfer of risk, and that it’s hard to break that cycle,” he says. © 2013 Nature Publishing Group
By Tanya Lewis 20 hours ago To understand the human brain, scientists must start small, and what better place than the mind of a worm? The roundworm Caenorhabditis elegans is one of biology's most widely studied organisms, and it's the first to have the complete wiring diagram, or connectome, of its nervous system mapped out. Knowing the structure of the animal's connectome will help explain its behavior, and could lead to insights about the brains of other organisms, scientists say. "You can't understand the brain without understanding the connectome," Scott Emmons, a molecular geneticist at Albert Einstein College of Medicine of Yeshiva University in New York, said in a talk earlier this month at the annual meeting of the Society for Neuroscience in San Diego. In 1963, South African biologist Sydney Brenner of the University of Cambridge decided to use C. elegans as a model organism for developmental biology. He chose the roundworm because it has a simple nervous system, it's easy to grow in a lab and its genetics are relatively straightforward. C. elegans was the first multicellular organism to have its genome sequenced, in 1998. Brenner knew that to understand how genes affect behavior, "you would have to know the structure of the nervous system," Emmons told LiveScience.
by Bethany Brookshire Most people take it as a given that distraction is bad for — oh, hey, a squirrel! Where was I? … Right. Most people take it as a given that distraction is bad for memory. And most of the time, it is. But under certain conditions, the right kind of distraction might actually help you remember. Nathan Cashdollar of University College London and colleagues were looking at the effects of distraction on memory in memory-impaired patients. They were specifically looking at distractions that were totally off-topic from a particular task, and how those distractions affected memory performance. Their results were published November 27 in the Journal of Neuroscience. The researchers worked with a small group of people with severe epilepsy who had lesions in the hippocampus, and therefore had memory problems. They compared them to groups of people with epilepsy without lesions, young healthy people, and older healthy people that were matched to the epilepsy group. Each of the participants went through a memory task called “delayed match-to-sample.” For this task, participants are given a set of samples or pictures, usually things like nature scenes. Then there’s a delay, from one second at the beginning of the test on up to nearly a minute. Then participants are shown another nature scene. Is it one they have seen before? Yes or no? The task starts out simply, with only one nature scene to match, but soon becomes harder, with up to five pictures to remember, and a five-second delay. People with memory impairments did a lot worse when they had more items to remember (called high cognitive load), falling off very steeply in their performance. Normal controls did better, still remaining fairly accurate, but making mistakes once in a while. © Society for Science & the Public 2000 - 2013.
Medical marijuana can alleviate pain and nausea, but it can also cause decreased attention span and memory loss. A new study in mice finds that taking an over-the-counter pain medication like ibuprofen may help curb these side effects. "This is what we call a seminal paper," says Giovanni Marsicano, a neuroscientist at the University of Bordeaux in France who was not involved in the work. If the results hold true in humans, they "could broaden the medical use of marijuana," he says. "Many people in clinical trials are dropping out from treatments, because they say, ‘I cannot work anymore. I am stoned all the time.’ ” People have used marijuana for hundreds of years to treat conditions such as chronic pain, multiple sclerosis, and epilepsy. Studies in mice have shown that it can reduce some of the neural damage seen in Alzheimer's disease. The main psychoactive ingredient, tetrahydrocannabinol (THC), is approved by the Food and Drug Administration to treat anorexia in AIDS patients and the nausea triggered by chemotherapy. Although recreational drug users usually smoke marijuana, patients prescribed THC take it as capsules. Many people find the side effects hard to bear, however. The exact cause of these side effects is unclear. In the brain, THC binds to receptors called CB1 and CB2, which are involved in neural development as well as pain perception and appetite. The receptors are normally activated by similar compounds, called endocannabinoids, that are produced by the human body. When one of these compounds binds to CB1, it suppresses the activity of an enzyme called cyclooxygenase-2 (COX-2). The enzyme has many functions. For instance, painkillers such as ibuprofen and aspirin work by blocking COX-2. Researchers have hypothesized that the suppression of COX-2 could be the cause of THC's side effects, such as memory problems. © 2013 American Association for the Advancement of Science
By BENEDICT CAREY Grading college students on quizzes given at the beginning of every class, rather than on midterms or a final exam, increases both attendance and overall performance, scientists reported Wednesday. The findings — from an experiment in which 901 students in a popular introduction to psychology course at the University of Texas took their laptops to class and were quizzed online — demonstrate that the computers can act as an aid to teaching, not just a distraction. Moreover, the study is the latest to show how tests can be used to enhance learning as well as measure it. The report, appearing in the journal PLoS One, found that this “testing effect” was particularly strong in students from lower-income households. Psychologists have known for almost a century that altering the timing of tests can affect performance. In the past decade, they have shown that taking a test — say, writing down all you can remember from a studied prose passage — can deepen the memory of that passage better than further study. The new findings stand as a large-scale prototype for how such testing effects can be exploited in the digital era, experts said, though they cautioned that it was not yet clear how widely they could be applied. “This study is important because it introduces a new method to implement frequent quizzing with feedback in large classrooms, which can be difficult to do,” said Jeffrey D. Karpicke, a professor of psychology at Purdue, who was not involved in the study. He added, “This is the first large study to show that classroom quizzing can help reduce achievement gaps” due to socioeconomic background. © 2013 The New York Times Company
Keyword: Learning & Memory
Link ID: 18960 - Posted: 11.23.2013
By EMILY ANTHES Humans have no exclusive claim on intelligence. Across the animal kingdom, all sorts of creatures have performed impressive intellectual feats. A bonobo named Kanzi uses an array of symbols to communicate with humans. Chaser the border collie knows the English words for more than 1,000 objects. Crows make sophisticated tools, elephants recognize themselves in the mirror, and dolphins have a rudimentary number sense. Anolis evermanni lizards normally attack their prey from above. The lizards were challenged to find a way to access insects that were kept inside a small hole covered with a tightfitting blue cap. And reptiles? Well, at least they have their looks. In the plethora of research over the past few decades on the cognitive capabilities of various species, lizards, turtles and snakes have been left in the back of the class. Few scientists bothered to peer into the reptile mind, and those who did were largely unimpressed. “Reptiles don’t really have great press,” said Gordon M. Burghardt, a comparative psychologist at the University of Tennessee at Knoxville. “Certainly in the past, people didn’t really think too much of their intelligence. They were thought of as instinct machines.” But now that is beginning to change, thanks to a growing interest in “coldblooded cognition” and recent studies revealing that reptile brains are not as primitive as we imagined. The research could not only redeem reptiles but also shed new light on cognitive evolution. Because reptiles, birds and mammals diverged so long ago, with a common ancestor that lived 280 million years ago, the emerging data suggest that certain sophisticated mental skills may be more ancient than had been assumed — or so adaptive that they evolved multiple times. © 2013 The New York Times Company
When President Obama announced his plan to explore the mysteries of the human brain seven months ago, it was long on ambition and short on details. Now some of the details are being sketched in. They will include efforts to restore lost memories in war veterans, create tools that let scientists study individual brain circuits and map the nervous system of the fruit fly. The Defense Advanced Projects Agency, or DARPA, which has committed more than $50 million to the effort, offered the clearest plan. The agency wants to focus on treatments for the sort of brain disorders affecting soldiers who served in Iraq and Afghanistan, according to , deputy director of . "That is our constituency," Ling said at a news conference at the Society for Neuroscience meeting in San Diego. A colored 3-D MRI scan of the brain's white matter pathways traces connections between cells in the cerebrum and the brainstem. So DARPA will be working on problems including PTSD and traumatic brain injuries, Ling says. In particular, the agency wants to help the soldier who has "a terribly damaged brain and has lost a significant amount of declarative memory," Ling said. "We would like to restore that memory." DARPA hopes to do that with an implanted device that will take over some functions of the brain's hippocampus, an area that's important to memory. The agency has already used a device that does this in rodents, Ling said, and the goal is to move on to people quickly. The agency plans to use the same approach that created a better in record time, Ling said. "We went from idea to prototype in 18 months," he says. This undated X-ray image from the Cleveland Clinic shows electrodes implanted in a patient's brain. The method, known as deep brain stimulation, has traditionally been used to treat diseases such as Parkinson's, but new research indicates it could be helpful for patients with obsessive-compulsive disorder. ©2013 NPR
Ed Yong Humanity's success depends on the ability of humans to copy, and build on, the works of their predecessors. Over time, human society has accumulated technologies, skills and knowledge beyond the scope of any single individual. Now, two teams of scientists have independently shown that the strength of this cumulative culture depends on the size and interconnectedness of social groups. Through laboratory experiments, they showed that complex cultural traditions — from making fishing nets to tying knots — last longer and improve faster at the hands of larger, more sociable groups. This helps to explain why some groups, such as Tasmanian aboriginals, lost many valuable skills and technologies as their populations shrank. “For producing fancy tools and complexity, it’s better to be social than smart,” says psychologist Joe Henrich of the University of British Columbia in Vancouver, Canada, the lead author of one of the two studies, published today in Proceedings of the Royal Society B1. “And things that make us social are going to make us seem smarter.” “There were some theoretical models to explain these phenomena but no one had done experiments,” says evolutionary biologist Maxime Derex of the University of Montpellier, France, who led the other study, published online today in Nature2. Derex’s team asked 366 male students to play a virtual game in which they gained points — and eventually money — by building either an arrowhead or a fishing net. The nets offered greater rewards, but were also harder to make. The students watched video demonstrations of the two tasks in groups of 2, 4, 8 or 16, before attempting the tasks individually. Their arrows and nets were tested in simulations and scored. After each trial, they could see how other group members fared, and watch a step-by-step procedure for any one of the designs. © 2013 Nature Publishing Group
Sedentary adults may improve their memory as soon as six weeks after taking up aerobic exercise, a small brain imaging study suggests. Cardiovascular fitness and cognitive performance such as attention seem to improve after six months or more of aerobic exercise in previous aging studies. Now researchers in Texas have found signs of increased regional blood flow in the brain of 37 sedentary adults with an average age of 64 who were randomized to physical training or a control group who had the training after a waiting period. They found a higher resting cerebral blood flow in the brain's anterior cingulate region in the physical training group compared with controls. The anterior cingulate region is associated with better memory functions. The size of this brain region was also larger in another study of "successful cognitive agers" over the age of 80 compared to middle-aged or elderly controls. "A relatively rapid health benefit across brain, memory and fitness in sedentary adults soon after starting to exercise, some gains starting as early as six weeks, could motivate adults to start exercising regularly," the study's lead author, Sandra Bond Champman of the Center for BrainHealth in Dallas and her co-authors concluded in Monday's issue of the journal Frontiers in Aging Neuroscience. "The current findings shed new light on ways exercise promotes cognitive/brain health in aging." The participants all had a physical exam and screening for dementia, early cognitive impairment, depression and IQ before the study began. A noninvasive type of MRI was used to measure brain blood flow before, half way through the 6-week training sessions and at 12 weeks. © CBC 2013
Keyword: Learning & Memory
Link ID: 18920 - Posted: 11.13.2013
Jessica Wright A new test of mouse intelligence closely mimics the types of assays used with people and detects a subtle learning deficit reminiscent of one seen in teenagers with autism, according to findings presented Saturday at the2013 Society for Neuroscience annual meeting in San Diego. Another behavioral test, also presented Saturday, uncovers an unexpected social deficit in an autism mouse model. The test in the first study could be used to screen for drugs that improve cognitive deficits associated with autism, says Jill Silverman, a postdoctoral associate in Jacqueline Crawley’s lab at the University of California, Davis MIND Institute. Silverman presented the work at a poster session. To measure learning in mice, researchers typically place them in a water maze, or see if they learn to anticipate an electric shock. “But you don’t shock people or put them in a pool to swim,” notes Silverman. Silverman instead trained the mice in a human activity: using a touchscreen. In the most basic form of the test, the mice see two graphic images (such as a plane and a spider) and learn that they get “yummy” strawberry milkshake if they touch the spider, Silverman says. (She says she uses milkshakes because the mice work hard for them, even if they aren’t hungry.) BTBR mice, which have many autism-like features, learn to go for the spider just as readily as control mice do. So Silverman made things much more complicated. The complex test follows the logic of transitive properties. For example, if John is taller than Anne and Anne is taller than Jane, we are able to infer that John is taller than Jane. © Copyright 2013 Simons Foundation
by Sarah Zielinski If you put two birds together and gave them a problem, would they be any better at solving it than if they were alone? A study in Animal Behaviour of common mynas finds that not only are they no better at problem solving when in a pair than when on their own, the birds actually get a lot worse when put in a group. Andrea S. Griffin and her research team from the University of Newcastle in Callaghan, Australia, began by using dog food pellets as bait to capture common mynas (a.k.a. the Indian mynah, Acridotheres tristis) from around Newcastle. Then they gave each of the birds an innovation test, consisting of a box containing a couple of drawers and some Petri dishes. To get to the food hidden in spots in the box, the birds would have to get creative and figure out how to open one of the four containers by doing things like levering up a lid or pushing open a drawer. The scientists then ranked the birds by innovative ability before pairing them up. Half the pairs consisted of a high-innovation and a low-innovation myna, and the other half were pairs of medium-innovation birds. Then the pairs each received an innovation test similar to the one with boxes. Another experiment tested the birds in same-sex groups of five. On their own, 29 of 34 birds were able to access at least one container. But in pairs, only 15 of the 34 birds did so, and they took a lot longer. Performance dropped for both high- and medium-innovation birds, and it didn’t improve for the low-ranked ones, which had done so poorly the first time around that their results couldn’t get any worse. In groups of five, birds’ results fell even further: No mynas solved any of those tasks. © Society for Science & the Public 2000 - 2013
Stanley Rachman. “Will these hands ne'er be clean?” In Shakespeare's play Macbeth, Lady Macbeth helps to plot the brutal murder of King Duncan. Afterwards she feels tainted by Duncan's blood and insists that “all the perfumes of Arabia” could not sweeten her polluted hands. Baffled by her compulsive washing, her doctor is forced to admit: “This disease is beyond my practise.” In the 400 years since Macbeth was first performed, other doctors, psychiatrists, neuroscientists and clinical psychologists — myself included — have also found the problem beyond the reach of their own expertise. We see compulsive washing a lot, mostly as a symptom of obsessive–compulsive disorder (OCD), but also in people who have suffered a physical or emotional trauma, for example in women who have suffered sexual assault. The events trigger a deep-seated psychological, and ultimately biological, response. We know that the driving force of compulsive washing is a fear of contamination by dirt and germs. An obsessive fear of contact with sexual fluids, for example, can drive compulsive washing in OCD and force people to restrict sexual activity to a specific room in the house. Compulsive washing fails to relieve the anxiety. Most patients with OCD continue to feel contaminated despite vigorous attempts to clean themselves. Why does repeated washing fail? There is much debate at present about the direction that psychiatric medicine and research should take. We should not underestimate what we can continue to learn from the careful observation of patients. Such observations have led my colleagues and me to diagnose a new cause of OCD and other types of compulsive washing: mental contamination. © 2013 Nature Publishing Group
M. Mitchell Waldrop Kwabena Boahen got his first computer in 1982, when he was a teenager living in Accra. “It was a really cool device,” he recalls. He just had to connect up a cassette player for storage and a television set for a monitor, and he could start writing programs. But Boahen wasn't so impressed when he found out how the guts of his computer worked. “I learned how the central processing unit is constantly shuffling data back and forth. And I thought to myself, 'Man! It really has to work like crazy!'” He instinctively felt that computers needed a little more 'Africa' in their design, “something more distributed, more fluid and less rigid”. Today, as a bioengineer at Stanford University in California, Boahen is among a small band of researchers trying to create this kind of computing by reverse-engineering the brain. The brain is remarkably energy efficient and can carry out computations that challenge the world's largest supercomputers, even though it relies on decidedly imperfect components: neurons that are a slow, variable, organic mess. Comprehending language, conducting abstract reasoning, controlling movement — the brain does all this and more in a package that is smaller than a shoebox, consumes less power than a household light bulb, and contains nothing remotely like a central processor. To achieve similar feats in silicon, researchers are building systems of non-digital chips that function as much as possible like networks of real neurons. Just a few years ago, Boahen completed a device called Neurogrid that emulates a million neurons — about as many as there are in a honeybee's brain. And now, after a quarter-century of development, applications for 'neuromorphic technology' are finally in sight. © 2013 Nature Publishing Group
A mother's level of education has strong implications for a child's development. Northwestern University researchers show in a new study that low maternal education is linked to a noisier nervous system in children, which could affect their learning. "You really can think of it as static on your radio that then will get in the way of hearing the announcer’s voice," says Nina Kraus, senior author of the study and researcher at the Auditory Neuroscience Laboratory at Northwestern University. The study, published in the Journal of Neuroscience, is part of a larger initiative working with children in public high schools in inner-city Chicago. The adolescents are tracked from ninth to 12th grade. An additional group of children in the gang-reduction zones of Los Angeles are also being tracked. Kraus and colleagues are more broadly looking at how music experience, through classroom group-based musical experience, could offset certain negative effects of poverty. But first, they wanted to see what biological effects poverty may have on the adolescents' brain. In this particular study, 66 children - a small sample - in Chicago participated. Those whose mothers had a "lower education" tended to have not graduated from high school. Kraus's study did not directly track income of families, but most children in the study qualified for free lunch (to be eligible, a family of four must have income of about $29,000 or less). Researchers found "children from lower-SES (socioeconomic status) backgrounds are exposed to less complex and linguistically rich input in addition to hearing fewer words per hour from their caregivers," according to the study. © 2012 Cable News Network
Early childhood poverty has been linked to smaller brain size by U.S. researchers who are pointing to the importance of nurturing from caregivers as a protective factor. Children exposed to poverty tend to have poorer cognitive outcomes and school performance. To learn more about the biology of how, researchers started tracking the emotional and brain development of 145 preschoolers in metropolitan St. Louis for 10 years. Household poverty was measured by the income-to-needs ratio. Children were assessed each year for thee to six years before they received an MRI and questionnaires. A parent and child were also observed during a lab task that required the child (age four to seven) to wait for eight minutes before opening a brightly wrapped gift within arm's reach while the parent filled in questionnaires. "These study findings demonstrated that exposure to poverty during early childhood is associated with smaller white matter, cortical grey matter, and hippocampal and amygdala volumes," Dr. Joan Luby of the psychiatry department at Washington University School of Medicine in St. Louis and her co-authors concluded in Monday's issue of the journal JAMA Pediatrics. The findings were consistent with an earlier study by the same team that suggested supportive parenting also plays an important role in the development of the hippocampus in childhood independent of income. The brain's hippocampus is important for learning and memory and how we respond to stress. In the study, the effects of poverty on hippocampal volume was influenced by caregiving support or hospitality in the brain's light and right hemispheres and stressful life events on the left. Caregiver education was not a significant mediator. © CBC 2013
When frogs croak, the fringe-lipped bat, Trachops cirrhosus, listens. The bats use the sounds to track and feed on amphibians and to share dining tips with neighbors. In a new study, Patricia Jones of the University of Texas at Austin and colleagues trained a few frog-eating bats to associate a cell phone ringtone with food. Some of the bats reliably got food when they heard the phone ring. Others did not. The bats that failed to get food using their own cues paid more attention to new ones that their fellow mammals shared. Social learning becomes much more important if a bat is unsuccessful at finding food, the scientists report October 22 in the Proceedings of the Royal Society B. Observing how bats forage alone and together may help scientists understand the way new hunting behaviors spread through animal populations. It may also give insight to animals’ potential for cultivating culture, the authors suggest. © Society for Science & the Public 2000 - 2013.
Keyword: Learning & Memory
Link ID: 18823 - Posted: 10.23.2013
by Tina Hesman Saey Sleep hoses garbage out of the brain, a study of mice finds. The trash, including pieces of proteins that cause Alzheimer’s disease, piles up while the rodents are awake. Sleep opens spigots that bathe the brain in fluids and wash away the potentially toxic buildup, researchers report in the Oct. 18 Science. The discovery may finally reveal why sleep seems mandatory for every animal. It may also shed new light on the causes of neurodegenerative disorders such as Alzheimer’s and Parkinson’s diseases. “It’s really an eye-opening and intriguing finding,” says Chiara Cirelli, a sleep researcher at the University of Wisconsin–Madison. The results have already led her and other sleep scientists to rethink some of their own findings. Although sleep requirements vary from individual to individual and across species, a complete lack of it is deadly. But no one knows why. One popular idea is that sleep severs weak connections between brain cells and strengthens more robust connections to solidify memories (SN Online: 4/2/09; SN Online: 6/23/11). But a good memory is not a biological imperative. “You don’t die from forgetting what you learned yesterday,” says Maiken Nedergaard, a neuroscientist at the University of Rochester Medical Center in New York who led the study. Researchers in Nedergaard’s lab stumbled upon sleep’s role in garbage clearance while studying a brain drainage system they described last year (SN: 9/22/12, p. 15). This service, called the glymphatic system, flushes fluid from the brain and spinal cord into the space between brain cells. Ultimately, the fluid and any debris it carries washes into the liver for disposal. © Society for Science & the Public 2000 - 2013
By Christopher Wanjek and LiveScience Your liver could be "eating" your brain, new research suggests. People with extra abdominal fat are three times more likely than lean individuals to develop memory loss and dementia later in life, and now scientists say they may know why. It seems that the liver and the hippocampus (the memory center in the brain), share a craving for a certain protein called PPARalpha. The liver uses PPARalpha to burn belly fat; the hippocampus uses PPARalpha to process memory. In people with a large amount of belly fat, the liver needs to work overtime to metabolize the fat, and uses up all the PPARalpha — first depleting local stores and then raiding the rest of the body, including the brain, according to the new study. The process essentially starves the hippocampus of PPARalpha, thus hindering memory and learning, researchers at Rush University Medical Center in Chicago wrote in the study, published in the current issue of Cell Reports. Other news reports were incorrect in stating that the researchers established that obese individuals were 3.6 times more likely than lean individuals to develop dementia. That finding dates back to a 2008 study by researchers at the Kaiser Permanente Division of Research in Oakland, Calif. In another study, described in a 2010 article in the Annals of Neurology, researchers at Boston University School of Medicine found that the greater the amount of belly fat, the greater the brain shrinkage in old age. © 2013 Scientific American