Chapter 17. Learning and Memory
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Thomas MacMillan “Time” is the most common noun in the English language, Dean Buonomano tells us on the first page of his new book, Your Brain Is a Time Machine: The Neuroscience and Physics of Time. But our despite fixation with time, and its obvious centrality in our lives, we still struggle to fully understand it. From a psychology perspective, for instance, time seems to flow by, sometimes slowly — like when we’re stuck in line at the DMV — and sometimes quickly — like when we’re lost in an engrossing novel. But from a physics perspective, time may be simply another dimension in the universe, like length, height, or width. Buonomano, a professor of neuroscience at UCLA, lays out the latest, best theories about how we understand time, illuminating a fundamental aspect of being human. The human brain, he writes, is a time machine that allows us to mentally travel backward and forward, to plan for the future and agonizingly regret that past like no other animal. And, he argues, our brains are time machines like clocks are time machines: constantly tracking the passage of time, whether it’s circadian rhythms that tell us when to go to sleep, or microsecond calculations that allow us to the hear the difference between “They gave her cat-food” and “They gave her cat food.” In an interview with Science of Us, Buonomano spoke about planning for the future as a basic human activity, the limits of be-here-now mindfulness, and the inherent incompatibility between physicists’ and neuroscientists’ understanding of the nature of time. I finished reading your book late last night and went to bed sort of planning our interview today, and then woke up at about 3:30 a.m. ready to do the interview, with my head full of insistent thoughts about questions that I should ask you. So was that my brain being a — maybe malfunctioning — time machine? I think this is consistent with the notion that the brain is an organ that’s future-oriented. As far as survival goes, the evolutionary value of the brain is to act in the present to ensure survival in the future, whether survival is figuring out a good place to get food, or doing an interview, I suppose. ! © Invalid Date, New York Media LLC
By BENEDICT CAREY Well-timed pulses from electrodes implanted in the brain can enhance memory in some people, scientists reported on Thursday, in the most rigorous demonstration to date of how a pacemaker-like approach might help reduce symptoms of dementia, head injuries and other conditions. The report is the result of decades of work decoding brain signals, helped along in recent years by large Department of Defense grants intended to develop novel treatments for people with traumatic brain injuries, a signature wound of the Iraq and Afghanistan wars. The research, led by a team at the University of Pennsylvania, is published in the journal Current Biology. Previous attempts to stimulate human memory with implanted electrodes had produced mixed results: Some experiments seemed to sharpen memory, but others muddled it. The new paper resolves this confusion by demonstrating that the timing of the stimulation is crucial. Zapping memory areas when they are functioning poorly improves the brain’s encoding of new information. But doing so when those areas are operating well — as they do for stretches of the day in most everyone, including those with deficits — impairs the process. “We all have good days and bad days, times when we’re foggy, or when we’re sharp,” said Michael Kahana, who with Youssef Ezzyat led the research team. “We found that jostling the system when it’s in a low-functioning state can jump it to a high-functioning one.” Researchers cautioned that implantation is a delicate procedure and that the reported improvements may not apply broadly. The study was of epilepsy patients; scientists still have much work to do to determine whether this approach has the same potential in people with other conditions, and if so how best to apply it. But in establishing the importance of timing, the field seems to have turned a corner, experts said. © 2017 The New York Times Company
By Dina Fine Maron A bizarre medical mystery can be added to the list of growing concerns about opioid use in the U.S. Since 2012 more than a dozen illicit drug users have shown up in hospitals across eastern Massachusetts with inexplicable amnesia. In some cases the patients’ memory difficulties had persisted for more than a year. Yet this bewildering condition does not appear to be the result of a simple case of tainted goods: The drug users do not appear to have used the same batch of drugs—or even the same type of substance. To get some answers, the state’s public health officials are rolling out a new requirement that clinicians who come across any patients (not just opioid users) with these types of memory deficits—along with damage to the hippocampus—must report the cases to the state. On April 3 state public health officials received the legal green light from the Massachusetts public health commissioner to make this a required, reportable condition. This technical change, which will last for one year, authorizes public health workers to collect this information and reassures clinicians that they can—and must—share case reports. In the next couple of days workers will notify emergency room personnel as well as addiction counselors and neurology specialists about the new designation via e-mail. The new reporting requirement, state officials hope, will help epidemiologists learn how widespread the issue of potential opioid-linked amnesia may be and whether patients have specific factors in common. The change was first reported by BuzzFeed News. © 2017 Scientific American,
By LISA SANDERS, M.D. “I feel very pain,” the 62-year-old mumbled incoherently as he sat in a wheelchair. He had said almost nothing since arriving at the office of Dr. Joel Geerling, a neurologist at Beth Israel Deaconess Medical Center in Boston. A year ago, he was fine, explained the patient’s sister. He was married, working as an auto mechanic, happy, normal. Then, six or seven months ago, he became forgetful. Little things at first — he couldn’t think of the right word, remember people’s names. But then big things — like forgetting who he was talking to on the phone or how to drive to places he had known for decades. That was fall 2014. By that Christmas, walking became difficult. He fell frequently. He had trouble feeding himself. He slept most of the day and night. Over the course of this illness, he lost almost everything. He was fired from his job; his wife left him. He didn’t even have his car anymore: His daughter took the keys after an accident. He had always been friendly and talkative, but now he was withdrawn and nearly wordless. In a few months, the man went from being completely independent to requiring round-the-clock care. This daughter tried to take care of him, but recently she had to hire someone; she couldn’t miss any more college classes. The patient first saw his regular doctor, but she couldn’t figure out what was wrong and sent him to a neurologist. When the specialist was stumped, she sent the patient to Geerling, a neurologist who focused on dementia and other cognitive diseases. In the exam room, the patient slumped in the wheelchair and held his head tipped back so that he was looking straight at the doctor above him, giving him a childlike appearance. When Geerling examined him, he found out why. The patient could not make his eyes move up. When he tried to walk, his feet remained on the ground — as if there were a magnet holding them down — giving him an odd, shuffling, gliding gait. He was unable to count down from 10 and didn’t know where he lived. © 2017 The New York Times Company
Keyword: Learning & Memory
Link ID: 23509 - Posted: 04.19.2017
By Simon Makin When the now-famous neurological patient Henry Molaison had his brain’s hippocampus surgically sectioned to treat seizures in 1953, science’s understanding of memory inadvertently received perhaps its biggest boost ever. Molaison lost the ability to form new memories of events, and his recollection of anything that had happened during the preceding year was severely impaired. Other types of memory such as learning physical skills were unaffected, suggesting the hippocampus specifically handles the recall of events—known as “episodic” memories. Further research on other patients with hippocampal damage confirmed recent memories are more impaired than distant ones. It appears the hippocampus provides temporary storage for new information whereas other areas may handle long-term memory. Events that we are later able to remember appear to be channeled for more permanent storage in the cortex (the outer layers of the brain responsible for higher functions such as planning and problem-solving). In the cortex these memories form gradually, becoming integrated with related information to build lasting knowledge about ourselves and the world. Episodic memories that are intended for long-term storage accumulate to form the “autobiographical” memory that is so essential for our sense of identity. Neuroscientists know a lot about how short-term memories are formed in the brain but the processes underlying long-term storage are still not well understood. © 2017 Scientific American,
Keyword: Learning & Memory
Link ID: 23485 - Posted: 04.13.2017
Ed Yong 12:00 PM ET Science Octopuses have three hearts, parrot-like beaks, venomous bites, and eight semi-autonomous arms that can taste the world. They squirt ink, contort through the tiniest of spaces, and melt into the world by changing both color and texture. They are incredibly intelligent, capable of wielding tools, solving problems, and sabotaging equipment. As Sy Montgomery once wrote, “no sci-fi alien is so startlingly strange” as an octopus. But their disarming otherness doesn’t end with their bodies. Their genes are also really weird. A team of scientists led by Joshua Rosenthal at the Marine Biological Laboratory and Eli Eisenberg at Tel Aviv University have shown that octopuses and their relatives—the cephalopods—practice a type of genetic alteration called RNA editing that’s very rare in the rest of the animal kingdom. They use it to fine-tune the information encoded by their genes without altering the genes themselves. And they do so extensively, to a far greater degree than any other animal group. “They presented this work at a recent conference, and it was a big surprise to everyone,” says Kazuka Nishikura from the Wistar Institute. “I study RNA editing in mice and humans, where it’s very restricted. The situation is very different here. I wonder if it has to do with their extremely developed brains.” It certainly seems that way. Rosenthal and Eisenberg found that RNA editing is especially rife in the neurons of cephalopods. They use it to re-code genes that are important for their nervous systems—the genes that, as Rosenthal says, “make a nerve cell a nerve cell.” And only the intelligent coleoid cephalopods—octopuses, squid, and cuttlefish—do so. The relatively dumber nautiluses do not. “Humans don’t have this. Monkeys don’t. Nothing has this except the coleoids,” says Rosenthal.
By James Gallagher Health and science reporter, What really happens when we make and store memories has been unravelled in a discovery that surprised even the scientists who made it. The US and Japanese team found that the brain "doubles up" by simultaneously making two memories of events. One is for the here-and-now and the other for a lifetime, they found. It had been thought that all memories start as a short-term memory and are then slowly converted into a long-term one. Experts said the findings were surprising, but also beautiful and convincing. 'Significant advance' Two parts of the brain are heavily involved in remembering our personal experiences. The hippocampus is the place for short-term memories while the cortex is home to long-term memories. This idea became famous after the case of Henry Molaison in the 1950s. His hippocampus was damaged during epilepsy surgery and he was no longer able to make new memories, but his ones from before the operation were still there. So the prevailing idea was that memories are formed in the hippocampus and then moved to the cortex where they are "banked". The team at the Riken-MIT Center for Neural Circuit Genetics have done something mind-bogglingly advanced to show this is not the case. The experiments had to be performed on mice, but are thought to apply to human brains too. They involved watching specific memories form as a cluster of connected brain cells in reaction to a shock. Researchers then used light beamed into the brain to control the activity of individual neurons - they could literally switch memories on or off. The results, published in the journal Science, showed that memories were formed simultaneously in the hippocampus and the cortex. Prof Susumu Tonegawa, the director of the research centre, said: "This was surprising." He told the BBC News website: "This is contrary to the popular hypothesis that has been held for decades. Copyright © 2017
Keyword: Learning & Memory
Link ID: 23460 - Posted: 04.07.2017
By Matt Reynolds Google’s latest take on machine translation could make it easier for people to communicate with those speaking a different language, by translating speech directly into text in a language they understand. Machine translation of speech normally works by first converting it into text, then translating that into text in another language. But any error in speech recognition will lead to an error in transcription and a mistake in the translation. Researchers at Google Brain, the tech giant’s deep learning research arm, have turned to neural networks to cut out the middle step. By skipping transcription, the approach could potentially allow for more accurate and quicker translations. The team trained its system on hundreds of hours of Spanish audio with corresponding English text. In each case, it used several layers of neural networks – computer systems loosely modelled on the human brain – to match sections of the spoken Spanish with the written translation. To do this, it analysed the waveform of the Spanish audio to learn which parts seemed to correspond with which chunks of written English. When it was then asked to translate, each neural layer used this knowledge to manipulate the audio waveform until it was turned into the corresponding section of written English. “It learns to find patterns of correspondence between the waveforms in the source language and the written text,” says Dzmitry Bahdanau at the University of Montreal in Canada, who wasn’t involved with the work. © Copyright Reed Business Information Ltd.
By CHRISTOPHER MELE You were sure you left the keys right there on the counter, and now they are nowhere to be found. Where could they be? Misplacing objects is an everyday occurrence, but finding them can be like going on a treasure hunt without a map. Here are some recommendations from experts to help you recover what is lost. (Consider printing this out and putting it someplace you can easily find it.) Stay calm and search on One of the biggest mistakes people make is becoming panicked or angry, which leads to frantic, unfocused searching, said Michael Solomon, who wrote the book “How to Find Lost Objects.” One of the axioms of his book is: “There are no missing objects. Only unsystematic searchers.” Look for the item where it’s supposed to be. Sometimes objects undergo “domestic drift” in which they were left wherever they were last used, Mr. Solomon said. “Objects are apt to wander,” he wrote in his book. “I have found, though, that they tend to travel no more than 18 inches from their original location.” Be disciplined in your search A common trap is forgetting where you have already searched, Corbin A. Cunningham, a Ph.D. student at the Department of Psychological and Brain Sciences at Johns Hopkins University, said in an email. “Go from one room to another, and only move on if you think you have searched everywhere in that room,” he wrote. Once you have thoroughly searched an area and ruled it out, don’t waste time returning to it. © 2017 The New York Times Company
Keyword: Learning & Memory
Link ID: 23440 - Posted: 04.03.2017
Elle Hunt Inches above the seafloor of Sydney’s Cabbage Tree Bay, with the proximity made possible by several millimetres of neoprene and a scuba diving tank, I’m just about eyeball to eyeball with this creature: an Australian giant cuttlefish. Even allowing for the magnifying effects of the mask snug across my nose, it must be about 60cm (two feet) long, and the peculiarities that abound in the cephalopod family, that includes octopuses and squid, are the more striking writ so large. ADVERTISING Its body – shaped around an internal surfboard-like shell, tailing off into a fistful of tentacles – has the shifting colour of velvet in light, and its W-shaped pupils lend it a stern expression. I don’t think I’m imagining some recognition on its part. The question is, of what? It was an encounter like this one – “at exactly the same place, actually, to the foot” – that first prompted Peter Godfrey-Smith to think about these most other of minds. An Australian academic philosopher, he’d recently been appointed a professor at Harvard. While snorkelling on a visit home to Sydney in about 2007, he came across a giant cuttlefish. The experience had a profound effect on him, establishing an unlikely framework for his own study of philosophy, first at Harvard and then the City University of New York. The cuttlefish hadn’t been afraid – it had seemed as curious about him as he was about it. But to imagine cephalopods’ experience of the world as some iteration of our own may sell them short, given the many millions of years of separation between us – nearly twice as many as with humans and any other vertebrate (mammal, bird or fish)
By C. CLAIBORNE RAY Q. When four of us shared memories of our very young lives, not one of us could recall events before the age of 4 or possibly 3. Is this common? A. Yes. For adults, remembering events only after age 3½ or 4 is typical, studies have found. The phenomenon was named childhood amnesia by Freud and identified late in the 19th century by the pioneering French researcher Victor Henri and his wife, Catherine. The Henris published a questionnaire on early memories in 1895, and the results from 123 people were published in 1897. Most of the participants’ earliest memories came from when they were 2 to 4 years old; the average was age 3. Very few participants recalled events from the first year of life. Many subsequent studies found similar results. Several theories have been offered to explain the timing of laying down permanent memories. One widely studied idea relates the formation of children’s earliest memories to when they start talking about past events with their mothers, suggesting a link between memories and the age of language acquisition. More recent studies, in 2010 and 2014, found discrepancies in the accuracy of young children’s estimates of when things had occurred in their lives. Another 2014 study found a progressive loss of recall as a child ages, with 5-, 6- and 7-year-olds remembering 60 percent or more of some early-life events that were discussed at age 3, while 8- and 9-year-olds remembered only 40 percent of these events. © 2017 The New York Times Company
By Jason G. Goldman In the summer of 2015 University of Oxford zoologists Antone Martinho III and Alex Kacelnik began quite the cute experiment—one involving ducklings and blindfolds. They wanted to see how the baby birds imprinted on their mothers depending on which eye was available. Why? Because birds lack a part of the brain humans take for granted. Suspended between the left and right hemispheres of our brains sits the corpus callosum, a thick bundle of nerves. It acts as an information bridge, allowing the left and right sides to rapidly communicate and act as a coherent whole. Although the hemispheres of a bird's brain are not entirely separated, the animals do not enjoy the benefits of this pathway. This quirk of avian neuroanatomy sets up a natural experiment. “I was in St. James's Park in London, and I saw some ducklings with their parents in the lake,” Martinho says. “It occurred to me that we could look at the instantaneous transfer of information through imprinting.” The researchers covered one eye of each of 64 ducklings and then presented a fake red or blue adult duck. This colored duck became “Mom,” and the ducklings followed it around. But when some of the ducklings' blindfolds were swapped so they could see out of only the other eye, they did not seem to recognize their “parent” anymore. Instead the ducklings in this situation showed equal affinity for both the red and blue ducks. It took three hours before any preferences began to emerge. Meanwhile ducklings with eyes that were each imprinted to a different duck did not show any parental preferences when allowed to use both eyes at once. The study was recently published in the journal Animal Behaviour. © 2017 Scientific American
By David Wiegand I just did something great for my brain and you can do the same, when the documentary “My Love Affair With the Brain: The Life and Science of Dr. Marian Diamond” airs on KQED on Wednesday, March 22. According to the UC Berkeley professor emerita, the five things that contribute to the continued development of the brain at any age are: diet, exercise, newness, challenge and love. You can check off three of those elements for the day by watching the film by Catherine Ryan and Gary Weimberg. No matter how smart you are, even about anatomy and neuroscience, you will find newness in the information about the miraculous human brain, how it works, and how it keeps on working no matter how old you are. That’s one of the fundamentals of modern neuroscience, of which Diamond is one of the founders. You will also be challenged to consider your own brain, to consider how Diamond’s favorite expression — “use it or lose it” — applies to your brain and your life. You will be challenged to consider what Diamond means when she says brain plasticity (its ability to keep developing by forming new connections between its cells) makes us “the masters of our own minds. We literally create our own masterpiece.” Before Diamond and her colleagues proved otherwise, the prevailing thought was that brains developed according to a genetically determined pattern, hit a high point and then essentially began to deteriorate. Bushwa: A brain can grow — i.e., learn — at any age, and you can teach an old dog new tricks. © 2017 Hearst Corporation
Keyword: Learning & Memory
Link ID: 23392 - Posted: 03.23.2017
By Mo Costandi This map of London shows how many other streets are connected to each street, with blue representing simple streets with few connecting streets and red representing complex streets with many connecting streets. Credit: Joao Pinelo Silva The brain contains a built-in GPS that relies on memories of past navigation experiences to simulate future ones. But how does it represent new environments in order to determine how to navigate them successfully? And what happens in the brain when we enter a new space, or use satellite navigation (SatNav) technology to help us find our way around? Research published Tuesday in Nature Communications reveals two distinct brain regions that cooperate to simulate the topology of one’s environment and plan future paths through it when one is actively navigating. In addition, the research suggests both regions become inactive when people follow SatNav instructions instead of using their spatial memories. In a previous study researchers at University College London took participants on a guided tour through the streets of London’s Soho district and then used functional magnetic resonance imaging (fMRI) to scan their brains as they watched 10 different simulations of navigating those streets. Some of the movies required them to decide at intersections which way would be the shortest path to a predetermined destination; others came with instructions about which way to go at each junction. © 2017 Scientific American,
Keyword: Learning & Memory
Link ID: 23391 - Posted: 03.22.2017
Laura Sanders Not too long ago, the internet was stationary. Most often, we’d browse the Web from a desktop computer in our living room or office. If we were feeling really adventurous, maybe we’d cart our laptop to a coffee shop. Looking back, those days seem quaint. Today, the internet moves through our lives with us. We hunt Pokémon as we shuffle down the sidewalk. We text at red lights. We tweet from the bathroom. We sleep with a smartphone within arm’s reach, using the device as both lullaby and alarm clock. Sometimes we put our phones down while we eat, but usually faceup, just in case something important happens. Our iPhones, Androids and other smartphones have led us to effortlessly adjust our behavior. Portable technology has overhauled our driving habits, our dating styles and even our posture. Despite the occasional headlines claiming that digital technology is rotting our brains, not to mention what it’s doing to our children, we’ve welcomed this alluring life partner with open arms and swiping thumbs. Scientists suspect that these near-constant interactions with digital technology influence our brains. Small studies are turning up hints that our devices may change how we remember, how we navigate and how we create happiness — or not. Somewhat limited, occasionally contradictory findings illustrate how science has struggled to pin down this slippery, fast-moving phenomenon. Laboratory studies hint that technology, and its constant interruptions, may change our thinking strategies. Like our husbands and wives, our devices have become “memory partners,” allowing us to dump information there and forget about it — an off-loading that comes with benefits and drawbacks. Navigational strategies may be shifting in the GPS era, a change that might be reflected in how the brain maps its place in the world. Constant interactions with technology may even raise anxiety in certain settings. |© Society for Science & the Public 2000 - 2017
Ian Sample Science editor Researchers have overcome one of the major stumbling blocks in artificial intelligence with a program that can learn one task after another using skills it acquires on the way. Developed by Google’s AI company, DeepMind, the program has taken on a range of different tasks and performed almost as well as a human. Crucially, and uniquely, the AI does not forget how it solved past problems, and uses the knowledge to tackle new ones. The AI is not capable of the general intelligence that humans draw on when they are faced with new challenges; its use of past lessons is more limited. But the work shows a way around a problem that had to be solved if researchers are ever to build so-called artificial general intelligence (AGI) machines that match human intelligence. “If we’re going to have computer programs that are more intelligent and more useful, then they will have to have this ability to learn sequentially,” said James Kirkpatrick at DeepMind. The ability to remember old skills and apply them to new tasks comes naturally to humans. A regular rollerblader might find ice skating a breeze because one skill helps the other. But recreating this ability in computers has proved a huge challenge for AI researchers. AI programs are typically one trick ponies that excel at one task, and one task only.
Laurel Hamers Mistakes can be learning opportunities, but the brain needs time for lessons to sink in. When facing a fast and furious stream of decisions, even the momentary distraction of noting an error can decrease accuracy on the next choice, researchers report in the March 15 Journal of Neuroscience. “We have a brain region that monitors and says ‘you messed up’ so that we can correct our behavior,” says psychologist George Buzzell, now at the University of Maryland in College Park. But sometimes, that monitoring system can backfire, distracting us from the task at hand and causing us to make another error. “There does seem to be a little bit of time for people, after mistakes, where you're sort of offline,” says Jason Moser, a psychologist at Michigan State University in East Lansing, who wasn’t part of the study. To test people’s response to making mistakes, Buzzell and colleagues at George Mason University in Fairfax, Va., monitored 23 participants’ brain activity while they worked through a challenging task. Concentric circles flashed briefly on a screen, and participants had to respond with one hand if the two circles were the same color and the other hand if the circles were subtly different shades. After making a mistake, participants generally answered the next question correctly if they had a second or so to recover. But when the next challenge came very quickly after an error, as little as 0.2 seconds, accuracy dropped by about 10 percent. Electrical activity recorded from the visual cortex showed that participants paid less attention to the next trial if they had just made a mistake than if they had responded correctly. |© Society for Science & the Public 2000 - 2017
There is widespread interest among teachers in the use of neuroscientific research findings in educational practice. However, there are also misconceptions and myths that are supposedly based on sound neuroscience that are prevalent in our schools. We wish to draw attention to this problem by focusing on an educational practice supposedly based on neuroscience that lacks sufficient evidence and so we believe should not be promoted or supported. Generally known as “learning styles”, it is the belief that individuals can benefit from receiving information in their preferred format, based on a self-report questionnaire. This belief has much intuitive appeal because individuals are better at some things than others and ultimately there may be a brain basis for these differences. Learning styles promises to optimise education by tailoring materials to match the individual’s preferred mode of sensory information processing. There are, however, a number of problems with the learning styles approach. First, there is no coherent framework of preferred learning styles. Usually, individuals are categorised into one of three preferred styles of auditory, visual or kinesthetic learners based on self-reports. One study found that there were more than 70 different models of learning styles including among others, “left v right brain,” “holistic v serialists,” “verbalisers v visualisers” and so on. The second problem is that categorising individuals can lead to the assumption of fixed or rigid learning style, which can impair motivation to apply oneself or adapt. Finally, and most damning, is that there have been systematic studies of the effectiveness of learning styles that have consistently found either no evidence or very weak evidence to support the hypothesis that matching or “meshing” material in the appropriate format to an individual’s learning style is selectively more effective for educational attainment. Students will improve if they think about how they learn but not because material is matched to their supposed learning style.
Keyword: Learning & Memory
Link ID: 23352 - Posted: 03.14.2017
By Knvul Sheikh As we get older, we start to think a little bit more slowly, we are less able to multitask and our ability to remember things gets a little wobblier. This cognitive transformation is linked to a steady, widespread thinning of the cortex, the brain's outermost layer. Yet the change is not inevitable. So-called super agers retain their good memory and thicker cortex as they age, a recent study suggests. Researchers believe that studying what makes super agers different could help unlock the secrets to healthy brain aging and improve our understanding of what happens when that process goes awry. “Looking at successful aging could provide us with biomarkers for predicting resilience and for things that might go wrong in people with age-related diseases like Alzheimer's and dementia,” says study co-author Alexandra Touroutoglou, a neuroscientist at Harvard Medical School. Touroutoglou and her team gave standard recall tests to a group of 40 participants between the ages of 60 and 80 and 41 participants aged 18 to 35. Among the older participants, 17 performed as well as or better than adults four to five decades younger. When the researchers looked at MRI scans of the super agers' brains, they found that their brains not only functioned more like young brains, they also looked very similar. Two brain networks in particular seemed to be protected from shrinking: the default mode network, which helps to store and recall new information, and the salience network, which is associated with directing attention and identifying important details. In fact, the thicker these regions were, the better the super agers' memory was. © 2017 Scientific American,
By Torah Kachur, A simple, non-invasive, non-medicinal, safe and cheap way to get a better night's sleep is to play some pink noise, according to a study published on Wednesday in the journal Frontiers in Human Neuroscience. Pink noise has more lower octaves than typical white noise and is hardly soothing. For example, it can be one-second pulses of the sound of a rushing waterfall. The short pieces of quick, quiet sounds would be really annoying if you were trying to fall asleep. But the pink noise isn't trying to get you to fall asleep; it's trying to keep you in a very deep sleep where you have slow brainwaves. This is one of our deepest forms of sleep and, in particular, seems to decline in aging adults. "When you play the pulses at particular times during deep sleep, it actually leads to an enhancement of the electrical signal. So it leads to essentially more of a synchronization of the neurons," said Nelly Papalambros, a PhD student at Northwestern University and the first author on the work. The pulses are timed to coincide with your entry into slow wave sleep. They sound to the same beat as your brainwaves, and they seem to increase the effectiveness of your very valuable and very elusive deep sleep. That slow wave sleep is critical for memory consolidation or, basically, your ability to incorporate new material learned that day with old material and memories. ©2017 CBC/Radio-Canada.