Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1470

Sarah Webb Four years ago, scientists from Google showed up on neuroscientist Steve Finkbeiner’s doorstep. The researchers were based at Google Accelerated Science, a research division in Mountain View, California, that aims to use Google technologies to speed scientific discovery. They were interested in applying ‘deep-learning’ approaches to the mountains of imaging data generated by Finkbeiner’s team at the Gladstone Institute of Neurological Disease in San Francisco, also in California. Deep-learning algorithms take raw features from an extremely large, annotated data set, such as a collection of images or genomes, and use them to create a predictive tool based on patterns buried inside. Once trained, the algorithms can apply that training to analyse other data, sometimes from wildly different sources. The technique can be used to “tackle really hard, tough, complicated problems, and be able to see structure in data — amounts of data that are just too big and too complex for the human brain to comprehend”, Finkbeiner says. He and his team produce reams of data using a high-throughput imaging strategy known as robotic microscopy, which they had developed for studying brain cells. But the team couldn’t analyse its data at the speed it acquired them, so Finkbeiner welcomed the opportunity to collaborate. “I can’t honestly say at the time that I had a clear grasp of what questions might be addressed with deep learning, but I knew that we were generating data at about twice to three times the rate we could analyse it,” he says. © 2018 Macmillan Publishers Limited,

Keyword: Learning & Memory; Robotics
Link ID: 24684 - Posted: 02.21.2018

By GRETCHEN REYNOLDS Exercise may help the brain to build durable memories, through good times and bad. Stress and adversity weaken the brain’s ability to learn and retain information, earlier research has found. But according to a remarkable new neurological study in mice, regular exercise can counteract those effects by bolstering communication between brain cells. Memory has long been considered a biological enigma, a medley of mental ephemera that has some basis in material existence. Memories are coded into brain cells in the hippocampus, the brain’s memory center. If our memories were not written into those cells, they would not be available for later, long-term recall, and every brain would be like that of Dory, the memory-challenged fish in “Finding Nemo.” But representations of experience are extremely complex, and aspects of most memories must be spread across multiple brain cells, neuroscientists have determined. These cells must be able to connect with one another, so that the memory, as a whole, stays intact. The connections between neurons, known as synapses, are composed of electrical and chemical signals that move from cell to cell, like notes passed in class. The signals can be relatively weak and sporadic or flow with vigor and frequency. In general, the stronger the messages between neurons, the sturdier and more permanent the memories they hold. Neuroscientists have known for some time that the potency of our synapses depends to some degree on how we live our lives. Lack of sleep, alcohol, diet and other aspects of our lifestyles, especially stress, may dampen the flow of messages between brain cells, while practice fortifies it. Repeat an action and the signals between the cells maintaining the memory of that action can strengthen. That is learning. © 2018 The New York Times Company

Keyword: Learning & Memory
Link ID: 24683 - Posted: 02.21.2018

A small group of cells in the brain can have a big effect on seizures and memory in a mouse model of epilepsy. According to a new study in Science, loss of mossy cells may contribute to convulsive seizures in temporal lobe epilepsy (TLE) as well as memory problems often experienced by people with the disease. The study was funded by the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health. “The role of mossy cells in epilepsy has been debated for decades. This study reveals how critical these cells are in the disease, and the findings suggest that preventing loss of mossy cells or finding ways to activate them may be potential therapeutic targets,” said Vicky Whittemore, Ph.D., program director at NINDS. Mossy cells, named for the dense moss-like protrusions that cover their surface, are located in the hippocampus, a brain area that is known to play key roles in memory. Loss of mossy cells is associated with TLE, but it is unknown what role that plays in the disease. Using state-of-the-art tools, Ivan Soltesz, Ph.D., professor of neurosurgery and neurosciences at Stanford University, Palo Alto, California, and his team were able to turn mossy cells on and off to track their effects in a mouse model of epilepsy. “This study would not have been possible without the rapid advancement of technology, thanks in part to the BRAIN Initiative, which has encouraged scientists to develop innovative instruments and new ways to look at the brain,” said Dr. Soltesz. “It’s remarkable that we can manipulate specific brain cells in the hippocampus of a mouse. Using 21st century tools brings us closer than ever to unlocking the mysteries behind this debilitating disease.”

Keyword: Epilepsy; Learning & Memory
Link ID: 24669 - Posted: 02.16.2018

By Ashley Yeager Wandering through a maze with striped gray walls, a mouse searches for turns that will take it to a thirst-quenching reward. Although the maze seems real to the mouse, it is, in fact, a virtual world. Virtual reality (VR) has become a valuable tool to study brains and behaviors because researchers can precisely control sensory cues, correlating nerve-cell activity with specific actions. “It allows experiments that are not possible using real-world approaches,” neurobiologist Christopher Harvey of Harvard Medical School and colleagues wrote in 2016 in a commentary in Nature (533:324–25). Studies of navigation are perfect examples. Extraneous sounds, smells, tastes, and textures, along with internal information about balance and spatial orientation, combine with visual cues to help a mouse move through a maze. In a virtual environment, researchers can add or remove any of these sensory inputs to see how each affects nerve-cell firing and the neural patterns that underlie exploration and other behaviors. But there’s a catch. Many VR setups severely restrict how animals move, which can change nerve cells’ responses to sensory cues. As a result, some researchers have begun to build experimental setups that allow animals to move more freely in their virtual environments, while others have starting using robots to aid animals in navigation or to simulate interactions with others of their kind. Here, The Scientist explores recent efforts in both arenas, which aim to develop a more realistic sense of how the brain interprets reality. © 1986-2018 The Scientist

Keyword: Learning & Memory
Link ID: 24667 - Posted: 02.16.2018

By BENEDICT CAREY Decent memory is a matter of livelihood, of independence, most of all of identity. Human memory is the ghost in the neural machine, a widely distributed, continually changing, multidimensional conversation among cells that can reproduce both the capital of Kentucky and the emotional catacombs of that first romance. The news last week that scientists had developed a brain implant that boosts memory — an implantable “cognitive prosthetic,” in the jargon — should be astounding even to the cynical. App developers probably are already plotting yet another brain-exercise product based on the latest science. Screenwriters working on their next amnesia-assassin scripts got some real-life backup for the pitch meeting. The scientists are in discussions to commercialize the technology, and so people in the throes of serious memory loss, and their families, likely feel a sense of hope, thin though it may be. These things take time, and there are still many unknowns. But for those in the worried-well demographic — the 40-is-the-new-30 crowd, and older — reports of a memory breakthrough fall into a different category. What exactly does it mean that scientists are truly beginning to understand the biology of memory well enough to manipulate it? Which reaction is appropriate: the futurist’s, or the curmudgeon’s? The only honest answer at this stage is both. The developers of the new implant, led by scientists at the University of Pennsylvania and Thomas Jefferson University, built on decades of work decoding brain signals, using the most advanced techniques of machine learning. Their implant, in fact, constitutes an array of electrodes embedded deep in the brain that monitor electrical activity and, like a pacemaker, deliver a stimulating pulse only when needed — when the brain is lagging as it tries to store new information. © 2018 The New York Times Company

Keyword: Learning & Memory; Robotics
Link ID: 24656 - Posted: 02.13.2018

By NICHOLAS BAKALAR Increasing blood sugar levels are associated with cognitive decline, a long-term study has found. Researchers assessed cognitive function in 5,189 people, average age 66, and tested their blood sugar using HbA1c, a test that accurately measures blood glucose levels over a period of weeks or months. (The finger-prick blood test, in contrast, gives a reading only at a given moment in time.) They followed the group for up to 10 years, tracking blood glucose levels and periodically testing cognitive ability. The study is in the journal Diabetologia. There was no association between blood sugar levels and cognition at the start of the study. But consistently over time, scores on the tests of memory and executive function declined as HbA1c levels increased, even in people without diabetes. The study controlled for many other variables, among them age, sex, cholesterol, B.M.I., education, marital status, depression, smoking, alcohol consumption, hypertension and cardiovascular disease. This is an observational study that does not prove cause and effect, and the lead author, Wuxiang Xie, a researcher at the Peking University Health Science Center, said that the underlying mechanism is still unknown. Still, he said, “Diabetes-related microvascular complications might be, at least in part, the reason for the subsequent cognitive decline. Future studies are warranted to reveal the precise mechanisms.” © 2018 The New York Times Company

Keyword: Learning & Memory; Alzheimers
Link ID: 24649 - Posted: 02.13.2018

Laura Sanders Brain scientists have filmed a first-of-a-kind birth video. It reveals specialized cells in the brains of mice dividing to create newborn nerve cells. The images, published in the Feb. 9 Science, show intricacies of how certain parts of the adult mouse brain can churn out new nerve cells. These details may help lead to a deeper understanding of the role of this nerve cell renewal in such processes as memory. Deep in the brains of mice, a memory-related structure called the hippocampus is known to be flush with new nerve cells. But because this buried neural real estate is hard to study, the circumstances of these births weren’t clear. Using living mice, Sebastian Jessberger, a neuroscientist at the University of Zurich, and colleagues removed the outer layers of brain tissue that obscure the hippocampus. The scientists marked 63 cells called radial stem cells, which can divide to create new nerve cells. Researchers then watched these stem cells for up to two months, taking pictures every 12 or 24 hours. During that time, 42 of these stem cells underwent a spurt of division, churning out two kinds of cells: intermediate cells that would go on to produce nerve cells as well as mature nerve cells themselves. Once this burst of activity ended, the radial stem cells disappeared by dividing themselves into mature nerve cells that could no longer split. |© Society for Science & the Public 2000 - 2017.

Keyword: Neurogenesis; Development of the Brain
Link ID: 24635 - Posted: 02.09.2018

By BENEDICT CAREY Scientists have developed a brain implant that noticeably boosted memory in its first serious test run, perhaps offering a promising new strategy to treat dementia, traumatic brain injuries and other conditions that damage memory. The device works like a pacemaker, sending electrical pulses to aid the brain when it is struggling to store new information, but remaining quiet when it senses that the brain is functioning well. In the test, reported Tuesday in the journal Nature Communications, the device improved word recall by 15 percent — roughly the amount that Alzheimer’s disease steals over two and half years. The implant is still experimental; the researchers are currently in discussions to commercialize the technology. And its broad applicability is unknown, having been tested so far only in people with epilepsy. Experts cautioned that the potential for misuse of any “memory booster” is enormous — A.D.H.D. drugs are widely used as study aids. They also said that a 15 percent improvement is fairly modest. Still, the research marks the arrival of new kind of device: an autonomous aid that enhances normal, but less than optimal, cognitive function. Doctors have used similar implants for years to block abnormal bursts of activity in the brain, most commonly in people with Parkinson’s disease and epilepsy. “The exciting thing about this is that, if it can be replicated and extended, then we can use the same method to figure out what features of brain activity predict good performance,” said Bradley Voytek, an assistant professor of cognitive and data science at the University of California, San Diego. © 2018 The New York Times Company

Keyword: Learning & Memory; Robotics
Link ID: 24629 - Posted: 02.07.2018

By Helen Shen Noninvasive brain stimulation is having its heyday, as scientists and hobbyists alike look for ways to change the activity of neurons without cutting into the brain and implanting electrodes. One popular set of techniques, called transcranial electrical stimulation (TES), delivers electrical current via electrodes stuck to the scalp, typically above the target brain area. In recent years a number of studies have attributed wide-ranging benefits to TES including enhancing memory, improving math skills, alleviating depression and even speeding recovery from stroke. Such results have also spawned a cottage industry providing commercial TES kits for DIY brain hackers seeking to boost their mind power. But little is known about how TES actually interacts with the brain, and some studies have raised serious doubts about the effectiveness of these techniques. A study published on February 2 in Nature Communications ups the ante, reporting that conventional TES techniques do not deliver enough current to activate brain circuits or modulate brain rhythms. The electrical currents mostly fizzle out as they pass through the scalp and skull. “Anybody who has published a positive effect in this field is probably not going to like our paper,” says György Buzsáki, a neuroscientist at New York University and a senior author of the study. The mechanisms behind TES have remained mysterious, in part because without penetrating the skull, researchers cannot measure neural responses while they apply stimulation. Conventional TES methods produce electrical noise that swamps any brain activity detected on the scalp. © 2018 Scientific American

Keyword: Learning & Memory
Link ID: 24622 - Posted: 02.06.2018

Dana Boebinger Roughly 15 percent of Americans report some sort of hearing difficulty; trouble understanding conversations in noisy environments is one of the most common complaints. Unfortunately, there’s not much doctors or audiologists can do. Hearing aids can amplify things for ears that can’t quite pick up certain sounds, but they don’t distinguish between the voice of a friend at a party and the music in the background. The problem is not only one of technology, but also of brain wiring. Most hearing aid users say that even with their hearing aids, they still have difficulty communicating in noisy environments. As a neuroscientist who studies speech perception, this issue is prominent in much of my own research, as well as that of many others. The reason isn’t that they can’t hear the sounds; it’s that their brains can’t pick out the conversation from the background chatter. Harvard neuroscientists Dan Polley and Jonathon Whitton may have found a solution, by harnessing the brain’s incredible ability to learn and change itself. They have discovered that it may be possible for the brain to relearn how to distinguish between speech and noise. And the key to learning that skill could be a video game. People with hearing aids often report being frustrated with how their hearing aids handle noisy situations; it’s a key reason many people with hearing loss don’t wear hearing aids, even if they own them. People with untreated hearing loss – including those who don’t wear their hearing aids – are at increased risk of social isolation, depression and even dementia. © 2010–2018, The Conversation US, Inc.

Keyword: Hearing; Learning & Memory
Link ID: 24618 - Posted: 02.06.2018

By C. CLAIBORNE RAY Q. Does an octopus have a brain? Where is it? And just how smart is an octopus? A. In a sense, an octopus has several brains, collections of neurons that control each arm. A famous 2001 study in the journal Science described how the commands that control one arm’s movement continue even when connections to the walnut-sized central processing system in the head are severed. Since then, more has been found about why the octopus is so much smarter than the average seafood. Even the relatively small central brain of an octopus is the largest among all invertebrates — proportionally, that is. A review article in 2015 in the journal Current Opinion in Neurobiology summarized the complexity of learning processes in the octopus and its remarkable adaptability. Some studies have examined the cephalopod’s ability to discern objects of different sizes, shapes, colors, brightnesses and textures; and its problem-solving, including the ability to navigate mazes and open jars. The creature also displays both short-term and long-term memory and recall over periods of weeks and even months. A possible explanation of the advanced abilities of the octopus lies in its very large genome, decoded in 2015 in a study in the journal Nature. The researchers surmised that the vast expansion of certain gene families in the octopus, and the network of linkages among the genes, could account for the development of its neurological complexity. © 2018 The New York Times Company

Keyword: Evolution; Learning & Memory
Link ID: 24602 - Posted: 02.02.2018

By Lauren Aguirre Just over five years ago, a man suffering from amnesia following a suspected drug overdose appeared at Lahey Hospital and Medical Center in Burlington, Massachusetts, a Boston suburb. He was 22, and had injected what he believed to be heroin. When he woke up the next morning, he was extremely confused, repeatedly asking the same questions and telling the same stories. Doctors at Lahey quickly diagnosed the man with anterograde amnesia — the inability to form new memories. “I thought it was an extremely strange [brain] scan — it was almost hard to believe.” His brain scan revealed why. “I thought it was an extremely strange scan — it was almost hard to believe,” said Jed Barash, a neurologist working at Lahey at the time. In the scan, the twin, seahorse-shaped structures of the man’s hippocampi were lit up against the dark background of the rest of the brain — a clear sign of severe injury to just that one region. “It was strange because that was all there was,” Barash said. Memory researchers have known since the late 1950s that the hippocampi are responsible for turning short-term memories into lasting ones, so the amnesia was not surprising. Just how the damage occurred, however, remained a mystery. Lack of oxygen to the brain that would have occurred during the overdose could not be the only explanation. The number of survivors in the state that year could easily have numbered in the thousands, so why was there only one patient with this seemingly unique brain damage? Along with his colleagues, Barash — now medical director at the Soldiers’ Home health care facility in Chelsea, Massachusetts — figured that the opioids must have played a role, and that hunch became only more acute as three more patients — each fitting the same pattern — appeared at Lahey over the next three years. All had the same unique destruction of the hippocampi, all had amnesia, and all were suspected to have overdosed. By that point, the doctors at Lahey faced two fundamental questions: What was causing the strange new syndrome? And precisely how rare was it? Copyright 2018 Undark

Keyword: Learning & Memory; Drug Abuse
Link ID: 24590 - Posted: 01.30.2018

Sara Reardon Superconducting computing chips modelled after neurons can process information faster and more efficiently than the human brain. That achievement, described in Science Advances on 26 January1, is a key benchmark in the development of advanced computing devices designed to mimic biological systems. And it could open the door to more natural machine-learning software, although many hurdles remain before it could be used commercially. Artificial intelligence software has increasingly begun to imitate the brain. Algorithms such as Google’s automatic image-classification and language-learning programs use networks of artificial neurons to perform complex tasks. But because conventional computer hardware was not designed to run brain-like algorithms, these machine-learning tasks require orders of magnitude more computing power than the human brain does. “There must be a better way to do this, because nature has figured out a better way to do this,” says Michael Schneider, a physicist at the US National Institute of Standards and Technology (NIST) in Boulder, Colorado, and a co-author of the study. NIST is one of a handful of groups trying to develop ‘neuromorphic’ hardware that mimics the human brain in the hope that it will run brain-like software more efficiently. In conventional electronic systems, transistors process information at regular intervals and in precise amounts — either 1 or 0 bits. But neuromorphic devices can accumulate small amounts of information from multiple sources, alter it to produce a different type of signal and fire a burst of electricity only when needed — just as biological neurons do. As a result, neuromorphic devices require less energy to run. © 2018 Macmillan Publishers Limited

Keyword: Robotics; Learning & Memory
Link ID: 24579 - Posted: 01.27.2018

Laura Sanders People tend to think of memories as deeply personal, ephemeral possessions — snippets of emotions, words, colors and smells stitched into our unique neural tapestries as life goes on. But a strange series of experiments conducted decades ago offered a different, more tangible perspective. The mind-bending results have gained unexpected support from recent studies. In 1959, James Vernon McConnell, a psychologist at the University of Michigan in Ann Arbor, painstakingly trained small flatworms called planarians to associate a shock with a light. The worms remembered this lesson, later contracting their bodies in response to the light. One weird and wonderful thing about planarians is that they can regenerate their bodies — including their brains. When the trained flatworms were cut in half, they regrew either a head or a tail, depending on which piece had been lost. Not surprisingly, worms that kept their heads and regrew tails retained the memory of the shock, McConnell found. Astonishingly, so did the worms that grew replacement heads and brains. Somehow, these fully operational, complex arrangements of brand-spanking-new nerve cells had acquired the memory of the painful shock, McConnell reported. In subsequent experiments, McConnell went even further, attempting to transfer memory from one worm to another. He tried grafting the head of a trained worm onto the tail of an untrained worm, but he couldn’t get the head to stick. He injected trained planarian slurry into untrained worms, but the recipients often exploded. Finally, he ground up bits of the trained planarians and fed them to untrained worms. Sure enough, after their meal, the untrained worms seemed to have traces of the memory — the cannibals recoiled at the light. The implications were bizarre, and potentially profound: Lurking in that pungent planarian puree must be a substance that allowed animals to literally eat one another’s memories. |© Society for Science & the Public 2000 - 2017.

Keyword: Learning & Memory
Link ID: 24569 - Posted: 01.25.2018

By Yasemin Saplakoglu In Harry Potter, the young wizard is given a piece of parchment called the Marauder’s map on which is a detailed layout of Hogwarts School of Witchcraft and Wizardry. The magical map reveals the movements of people (and ghosts) through the school with evanescing ink footsteps. In the hippocampus—a small horseshoe-shaped area of mammalian brains—there is a kind of Marauder’s map, which keeps track of other individuals’ movements. Two new studies published last week in Science show the hippocampus is not only responsible for figuring out an animal’s own position in space—something previously known—but also that of others. This finding explains why soccer games do not always end up with a pile of humans in the middle of the field—players are really good at not bumping into one another. In 1971 John O’Keefe, a neuroscientist at University College London (U.C.L.), and his student Jonathan Dostrovsky discovered “place cells”—neurons in the hippocampus that fire when an animal goes to a specific spot in space. This finding earned O’Keefe a Nobel Prize in 2014. But until recently research had only looked at how the brain maps an animal’s own position. Being able to map where others are in space, “is important for any kind of social interactions [such as] courtship, coordinated hunting [and] monitoring the position of predators or prey,” says Nachum Ulanovsky, a neurobiologist at the Weizmann Institute of Science in Israel and senior author of one of the studies. “Here, in one bang, we had two studies in two different species of mammals showing [how the brain does this] for the first time.” © 2018 Scientific American

Keyword: Learning & Memory
Link ID: 24540 - Posted: 01.19.2018

Two independent teams of scientists from the University of Utah and the University of Massachusetts Medical School have discovered that a gene crucial for learning, called Arc, can send its genetic material from one neuron to another by employing a strategy commonly used by viruses. The studies, both published in Cell, unveil a new way that nervous system cells interact. “This work is a great example of the importance of basic neuroscience research,” said Edmund Talley, Ph.D., a program director at the National Institute of Neurological Disorders and Stroke (NINDS), part of the National Institutes of Health. “What began as an effort to examine the behavior of a gene involved in memory and implicated in neurological disorders such as Alzheimer’s disease has unexpectedly led to the discovery of an entirely new process, which neurons may use to send genetic information to one another.” While Arc is known to play a vital role in the brain’s ability to store new information, little is known about precisely how it works. In addition, previous studies had detailed similarities between the Arc protein and proteins found in certain viruses like HIV, but it was unclear how those commonalities influenced the behavior of the Arc protein. The University of Utah researchers began their examination of the Arc gene by introducing it into bacterial cells. To their surprise, when the cells made the Arc protein, it clumped together into a form that resembled a viral capsid, the shell that contains a virus’ genetic information. The Arc “capsids” appeared to mirror viral capsids in their physical structure as well as their behavior and other properties.

Keyword: Learning & Memory
Link ID: 24535 - Posted: 01.17.2018

Helen Shen For someone who’s not a Sherlock superfan, cognitive neuroscientist Janice Chen knows the BBC’s hit detective drama better than most. With the help of a brain scanner, she spies on what happens inside viewers’ heads when they watch the first episode of the series and then describe the plot. Chen, a researcher at Johns Hopkins University in Baltimore, Maryland, has heard all sorts of variations on an early scene, when a woman flirts with the famously aloof detective in a morgue. Some people find Sherlock Holmes rude while others think he is oblivious to the woman’s nervous advances. But Chen and her colleagues found something odd when they scanned viewers’ brains: as different people retold their own versions of the same scene, their brains produced remarkably similar patterns of activity1. Chen is among a growing number of researchers using brain imaging to identify the activity patterns involved in creating and recalling a specific memory. Powerful technological innovations in human and animal neuroscience in the past decade are enabling researchers to uncover fundamental rules about how individual memories form, organize and interact with each other. Using techniques for labelling active neurons, for example, teams have located circuits associated with the memory of a painful stimulus in rodents and successfully reactivated those pathways to trigger the memory. And in humans, studies have identified the signatures of particular recollections, which reveal some of the ways that the brain organizes and links memories to aid recollection. Such findings could one day help to reveal why memories fail in old age or disease, or how false memories creep into eyewitness testimony. These insights might also lead to strategies for improved learning and memory. © 2018 Macmillan Publishers Limited,

Keyword: Learning & Memory
Link ID: 24520 - Posted: 01.11.2018

Eric Nyquist for Quanta Magazine Brains, beyond their signature achievements in thinking and problem solving, are paragons of energy efficiency. The human brain’s power consumption resembles that of a 20-watt incandescent lightbulb. In contrast, one of the world’s largest and fastest supercomputers, the K computer in Kobe, Japan, consumes as much as 9.89 megawatts of energy — an amount roughly equivalent to the power usage of 10,000 households. Yet in 2013, even with that much power, it took the machine 40 minutes to simulate just a single second’s worth of 1 percent of human brain activity. Now engineering researchers at the California NanoSystems Institute at the University of California, Los Angeles, are hoping to match some of the brain’s computational and energy efficiency with systems that mirror the brain’s structure. They are building a device, perhaps the first one, that is “inspired by the brain to generate the properties that enable the brain to do what it does,” according to Adam Stieg, a research scientist and associate director of the institute, who leads the project with Jim Gimzewski, a professor of chemistry at UCLA. The device is a far cry from conventional computers, which are based on minute wires imprinted on silicon chips in highly ordered patterns. The current pilot version is a 2-millimeter-by-2-millimeter mesh of silver nanowires connected by artificial synapses. Unlike silicon circuitry, with its geometric precision, this device is messy, like “a highly interconnected plate of noodles,” Stieg said. And instead of being designed, the fine structure of the UCLA device essentially organized itself out of random chemical and electrical processes. All Rights Reserved © 2018

Keyword: Learning & Memory; Robotics
Link ID: 24500 - Posted: 01.08.2018

By HENRY ALFORD Here in the valley of my mid-50s, I try not to get into a swivet over my occasionally faulty memory: Sometimes the mind has a mind of its own. But when I read this chilling passage — “I am dementing. I am dementing. I am dementing.” — from Gerda Saunders’s recent memoir “Memory’s Last Breath: Field Notes on My Dementia,” I found myself starting to panic. In a world increasingly dominated by the Google/Apple/Facebook/Amazon hegemony, we hear a lot about the threat to privacy. But isn’t memory just as vulnerable? Now that, as the former New Republic editor Franklin Foer writes in “World Without Mind: The Existential Threat of Big Tech,” “our phone is an extension of our memory; we’ve outsourced basic functions to algorithms,” doesn’t the world seem like an ever-larger parking lot that has mysteriously swallowed our Toyota? Don’t we all wish, now more than ever, that acquaintances came equipped with their own “Previously on this series …” trailer? Mr. Foer and Ms. Saunders aren’t the only writers on this beat. Recent books by Robert Sapolsky, Michael Lemonick, Felicia Yap, Emily Barr, Dale Bredesen, Val Emmich, Oliver Sacks and Elizabeth Rosner, among others, have addressed the theme of non-historical memory. Last July alone, more than a dozen books specifically about the topic, most of them self-published, were released. You’d expect the themes of amnesia or powers of recall to be prevalent in thrillers or in memoirs by trauma survivors or over-beveraged rock stars, but even literary fiction is getting in on the act. In Rachel Khong’s sly, diaristic “Goodbye, Vitamin,” a 30-year-old who moves back home learns she has to care for a dementing father who has started leaving his pants in trees. In Alissa Nutting’s outrageous sex comedy “Made for Love,” a woman on the lam from her tech pioneer husband discovers that he has implanted a chip in her brain that allows him to download all her experiences. © 2018 The New York Times Company

Keyword: Learning & Memory
Link ID: 24499 - Posted: 01.08.2018

Jon Hamilton Older brains may forget more because they lose their rhythm at night. During deep sleep, older people have less coordination between two brain waves that are important to saving new memories, a team reports in the journal Neuron. "It's like a drummer that's perhaps just one beat off the rhythm," says Matt Walker, one of the paper's authors and a professor of neuroscience and psychology at the University of California, Berkeley. "The aging brain just doesn't seem to be able to synchronize its brain waves effectively." The finding appears to answer a long-standing question about how aging can affect memory even in people who do not have Alzheimer's or some other brain disease. "This is the first paper that actually found a cellular mechanism that might be affected during aging and therefore be responsible for a lack of memory consolidation during sleep," says Julie Seibt, a lecturer in sleep and plasticity at the University of Surrey in the U.K. Seibt was not involved in the new study. To confirm the finding, though, researchers will have to show that it's possible to cause memory problems in a young brain by disrupting these rhythms, Seibt says. The study was the result of an effort to understand how the sleeping brain turns short-term memories into memories that can last a lifetime, says Walker, the author of the book Why We Sleep. "What is it about sleep that seems to perform this elegant trick of cementing new facts into the neural architecture of the brain?" To find out, Walker and a team of scientists had 20 young adults learn 120 pairs of words. "Then we put electrodes on their head and we had them sleep," he says. The electrodes let researchers monitor the electrical waves produced by the brain during deep sleep. They focused on the interaction between slow waves, which occur every second or so, and faster waves called sleep spindles, which occur more than 12 times a second. © 2017 npr

Keyword: Sleep; Learning & Memory
Link ID: 24433 - Posted: 12.18.2017