Links for Keyword: Learning & Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 1090

/ By Caroline Williams I‘m not the kind of girl who jumps into a strange man’s car and hopes for the best. Especially when a quick Google stalk reveals him to be recovering from an addiction to methamphetamine. But having been assured by someone I trust that he was “one of the good guys,” I accepted his offer of a ride to the airport and … hoped for the best. WHAT I LEFT OUT is a recurring feature in which book authors are invited to share anecdotes and narratives that, for whatever reason, did not make it into their final manuscripts. In this installment, Caroline Williams shares a story that was left out of “My Plastic Brain: One Woman’s Yearlong Journey to Discover if Science Can Improve Her Mind,” published by Prometheus Books. Some books make it sound so easy: Change the way you think, and hey presto, you can become a different person. In hindsight I’m glad I did. After many months talking to scientists about brain change, it was this journey that prompted me to think more deeply about what that actually meant. I was in Lawrence, Kansas, researching a book that I hoped would apply the latest science to make real, measurable, and lasting changes to my brain. I wanted to learn, among other things, how to concentrate better and to overcome my irrational anxieties about life. I was in Kansas to try to boost my powers of creativity. Copyright 2018 Undark

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 14: Attention and Consciousness
Link ID: 25349 - Posted: 08.18.2018

Kelsey Tyssowski The first dance at my wedding lasted exactly four minutes and 52 seconds, but I’ll probably remember it for decades. Neuroscientists still don’t entirely understand this: How was my brain able to translate this less-than-five-minute experience into a lifelong memory? Part of the puzzle is that there’s a gap between experience and memory: our experiences are fleeting, but it takes hours to form a long-term memory. In recent work published in the journal Neuron, my colleagues and I figured out how the brain keeps temporary molecular records of transient experiences. Our finding not only helps to explain how the brain bridges the gap between experience and memory. It also allows us to read the brain’s short-term records, raising the possibility that we may one day be able to infer a person’s, or at least a laboratory mouse’s, past experience – what they saw, thought, felt – just by looking at the molecules in their brain. To uncover how the brain keeps track of an animal’s experience, we started by asking how the brain records its electrical activity. Every experience you have, from chatting with a friend to smelling french fries, corresponds to its own unique pattern of electrical activity in the nervous system and brain. These activity patterns are defined by which neurons are active and in what way they’re active. For example, say you’re at the gym lifting weights. Which neurons are active is fairly straightforward: If you’re lifting with your right arm, different neurons will be active than if you’re lifting with your left arm because different neurons are connected to the muscles of each arm. © 2010–2018, The Conversation US, Inc.

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25307 - Posted: 08.08.2018

Allison Aubrey Was it hard to concentrate during that long meeting? Or, does the crossword seem a little tougher? You could be mildly dehydrated. A growing body of evidence finds that being just a little dehydrated is tied to a range of subtle effects — from mood changes to muddled thinking. "We find that when people are mildly dehydrated they really don't do as well on tasks that require complex processing or on tasks that require a lot of their attention," says Mindy Millard-Stafford, director of the Exercise Physiology Laboratory at Georgia Institute of Technology. She published an analysis of the evidence this month, based on 33 studies. Heat Making You Lethargic? Research Shows It Can Slow Your Brain, Too Shots - Health News Heat Making You Lethargic? Research Shows It Can Slow Your Brain, Too How long does it take to become mildly dehydrated in the summer heat? Not long at all, studies show, especially when you exercise outdoors. "If I were hiking at moderate intensity for one hour, I could reach about 1.5 percent to 2 percent dehydration," says Doug Casa, a professor of kinesiology at the University of Connecticut, and CEO of the Korey Stringer Institute. For an average-size person, 2 percent dehydration equates to sweating out about a liter of water. "Most people don't realize how high their sweat rate is in the heat," Casa says. If you're going hard during a run, you can reach that level of dehydration in about 30 minutes. And, at this level of dehydration the feeling of thirst, for many of us, is only just beginning to kick in. "Most people can't perceive that they're 1.5 percent dehydrated," Casa says. © 2018 npr

Related chapters from BN8e: Chapter 13: Homeostasis: Active Regulation of the Internal Environment; Chapter 18: Attention and Higher Cognition
Related chapters from MM:Chapter 9: Homeostasis: Active Regulation of the Internal Environment; Chapter 14: Attention and Consciousness
Link ID: 25269 - Posted: 07.30.2018

By Sara Chodosh Think back to your earliest memory. What age were you in it? If its under two, you're not alone. In a recent survey, 40 percent of people say they remember events earlier than age two. But here's the problem: Most memory researchers argue that its essentially impossible to remember anything before those terrible twos. So what gives? Understanding how and why our brains form memories in the first place might convince you that if you're in that 40 percent, perhaps your memory is a fictional one after all. That number comes courtesy of a recent study out this week in the journal Psychological Science, which sought to understand when most people have their first memories and what they’re about. . The researchers asked 6,641 U.K. residents to describe in writing their first recollection and the age they were in that memory. They then used that data to figure out how many of these first impressions were real. Aside from interviewing friends and family (who might also have false memories), it’s difficult to determine whether a memory is real or not. Instead, the psychologists operated on the assumption—albeit an assumption backed by a lot of research—that people can’t remember anything before about age two. Based on that cutoff, 38.6 percent of the first memories in this dataset were fictional. Most of those were dated to somewhere between ages one and two, but 893 people claimed they could remember being less than one year old. Why are researchers so quick to dismiss those first couple years of life thoughts? There’s a lot of research that suggests it’s all made up. It might seem dismissive to assume that these memories are false, but memory researchers have good reason to conclude that people aren’t truly remembering being a baby. Research on infantile amnesia, the official term for the phenomenon in which we forget things that happened to us as babies and young children, has shown that it’s close to impossible to retain declarative memories at that young age. Babies can obviously remember other, nondeclarative things because they learn how to walk and talk—both of those are reliant on retaining some kind of information—but a declarative memory happens in a separate part of the brain. Copyright © 2018 Popular Science.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 25225 - Posted: 07.19.2018

By Matthew Hutson Bird populations are plummeting, thanks to logging, agriculture, and climate change. Scientists keep track of species by recording their calls, but even the best computer programs can’t reliably distinguish bird calls from other sounds. Now, thanks to a bit of crowdsourcing and a lot of artificial intelligence (AI), researchers say they have something to crow about. AI algorithms can be as finicky as finches, often requiring manual calibration and retraining for each new location or species. So an interdisciplinary group of researchers launched the Bird Audio Detection challenge, which released hours of audio from environmental monitoring stations around Chernobyl, Ukraine, which they happened to have access to, as well as crowdsourced recordings, some of which came from an app called Warblr. Humans labeled each 10-second clip as containing a bird call or not. Using so-called machine learning, in which computers learn from data, 30 teams trained their AIs on a set of the recordings for which labels were provided and then tested them on recordings for which they were not. Most relied on neural networks, a type of AI inspired by the brain that connects many small computing elements akin to neurons. At the end of the monthlong contest, the best algorithm scored 89 out of 100 on a statistical measure of performance called AUC. A higher number, in this case, indicates the algorithm managed to avoid labeling nonbird sounds as bird sounds (humans, insects, or rain often threw them off) and avoid missing real bird sounds (usually because of faint recordings), the organizers report in a paper uploaded to the preprint server arXiv. The best previous algorithm they tested had an AUC score of 79. © 2018 American Association for the Advancement of Scienc

Related chapters from BN8e: Chapter 19: Language and Lateralization; Chapter 17: Learning and Memory
Related chapters from MM:Chapter 15: Brain Asymmetry, Spatial Cognition, and Language; Chapter 13: Memory, Learning, and Development
Link ID: 25223 - Posted: 07.19.2018

By Niraj Chokshi Six months after a White House physician told reporters that President Trump had aced a well-regarded test of cognitive impairment, a group of doctors is warning that the exam may have been compromised by the resulting news coverage, which revealed some of its questions. Until it’s clear what effect the exposure has had on the effectiveness of the test, known as the Montreal Cognitive Assessment, or MoCA, doctors should consider using alternatives, said Dr. Hourmazd Haghbayan, an internist at the University of Toronto. “When I saw that this test was being disseminated to the mass population, and in some cases individuals were being invited to take it online, I wondered whether there would be an effect,” Dr. Haghbayan and colleagues wrote in a letter published Monday in the medical journal JAMA Neurology. The group collected data to show how widely the test’s questions were publicized after Dr. Ronny L. Jackson, a rear admiral in the Navy and then the White House physician, mentioned it at a news conference in January. Dr. Jackson, who later withdrew as nominee for veterans affairs secretary under a cloud of scandal, told reporters at the time that Mr. Trump was in “excellent” overall health and that he had landed a perfect MoCA score. “The fact that the president got 30 out of 30 on that exam, I think that there’s no indication whatsoever that he has any cognitive issues,” Dr. Jackson said. Mr. Trump has long faced questions about his mental stability and his fitness for office. He has occasionally responded to them directly, as he did in early January when he described himself on Twitter as “a very stable genius.” Using a Google News search, the researchers found 190 articles published in the days after the announcement that mentioned MoCA in reference to the president. © 2018 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 25216 - Posted: 07.17.2018

By Kate Sheridan On Sunday, neuroscientist Brenda Milner turns 100, and she plans to celebrate in two ways: the World Cup finals, followed by a party. “I tipped France from the beginning of the tournament to win, but I must say that Croatia has really impressed me,” she told STAT recently. In her 100 years of life, our understanding of the nervous system has changed dramatically. For example, only a few decades before Milner was born, some scientists still believed the nervous system was an uninterrupted network throughout the body. Now we know it isn’t, and drugs are created specifically to manipulate the movement of chemicals across the gaps between neurons. Milner’s work in memory and language processing has contributed mightily to that shift in understanding, and her decades-long career has made her both witness and player to the growth of neuroscience as a field. Yet as she journeyed from Cambridge University to the British defense ministry during World War II to the Montreal Neurological Institute, perhaps no encounter has shaped her career — and the study of human cognition — as meeting a man in the late 1950’s known as Patient H.M. H.M. was a young man who had epilepsy — and until his death in 2008, very few people knew his real name. A few years before Milner met him, surgeons removed parts of his brain, including his hippocampus, where doctors then thought his seizures began. © 2018 STAT

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25208 - Posted: 07.16.2018

By Sarah Gibbens Can you remember how you felt the first time you rode a bike? What about your first kiss—or your first heartbreak? Memorable moments and the emotions they trigger can resonate in our minds for decades, accumulating and powerfully shaping who we are as individuals. But for those who experience severe trauma, such memories can be haunting, and brutally painful memories can leave people with life-altering mental conditions.  So, what if traumatic memories didn't have to cause so much pain? As our understanding of the human brain evolves, various groups of neuroscientists are inching closer to techniques that manipulate memory to treat conditions such as PTSD or Alzheimer’s.  For now, the work is mainly happening in other animals, such as mice. But as these initial trials show continued success, scientists are looking toward the potential for tests in people, while grappling with the ethical implications of what it means to change a fundamental piece of someone’s identity. Feasibly, we could alter human memory in the not too distant future—but does that mean we should? Neuroscientists usually define a singular memory as an engram—a physical change in brain tissue associated with a particular recollection. Recently, brain scans revealed that an engram isn't isolated to one region of the brain and instead manifests as a colorful splattering across the neural tissue.

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25206 - Posted: 07.14.2018

Sukanya Charuchandra Like humans, mice experience a period of amnesia when they lose their memories of experiences from infancy. Now, researchers report that these memories are not entirely forgotten by mice but simply difficult to recollect—and can be brought out of storage. These findings were published today (July 5) in Current Biology. According to this study, early life experiences “leave very long-lasting traces even if the memories are not expressed,” writes Cristina Alberini, who studies memory at New York University’s Center for Neural Science and was not involved in the study, in an email to The Scientist. Having encountered patients who couldn’t remember their early years, Sigmund Freud first coined the term infantile amnesia in the late 19th century. Since then, scientists have tried to understand why humans, nonhuman primates, and rodents alike experience this phenomenon. Whether these lost memories were due to improper storage or inefficient recollection was unknown. In this latest study, Paul Frankland, a psychologist at The Hospital for Sick Children in Toronto, and his colleagues sought to establish which of these possibilities was operating in mice. To first induce memory formation in the animals, the scientists placed the mice in a box and gave them a mild foot shock. While young adult mice retained this memory and froze when put in the box a second time, infant mice forgot this fear-related memory after a day and behaved normally when they encountered the box again. © 1986 - 2018 The Scientist.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 15: Emotions, Aggression, and Stress
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 11: Emotions, Aggression, and Stress
Link ID: 25182 - Posted: 07.07.2018

By Lisa Feldman Barrett Jasanoff’s big message in “The Biological Mind” is you are not your brain. Or rather, you are not merely your brain — your body and the broader circumstances of your life also make you who you are. Jasanoff reminds us that the brain is not some mystical machine — it’s a gooey, bloody tangle of cells, dripping with chemicals. But we mythologize brains, creating false boundaries that divorce them from bodies and the outside world, blinding us to the biological nature of the mind. These divisions, Jasanoff contends, are why neuroscience has failed to make a real difference in anyone’s life. Unfortunately, the book’s own divisions between body versus brain, and nature versus nurture, reinforce the very dualisms that Jasanoff indicts. He gives examples of the ways our bodies and the world around us affect our thoughts, feelings and actions, but not how body and world become biologically embedded to constitute a mind. Missing is a discussion of how the workings of your body necessarily and irrevocably shape your brain’s structure and function, and vice versa. The artificial boundary between brain and world also goes largely unmentioned. In real life, the experiences we have from infancy onward impact the brain’s wiring. For example, childhood poverty and adversity fundamentally alter brain development, leaving an indelible mark that increases people’s risk of illness in adulthood. This is fascinating and profound stuff, but it mostly goes unexamined in Jasanoff’s book. Still, “The Biological Mind” is chock-full of fun facts that entertain. And best of all, it makes you think. I found myself debating with Jasanoff in my head as I read — surely a sign of a worthy book. © 2018 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 1: Biological Psychology: Scope and Outlook
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 1: An Introduction to Brain and Behavior
Link ID: 25143 - Posted: 06.26.2018

by Sarah DiGiulio Why is it that you can perfectly recite the words to *NSYNC’s “Bye Bye Bye,” but can’t remember the title of the new TV show you started watching on Netflix and wanted to tell your coworker about? We remember things because they either stand out, they relate to and can easily be integrated in our existing knowledge base, or it’s something we retrieve, recount or use repeatedly over time, explains Sean Kang, PhD, assistant professor in the Department of Education at Dartmouth College, whose research focuses on the cognitive psychology of learning and memory. “The average layperson trying to learn nuclear physics for the first time, for example, will probably find it very difficult to retain that information." That's because he or she likely doesn’t have existing knowledge in their brain to connect that new information to. And on a molecular level neuroscientists suspect that there’s actually a physical process that needs to be completed to form a memory — and us not remembering something is a result of that not happening, explains Blake Richards, DPhil, assistant professor in the Department of Biological Sciences and Fellow at the Canadian Institute for Advanced Research. In the same way that when you store a grocery list on a piece of paper, you are making a physical change to that paper by writing words down, or when you store a file on a computer, you’re making a physical change somewhere in the magnetization of some part of your hard drive — a physical change happens in your brain when you store a memory or new information. © 2018 NBC UNIVERSAL

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 25085 - Posted: 06.14.2018

By Ruth Williams The sun’s ultraviolet (UV) radiation is a major cause of skin cancer, but it offers some health benefits too, such as boosting production of essential vitamin D and improving mood. Today (May 17), a report in Cell adds enhanced learning and memory to UV’s unexpected benefits. Researchers have discovered that, in mice, exposure to UV light activates a molecular pathway that increases production of the brain chemical glutamate, heightening the animals’ ability to learn and remember. “The subject is of strong interest, because it provides additional support for the recently proposed theory of ultraviolet light’s regulation of the brain and central neuroendocrine system,” dermatologist Andrzej Slominski of the University of Alabama who was not involved in the research writes in an email to The Scientist. “It’s an interesting and timely paper investigating the skin-brain connection,” notes skin scientist Martin Steinhoff of University College Dublin’s Center for Biomedical Engineering who also did not participate in the research. “The authors make an interesting observation linking moderate UV exposure to . . . [production of] the molecule urocanic acid. They hypothesize that this molecule enters the brain, activates glutaminergic neurons through glutamate release, and that memory and learning are increased.” © 1986-2018 The Scientist

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 14: Biological Rhythms, Sleep, and Dreaming
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 10: Biological Rhythms and Sleep
Link ID: 25052 - Posted: 06.02.2018

By Matthew Hutson It's a Saturday morning in February, and Chloe, a curious 3-year-old in a striped shirt and leggings, is exploring the possibilities of a new toy. Her father, Gary Marcus, a developmental cognitive scientist at New York University (NYU) in New York City, has brought home some strips of tape designed to adhere Lego bricks to surfaces. Chloe, well-versed in Lego, is intrigued. But she has always built upward. Could she use the tape to build sideways or upside down? Marcus suggests building out from the side of a table. Ten minutes later, Chloe starts sticking the tape to the wall. "We better do it before Mama comes back," Marcus says in a singsong voice. "She won't be happy." (Spoiler: The wall paint suffers.) Implicit in Marcus's endeavor is an experiment. Could Chloe apply what she had learned about an activity to a new context? Within minutes, she has a Lego sculpture sticking out from the wall. "Papa, I did it!" she exclaims. In her adaptability, Chloe is demonstrating common sense, a kind of intelligence that, so far, computer scientists have struggled to reproduce. Marcus believes the field of artificial intelligence (AI) would do well to learn lessons from young thinkers like her. Researchers in machine learning argue that computers trained on mountains of data can learn just about anything—including common sense—with few, if any, programmed rules. These experts "have a blind spot, in my opinion," Marcus says. "It's a sociological thing, a form of physics envy, where people think that simpler is better." He says computer scientists are ignoring decades of work in the cognitive sciences and developmental psychology showing that humans have innate abilities—programmed instincts that appear at birth or in early childhood—that help us think abstractly and flexibly, like Chloe. He believes AI researchers ought to include such instincts in their programs. © 2018 American Association for the Advancement of Science.

Related chapters from BN8e: Chapter 17: Learning and Memory; Chapter 7: Life-Span Development of the Brain and Behavior
Related chapters from MM:Chapter 13: Memory, Learning, and Development; Chapter 13: Memory, Learning, and Development
Link ID: 25026 - Posted: 05.26.2018

By Shawn Hayward Brenda Milner has collected her share of awards, prizes, honourary degrees and other recognitions throughout her amazing career, but there is something special about being recognized with top honours from the city and province she has called home since 1944, all within one week. On May 8, the Speaker of the National Assembly of Quebec, Jacques Chagnon presented Milner with its Medal of Honour, along with seven other Quebecers including McGill alumna Dr. Joanne Liu. The Medal of Honour is awarded to public figures from all walks of life who, through their career, their work or their social commitment, have earned the recognition of the Members of the National Assembly and the people of Quebec. Milner added to that recognition the title of Commander of the Order of Montreal, given to her by Mayor Valérie Plante during a ceremony at City Hall on May 14. The Order of Montreal was created on the city’s 375th anniversary to recognize women and men who have contributed in a remarkable way to the city’s development and reputation. There are three ranks in the Order, Commander being the highest. A celebrated researcher at the Montreal Neurological Institute and Hospital (The Neuro), Milner turns 100 years old on July 15. She is the Dorothy J. Killam Professor at The Neuro, and a professor in the Department of Neurology and Neurosurgery at McGill University. © 2018 McGill Reporter ·

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24984 - Posted: 05.17.2018

By Usha Lee McFarling, STAT UCLA neuroscientists reported Monday that they have transferred a memory from one animal to another via injections of RNA, a startling result that challenges the widely held view of where and how memories are stored in the brain. The finding from the lab of David Glanzman hints at the potential for new RNA-based treatments to one day restore lost memories and, if correct, could shake up the field of memory and learning. “It’s pretty shocking,” said Dr. Todd Sacktor, a neurologist and memory researcher at SUNY Downstate Medical Center in Brooklyn, N.Y. “The big picture is we’re working out the basic alphabet of how memories are stored for the first time.” He was not involved in the research, which was published in eNeuro, the online journal of the Society for Neuroscience. Advertisement Many scientists are expected to view the research more cautiously. The work is in snails, animals that have proven a powerful model organism for neuroscience but whose simple brains work far differently than those of humans. The experiments will need to be replicated, including in animals with more complex brains. And the results fly in the face of a massive amount of evidence supporting the deeply entrenched idea that memories are stored through changes in the strength of connections, or synapses, between neurons. © 2018 Scientific American

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24979 - Posted: 05.15.2018

Laurel Hamers Sluggish memories might be captured via RNA. The molecule, when taken from one sea slug and injected into another, appeared to transfer a rudimentary memory between the two, a new study suggests. Most neuroscientists believe long-term memories are stored by strengthening connections between nerve cells in the brain (SN: 2/3/18, p. 22). But these results, reported May 14 in eNeuro, buoy a competing argument: that some types of RNA molecules, and not linkages between nerve cells, are key to long-term memory storage. “It’s a very controversial idea,” admits study coauthor David Glanzman, a neuroscientist at UCLA. When poked or prodded, some sea slugs (Aplysia californica) will reflexively pull their siphon, a water-filtering appendage, into their bodies. Using electric shocks, Glanzman and his colleagues sensitized sea slugs to have a longer-lasting siphon-withdrawal response — a very basic form of memory. The team extracted RNA from those slugs and injected it into slugs that hadn’t been sensitized. These critters then showed the same long-lasting response to touch as their shocked companions. RNA molecules come in a variety of flavors that carry out specialized jobs, so it’s not yet clear what kind of RNA may be responsible for the effect, Glanzman says. But he suspects that it’s one of the handful of RNA varieties that don’t carry instructions to make proteins, the typical job of most RNA. (Called noncoding RNAs, these molecules are often involved in manipulating genes’ activity.) |© Society for Science & the Public 2000 - 2018.

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24978 - Posted: 05.15.2018

By Gretchen Reynolds Call them tip-of-the-tongue moments: those times we can’t quite call up the name or word that we know we know. These frustrating lapses are thought to be caused by a brief disruption in the brain’s ability to access a word’s sounds. We haven’t forgotten the word, and we know its meaning, but its formulation dances teasingly just beyond our grasp. Though these mental glitches are common throughout life, they become more frequent with age. Whether this is an inevitable part of growing older or somehow lifestyle-dependent is unknown. But because evidence already shows that physically fit older people have reduced risks for a variety of cognitive deficits, researchers recently looked into the relationship between aerobic fitness and word recall. For the study, whose results appeared last month in Scientific Reports, researchers at the University of Birmingham tested the lungs and tongues, figuratively speaking, of 28 older men and women at the school’s human-performance lab. Volunteers were between 60 and 80 and healthy, with no clinical signs of cognitive problems. Their aerobic capacities were measured by having them ride a specialized stationary bicycle to exhaustion; fitness levels among the subjects varied greatly. This group and a second set of volunteers in their 20s then sat at computers as word definitions flashed on the screens, prompting them to indicate whether they knew and could say the implied word. The vocabulary tended to be obscure — “decanter,” for example — because words rarely used are the hardest to summon quickly. To no one’s surprise, the young subjects experienced far fewer tip-of-the-tongue failures than the seniors, even though they had smaller vocabularies over all, according to other tests. Within the older group, the inability to identify and say the right words was strongly linked to fitness. The more fit someone was, the less likely he or she was to go through a “what’s that word again?” moment of mental choking. © 2018 The New York Times Company

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24974 - Posted: 05.15.2018

By Neuroskeptic A new paper in ACS Chemical Neuroscience pulls no punches in claiming that most of what we know about the neuroscience of learning is wrong: Dendritic Learning as a Paradigm Shift in Brain Learning According to authors Shira Sardi and colleagues, the prevailing view which is that learning takes place in the synapses is mistaken. Instead, they say, ‘dendritic learning’ is how brain cells really store information. If a neuron is a tree, the dendrites are the branches, while the synapses are the leaves on the ends of those branches. Here’s how Sardi et al. explain their new theory: On the left we see the idea of synaptic learning, which proposes that each synapse can independently adjust its strength. On the right, we have dendritic learning, the idea that each neuron only has a small number of adjustable units, corresponding to the main dendritic branches. The evidence for dendritic learning, Sardi et al. say, comes from experiments using cultured neurons in which they found that a) some neurons are more likely to fire when stimulated in certain places, suggesting that dendrites can vary in their excitability and b) that these (presumed) dendritic excitability levels are plastic (they can ‘learn’.)

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24968 - Posted: 05.12.2018

Maria Temming An artificial intelligence that navigates its environment much like mammals do could help solve a mystery about our own internal GPS. Equipped with virtual versions of specialized brain nerve cells called grid cells, the AI could easily solve and plan new routes through virtual mazes. That performance, described online May 9 in Nature, suggests the grid cells in animal brains play a critical role in path planning. “This is a big step forward” in understanding our own navigational neural circuitry, says Ingmar Kanitscheider, a computational neuroscientist at the University of Texas at Austin not involved in the work. The discovery that rats track their location with the help of grid cells, which project an imaginary hexagonal lattice onto an animal’s surroundings, earned a Norwegian research team the 2014 Nobel Prize in physiology or medicine (SN Online: 10/6/14). Neuroscientists suspected these cells, which have also been found in humans, might help not only give mammals an internal coordinate system, but also plan direct paths between points (SN Online: 8/5/13). To test that idea, neuroscientist Caswell Barry at University College London, along with colleagues at Google DeepMind, created an AI that contained virtual nerve cells, or neurons, whose activity resembled that of real grid cells. The researchers trained this AI to navigate virtual mazes by giving the system reward signals when it reached its destination. |© Society for Science & the Public 2000 - 2018

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24958 - Posted: 05.10.2018

Alison Abbott Scientists have used artificial intelligence (AI) to recreate the complex neural codes that the brain uses to navigate through space. The feat demonstrates how powerful AI algorithms can assist conventional neuroscience research to test theories about the brain’s workings — but the approach is not going to put neuroscientists out of work just yet, say the researchers. “It really was a very striking and remarkable convergence of form and function.” The computer program, details of which were published in Nature on 9 May1, was developed by neuroscientists at University College London (UCL) and AI researchers at the London-based Google company DeepMind. It used a technique called deep learning — a type of AI inspired by the structures in the brain — to train a computer-simulated rat to track its position in a virtual environment. The program surprised the scientists by spontaneously generating hexagonal-shaped patterns of activity akin to those generated by navigational cells in the mammalian brain called grid cells. Grid cells have been shown in experiments with real rats to be fundamental to how an animal tracks its own position in space. What’s more, the simulated rat was able to use the grid-cell-like coding to navigate a virtual maze so well that it even learnt to take shortcuts. © 2018 Macmillan Publishers Limited,

Related chapters from BN8e: Chapter 17: Learning and Memory
Related chapters from MM:Chapter 13: Memory, Learning, and Development
Link ID: 24957 - Posted: 05.10.2018