Chapter 17. Learning and Memory

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1750

By Laura Sanders Some big scientific discoveries aren’t actually discovered. They are borrowed. That’s what happened when scientists enlisted proteins from an unlikely lender: green algae. Cells of the algal species Chlamydomonas reinhardtii are decorated with proteins that can sense light. That ability, first noticed in 2002, quickly caught the attention of brain scientists. A light-sensing protein promised the power to control neurons — the brain’s nerve cells — by providing a way to turn them on and off, in exactly the right place and time. Nerve cells genetically engineered to produce the algal proteins become light-controlled puppets. A flash of light could induce a quiet neuron to fire off signals or force an active neuron to fall silent. “This molecule is the light sensor that we needed,” says vision neuroscientist Zhuo-Hua Pan, who had been searching for a way to control vision cells in mice’s retinas. The method enabled by these loaner proteins is now called optogenetics, for its combination of light (opto) and genes. In less than two decades, optogenetics has led to big insights into how memories are stored, what creates perceptions and what goes wrong in the brain during depression and addiction. Using light to drive the activity of certain nerve cells, scientists have toyed with mouse hallucinations: Mice have seen lines that aren’t there and have remembered a room they had never been inside. Scientists have used optogenetics to make mice fight, mate and eat, and even given blind mice sight. In a big first, optogenetics recently restored aspects of a blind man’s vision. © Society for Science & the Public 2000–2021.

Keyword: Brain imaging; Learning & Memory
Link ID: 27861 - Posted: 06.19.2021

By Carolyn Wilke Scientists have long sought to prevent sharp memories from dulling with age, but the problem remains stubborn. Now research published in Scientific Reports suggests virtual reality might help older people recall facts and events based on specific details. The study involved 42 healthy older adults from the San Francisco Bay Area. Half spent a dozen hours over four weeks playing a virtual-reality game called Labyrinth; they strapped on headsets and walked in place, roaming virtual neighborhoods while completing errands. The other half, in the control group, used electronic tablets to play games that did not require navigating or recalling details. After 15 sessions, the latter performed roughly the same as before on a long-term memory test based on picking out objects they had seen about an hour earlier. But the Labyrinth players' scores rose, and they were less frequently tricked by objects that resembled ones they had viewed. Those improvements “brought them back up to the level of another group of younger adults who did the same memory tests,” says cognitive neuroscientist Peter Wais of the University of California, San Francisco. He and his colleagues designed the VR game, which they say likely stimulates the hippocampus—a brain area important for long-term memory. The team did not observe improvement on two other tests, which measured autobiographical memory and spatial memory capability. © 2021 Scientific American,

Keyword: Learning & Memory; Alzheimers
Link ID: 27853 - Posted: 06.16.2021

By Emily Underwood In the 1930s, neurosurgeon Wilder Penfield pioneered a daring new kind of cartography. As a stenographer took notes, he delicately touched an electrode to the exposed brains of his awake, consenting patients and asked what they felt as electrical current hit different areas. Penfield wanted to better predict which brain functions would be threatened when surgeons had to remove tumors or chunks of tissue that were triggering epileptic seizures. Stimulating adjacent brain regions, he found, produced sensations in corresponding body parts: hand, forearm, elbow. The result of his mapping was the iconic “homunculus”: a map on the brain’s wrinkled outer layer representing the surface of the body. Penfield then ventured into more mysterious territory. When he probed the insula, a deep fold of cortex, some patients felt nauseated or gassy; others belched or vomited. “My stomach is upset and I smell something like medicine,” one said. Penfield found those visceral signals harder to decipher than the brain’s map of the body’s surface. Brain regions responsible for different internal sensations seemed to overlap. Sensory regions were hard to distinguish from those that sent motor instructions such as telling the intestines to contract. Penfield once asked participants to swallow an electrode to detect changes in gut contractions while he stimulated their brains. But his map of the inner organs was blurry and ambiguous—and stayed that way for most of the next century. Decades later, scientists are starting to unravel how our wet, spongy, slippery organs talk to the brain and how the brain talks back. That two-way communication, known as interoception, encompasses a complex, bodywide system of nerves and hormones. Much recent exploration has focused on the vagus nerve: a massive, meandering network of more than 100,000 fibers that travel from nearly every internal organ to the base of the brain and back again. © 2021 American Association for the Advancement of Science.

Keyword: Learning & Memory; Obesity
Link ID: 27850 - Posted: 06.11.2021

By Ben Guarino and Frances Stead Sellers In the coronavirus pandemic’s early weeks, in neuropathology departments around the world, scientists wrestled with a question: Should they cut open the skulls of patients who died of covid-19 and extract their brains? Autopsy staff at Columbia University in New York were hesitant. Sawing into bone creates dust, and the Centers for Disease Control and Prevention had issued a warning about the bodies of covid patients — airborne debris from autopsies could be an infectious hazard. But as more patients were admitted and more began to die, researchers decided to “make all the efforts we could to start collecting the brain tissue,” Columbia neuropathologist Peter D. Canoll said. In March 2020, in an insolation room, the Columbia team extracted a brain from a patient who had died of severe covid-19, the illness caused by the coronavirus. During the next months, they would examine dozens more. Saw met skull elsewhere, too. In Germany, scientists autopsied brains — even though medical authorities recommended against doing that. Researchers were searching the brain for damage — and for the virus itself. At the pandemic’s start, understanding how the virus affected the nervous system was largely a mystery. S. Andrew Josephson, chair of neurology at the University of California at San Francisco and editor in chief of the academic journal JAMA Neurology, said, “We had hundreds of submissions of ‘I saw one case of X.’” It was difficult to understand whether single cases has any relationship to covid at all. Patients reported visual and auditory disturbances, vertigo and tingling sensations, among other perplexing symptoms. Some lost their sense of smell, or their vision became distorted. Weeks or months after the initial onset of symptoms, some remain convinced after even a mild bout of the coronavirus of persistent “brain fog.”

Keyword: Learning & Memory; Attention
Link ID: 27845 - Posted: 06.08.2021

By Jason S. Tsukahara, Alexander P. Burgoyne, Randall W. Engle It has been said that “the eyes are the window to the soul,” but new research suggests that they may be a window to the brain as well. Our pupils respond to more than just the light. They indicate arousal, interest or mental exhaustion. Pupil dilation is even used by the FBI to detect deception. Now work conducted in our laboratory at the Georgia Institute of Technology suggests that baseline pupil size is closely related to individual differences in intelligence. The larger the pupils, the higher the intelligence, as measured by tests of reasoning, attention and memory. In fact, across three studies, we found that the difference in baseline pupil size between people who scored the highest on the cognitive tests and those who scored the lowest was large enough to be detected by the unaided eye. We first uncovered this surprising relationship while studying differences in the amount of mental effort people used to complete memory tasks. We used pupil dilations as an indicator of effort, a technique psychologist Daniel Kahneman popularized in the 1960s and 1970s. When we discovered a relationship between baseline pupil size and intelligence, we weren’t sure if it was real or what it meant. Advertisement Intrigued, we conducted several large-scale studies in which we recruited more than 500 people aged 18 to 35 from the Atlanta community. We measured participants’ pupil size using an eye tracker, a device that captures the reflection of light off the pupil and cornea using a high-powered camera and computer. We measured participants’ pupils at rest while they stared at a blank computer screen for up to four minutes. All the while, the eye tracker was recording. Using the tracker, we then calculated each participant’s average pupil size. © 2021 Scientific American

Keyword: Learning & Memory; Vision
Link ID: 27844 - Posted: 06.08.2021

Abby Olena Leptin is a hormone released by fat cells in adult organisms, and researchers have largely focused on how it controls appetite. In a study published May 18 in Science Signaling, the authors show that leptin promotes synapse formation, or synaptogenesis, in developing rodent neurons in culture. “This paper does a really wonderful job [breaking] down the mechanisms” of leptin signaling, and the authors look at changes in synaptic function, not just at the protein level, but also on a physiological level, says Laura Cocas, a neuroscientist at Santa Clara University who was not involved in the study. “Because all of the work on the paper is done in vitro, they can do very careful analysis . . . to break down each step in the signaling pathway.” When Washington State University neuroscientist Gary Wayman and his group started working on leptin about 10 years ago, most of the research had examined the hormone’s function in regulating satiety. But “we and others knew that leptin surged during a critical period of neuronal—and in particular synaptic—development in the brain,” he says. In people, this surge happens during the third trimester of fetal development and, in rodents, over the first few weeks of life. “This surge in leptin is independent of the amount of adipose tissue that’s present. And it does not control feeding during this period because feeding circuits have not developed, so we really wanted to understand what the developmental role was.” Wayman and colleagues focused on the hippocampus because, despite being one of the best-characterized regions in the brain, there wasn’t a lot of information out there about what the leptin receptors present were doing—particularly during development. Multiple groups had also shown that leptin injected in this brain region can improve cognition and act as an antidepressant. © 1986–2021 The Scientist.

Keyword: Obesity; Learning & Memory
Link ID: 27837 - Posted: 05.29.2021

By Jackie Rocheleau It’s an attractive idea: By playing online problem-solving, matching and other games for a few minutes a day, people can improve such mental abilities as reasoning, verbal skills and memory. But whether these games deliver on those promises is up for debate. “For every study that finds some evidence, there’s an equal number of papers that find no evidence,” says Bobby Stojanoski, a cognitive neuroscientist at Western University in Ontario (SN: 3/8/17; SN: 5/9/17). Now, in perhaps the biggest real-world test of these programs, Stojanoski and colleagues pitted more than 1,000 people who regularly use brain trainers against around 7,500 people who don’t do the mini brain workouts. There was little difference between how both groups performed on a series of tests of their thinking abilities, suggesting that brain training doesn’t live up to its name, the scientists report in the April Journal of Experimental Psychology: General. “They put brain training to the test,” says Elizabeth Stine-Morrow, a cognitive aging scientist at the University of Illinois at Urbana-Champaign. While the study doesn’t show why brain trainers aren’t seeing benefits, it does show there is no link “between the amount of time spent with the brain training programs and cognition,” Stine-Morrow says. “That was pretty cool.” © Society for Science & the Public 2000–2021

Keyword: Learning & Memory
Link ID: 27830 - Posted: 05.27.2021

R. Douglas Fields The raging bull locked its legs mid-charge. Digging its hooves into the ground, the beast came to a halt just before it would have gored the man. Not a matador, the man in the bullring standing eye-to-eye with the panting toro was the Spanish neuroscientist José Manuel Rodriguez Delgado, in a death-defying public demonstration in 1963 of how violent behavior could be squelched by a radio-controlled brain implant. Delgado had pressed a switch on a hand-held radio transmitter to energize electrodes implanted in the bull’s brain. Remote-controlled brain implants, Delgado argued, could suppress deviant behavior to achieve a “psychocivilized society.” Unsurprisingly, the prospect of manipulating the human mind with brain implants and radio beams ignited public fears that curtailed this line of research for decades. But now there is a resurgence using even more advanced technology. Laser beams, ultrasound, electromagnetic pulses, mild alternating and direct current stimulation and other methods now allow access to, and manipulation of, electrical activity in the brain with far more sophistication than the needlelike electrodes Delgado stabbed into brains. Billionaires Elon Musk of Tesla and Mark Zuckerberg of Facebook are leading the charge, pouring millions of dollars into developing brain-computer interface (BCI) technology. Musk says he wants to provide a “superintelligence layer” in the human brain to help protect us from artificial intelligence, and Zuckerberg reportedly wants users to upload their thoughts and emotions over the internet without the bother of typing. But fact and fiction are easily blurred in these deliberations. How does this technology actually work, and what is it capable of? All Rights Reserved © 2021

Keyword: Robotics; Attention
Link ID: 27827 - Posted: 05.19.2021

By Nicholas Bakalar Long-term exposure to air pollution has many health consequences, including accelerating brain aging and increasing the risk for dementia. Now new research suggests that short-term exposure to polluted air, even at levels generally considered “acceptable,” may impair mental ability in the elderly. Scientists studied 954 men, average age 69, living in the greater Boston area. The men were tested at the start of the study and several times over the next 28 days using the Mini-Mental State Examination, or MMSE, a widely used test of cognitive ability. The test includes simple questions like “What year is this?” and “What season is it?,” and requires tasks like counting backward by sevens from 100. Correctly answering fewer than 25 of its 30 questions suggests mild dementia. Over the month, the researchers measured air levels of what’s known as PM 2.5, particles of soot and other fine particulate matter with a diameter of up to 2.5 microns, small enough to enter the lungs and move into bloodstream. There is no safe level of PM 2.5, but the Environmental Protection Agency considers air acceptable when it is under 12 micrograms per cubic meter. During the testing period, PM 2.5 levels in Boston averaged 10.77. Higher PM 2.5 was consistently associated with lower test scores. In weeks with the highest levels of air pollution, the men were 63 percent more likely to score below 25 on the MMSE than in weeks with the lowest levels. The study, in Nature Aging, adjusted for age, B.M.I., coronary heart disease, diabetes, alcohol consumption, smoking, high blood pressure and other factors. Dr. Andrea A. Baccarelli, the senior author and a professor of environmental science at the Columbia Mailman School of Public Health, said that these short-term effects may be reversible. “When air pollution goes down,” he said, “the brain reboots and goes back to normal. However, if repeated, these episodes produce long-term damage to the brain.” © 2021 The New York Times Company

Keyword: Learning & Memory; Neurotoxins
Link ID: 27823 - Posted: 05.19.2021

By Noah Hutton Twelve years ago, when I graduated college, I was well aware of the Silicon Valley hype machine, but I considered the salesmanship of private tech companies a world away from objective truths about human biology I had been taught in neuroscience classes. At the time, I saw the neuroscientist Henry Markram proclaim in a TED talk that he had figured out a way to simulate an entire human brain on supercomputers within 10 years. This computer-simulated organ would allow scientists to instantly and noninvasively test new treatments for disorders and diseases, moving us from research that depends on animal experimentation and delicate interventions on living people to an “in silico” approach to neuroscience. My 22-year-old mind didn’t clock this as an overhyped proposal. Instead, it felt exciting and daring, the kind of moment that transforms a distant scientific pipe dream into a suddenly tangible goal and motivates funders and fellow researchers to think bigger. And so I began a 10-year documentary project following Markram and his Blue Brain Project, with the start of the film coinciding with the beginning of an era of big neuroscience where the humming black boxes produced by Silicon Valley came to be seen as the great new hope for making sense of the black boxes between our ears. My decade-long journey documenting Markram’s vision has no clear answers except perhaps one: that flashy presentations and sheer ambition are poor indicators of success when it comes to understanding the complex biological mechanisms of brains. Today, as we bear witness to a game of Pong being mind-controlled by a monkey as part of a typically bombastic demonstration by Elon Musk’s start-up Neuralink, there is more of a need than ever to unwind the cycles of hype in order to grapple with what the future of brain technology and neuroscience have in store for humanity. © 2021 Scientific American

Keyword: Brain imaging; Robotics
Link ID: 27794 - Posted: 05.01.2021

Jordana Cepelewicz During every waking moment, we humans and other animals have to balance on the edge of our awareness of past and present. We must absorb new sensory information about the world around us while holding on to short-term memories of earlier observations or events. Our ability to make sense of our surroundings, to learn, to act and to think all depend on constant, nimble interactions between perception and memory. But to accomplish this, the brain has to keep the two distinct; otherwise, incoming data streams could interfere with representations of previous stimuli and cause us to overwrite or misinterpret important contextual information. Compounding that challenge, a body of research hints that the brain does not neatly partition short-term memory function exclusively into higher cognitive areas like the prefrontal cortex. Instead, the sensory regions and other lower cortical centers that detect and represent experiences may also encode and store memories of them. And yet those memories can’t be allowed to intrude on our perception of the present, or to be randomly rewritten by new experiences. A paper published recently in Nature Neuroscience may finally explain how the brain’s protective buffer works. A pair of researchers showed that, to represent current and past stimuli simultaneously without mutual interference, the brain essentially “rotates” sensory information to encode it as a memory. The two orthogonal representations can then draw from overlapping neural activity without intruding on each other. The details of this mechanism may help to resolve several long-standing debates about memory processing. To figure out how the brain prevents new information and short-term memories from blurring together, Timothy Buschman, a neuroscientist at Princeton University, and Alexandra Libby, a graduate student in his lab, decided to focus on auditory perception in mice. They had the animals passively listen to sequences of four chords over and over again, in what Buschman dubbed “the worst concert ever.” All Rights Reserved © 2021

Keyword: Learning & Memory
Link ID: 27778 - Posted: 04.17.2021

Natalie Grover Few species in the animal kingdom can change the size of their brain. Fewer still can change it back to its original size. Now researchers have found the first insect species with that ability: Indian jumping ants. They are like catnip to researchers in the field. In contrast to their cousins, Indian jumping ants colonies do not perish once their queen dies. Instead, “chosen” workers take her place – with expanded ovaries and shrunken brains – to produce offspring. But, if a worker’s “pseudo-queen” status is somehow revoked, their bodies can bounce back, the research suggests. Typically, whether an ant will be a worker or a queen is decided at the larval stage. If fed generously and given the right hormones, the ant has the chance to become a big queen. If not, then it is stuck with a career as a sterile worker deprived of the opportunity to switch – unless it’s part of a species such as the Indian jumping ant. “They have this ability to completely transform themselves at the adult stage, and that makes them interesting to try to understand,” said lead author Dr Clint Penick from US-based Kennesaw State University. Social insects such as ants typically inhabit a caste-based society – the queen reigns as the sole reproducer by secreting pheromones that thwart female worker ants from laying eggs. The other ants work hard: foraging and hunting for food, cleaning, caring for the young and defending the nest. But unlike typical colonies that wither away on the death of their queen, Indian jumping ant colonies are functionally immortal. © 2021 Guardian News & Media Limited

Keyword: Learning & Memory
Link ID: 27772 - Posted: 04.14.2021

Sarah DeGenova Ackerman The human brain is made up of billions of neurons that form complex connections with one another. Flexibility at these connections is a major driver of learning and memory, but things can go wrong if it isn’t tightly regulated. For example, in people, too much plasticity at the wrong time is linked to brain disorders such as epilepsy and Alzheimer’s disease. Additionally, reduced levels of the two neuroplasticity-controlling proteins we identified are linked to increased susceptibility to autism and schizophrenia. Similarly, in our fruit flies, removing the cellular brakes on plasticity permanently impaired their crawling behavior. While fruit flies are of course different from humans, their brains work in very similar ways to the human brain and can offer valuable insight. One obvious benefit of discovering the effect of these proteins is the potential to treat some neurological diseases. But since a neuron’s flexibility is closely tied to learning and memory, in theory, researchers might be able to boost plasticity in a controlled way to enhance cognition in adults. This could, for example, allow people to more easily learn a new language or musical instrument. A colorful microscope image of a developing fruit fly brain. In this image showing a developing fruit fly brain on the right and the attached nerve cord on the left, the astrocytes are labeled in different colors showing their wide distribution among neurons. Sarah DeGenova Ackerman, CC BY-ND © 2010–2021, The Conversation US, Inc.

Keyword: Learning & Memory; Glia
Link ID: 27770 - Posted: 04.14.2021

By Alla Katsnelson New monthly payments in the pandemic relief package have the potential to lift millions of American children out of poverty. Some scientists believe the payments could change children’s lives even more fundamentally — via their brains. It’s well established that growing up in poverty correlates with disparities in educational achievement, health and employment. But an emerging branch of neuroscience asks how poverty affects the developing brain. Over the past 15 years, dozens of studies have found that children raised in meager circumstances have subtle brain differences compared with children from families of higher means. On average, the surface area of the brain’s outer layer of cells is smaller, especially in areas relating to language and impulse control, as is the volume of a structure called the hippocampus, which is responsible for learning and memory. These differences don’t reflect inherited or inborn traits, research suggests, but rather the circumstances in which the children grew up. Researchers have speculated that specific aspects of poverty — subpar nutrition, elevated stress levels, low-quality education — might influence brain and cognitive development. But almost all the work to date is correlational. And although those factors may be at play to various degrees for different families, poverty is their common root. A continuing study called Baby’s First Years, started in 2018, aims to determine whether reducing poverty can itself promote healthy brain development. © 2021 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 27763 - Posted: 04.08.2021

By Rachel Aviv Elizabeth Loftus was in Argentina, giving talks about the malleability of memory, in October, 2018, when she learned that Harvey Weinstein, who had recently been indicted for rape and sexual assault, wanted to speak with her. She couldn’t figure out how to receive international calls in her hotel room, so she asked if they could talk in three days, once she was home, in California. In response, she got a series of frantic e-mails saying that the conversation couldn’t wait. But, when Weinstein finally got through, she said, “basically he just wanted to ask, ‘How can something that seems so consensual be turned into something so wrong?’ ” Loftus, a professor at the University of California, Irvine, is the most influential female psychologist of the twentieth century, according to a list ­compiled by the Review of General Psychology. Her work helped usher in a paradigm shift, rendering obsolete the archival model of memory—the idea, dominant for much of the twentieth century, that our memories exist in some sort of mental library, as literal representations of past events. According to Loftus, who has published twenty-four books and more than six hundred papers, memories are reconstructed, not replayed. “Our representation of the past takes on a living, shifting reality,” she has written. “It is not fixed and immutable, not a place way back there that is preserved in stone, but a living thing that changes shape, expands, shrinks, and expands again, an amoeba-­like creature.” George A. Miller, one of the founders of cognitive psychology, once said in a speech to the American Psychological Association that the way to advance the field was “to give psychology away.” Loftus, who is seventy-six, adopts a similar view, seizing any opportunity to elaborate on what she calls the “flimsy curtain that separates our imagination and our memory.” In the past forty-five years, she has testified or consulted in more than three hundred cases, on behalf of people wrongly accused of robbery and murder, as well as for high-profile defendants like Bill Cosby, Jerry Sandusky, and the Duke lacrosse players accused of rape, in 2006. “If the MeToo movement had an office, Beth’s picture would be on the ten-most-wanted list,” her brother Robert told me. © 2021 Condé Nast.

Keyword: Learning & Memory
Link ID: 27754 - Posted: 03.31.2021

James Doubek By being able to wait for better food, cuttlefish — the squishy sea creatures similar to octopuses and squids — showed self-control that's linked to the higher intelligence of primates. It was part of an experiment by Alex Schnell from the University of Cambridge and colleagues. "What surprised me the most was that the level of self-control shown by our cuttlefish was quite advanced," she tells Lulu Garcia-Navarro on Weekend Edition. The experiment was essentially a take on the classic "marshmallow" experiment from the 1960s. In that experiment, young children were presented with one marshmallow and told that if they can resist eating it, unsupervised, for several minutes, they will get two marshmallows. But if they eat it that's all they get. The conventional wisdom has been that children who are able to delay gratification do better on tests and are more successful later in life. (There are of course many caveats when talking about the human experiments.) To adapt the experiment for cuttlefish, the researchers first figured out the cuttlefish's favorite food: live grass shrimp; and their second-favorite food: a piece of king prawn. Instead of choosing one or two marshmallows, the cuttlefish had to choose either their favorite food or second-favorite food. "Each of the food items were placed in clear chambers within their tank," Schnell says. "One chamber would open immediately, whereas the other chamber would only open after a delay." © 2021 npr

Keyword: Evolution; Learning & Memory
Link ID: 27724 - Posted: 03.11.2021

By Laura Sanders A century ago, science’s understanding of the brain was primitive, like astronomy before telescopes. Certain brain injuries were known to cause specific problems, like loss of speech or vision, but those findings offered a fuzzy view. Anatomists had identified nerve cells, or neurons, as key components of the brain and nervous system. But nobody knew how these cells collectively manage the brain’s sophisticated control of behavior, memory or emotions. And nobody knew how neurons communicate, or the intricacies of their connections. For that matter, the research field known as neuroscience — the science of the nervous system — did not exist, becoming known as such only in the 1960s. Over the last 100 years, brain scientists have built their telescopes. Powerful tools for peering inward have revealed cellular constellations. It’s likely that over 100 different kinds of brain cells communicate with dozens of distinct chemicals. A single neuron, scientists have discovered, can connect to tens of thousands of other cells. Yet neuroscience, though no longer in its infancy, is far from mature. Today, making sense of the brain’s vexing complexity is harder than ever. Advanced technologies and expanded computing capacity churn out torrents of information. “We have vastly more data … than we ever had before, period,” says Christof Koch, a neuroscientist at the Allen Institute in Seattle. Yet we still don’t have a satisfying explanation of how the brain operates. We may never understand brains in the way we understand rainbows, or black holes, or DNA. © Society for Science & the Public 2000–2021.

Keyword: Brain imaging; Learning & Memory
Link ID: 27722 - Posted: 03.06.2021

The Physics arXiv Blog One of the best-studied networks in neuroscience is the brain of a fruit fly, in particular, a part called the mushroom body. This analyzes sensory inputs such as odors, temperature, humidity and visual data so that the fly can learn to distinguish friendly stimuli from dangerous ones. Neuroscientists have long known how this section of the brain is wired. It consists of a set of cells called projection neurons that transmit the sensory information to a population of 2,000 neurons called Kenyon cells. The Kenyon cells are wired together to form a neural network capable of learning. This is how fruit flies learn to avoid potentially hazardous sensory inputs — such as dangerous smells and temperatures — while learning to approach foodstuffs, potential mates, and so on. But the power and flexibility of this relatively small network has long raised a curious question for neuroscientists: could it be re-programmed to tackle other tasks? Now they get an answer thanks to the work of Yuchan Liang at the Rensselaer Polytechnic Institute, the MIT-IBM Watson AI Lab, and colleagues. This team has hacked the fruit fly brain network to perform other tasks, such as natural language processing. It's the first time a naturally occurring network has been commandeered in this way. And this biological brain network is no slouch. Liang and the team says it matches the performance of artificial learning networks while using far fewer computational resources. © 2021 Kalmbach Media Co.

Keyword: Learning & Memory
Link ID: 27671 - Posted: 01.30.2021

By Clay Risen In 1978, James R. Flynn, a political philosopher at the University of Otago, in New Zealand, was writing a book about what constituted a “humane” society. He considered “inhumane” societies as well — dictatorships, apartheid states — and, in his reading, came across the work of Arthur R. Jensen, a psychologist at the University of California, Berkeley. Dr. Jensen was best known for an article he published in 1969 claiming that the differences between Black and white Americans on I.Q. tests resulted from genetic differences between the races — and that programs that tried to improve Black educational outcomes, like Head Start, were bound to fail. Dr. Flynn, a committed leftist who had once been a civil rights organizer in Kentucky, felt instinctively that Dr. Jensen was wrong, and he set out to prove it. In 1980 he published a thorough, devastating critique of Dr. Jensen’s work — showing, for example, that many groups of whites scored as low on I.Q. tests as Black Americans. But he didn’t stop there. Like most researchers in his field, Dr. Jensen had assumed that intelligence was constant across generations, pointing to the relative stability of I.Q. tests over time as evidence. But Dr. Flynn noticed something that no one else had: Those tests were recalibrated every decade or so. When he looked at the raw, uncalibrated data over nearly 100 years, he found that I.Q. scores had gone up, dramatically. “If you scored people 100 years ago against our norms, they would score a 70,” or borderline mentally disabled, he said later. “If you scored us against their norms, we would score 130” — borderline gifted. Just as groundbreaking was his explanation for why. The rise was too fast to be genetic, nor could it be that our recent ancestors were less intelligent than we are. Rather, he argued, the last century has seen a revolution in abstract thinking, what he called “scientific spectacles,” brought on by the demands of a technologically robust industrial society. This new order, he maintained, required greater educational attainment and an ability to think in terms of symbols, analogies and complex logic — exactly what many I.Q. tests measure. © 2021 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 27664 - Posted: 01.27.2021

By Stephani Sutherland Patrick Thornton, a 40-year-old math teacher in Houston, Tex., relies on his voice to clearly communicate with his high school students. So when he began to feel he was recovering from COVID, he was relieved to get his voice back a month after losing it. Thornton got sick in mid-August and had symptoms typical of a moderate case: a sore throat, headaches, trouble breathing. By the end of September, “I was more or less counting myself as on the mend and healing,” Thornton says. “But on September 25, I took a nap, and then my mom called.” As the two spoke, Thornton’s mother remarked that it was great that his voice was returning. Something was wrong, however. “I realized that some of the words didn’t feel right in my mouth, you know?” he says. They felt jumbled, stuck inside. Thornton had suddenly developed a severe stutter for the first time in his life. “I got my voice back, but it broke my mouth,” he says. After relaying the story over several minutes, Thornton sighs heavily with exhaustion. The thought of going back to teaching with his stutter, “that was terrifying,” he says. In November Thornton still struggled with low energy, chest pain and headaches. And “sometimes my heart rate [would] just decide that we’re being chased by a tiger out of nowhere," he adds. His stutter only worsened by that time, Thornton says, and he worried that it reflected some more insidious condition in his brain, despite doctors’ insistence that the speech disruption was simply a product of stress. © 2021 Scientific American,

Keyword: Learning & Memory; Schizophrenia
Link ID: 27661 - Posted: 01.23.2021