Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1734

By Laura Sanders A century ago, science’s understanding of the brain was primitive, like astronomy before telescopes. Certain brain injuries were known to cause specific problems, like loss of speech or vision, but those findings offered a fuzzy view. Anatomists had identified nerve cells, or neurons, as key components of the brain and nervous system. But nobody knew how these cells collectively manage the brain’s sophisticated control of behavior, memory or emotions. And nobody knew how neurons communicate, or the intricacies of their connections. For that matter, the research field known as neuroscience — the science of the nervous system — did not exist, becoming known as such only in the 1960s. Over the last 100 years, brain scientists have built their telescopes. Powerful tools for peering inward have revealed cellular constellations. It’s likely that over 100 different kinds of brain cells communicate with dozens of distinct chemicals. A single neuron, scientists have discovered, can connect to tens of thousands of other cells. Yet neuroscience, though no longer in its infancy, is far from mature. Today, making sense of the brain’s vexing complexity is harder than ever. Advanced technologies and expanded computing capacity churn out torrents of information. “We have vastly more data … than we ever had before, period,” says Christof Koch, a neuroscientist at the Allen Institute in Seattle. Yet we still don’t have a satisfying explanation of how the brain operates. We may never understand brains in the way we understand rainbows, or black holes, or DNA. © Society for Science & the Public 2000–2021.

Keyword: Brain imaging; Learning & Memory
Link ID: 27722 - Posted: 03.06.2021

The Physics arXiv Blog One of the best-studied networks in neuroscience is the brain of a fruit fly, in particular, a part called the mushroom body. This analyzes sensory inputs such as odors, temperature, humidity and visual data so that the fly can learn to distinguish friendly stimuli from dangerous ones. Neuroscientists have long known how this section of the brain is wired. It consists of a set of cells called projection neurons that transmit the sensory information to a population of 2,000 neurons called Kenyon cells. The Kenyon cells are wired together to form a neural network capable of learning. This is how fruit flies learn to avoid potentially hazardous sensory inputs — such as dangerous smells and temperatures — while learning to approach foodstuffs, potential mates, and so on. But the power and flexibility of this relatively small network has long raised a curious question for neuroscientists: could it be re-programmed to tackle other tasks? Now they get an answer thanks to the work of Yuchan Liang at the Rensselaer Polytechnic Institute, the MIT-IBM Watson AI Lab, and colleagues. This team has hacked the fruit fly brain network to perform other tasks, such as natural language processing. It's the first time a naturally occurring network has been commandeered in this way. And this biological brain network is no slouch. Liang and the team says it matches the performance of artificial learning networks while using far fewer computational resources. © 2021 Kalmbach Media Co.

Keyword: Learning & Memory
Link ID: 27671 - Posted: 01.30.2021

By Clay Risen In 1978, James R. Flynn, a political philosopher at the University of Otago, in New Zealand, was writing a book about what constituted a “humane” society. He considered “inhumane” societies as well — dictatorships, apartheid states — and, in his reading, came across the work of Arthur R. Jensen, a psychologist at the University of California, Berkeley. Dr. Jensen was best known for an article he published in 1969 claiming that the differences between Black and white Americans on I.Q. tests resulted from genetic differences between the races — and that programs that tried to improve Black educational outcomes, like Head Start, were bound to fail. Dr. Flynn, a committed leftist who had once been a civil rights organizer in Kentucky, felt instinctively that Dr. Jensen was wrong, and he set out to prove it. In 1980 he published a thorough, devastating critique of Dr. Jensen’s work — showing, for example, that many groups of whites scored as low on I.Q. tests as Black Americans. But he didn’t stop there. Like most researchers in his field, Dr. Jensen had assumed that intelligence was constant across generations, pointing to the relative stability of I.Q. tests over time as evidence. But Dr. Flynn noticed something that no one else had: Those tests were recalibrated every decade or so. When he looked at the raw, uncalibrated data over nearly 100 years, he found that I.Q. scores had gone up, dramatically. “If you scored people 100 years ago against our norms, they would score a 70,” or borderline mentally disabled, he said later. “If you scored us against their norms, we would score 130” — borderline gifted. Just as groundbreaking was his explanation for why. The rise was too fast to be genetic, nor could it be that our recent ancestors were less intelligent than we are. Rather, he argued, the last century has seen a revolution in abstract thinking, what he called “scientific spectacles,” brought on by the demands of a technologically robust industrial society. This new order, he maintained, required greater educational attainment and an ability to think in terms of symbols, analogies and complex logic — exactly what many I.Q. tests measure. © 2021 The New York Times Company

Keyword: Learning & Memory; Development of the Brain
Link ID: 27664 - Posted: 01.27.2021

By Stephani Sutherland Patrick Thornton, a 40-year-old math teacher in Houston, Tex., relies on his voice to clearly communicate with his high school students. So when he began to feel he was recovering from COVID, he was relieved to get his voice back a month after losing it. Thornton got sick in mid-August and had symptoms typical of a moderate case: a sore throat, headaches, trouble breathing. By the end of September, “I was more or less counting myself as on the mend and healing,” Thornton says. “But on September 25, I took a nap, and then my mom called.” As the two spoke, Thornton’s mother remarked that it was great that his voice was returning. Something was wrong, however. “I realized that some of the words didn’t feel right in my mouth, you know?” he says. They felt jumbled, stuck inside. Thornton had suddenly developed a severe stutter for the first time in his life. “I got my voice back, but it broke my mouth,” he says. After relaying the story over several minutes, Thornton sighs heavily with exhaustion. The thought of going back to teaching with his stutter, “that was terrifying,” he says. In November Thornton still struggled with low energy, chest pain and headaches. And “sometimes my heart rate [would] just decide that we’re being chased by a tiger out of nowhere," he adds. His stutter only worsened by that time, Thornton says, and he worried that it reflected some more insidious condition in his brain, despite doctors’ insistence that the speech disruption was simply a product of stress. © 2021 Scientific American,

Keyword: Learning & Memory; Schizophrenia
Link ID: 27661 - Posted: 01.23.2021

by Sarah DeWeerdt A drug that has been tested in clinical trials as a treatment for depression restores social memory in a mouse model of 22q11.2 deletion syndrome, according to a new study. The findings hint that the drug might also be useful to treat social cognitive difficulties in people with conditions such as autism, experts say. People who are missing one copy of a chromosomal region known as 22q11.2 have heart abnormalities, distinctive facial features and an increased risk of schizophrenia and other psychiatric conditions. About 16 percent have autism. People with the syndrome also have a smaller-than-average hippocampus, a structure that functions as the brain’s memory hub. The findings extend what researchers know about the role of the hippocampus in social behavior by suggesting that a small region of the hippocampus known as CA2 springs to life when an animal encounters an individual it hasn’t met before. A strength of the study is that it describes the basic biology of a brain circuit, shows how that circuit is disrupted in a mouse model and identifies a therapeutic target to reverse those disruptions, says Anthony LaMantia, professor of developmental disorders and genetics at Virginia Polytechnic Institute and State University in Blacksburg, who was not involved in the work. “This is one of the best papers sort of going from soup to nuts that has come out.” Previous studies showed that CA2 is crucial for social memory, the ability to recognize and remember others. “But we really didn’t have a good handle on what type of information CA2 was providing to the rest of the brain,” says study leader Steven Siegelbaum, professor of neuroscience and pharmacology at Columbia University. © 2020 Simons Foundation

Keyword: Autism; Genes & Behavior
Link ID: 27635 - Posted: 12.22.2020

By Cara Giaimo The rooms that make up the Bloomington Drosophila Stock Center at Indiana University are lined wall to wall with identical shelves. Each shelf is filled with uniform racks, and each rack with indistinguishable glass vials. The tens of thousands of fruit fly types within the vials, though, are each magnificently different. Some have eyes that fluoresce pink. Some jump when you shine a red light on them. Some have short bodies and iridescent curly wings, and look “like little ballerinas,” said Carol Sylvester, who helps care for them. Each variety doubles as a unique research tool, and it has taken decades to introduce the traits that make them useful. If left unattended, the flies would die in a matter of weeks, marooning entire scientific disciplines. Throughout the Covid-19 pandemic, workers across industries have held the world together, taking on great personal risk to care for sick patients, maintain supply chains and keep people fed. But other essential jobs are less well-known. At the Stock Center dozens of employees have come to work each day, through a lockdown and afterward, to minister to the flies that underpin scientific research. Tiny Bug, Huge Impact To most casual observers, fruit flies are little dots with wings that hang out near old bananas. But over the course of the last century, researchers have turned the insect — known to science as Drosophila melanogaster — into a sort of genetic switchboard. Biologists regularly develop new “strains” of flies, in which particular genes are turned on or off. Studying these slight mutants can reveal how those genes function — including in humans, because we share over half of our genes with Drosophila. For instance, researchers discovered what is now called the hippo gene — which helps regulate organ size in both fruit flies and vertebrates — after flies with a defect in it grew up to be unusually large and wrinkly. Further work with the gene has indicated that such defects may contribute to the unchecked cell growth that leads to cancer in people. © 2020 The New York Times Company

Keyword: Genes & Behavior; Development of the Brain
Link ID: 27624 - Posted: 12.15.2020

Sarah Sloat Patience, you might have heard, is a virtue. That’s why so many Puritans named their daughters “Patience” in the 1600s. It is the ability to wait calmly in the face of frustration or adversity. Like Penelope weaving while waiting for Odysseus, patient people wait for their partners to finish a Netflix show they’re binging. Impatient people do not. But despite the societal framing of patience as a measurement of character, in its purest sense, patience is a chemically induced output of the brain. However, exactly what goes on in the brain that leads to patience isn’t well understood. A new study involving mice takes a step toward understanding patience by pointing to the role of serotonin, and how it interacts with different brain structures. Serotonin is a chemical and a neurotransmitter, meaning it sends messages throughout the brain. It influences many behaviors, including mood and sleep. In a paper recently released in the journal Science Advances, scientists argue that serotonin influences specific areas of the brain to promote patient behavior. But critically, this process only occurs if there’s already “high expectation or confidence” that being patient will lead to future rewards. First author Katsuhiko Miyazaki is a scientist at the Okinawa Institute of Science and Technology in Japan who researches the relationship between serotonergic neural activity and animal behavior. He tells me this study originated from an interest in revealing how projections of serotonin promote waiting for future rewards.

Keyword: Attention; Learning & Memory
Link ID: 27615 - Posted: 12.09.2020

Elena Renken More than a century ago, the zoologist Richard Semon coined the term “engram” to designate the physical trace a memory must leave in the brain, like a footprint. Since then, neuroscientists have made progress in their hunt for exactly how our brains form memories. They have learned that specific brain cells activate as we form a memory and reactivate as we remember it, strengthening the connections among the neurons involved. That change ingrains the memory and lets us keep memories we recall more often, while others fade. But the precise physical alterations within our neurons that bring about these changes have been hard to pin down — until now. In a study published last month, researchers at the Massachusetts Institute of Technology tracked an important part of the memory-making process at the molecular scale in engram cells’ chromosomes. Neuroscientists already knew that memory formation is not instantaneous, and that the act of remembering is crucial to locking a memory into the brain. These researchers have now discovered some of the physical embodiment of that mechanism. The MIT group worked with mice that had a fluorescent marker spliced into their genome to make their cells glow whenever they expressed the gene Arc, which is associated with memory formation. The scientists placed these mice in a novel location and trained them to fear a specific noise, then returned them to this location several days later to reactivate the memory. In the brain area called the hippocampus, the engram cells that formed and recalled this memory lit up with color, which made it easy to sort them out from other brain cells under the microscope during a postmortem examination. All Rights Reserved © 2020

Keyword: Learning & Memory; Stress
Link ID: 27567 - Posted: 11.04.2020

Anil Ananthaswamy In the winter of 2011, Daniel Yamins, a postdoctoral researcher in computational neuroscience at the Massachusetts Institute of Technology, would at times toil past midnight on his machine vision project. He was painstakingly designing a system that could recognize objects in pictures, regardless of variations in size, position and other properties — something that humans do with ease. The system was a deep neural network, a type of computational device inspired by the neurological wiring of living brains. “I remember very distinctly the time when we found a neural network that actually solved the task,” he said. It was 2 a.m., a tad too early to wake up his adviser, James DiCarlo, or other colleagues, so an excited Yamins took a walk in the cold Cambridge air. “I was really pumped,” he said. It would have counted as a noteworthy accomplishment in artificial intelligence alone, one of many that would make neural networks the darlings of AI technology over the next few years. But that wasn’t the main goal for Yamins and his colleagues. To them and other neuroscientists, this was a pivotal moment in the development of computational models for brain functions. DiCarlo and Yamins, who now runs his own lab at Stanford University, are part of a coterie of neuroscientists using deep neural networks to make sense of the brain’s architecture. In particular, scientists have struggled to understand the reasons behind the specializations within the brain for various tasks. They have wondered not just why different parts of the brain do different things, but also why the differences can be so specific: Why, for example, does the brain have an area for recognizing objects in general but also for faces in particular? Deep neural networks are showing that such specializations may be the most efficient way to solve problems. All Rights Reserved © 2020

Keyword: Learning & Memory
Link ID: 27562 - Posted: 10.31.2020

Jon Hamilton If you fall off a bike, you'll probably end up with a cinematic memory of the experience: the wind in your hair, the pebble on the road, then the pain. That's known as an episodic memory. And now researchers have identified cells in the human brain that make this sort of memory possible, a team reports in the journal Proceedings of the National Academy of Sciences. The cells are called time cells, and they place a sort of time stamp on memories as they are being formed. That allows us to recall sequences of events or experiences in the right order. "By having time cells create this indexing across time, you can put everything together in a way that makes sense," says Dr. Bradley Lega, the study's senior author and a neurosurgeon at the University of Texas Southwestern Medical Center in Dallas. Time cells were discovered in rodents decades ago. But the new study is critical because "the final arbitrator is always the human brain," says Dr. György Buzsáki, Biggs Professor of Neuroscience at New York University. Buzsáki is not an author of the study but did edit the manuscript. Lega and his team found the time cells by studying the brains of 27 people who were awaiting surgery for severe epilepsy. As part of their pre-surgical preparation, these patients had electrodes placed in the hippocampus and another area of the brain involved in navigation, memory and time perception. In the experiment, the patients studied sequences of 12 or 15 words that appeared on a laptop screen during a period of about 30 seconds. Then, after a break, they were asked to recall the words they had seen. © 2020 npr

Keyword: Learning & Memory
Link ID: 27561 - Posted: 10.31.2020

By Abby Goodnough PHILADELPHIA — Steven Kelty had been addicted to crack cocaine for 32 years when he tried a different kind of treatment last year, one so basic in concept that he was skeptical. He would come to a clinic twice a week to provide a urine sample, and if it was free of drugs, he would get to draw a slip of paper out of a fishbowl. Half contained encouraging messages — typically, “Good job!” — but the other half were vouchers for prizes worth between $1 and $100. “I’ve been to a lot of rehabs, and there were no incentives except for the idea of being clean after you finished,” said Mr. Kelty, 61, of Winfield, Pa. “Some of us need something to motivate us — even if it’s a small thing — to live a better life.” The treatment is called contingency management, because the rewards are contingent on staying abstinent. A number of clinical trials have found it highly effective in getting people addicted to stimulants like cocaine and methamphetamine to stay in treatment and to stop using the drugs. But outside the research arena and the Department of Veterans Affairs, where Mr. Kelty is a patient, it is nearly impossible to find programs that offer such treatment — even as overdose deaths involving meth, in particular, have soared. There were more than 16,500 such deaths last year, according to preliminary data, more than twice as many as in 2016. Early data suggests that overdoses have increased even more during the coronavirus pandemic, which has forced most treatment programs to move online. Researchers say that one of the biggest obstacles to contingency management is a moral objection to the idea of rewarding someone for staying off drugs. That is one reason publicly funded programs like Medicaid, which provides health coverage for the poor, do not cover the treatment. Some treatment providers are also wary of giving prizes that they say patients could sell or trade for drugs. Greg Delaney, a pastor and the outreach coordinator at Woodhaven, a residential treatment center in Ohio, said, “Until you’re at the point where you can say, ‘I can make a good decision with this $50,’ it’s counterproductive.” © 2020 The New York Times Company

Keyword: Drug Abuse; Learning & Memory
Link ID: 27556 - Posted: 10.28.2020

By Stephani Sutherland Many of the symptoms experienced by people infected with SARS-CoV-2 involve the nervous system. Patients complain of headaches, muscle and joint pain, fatigue and “brain fog,” or loss of taste and smell—all of which can last from weeks to months after infection. In severe cases, COVID-19 can also lead to encephalitis or stroke. The virus has undeniable neurological effects. But the way it actually affects nerve cells still remains a bit of a mystery. Can immune system activation alone produce symptoms? Or does the novel coronavirus directly attack the nervous system? Some studies—including a recent preprint paper examining mouse and human brain tissue—show evidence that SARS-CoV-2 can get into nerve cells and the brain. The question remains as to whether it does so routinely or only in the most severe cases. Once the immune system kicks into overdrive, the effects can be far-ranging, even leading immune cells to invade the brain, where they can wreak havoc. Some neurological symptoms are far less serious yet seem, if anything, more perplexing. One symptom—or set of symptoms—that illustrates this puzzle and has gained increasing attention is an imprecise diagnosis called “brain fog.” Even after their main symptoms have abated, it is not uncommon for COVID-19 patients to experience memory loss, confusion and other mental fuzziness. What underlies these experiences is still unclear, although they may also stem from the body-wide inflammation that can go along with COVID-19. Many people, however, develop fatigue and brain fog that lasts for months even after a mild case that does not spur the immune system to rage out of control. Another widespread symptom called anosmia, or loss of smell, might also originate from changes that happen without nerves themselves getting infected. Olfactory neurons, the cells that transmit odors to the brain, lack the primary docking site, or receptor, for SARS-CoV-2, and they do not seem to get infected. Researchers are still investigating how loss of smell might result from an interaction between the virus and another receptor on the olfactory neurons or from its contact with nonnerve cells that line the nose. © 2020 Scientific American,

Keyword: Learning & Memory; Chemical Senses (Smell & Taste)
Link ID: 27547 - Posted: 10.24.2020

The plant compound apigenin improved the cognitive and memory deficits usually seen in a mouse model of Down syndrome, according to a study by researchers at the National Institutes of Health and other institutions. Apigenin is found in chamomile flowers, parsley, celery, peppermint and citrus fruits. The researchers fed the compound to pregnant mice carrying fetuses with Down syndrome characteristics and then to the animals after they were born and as they matured. The findings raise the possibility that a treatment to lessen the cognitive deficits seen in Down syndrome could one day be offered to pregnant women whose fetuses have been diagnosed with Down syndrome through prenatal testing. The study appears in the American Journal of Human Genetics. Down syndrome is a set of symptoms resulting from an extra copy or piece of chromosome 21. The intellectual and developmental disabilities accompanying the condition are believed to result from decreased brain growth caused by increased inflammation in the fetal brain. Apigenin is not known to have any toxic effects, and previous studies have indicated that it is an antioxidant that reduces inflammation. Unlike many compounds, it is absorbed through the placenta and the blood brain barrier, the cellular layer that prevents potentially harmful substances from entering the brain. Compared to mice with Down symptoms whose mothers were not fed apigenin, those exposed to the compound showed improvements in tests of developmental milestones and had improvements in spatial and olfactory memory. Tests of gene activity and protein levels showed the apigenin-treated mice had less inflammation and increased blood vessel and nervous system growth. Guedj, F. et al. Apigenin as a candidate prenatal treatment for Trisomy 21: effects in human amniocytes and the Ts1Cje mouse model. American Journal of Human Genetics. 2020.

Keyword: Development of the Brain; Genes & Behavior
Link ID: 27546 - Posted: 10.24.2020

By Bruce Bower A type of bone tool generally thought to have been invented by Stone Age humans got its start among hominids that lived hundreds of thousands of years before Homo sapiens evolved, a new study concludes. A set of 52 previously excavated but little-studied animal bones from East Africa’s Olduvai Gorge includes the world’s oldest known barbed bone point, an implement probably crafted by now-extinct Homo erectus at least 800,000 years ago, researchers say. Made from a piece of a large animal’s rib, the artifact features three curved barbs and a carved tip, the team reports in the November Journal of Human Evolution. Among the Olduvai bones, biological anthropologist Michael Pante of Colorado State University in Fort Collins and colleagues identified five other tools from more than 800,000 years ago as probable choppers, hammering tools or hammering platforms. The previous oldest barbed bone points were from a central African site and dated to around 90,000 years ago (SN: 4/29/95), and were assumed to reflect a toolmaking ingenuity exclusive to Homo sapiens. Those implements include carved rings around the base of the tools where wooden shafts were presumably attached. Barbed bone points found at H. sapiens sites were likely used to catch fish and perhaps to hunt large land prey. The Olduvai Gorge barbed bone point, which had not been completed, shows no signs of having been attached to a handle or shaft. Ways in which H. erectus used the implement are unclear, Pante and his colleagues say. © Society for Science & the Public 2000–2020.

Keyword: Evolution; Learning & Memory
Link ID: 27543 - Posted: 10.24.2020

By Meagan Cantwell Although bird brains are tiny, they’re packed with neurons, especially in areas responsible for higher level thinking. Two studies published last month in Science explore the structure and function of avian brains—revealing they are organized similarly to mammals’ and are capable of conscious thought. © 2020 American Association for the Advancement of Science.

Keyword: Evolution; Learning & Memory
Link ID: 27541 - Posted: 10.24.2020

By Benedict Carey Scott Lilienfeld, an expert in personality disorders who repeatedly disturbed the order in his own field, questioning the science behind many of psychology’s conceits, popular therapies and prized tools, died on Sept. 30 at his home in Atlanta. He was 59. The cause was pancreatic cancer, his wife, Candice Basterfield, said. Dr. Lilienfeld’s career, most of it spent at Emory University in Atlanta, proceeded on two tracks: one that sought to deepen the understanding of so-called psychopathic behavior, the other to expose the many faces of pseudoscience in psychology. Psychopathy is characterized by superficial charm, grandiosity, pathological lying and a lack of empathy. Descriptions of the syndrome were rooted in research in the criminal justice system, where psychopaths often end up. In the early 1990s, Dr. Lilienfeld worked to deepen and clarify the definition. In a series of papers, he anchored a team of psychologists who identified three underlying personality features that psychopaths share, whether they commit illegal acts or not: fearless dominance, meanness and impulsivity. The psychopath does what he or she wants, without anxiety, regret or regard for the suffering of others. “When you have these three systems interacting, it’s a bad brew, and it creates the substrate for what can become psychopathy,” said Mark F. Lenzenweger, a professor of psychology at the State University of New York at Binghamton. “This was Scott’s great contribution: He helped change the thinking about psychopathy, in a profound way, by focusing on aspects of personality, rather than on a list of bad behaviors.” Dr. Lilienfeld’s parallel career encompassed clinical psychology and aimed to shake it free of empty theorizing, softheadedness and bad practice. In the late 1990s and early 2000s, he led a loose group of researchers who began to question the validity of some of the field’s favored constructs, like repressed memories of abuse and multiple personality disorder. The Rorschach inkblot test took a direct hit as largely unreliable. The group also attacked treatments including psychological debriefing and eye movement desensitization and reprocessing, or E.M.D.R., both of which are used for trauma victims. © 2020 The New York Times Company

Keyword: Aggression; Learning & Memory
Link ID: 27529 - Posted: 10.19.2020

Keith A. Trujillo1, Alfredo Quiñones-Hinojosa2, Kenira J. Thompson3 Joe Louis Martinez Jr. died on 29 August at the age of 76. In addition to making extraordinary contributions to the fields of neurobiology and Chicano psychology, Joe was a tireless advocate of diversity, equity, and inclusion in the sciences. He established professional development programs for individuals from underrepresented groups and provided lifelong mentoring as they pursued careers in science and academia. Joe was passionately devoted to expanding opportunities in the sciences well before diversity became a visible goal for scientific organizations and academic institutions. Born in Albuquerque, New Mexico, on 1 August 1944, Joe received his bachelor's degree in psychology from the University of San Diego in 1966; his master's in experimental psychology from New Mexico Highlands University in 1968; and his Ph.D. in physiological psychology from the University of Delaware in 1971. His faculty career began in 1972 at California State University, San Bernardino (CSUSB), shortly after the campus was established. He later completed postdocs in the laboratory of neurobiologist James McGaugh at the University of California, Irvine, and with neurobiologist Floyd Bloom at the Salk Institute for Biological Studies in San Diego, California. The University of California, Berkeley, recruited Joe in 1982, and he served as a professor as well as the area head of biopsychology and faculty assistant to the vice chancellor for affirmative action. As the highest-ranking Hispanic faculty member in the University of California system, Joe used his voice to help others from underrepresented groups. However, he felt that he could have a greater impact on diversity in the sciences by helping to build a university with a high concentration of Hispanic students, so in 1995 he moved to the University of Texas, San Antonio (UTSA). He began as a professor of biology and went on to assume a range of leadership roles, including director of the Cajal Neuroscience Institute. At UTSA, he worked with colleagues to obtain nearly $18 million in funding for neuroscience research and education. In 2012, he moved to the University of Illinois at Chicago where he served as professor and psychology department head until his retirement in 2016. At each institution, he embraced the opportunity to provide guidance and mentoring to innumerable students, faculty, and staff. © 2020 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 27523 - Posted: 10.16.2020

By Pam Belluck After contracting the coronavirus in March, Michael Reagan lost all memory of his 12-day vacation in Paris, even though the trip was just a few weeks earlier. Several weeks after Erica Taylor recovered from her Covid-19 symptoms of nausea and cough, she became confused and forgetful, failing to even recognize her own car, the only Toyota Prius in her apartment complex’s parking lot. Lisa Mizelle, a veteran nurse practitioner at an urgent care clinic who fell ill with the virus in July, finds herself forgetting routine treatments and lab tests, and has to ask colleagues about terminology she used to know automatically. “I leave the room and I can’t remember what the patient just said,” she said, adding that if she hadn’t exhausted her medical leave she’d take more time off. “It scares me to think I’m working,” Ms. Mizelle, 53, said. “I feel like I have dementia.” It’s becoming known as Covid brain fog: troubling cognitive symptoms that can include memory loss, confusion, difficulty focusing, dizziness and grasping for everyday words. Increasingly, Covid survivors say brain fog is impairing their ability to work and function normally. “There are thousands of people who have that,” said Dr. Igor Koralnik, chief of neuro-infectious disease at Northwestern Medicine in Chicago, who has already seen hundreds of survivors at a post-Covid clinic he leads. “The impact on the work force that’s affected is going to be significant. Scientists aren’t sure what causes brain fog, which varies widely and affects even people who became only mildly physically ill from Covid-19 and had no previous medical conditions. Leading theories are that it arises when the body’s immune response to the virus doesn’t shut down or from inflammation in blood vessels leading to the brain. © 2020 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 27522 - Posted: 10.12.2020

By Bret Stetka The human brain is hardwired to map our surroundings. This trait is called spatial memory—our ability to remember certain locations and where objects are in relation to one another. New findings published today in Scientific Reports suggest that one major feature of our spatial recall is efficiently locating high-calorie, energy-rich food. The study’s authors believe human spatial memory ensured that our hunter-gatherer ancestors could prioritize the location of reliable nutrition, giving them an evolutionary leg up. In the study, researchers at Wageningen University & Research in the Netherlands observed 512 participants follow a fixed path through a room where either eight food samples or eight food-scented cotton pads were placed in different locations. When they arrived at a sample, the participants would taste the food or smell the cotton and rate how much they liked it. Four of the food samples were high-calorie, including brownies and potato chips, and the other four, including cherry tomatoes and apples, were low in calories—diet foods, you might call them. After the taste test, the participants were asked to identify the location of each sample on a map of the room. They were nearly 30 percent more accurate at mapping the high-calorie samples versus the low-calorie ones, regardless of how much they liked those foods or odors. They were also 243 percent more accurate when presented with actual foods, as opposed to the food scents. “Our main takeaway message is that human minds seem to be designed for efficiently locating high-calorie foods in our environment,” says Rachelle de Vries, a Ph.D. candidate in human nutrition and health at Wageningen University and lead author of the new paper. De Vries feels her team’s findings support the idea that locating valuable caloric resources was an important and regularly occurring problem for early humans weathering the climate shifts of the Pleistocene epoch. “Those with a better memory for where and when high-calorie food resources would be available were likely to have a survival—or fitness—advantage,” she explains. © 2020 Scientific American

Keyword: Learning & Memory; Obesity
Link ID: 27518 - Posted: 10.10.2020

R. Stanley Williams For the first time, my colleagues and I have built a single electronic device that is capable of copying the functions of neuron cells in a brain. We then connected 20 of them together to perform a complicated calculation. This work shows that it is scientifically possible to make an advanced computer that does not rely on transistors to calculate and that uses much less electrical power than today’s data centers. Our research, which I began in 2004, was motivated by two questions. Can we build a single electronic element – the equivalent of a transistor or switch – that performs most of the known functions of neurons in a brain? If so, can we use it as a building block to build useful computers? Neurons are very finely tuned, and so are electronic elements that emulate them. I co-authored a research paper in 2013 that laid out in principle what needed to be done. It took my colleague Suhas Kumar and others five years of careful exploration to get exactly the right material composition and structure to produce the necessary property predicted from theory. Kumar then went a major step further and built a circuit with 20 of these elements connected to one another through a network of devices that can be programmed to have particular capacitances, or abilities to store electric charge. He then mapped a mathematical problem to the capacitances in the network, which allowed him to use the device to find the solution to a small version of a problem that is important in a wide range of modern analytics. © 2010–2020, The Conversation US, Inc.

Keyword: Learning & Memory; Robotics
Link ID: 27512 - Posted: 10.07.2020