Chapter 17. Learning and Memory

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 1727

Elena Renken More than a century ago, the zoologist Richard Semon coined the term “engram” to designate the physical trace a memory must leave in the brain, like a footprint. Since then, neuroscientists have made progress in their hunt for exactly how our brains form memories. They have learned that specific brain cells activate as we form a memory and reactivate as we remember it, strengthening the connections among the neurons involved. That change ingrains the memory and lets us keep memories we recall more often, while others fade. But the precise physical alterations within our neurons that bring about these changes have been hard to pin down — until now. In a study published last month, researchers at the Massachusetts Institute of Technology tracked an important part of the memory-making process at the molecular scale in engram cells’ chromosomes. Neuroscientists already knew that memory formation is not instantaneous, and that the act of remembering is crucial to locking a memory into the brain. These researchers have now discovered some of the physical embodiment of that mechanism. The MIT group worked with mice that had a fluorescent marker spliced into their genome to make their cells glow whenever they expressed the gene Arc, which is associated with memory formation. The scientists placed these mice in a novel location and trained them to fear a specific noise, then returned them to this location several days later to reactivate the memory. In the brain area called the hippocampus, the engram cells that formed and recalled this memory lit up with color, which made it easy to sort them out from other brain cells under the microscope during a postmortem examination. All Rights Reserved © 2020

Keyword: Learning & Memory; Stress
Link ID: 27567 - Posted: 11.04.2020

Anil Ananthaswamy In the winter of 2011, Daniel Yamins, a postdoctoral researcher in computational neuroscience at the Massachusetts Institute of Technology, would at times toil past midnight on his machine vision project. He was painstakingly designing a system that could recognize objects in pictures, regardless of variations in size, position and other properties — something that humans do with ease. The system was a deep neural network, a type of computational device inspired by the neurological wiring of living brains. “I remember very distinctly the time when we found a neural network that actually solved the task,” he said. It was 2 a.m., a tad too early to wake up his adviser, James DiCarlo, or other colleagues, so an excited Yamins took a walk in the cold Cambridge air. “I was really pumped,” he said. It would have counted as a noteworthy accomplishment in artificial intelligence alone, one of many that would make neural networks the darlings of AI technology over the next few years. But that wasn’t the main goal for Yamins and his colleagues. To them and other neuroscientists, this was a pivotal moment in the development of computational models for brain functions. DiCarlo and Yamins, who now runs his own lab at Stanford University, are part of a coterie of neuroscientists using deep neural networks to make sense of the brain’s architecture. In particular, scientists have struggled to understand the reasons behind the specializations within the brain for various tasks. They have wondered not just why different parts of the brain do different things, but also why the differences can be so specific: Why, for example, does the brain have an area for recognizing objects in general but also for faces in particular? Deep neural networks are showing that such specializations may be the most efficient way to solve problems. All Rights Reserved © 2020

Keyword: Learning & Memory
Link ID: 27562 - Posted: 10.31.2020

Jon Hamilton If you fall off a bike, you'll probably end up with a cinematic memory of the experience: the wind in your hair, the pebble on the road, then the pain. That's known as an episodic memory. And now researchers have identified cells in the human brain that make this sort of memory possible, a team reports in the journal Proceedings of the National Academy of Sciences. The cells are called time cells, and they place a sort of time stamp on memories as they are being formed. That allows us to recall sequences of events or experiences in the right order. "By having time cells create this indexing across time, you can put everything together in a way that makes sense," says Dr. Bradley Lega, the study's senior author and a neurosurgeon at the University of Texas Southwestern Medical Center in Dallas. Time cells were discovered in rodents decades ago. But the new study is critical because "the final arbitrator is always the human brain," says Dr. György Buzsáki, Biggs Professor of Neuroscience at New York University. Buzsáki is not an author of the study but did edit the manuscript. Lega and his team found the time cells by studying the brains of 27 people who were awaiting surgery for severe epilepsy. As part of their pre-surgical preparation, these patients had electrodes placed in the hippocampus and another area of the brain involved in navigation, memory and time perception. In the experiment, the patients studied sequences of 12 or 15 words that appeared on a laptop screen during a period of about 30 seconds. Then, after a break, they were asked to recall the words they had seen. © 2020 npr

Keyword: Learning & Memory
Link ID: 27561 - Posted: 10.31.2020

By Abby Goodnough PHILADELPHIA — Steven Kelty had been addicted to crack cocaine for 32 years when he tried a different kind of treatment last year, one so basic in concept that he was skeptical. He would come to a clinic twice a week to provide a urine sample, and if it was free of drugs, he would get to draw a slip of paper out of a fishbowl. Half contained encouraging messages — typically, “Good job!” — but the other half were vouchers for prizes worth between $1 and $100. “I’ve been to a lot of rehabs, and there were no incentives except for the idea of being clean after you finished,” said Mr. Kelty, 61, of Winfield, Pa. “Some of us need something to motivate us — even if it’s a small thing — to live a better life.” The treatment is called contingency management, because the rewards are contingent on staying abstinent. A number of clinical trials have found it highly effective in getting people addicted to stimulants like cocaine and methamphetamine to stay in treatment and to stop using the drugs. But outside the research arena and the Department of Veterans Affairs, where Mr. Kelty is a patient, it is nearly impossible to find programs that offer such treatment — even as overdose deaths involving meth, in particular, have soared. There were more than 16,500 such deaths last year, according to preliminary data, more than twice as many as in 2016. Early data suggests that overdoses have increased even more during the coronavirus pandemic, which has forced most treatment programs to move online. Researchers say that one of the biggest obstacles to contingency management is a moral objection to the idea of rewarding someone for staying off drugs. That is one reason publicly funded programs like Medicaid, which provides health coverage for the poor, do not cover the treatment. Some treatment providers are also wary of giving prizes that they say patients could sell or trade for drugs. Greg Delaney, a pastor and the outreach coordinator at Woodhaven, a residential treatment center in Ohio, said, “Until you’re at the point where you can say, ‘I can make a good decision with this $50,’ it’s counterproductive.” © 2020 The New York Times Company

Keyword: Drug Abuse; Learning & Memory
Link ID: 27556 - Posted: 10.28.2020

By Stephani Sutherland Many of the symptoms experienced by people infected with SARS-CoV-2 involve the nervous system. Patients complain of headaches, muscle and joint pain, fatigue and “brain fog,” or loss of taste and smell—all of which can last from weeks to months after infection. In severe cases, COVID-19 can also lead to encephalitis or stroke. The virus has undeniable neurological effects. But the way it actually affects nerve cells still remains a bit of a mystery. Can immune system activation alone produce symptoms? Or does the novel coronavirus directly attack the nervous system? Some studies—including a recent preprint paper examining mouse and human brain tissue—show evidence that SARS-CoV-2 can get into nerve cells and the brain. The question remains as to whether it does so routinely or only in the most severe cases. Once the immune system kicks into overdrive, the effects can be far-ranging, even leading immune cells to invade the brain, where they can wreak havoc. Some neurological symptoms are far less serious yet seem, if anything, more perplexing. One symptom—or set of symptoms—that illustrates this puzzle and has gained increasing attention is an imprecise diagnosis called “brain fog.” Even after their main symptoms have abated, it is not uncommon for COVID-19 patients to experience memory loss, confusion and other mental fuzziness. What underlies these experiences is still unclear, although they may also stem from the body-wide inflammation that can go along with COVID-19. Many people, however, develop fatigue and brain fog that lasts for months even after a mild case that does not spur the immune system to rage out of control. Another widespread symptom called anosmia, or loss of smell, might also originate from changes that happen without nerves themselves getting infected. Olfactory neurons, the cells that transmit odors to the brain, lack the primary docking site, or receptor, for SARS-CoV-2, and they do not seem to get infected. Researchers are still investigating how loss of smell might result from an interaction between the virus and another receptor on the olfactory neurons or from its contact with nonnerve cells that line the nose. © 2020 Scientific American,

Keyword: Learning & Memory; Chemical Senses (Smell & Taste)
Link ID: 27547 - Posted: 10.24.2020

The plant compound apigenin improved the cognitive and memory deficits usually seen in a mouse model of Down syndrome, according to a study by researchers at the National Institutes of Health and other institutions. Apigenin is found in chamomile flowers, parsley, celery, peppermint and citrus fruits. The researchers fed the compound to pregnant mice carrying fetuses with Down syndrome characteristics and then to the animals after they were born and as they matured. The findings raise the possibility that a treatment to lessen the cognitive deficits seen in Down syndrome could one day be offered to pregnant women whose fetuses have been diagnosed with Down syndrome through prenatal testing. The study appears in the American Journal of Human Genetics. Down syndrome is a set of symptoms resulting from an extra copy or piece of chromosome 21. The intellectual and developmental disabilities accompanying the condition are believed to result from decreased brain growth caused by increased inflammation in the fetal brain. Apigenin is not known to have any toxic effects, and previous studies have indicated that it is an antioxidant that reduces inflammation. Unlike many compounds, it is absorbed through the placenta and the blood brain barrier, the cellular layer that prevents potentially harmful substances from entering the brain. Compared to mice with Down symptoms whose mothers were not fed apigenin, those exposed to the compound showed improvements in tests of developmental milestones and had improvements in spatial and olfactory memory. Tests of gene activity and protein levels showed the apigenin-treated mice had less inflammation and increased blood vessel and nervous system growth. Guedj, F. et al. Apigenin as a candidate prenatal treatment for Trisomy 21: effects in human amniocytes and the Ts1Cje mouse model. American Journal of Human Genetics. 2020.

Keyword: Development of the Brain; Genes & Behavior
Link ID: 27546 - Posted: 10.24.2020

By Bruce Bower A type of bone tool generally thought to have been invented by Stone Age humans got its start among hominids that lived hundreds of thousands of years before Homo sapiens evolved, a new study concludes. A set of 52 previously excavated but little-studied animal bones from East Africa’s Olduvai Gorge includes the world’s oldest known barbed bone point, an implement probably crafted by now-extinct Homo erectus at least 800,000 years ago, researchers say. Made from a piece of a large animal’s rib, the artifact features three curved barbs and a carved tip, the team reports in the November Journal of Human Evolution. Among the Olduvai bones, biological anthropologist Michael Pante of Colorado State University in Fort Collins and colleagues identified five other tools from more than 800,000 years ago as probable choppers, hammering tools or hammering platforms. The previous oldest barbed bone points were from a central African site and dated to around 90,000 years ago (SN: 4/29/95), and were assumed to reflect a toolmaking ingenuity exclusive to Homo sapiens. Those implements include carved rings around the base of the tools where wooden shafts were presumably attached. Barbed bone points found at H. sapiens sites were likely used to catch fish and perhaps to hunt large land prey. The Olduvai Gorge barbed bone point, which had not been completed, shows no signs of having been attached to a handle or shaft. Ways in which H. erectus used the implement are unclear, Pante and his colleagues say. © Society for Science & the Public 2000–2020.

Keyword: Evolution; Learning & Memory
Link ID: 27543 - Posted: 10.24.2020

By Meagan Cantwell Although bird brains are tiny, they’re packed with neurons, especially in areas responsible for higher level thinking. Two studies published last month in Science explore the structure and function of avian brains—revealing they are organized similarly to mammals’ and are capable of conscious thought. © 2020 American Association for the Advancement of Science.

Keyword: Evolution; Learning & Memory
Link ID: 27541 - Posted: 10.24.2020

By Benedict Carey Scott Lilienfeld, an expert in personality disorders who repeatedly disturbed the order in his own field, questioning the science behind many of psychology’s conceits, popular therapies and prized tools, died on Sept. 30 at his home in Atlanta. He was 59. The cause was pancreatic cancer, his wife, Candice Basterfield, said. Dr. Lilienfeld’s career, most of it spent at Emory University in Atlanta, proceeded on two tracks: one that sought to deepen the understanding of so-called psychopathic behavior, the other to expose the many faces of pseudoscience in psychology. Psychopathy is characterized by superficial charm, grandiosity, pathological lying and a lack of empathy. Descriptions of the syndrome were rooted in research in the criminal justice system, where psychopaths often end up. In the early 1990s, Dr. Lilienfeld worked to deepen and clarify the definition. In a series of papers, he anchored a team of psychologists who identified three underlying personality features that psychopaths share, whether they commit illegal acts or not: fearless dominance, meanness and impulsivity. The psychopath does what he or she wants, without anxiety, regret or regard for the suffering of others. “When you have these three systems interacting, it’s a bad brew, and it creates the substrate for what can become psychopathy,” said Mark F. Lenzenweger, a professor of psychology at the State University of New York at Binghamton. “This was Scott’s great contribution: He helped change the thinking about psychopathy, in a profound way, by focusing on aspects of personality, rather than on a list of bad behaviors.” Dr. Lilienfeld’s parallel career encompassed clinical psychology and aimed to shake it free of empty theorizing, softheadedness and bad practice. In the late 1990s and early 2000s, he led a loose group of researchers who began to question the validity of some of the field’s favored constructs, like repressed memories of abuse and multiple personality disorder. The Rorschach inkblot test took a direct hit as largely unreliable. The group also attacked treatments including psychological debriefing and eye movement desensitization and reprocessing, or E.M.D.R., both of which are used for trauma victims. © 2020 The New York Times Company

Keyword: Aggression; Learning & Memory
Link ID: 27529 - Posted: 10.19.2020

Keith A. Trujillo1, Alfredo Quiñones-Hinojosa2, Kenira J. Thompson3 Joe Louis Martinez Jr. died on 29 August at the age of 76. In addition to making extraordinary contributions to the fields of neurobiology and Chicano psychology, Joe was a tireless advocate of diversity, equity, and inclusion in the sciences. He established professional development programs for individuals from underrepresented groups and provided lifelong mentoring as they pursued careers in science and academia. Joe was passionately devoted to expanding opportunities in the sciences well before diversity became a visible goal for scientific organizations and academic institutions. Born in Albuquerque, New Mexico, on 1 August 1944, Joe received his bachelor's degree in psychology from the University of San Diego in 1966; his master's in experimental psychology from New Mexico Highlands University in 1968; and his Ph.D. in physiological psychology from the University of Delaware in 1971. His faculty career began in 1972 at California State University, San Bernardino (CSUSB), shortly after the campus was established. He later completed postdocs in the laboratory of neurobiologist James McGaugh at the University of California, Irvine, and with neurobiologist Floyd Bloom at the Salk Institute for Biological Studies in San Diego, California. The University of California, Berkeley, recruited Joe in 1982, and he served as a professor as well as the area head of biopsychology and faculty assistant to the vice chancellor for affirmative action. As the highest-ranking Hispanic faculty member in the University of California system, Joe used his voice to help others from underrepresented groups. However, he felt that he could have a greater impact on diversity in the sciences by helping to build a university with a high concentration of Hispanic students, so in 1995 he moved to the University of Texas, San Antonio (UTSA). He began as a professor of biology and went on to assume a range of leadership roles, including director of the Cajal Neuroscience Institute. At UTSA, he worked with colleagues to obtain nearly $18 million in funding for neuroscience research and education. In 2012, he moved to the University of Illinois at Chicago where he served as professor and psychology department head until his retirement in 2016. At each institution, he embraced the opportunity to provide guidance and mentoring to innumerable students, faculty, and staff. © 2020 American Association for the Advancement of Science.

Keyword: Learning & Memory
Link ID: 27523 - Posted: 10.16.2020

By Pam Belluck After contracting the coronavirus in March, Michael Reagan lost all memory of his 12-day vacation in Paris, even though the trip was just a few weeks earlier. Several weeks after Erica Taylor recovered from her Covid-19 symptoms of nausea and cough, she became confused and forgetful, failing to even recognize her own car, the only Toyota Prius in her apartment complex’s parking lot. Lisa Mizelle, a veteran nurse practitioner at an urgent care clinic who fell ill with the virus in July, finds herself forgetting routine treatments and lab tests, and has to ask colleagues about terminology she used to know automatically. “I leave the room and I can’t remember what the patient just said,” she said, adding that if she hadn’t exhausted her medical leave she’d take more time off. “It scares me to think I’m working,” Ms. Mizelle, 53, said. “I feel like I have dementia.” It’s becoming known as Covid brain fog: troubling cognitive symptoms that can include memory loss, confusion, difficulty focusing, dizziness and grasping for everyday words. Increasingly, Covid survivors say brain fog is impairing their ability to work and function normally. “There are thousands of people who have that,” said Dr. Igor Koralnik, chief of neuro-infectious disease at Northwestern Medicine in Chicago, who has already seen hundreds of survivors at a post-Covid clinic he leads. “The impact on the work force that’s affected is going to be significant. Scientists aren’t sure what causes brain fog, which varies widely and affects even people who became only mildly physically ill from Covid-19 and had no previous medical conditions. Leading theories are that it arises when the body’s immune response to the virus doesn’t shut down or from inflammation in blood vessels leading to the brain. © 2020 The New York Times Company

Keyword: Alzheimers; Learning & Memory
Link ID: 27522 - Posted: 10.12.2020

By Bret Stetka The human brain is hardwired to map our surroundings. This trait is called spatial memory—our ability to remember certain locations and where objects are in relation to one another. New findings published today in Scientific Reports suggest that one major feature of our spatial recall is efficiently locating high-calorie, energy-rich food. The study’s authors believe human spatial memory ensured that our hunter-gatherer ancestors could prioritize the location of reliable nutrition, giving them an evolutionary leg up. In the study, researchers at Wageningen University & Research in the Netherlands observed 512 participants follow a fixed path through a room where either eight food samples or eight food-scented cotton pads were placed in different locations. When they arrived at a sample, the participants would taste the food or smell the cotton and rate how much they liked it. Four of the food samples were high-calorie, including brownies and potato chips, and the other four, including cherry tomatoes and apples, were low in calories—diet foods, you might call them. After the taste test, the participants were asked to identify the location of each sample on a map of the room. They were nearly 30 percent more accurate at mapping the high-calorie samples versus the low-calorie ones, regardless of how much they liked those foods or odors. They were also 243 percent more accurate when presented with actual foods, as opposed to the food scents. “Our main takeaway message is that human minds seem to be designed for efficiently locating high-calorie foods in our environment,” says Rachelle de Vries, a Ph.D. candidate in human nutrition and health at Wageningen University and lead author of the new paper. De Vries feels her team’s findings support the idea that locating valuable caloric resources was an important and regularly occurring problem for early humans weathering the climate shifts of the Pleistocene epoch. “Those with a better memory for where and when high-calorie food resources would be available were likely to have a survival—or fitness—advantage,” she explains. © 2020 Scientific American

Keyword: Learning & Memory; Obesity
Link ID: 27518 - Posted: 10.10.2020

R. Stanley Williams For the first time, my colleagues and I have built a single electronic device that is capable of copying the functions of neuron cells in a brain. We then connected 20 of them together to perform a complicated calculation. This work shows that it is scientifically possible to make an advanced computer that does not rely on transistors to calculate and that uses much less electrical power than today’s data centers. Our research, which I began in 2004, was motivated by two questions. Can we build a single electronic element – the equivalent of a transistor or switch – that performs most of the known functions of neurons in a brain? If so, can we use it as a building block to build useful computers? Neurons are very finely tuned, and so are electronic elements that emulate them. I co-authored a research paper in 2013 that laid out in principle what needed to be done. It took my colleague Suhas Kumar and others five years of careful exploration to get exactly the right material composition and structure to produce the necessary property predicted from theory. Kumar then went a major step further and built a circuit with 20 of these elements connected to one another through a network of devices that can be programmed to have particular capacitances, or abilities to store electric charge. He then mapped a mathematical problem to the capacitances in the network, which allowed him to use the device to find the solution to a small version of a problem that is important in a wide range of modern analytics. © 2010–2020, The Conversation US, Inc.

Keyword: Learning & Memory; Robotics
Link ID: 27512 - Posted: 10.07.2020

By Bret Stetka With enough training, pigeons can distinguish between the works of Picasso and Monet. Ravens can identify themselves in a mirror. And on a university campus in Japan, crows are known to intentionally leave walnuts in a crosswalk and let passing traffic do their nut cracking. Many bird species are incredibly smart. Yet among intelligent animals, the “bird brain” often doesn’t get much respect. Two papers published today in Science find birds actually have a brain that is much more similar to our complex primate organ than previously thought. For years it was assumed that the avian brain was limited in function because it lacked a neocortex. In mammals, the neocortex is the hulking, evolutionarily modern outer layer of the brain that allows for complex cognition and creativity and that makes up most of what, in vertebrates as a whole, is called the pallium. The new findings show that birds’ do, in fact, have a brain structure that is comparable to the neocortex despite taking a different shape. It turns out that at a cellular level, the brain region is laid out much like the mammal cortex, explaining why many birds exhibit advanced behaviors and abilities that have long befuddled scientists. The new work even suggests that certain birds demonstrate some degree of consciousness. The mammalian cortex is organized into six layers containing vertical columns of neurons that communicate with one another both horizontally and vertically. The avian brain, on the other hand, was thought to be arranged into discrete collections of neurons called nuclei, including a region called the dorsal ventricular ridge, or DVR, and a single nucleus named the wulst. In one of the new papers, senior author Onur Güntürkün, a neuroscientist at Ruhr University Bochum in Germany, and his colleagues analyzed regions of the DVR and wulst involved in sound and vision processing. To do so, they used a technology called three-dimensional polarized light imaging, or 3D-PLI—a light-based microscopy technique that can be employed to visualize nerve fibers in brain samples. The researchers found that in both pigeons and barn owls, these brain regions are constructed much like our neocortex, with both layerlike and columnar organization—and with both horizontal and vertical circuitry. They confirmed the 3D-PLI findings using biocytin tracing, a technique for staining nerve cells. © 2020 Scientific American

Keyword: Evolution; Learning & Memory
Link ID: 27487 - Posted: 09.25.2020

By Elizabeth Preston This is Panurgus banksianus, the large shaggy bee. It lives alone, burrowed into sandy grasslands across Europe. It prefers to feed on yellow-flowered members of the aster family. The large shaggy bee also has a very large brain. Just like mammals or birds, insect species of the same size may have different endowments inside their heads. Researchers have discovered some factors linked to brain size in back-boned animals. But in insects, the drivers of brain size have been more of a mystery. In a study published Wednesday in Proceedings of the Royal Society B, scientists scrutinized hundreds of bee brains for patterns. Bees with specialized diets seem to have larger brains, while social behavior appears unrelated to brain size. That means when it comes to insects, the rules that have guided brain evolution in other animals may not apply. “Most bee brains are smaller than a grain of rice,” said Elizabeth Tibbetts, an evolutionary biologist at the University of Michigan who was not involved in the research. But, she said, “Bees manage surprisingly complex behavior with tiny brains,” making the evolution of bee brains an especially interesting subject. Ferran Sayol, an evolutionary biologist at University College London, and his co-authors studied those tiny brains from 395 female bees belonging to 93 species from across the United States, Spain and the Netherlands. Researchers beheaded each insect and used forceps to remove its brain, a curled structure that’s widest in the center. “It reminds me a little bit of a croissant,” Dr. Sayol said. One pattern that emerged was a connection between brain size and how long each bee generation lasted. Bees that only go through one generation each year have larger brains, relative to their body size, than bees with multiple generations a year. © 2020 The New York Times Company

Keyword: Evolution; Learning & Memory
Link ID: 27476 - Posted: 09.16.2020

David Cox Gérard Karsenty was a young scientist trying to make a name for himself in the early 1990s when he first stumbled upon a finding that would go on to transform our understanding of bone, and the role it plays in our body. Karsenty had become interested in osteocalcin, one of the most abundant proteins in bone. He suspected that it played a crucial role in bone remodelling – the process by which our bones continuously remove and create new tissue – which enables us to grow during childhood and adolescence, and also recover from injuries. Intending to study this, he conducted a genetic knockout experiment, removing the gene responsible for osteocalcin from mice. However to his dismay, his mutant mice did not appear to have any obvious bone defects at all. “For him, it was initially a total failure,” says Mathieu Ferron, a former colleague of Karsenty who now heads a research lab studying bone biology at IRCM in Montreal. “In those days it was super-expensive to do modification in the mouse genome.” But then Karsenty noticed something unexpected. While their bones had developed normally, the mice appeared to be both noticeably fat and cognitively impaired. “Mice that don’t have osteocalcin have increased circulating glucose, and they tend to look a bit stupid,” says Ferron. “It may sound silly to say this, but they don’t learn very well, they appear kind of depressed. But it took Karsenty and his team some time to understand how a protein in bone could be affecting these functions. They were initially a bit surprised and terrified as it didn’t really make any sense to them.” © 2020 Read It Later, Inc.

Keyword: Hormones & Behavior; Obesity
Link ID: 27473 - Posted: 09.16.2020

Ian Sample Science editor Brain scans of cosmonauts have revealed the first clear evidence of how the organ adapts to the weird and often sickness-inducing challenge of moving around in space. Analysis of scans taken from 11 cosmonauts, who spent about six months each in orbit, found increases in white and grey matter in three brain regions that are intimately involved in physical movement. The changes reflect the “neuroplasticity” of the brain whereby neural tissue, in this case the cells that govern movement or motor activity, reconfigures itself to cope with the fresh demands of life in orbit. “With the techniques we used, we can clearly see there are microstructural changes in three major areas of the brain that are involved in motor processing,” said Steven Jillings, a neuroscientist at the University of Antwerp in Belgium. Visitors to the International Space Station face a dramatic shock to the system for a whole host of reasons, but one of the most striking is weightlessness. While the space station and its occupants are firmly in the grip of gravity – they are constantly falling around the planet – the body must recalibrate its senses to cope with the extreme environment. Images of the cosmonauts’ brains, taken before and after missions lasting on average 171 days, and again seven months later, confirmed that the cerebrospinal fluid that bathes the brain redistributes itself in orbit, pushing the brain up towards the top of the skull. This also expands fluid-filled cavities called ventricles, which may be linked to a loss of sharpness in the cosmonauts’ vision, a condition called spaceflight-associated neuro-ocular syndrome or Sans. © 2020 Guardian News & Media Limited

Keyword: Learning & Memory
Link ID: 27453 - Posted: 09.05.2020

— Joe Louis Martinez Jr., founder and former director of UTSA’s Neurosciences Institute, passed away on August 29 after a long battle with liver cancer. He was 76. Martinez was born in Albuquerque, New Mexico, on August 1, 1944. He received his B.A. from the University of California, San Diego; graduated with his M.S. in experimental psychology from New Mexico Highlands University in 1968; and earned his Ph.D. in physiological psychology in 1971 from the University of Delaware. He completed his postdoctoral training at the University of California, Irvine, and the Salk Institute in San Diego. Martinez served as a professor in the Department of Psychology at the University of California, Berkeley, from 1982 to 1995. During this time he led an internationally recognized research laboratory and departed as professor emeritus. In 1995 he joined UTSA as the Ewing Halsell Distinguished Chair in psychology. From 1995 to 2012 he was a beloved professor who founded and directed the Cajal Neuroscience Research Center, now known as the UTSA Neurosciences Institute. He oversaw the design and construction of the Biosciences Building, UTSA’s first research building. Each floor in the BSB contains tiles representing the neuroanatomical drawings of Santiago Ramon y Cajal. During his tenure at UTSA, Martinez brought over $15 million in grant funding to the university. In 2013 he moved to the University of Illinois at Chicago to become the chair of the department of psychology. He retired in 2016. © 2020 The University of Texas at San Antonio

Keyword: Learning & Memory
Link ID: 27450 - Posted: 09.05.2020

Alison Abbott Two years ago, Jennifer Li and Drew Robson were trawling through terabytes of data from a zebrafish-brain experiment when they came across a handful of cells that seemed to be psychic. The two neuroscientists had planned to map brain activity while zebrafish larvae were hunting for food, and to see how the neural chatter changed. It was their first major test of a technological platform they had built at Harvard University in Cambridge, Massachusetts. The platform allowed them to view every cell in the larvae’s brains while the creatures — barely the size of an eyelash — swam freely in a 35-millimetre-diameter dish of water, snacking on their microscopic prey. Out of the scientists’ mountain of data emerged a handful of neurons that predicted when a larva was next going to catch and swallow a morsel. Some of these neurons even became activated many seconds before the larva fixed its eyes on the prey1. Something else was strange. Looking in more detail at the data, the researchers realized that the ‘psychic’ cells were active for an unusually long time — not seconds, as is typical for most neurons, but many minutes. In fact, more or less the duration of the larvae’s hunting bouts. “It was spooky,” says Li. “None of it made sense.” Li and Robson turned to the literature and slowly realized that the cells must be setting an overall ‘brain state’ — a pattern of prolonged brain activity that primed the larvae to engage with the food in front of them. The pair learnt that, in the past few years, other scientists using various approaches and different species had also found internal brain states that alter how an animal behaves, even when nothing has changed in its external environment. © 2020 Springer Nature Limited

Keyword: Attention; Learning & Memory
Link ID: 27417 - Posted: 08.12.2020

By Chimamanda Ngozi Adichie My daughter and I were playing tag, or a kind of tag. Before that, we traced the letter P and we danced to James Brown’s “I feel good,” a song she selected from the iPod. We laughed as we danced, she with a natural rhythm striking for a 4-year-old, and I with my irretrievable gracelessness. Next on our plan was “Sesame Street.” It was about 2 p.m. on May 28. A day complacent with the promise of no surprises, like all the other days of the lockdown, shrunken days with shriveled routines. “When coronavirus is over,” my daughter often said, words filled with yearning for her preschool, her friends, her swimming lessons. And I, amid snatches of joy and discovery, often felt bored, and then guilty for feeling boredom, in this expanded boundless role of parent-playmate. My daughter picked up a green balloon pump, squirted the air at me, and ran off, around the kitchen counter. When I caught her, squealing, it was her turn to chase me. I was wearing white slippers, from some hotel somewhere, back when international travel was normal. They felt soft and thin-soled. I recall all these clearly, because of all the things I will be unable to recall later. I turned away from the kitchen to make the chase longer and something happened. I slipped or I tripped or my destiny thinned and I fell and hit my head on the hardwood floor. At the beginning of the stay-at-home order, plagued by amorphous anxieties, I taught my daughter how to call my doctor husband at work. Just in case. My daughter says that after I fell I told her, “Call Papa.” My husband says I spoke coherently. I told him that I fell and that the pain in my head was “excruciating,” and when I said “excruciating,” I seemed to wince. He says he asked my daughter to get me the ice pack in the freezer and that I said, “Thank you, baby,” when she gave it to me. I do not remember any of this.

Keyword: Learning & Memory; Brain Injury/Concussion
Link ID: 27412 - Posted: 08.11.2020