Most Recent Links

Follow us on Facebook or subscribe to our mailing list, to receive news updates. Learn more.


Links 2541 - 2560 of 29522

In a 2009 TED Talk, Israeli neuroscientist Henry Markram made a shocking claim: he was going to create a machine version of human brain within 10 years. The project was catnip to filmmaker Noah Hutton, who began documenting Markram's quest. Ultimately, Hutton followed Markram for a decade — but the scientist's lofty goal remains conspicuously incomplete. The resulting film, In Silico, finally makes its world premiere as part of the online version of the DOC NYC film festival on November 11. The film traces Markram’s journey with the Human Brain Project, from the project’s inception to its $1.4 billion in funding from the European Commission — and how it failed to meet its 10-year goal by 2019. Following a neuroscientist for a decade reveals a lot of highs and lows. Hutton presents the controversies by interviewing both the Human Brain Project team and its critics, including Princeton neuroscientist Sebastian Seung, researcher Zach Mainen at the Champalimaud Centre for the Unknown based in Portugal, and experimental cognitive psychologist Stanislas Dehane, who is professor at Collège de France in Paris. The film also features candid interviews with neuroscientists Christof Koch, who head's up the Allen Institute's MindScope Program, Harvard University's Jeremy R. Knowles Professor of Molecular and Cellular Biology Jeff W. Lichtman, and Stanford University neuroscience adjunct professor David Eagleman. Neuroscientists Idan Segev of Hebrew University in Israel, Cori Bargmann, Torsten N. Weisel Professor of Genetics and Genomics and Neuroscience and Behavior at Rockefeller University, and Cold Spring Harbor Lab professor Anne Churchland.

Keyword: Brain imaging
Link ID: 27577 - Posted: 11.14.2020

By Benedict Carey Merriam-Webster’s defines a time warp as a “discontinuity, suspension or anomaly” in the otherwise normal passage of time; this year all three terms could apply. It seems like March happened 10 years ago; everyday may as well be Wednesday, and still, somehow, here come the holidays — fast, just like every year. Some bard or novelist may yet come forth to help explain the paradoxes of pandemic time, both its Groundhog Days and the blurs of stress and fear for those on the front lines, or who had infectious people in their household. But brain science also has something to say about the relationship between perceived time and the Greenwich Mean variety, and why the two may slip out of sync. In a new study, a research team based in Dallas reported the first strong evidence to date of so-called “time cells” in the human brain. The finding, posted by the journal PNAS, was not unexpected: In recent years, several research groups have isolated neurons in rodents that track time intervals. It’s where the scientists look for these cells, and how they identified them, that provide some insight into the subjective experiences of time. “The first thing to say is that, strictly speaking, there is no such thing as ‘time cells’ in the brain,” said Gyorgy Buzsaki, a neuroscientist at New York University who was not involved in the new research. “There is no neural clock. What happens in the brain is neurons change in response to other neurons.” He added, “Having said that, it’s a useful concept to talk about how this neural substrate represents the passage of what we call time.” In the new study, a team led by Dr. Bradley Lega, a neurosurgeon at UT Southwestern Medical Center, analyzed the firing of cells in the medial temporal area, a region deep in the brain that is essential for memory formation and retrieval. It’s a natural place to look: Memories must be somehow “time-stamped” to retain some semblance of sequence, or chronological order. © 2020 The New York Times Company

Keyword: Attention
Link ID: 27576 - Posted: 11.10.2020

by Angie Voyles Askham Editing DNA in embryonic and newborn mice by using CRISPR technology can override mutations underlying Angelman syndrome and prevent many of the condition’s traits, according to a new study1. The effects last for at least 17 months and may be permanent, the researchers say. “It’s very exciting,” says Steven Kushner, professor of psychiatry at Columbia University, who was not involved in the study. Angelman syndrome usually stems from a mutation in or deletion of the UBE3A gene. People have two copies of the gene — one from each parent — but typically only the one passed down from the mother is active in neurons. Mutations that stymie that copy can lead to a lack of UBE3A protein in the brain, causing the syndrome’s core traits: developmental delays, motor dysfunction, speech impairments, seizures and, often, autism. These traits improve in response to treatments that activate the silent yet intact paternal copy of UBE3A and boost production of the protein in Angelman syndrome model mice2,3. But these treatments wear off over time, requiring repeated injections into the spinal fluid or brain. The new therapy is effective after only two doses, says lead researcher Mark Zylka, professor of cell biology and physiology at the University of North Carolina at Chapel Hill. The strategy uses the enzyme CRISPR-Cas9 to cut and edit DNA encoding an ‘antisense RNA’ molecule that ordinarily serves to block production of UBE3A protein from the paternal copy of the gene. The technique also rouses the silent paternal copy of the gene in cultured human neurons, suggesting that it might work in people. © 2020 Simons Foundation

Keyword: Autism; Genes & Behavior
Link ID: 27575 - Posted: 11.10.2020

By James Gorman Dogs go through stages in their life, just as people do, as is obvious to anyone who has watched their stiff-legged, white-muzzled companion rouse themselves to go for one more walk. Poets from Homer to Pablo Neruda have taken notice. As have folk singers and story tellers. Now science is taking a turn, in the hope that research on how dogs grow and age will help us understand how humans age. And, like the poets before them, scientists are finding parallels between the two species. Their research so far shows that dogs are similar to us in important ways, like how they act during adolescence and old age, and what happens in their DNA as they get older. They may be what scientists call a “model” for human aging, a species that we can study to learn more about how we age and perhaps how to age better. Most recently, researchers in Vienna have found that dogs’ personalities change over time. They seem to mellow in the same way that most humans do. The most intriguing part of this study is that like people, some dogs are just born old, which is to say, relatively steady and mature, the kind of pup that just seems ready for a Mr. Rogers cardigan. “That’s professor Spot, to you, thank you, and could we be a little neater when we pour kibble into my dish?” Mind you, the Vienna study dogs were all Border collies, so I’m a little surprised that any of them were mature. That would suggest a certain calm, a willingness to tilt the head and muse that doesn’t seem to fit the breed, with its desperate desire to be constantly chasing sheep, geese, children or Frisbees. Another recent paper came to the disturbing conclusion that the calculus of seven dog years for every human year isn’t accurate. To calculate dog years, you must now multiply the natural logarithm of a dog’s age in human years by 16 and then add 31. Is that clear? It’s actually not as hard as it sounds, as long as you have a calculator or internet access. For example the natural log of 6 is 1.8, roughly, which, multiplied by 16 is about 29, which, plus 31, is 60. OK, it’s not that easy, even with the internet. © 2020 The New York Times Company

Keyword: Development of the Brain; Evolution
Link ID: 27574 - Posted: 11.10.2020

By Linda Searing The “baby blues” that women can experience after giving birth usually go away within a week or two, but it now appears that more severe depressive symptoms, known as postpartum depression, may affect some new mothers for at least three years. Research from the National Institutes of Health, which tracked 4,866 women for three years after childbirth, found that about 25 percent of the women reported moderate to high levels of depressive symptoms at some point and that the remaining 75 percent experienced low-level depressive symptoms throughout the study. The “baby blues” typically include such symptoms as mood swings, anxiety and trouble sleeping, whereas postpartum depression symptoms — generally more intense and longer lasting — may include excessive crying, overwhelming fatigue, loss of appetite, difficulty bonding with the baby, feelings of inadequacy, hopelessness and more. The NIH research, published in the journal Pediatrics, encourages pediatricians to screen their tiny patients’ mothers for depressive symptoms during the children’s regular checkups, noting that “mothers’ mental health is critical to children’s well-being and development.” The researchers note that maternal depression increases a child’s risk for cognitive, emotional and behavioral problems. Getting treatment, however, should not only ease a mother’s symptoms but also improve her child’s odds for a favorable developmental outcome.

Keyword: Depression; Development of the Brain
Link ID: 27573 - Posted: 11.10.2020

By Giorgia Guglielmi, Spectrum A small clinical trial of a gene therapy for Angelman syndrome—a rare genetic condition related to autism—is on hold after two participants temporarily lost the ability to walk. The safety issue is important to resolve, experts say, given that the therapy otherwise appears to be effective, and the trial could guide treatment strategies for similar brain conditions. Biopharmaceutical company Ultragenyx in Novato, California, in collaboration with Florida-based biotech startup GeneTx, launched the trial in February to assess the safety of a therapy for Angelman syndrome, a neurodevelopmental condition characterized by intellectual disability, balance and motor problems, seizures, sleep problems and, in some cases, autism. Angelman syndrome results from the mutation or absence of a gene called UBE3A. People inherit two copies of UBE3A. Typically, only the maternal copy is active in neurons and the paternal copy is silent. But in people with Angelman syndrome, the maternal copy is mutated or missing, so their brain cells express no active UBE3A protein. The drug developed by Ultragenyx and GeneTx, called GTX-102, is a short snippet of RNA called an antisense oligonucleotide that activates the paternal copy of UBE3A and aims to restore the protein to typical levels. Three other companies—Roche, Biogen, and Ionis—are pursuing similar therapies for the syndrome. On 26 October, Ultragenyx and GeneTx reported that the clinical trial had enrolled five individuals with Angelman syndrome, aged 5 to 15. The plan had been to administer to each participant a dose of GTX-102 once a month over four months. Researchers injected the drug directly into the nutrient-rich solution that envelops the brain and spinal cord through a site in the lower back. © 2020 American Association for the Advancement of Science

Keyword: Autism
Link ID: 27572 - Posted: 11.07.2020

Alison Abbott Two years ago, immunologist and medical-publishing entrepreneur Leslie Norins offered to award US$1 million of his own money to any scientist who could prove that Alzheimer’s disease was caused by a germ. The theory that an infection might cause this form of dementia has been rumbling for decades on the fringes of neuroscience research. The majority of Alzheimer’s researchers, backed by a huge volume of evidence, think instead that the key culprits are sticky molecules in the brain called amyloids, which clump into plaques and cause inflammation, killing neurons. Norins wanted to reward work that would make the infection idea more persuasive. The amyloid hypothesis has become “the one acceptable and supportable belief of the Established Church of Conventional Wisdom”, says Norins. “The few pioneers who did look at microbes and published papers were ridiculed or ignored.” In large part, this was because some early proponents of the infection theory saw it as a replacement for the amyloid hypothesis. But some recent research has provided intriguing hints that the two ideas could fit together — that infection could seed some cases of Alzheimer’s disease by triggering the production of amyloid clumps. The data hint at a radical role for amyloid in neurons. Instead of just being a toxic waste product, amyloid might have an important job of its own: helping to protect the brain from infection. But age or genetics can interrupt the checks and balances in the system, turning amyloid from defender into villain. And that idea suggests new avenues to explore for potential therapies. To test the theory further, scientists are now developing animal models that mimic Alzheimer’s disease more closely. “We are taking the ideas seriously,” says neuroscientist Bart de Strooper, director of the UK Dementia Research Institute at University College London. © 2020 Springer Nature Limited

Keyword: Alzheimers; Neuroimmunology
Link ID: 27571 - Posted: 11.07.2020

By Laura Sanders The fate of a potential new Alzheimer’s drug is still uncertain. Evidence that the drug works isn’t convincing enough for it to be approved, outside experts told the U.S. Food and Drug Administration during a Nov. 6 virtual meeting that at times became contentious. The scientists and clinicians were convened at the request of the FDA to review the evidence for aducanumab, a drug that targets a protein called amyloid-beta that accumulates in the brains of people with Alzheimer’s. The drug is designed to stick to A-beta and stop it from forming larger, more dangerous clumps. That could slow the disease’s progression but not stop or reverse it. When asked whether a key clinical study provided strong evidence that the drug effectively treated Alzheimer’s, eight of 11 experts voted no. One expert voted yes, and two were uncertain. The FDA is not bound to follow the recommendations of the guidance committee, though it has historically done so. If ultimately approved, the drug would be a milestone, says neurologist and neuroscientist Arjun Masurkar of New York University Langone’s Alzheimer’s Disease Research Center. Aducanumab “would be the first therapy that actually targets the underlying disease itself and slows progression.” Developed by the pharmaceutical company Biogen, which is based in Cambridge, Mass., the drug is controversial. That’s because two large clinical trials of aducanumab have yielded different outcomes, one positive and one negative (SN: 12/5/19). The trials were also paused at one point, based on analyses that suggested the drug didn’t work. © Society for Science & the Public 2000–2020.

Keyword: Alzheimers
Link ID: 27570 - Posted: 11.07.2020

The membranes surrounding our brains are in a never-ending battle against deadly infections, as germs constantly try to elude watchful immune cells and sneak past a special protective barrier called the meninges. In a study involving mice and human autopsy tissue, researchers at the National Institutes of Health and Cambridge University have shown that some of these immune cells are trained to fight these infections by first spending time in the gut. “This finding opens a new area of neuroimmunology, showing that gut-educated antibody-producing cells inhabit and defend regions that surround the central nervous system,” said Dorian McGavern, Ph.D., senior investigator at NINDS and co-senior author of the study, which was published in Nature. The central nervous system (CNS) is protected from pathogens both by a three-membrane barrier called the meninges and by immune cells within those membranes. The CNS is also walled off from the rest of the body by specialized blood vessels that are tightly sealed by the blood brain barrier. This is not the case, however, in the dura mater, the outermost layer of the meninges. Blood vessels in this compartment are not sealed, and large venous structures, referred to as the sinuses, carry slow moving blood back to the heart. The combination of slow blood flow and proximity to the brain requires strong immune protection to stop potential infections in their tracks. “The immune system has invested heavily in the dura mater,” said Dr. McGavern. “The venous sinuses within the dura act like drainage bins, and, consequently, are a place where pathogens can accumulate and potentially enter the brain. It makes sense that the immune system would set up camp in this vulnerable area.”

Keyword: Neuroimmunology
Link ID: 27569 - Posted: 11.07.2020

By Gretchen Reynolds Roiled by concerns about the pandemic and politics? Lifting weights might help, according to a timely new study of anxiety and resistance training. The study, which involved healthy young adults, barbells and lunges, indicates that regular weight training substantially reduces anxiety, a finding with particular relevance during these unsettling, bumpy days. We already have plenty of evidence that exercise helps stave off depression and other mental ills, and that exercise can elevate feelings of happiness and contentment. But most past studies of exercise and moods have looked at the effects of aerobic exercise, like running on a treadmill or riding a stationary bike. Scientists only recently have begun to investigate whether and how weight training might also affect mental health. A 2018 review of studies, for instance, concluded that adults who lift weights are less likely to develop depression than those who never lift. In another study, women with clinical anxiety disorders reported fewer symptoms after taking up either aerobic or weight training. But many of these studies involved frequent and complicated sessions of resistance exercise performed under the eyes of researchers, which is not how most of us are likely to work out. They also often focused on somewhat narrow groups, such as men or women with a diagnosed mental health condition like depression or an anxiety disorder, limiting their applicability. So for the new study, which was published in October in Scientific Reports, researchers at the University of Limerick in Ireland and other institutions decided to see if a simple version of weight training could have benefits for mood in people who already were in generally good mental health. © 2020 The New York Times Company

Keyword: Stress
Link ID: 27568 - Posted: 11.07.2020

Elena Renken More than a century ago, the zoologist Richard Semon coined the term “engram” to designate the physical trace a memory must leave in the brain, like a footprint. Since then, neuroscientists have made progress in their hunt for exactly how our brains form memories. They have learned that specific brain cells activate as we form a memory and reactivate as we remember it, strengthening the connections among the neurons involved. That change ingrains the memory and lets us keep memories we recall more often, while others fade. But the precise physical alterations within our neurons that bring about these changes have been hard to pin down — until now. In a study published last month, researchers at the Massachusetts Institute of Technology tracked an important part of the memory-making process at the molecular scale in engram cells’ chromosomes. Neuroscientists already knew that memory formation is not instantaneous, and that the act of remembering is crucial to locking a memory into the brain. These researchers have now discovered some of the physical embodiment of that mechanism. The MIT group worked with mice that had a fluorescent marker spliced into their genome to make their cells glow whenever they expressed the gene Arc, which is associated with memory formation. The scientists placed these mice in a novel location and trained them to fear a specific noise, then returned them to this location several days later to reactivate the memory. In the brain area called the hippocampus, the engram cells that formed and recalled this memory lit up with color, which made it easy to sort them out from other brain cells under the microscope during a postmortem examination. All Rights Reserved © 2020

Keyword: Learning & Memory; Stress
Link ID: 27567 - Posted: 11.04.2020

By Veronique Greenwood Some 230 million years ago, in the forests of what humans would eventually call Brazil, a small bipedal dinosaur zipped after its prey. It had a slender head, a long tail and sharp teeth, and it was about the size of a basset hound. Buriolestes schultzi, as paleontologists have named the creature, is one of the earliest known relatives of more famous dinosaurs that emerged 100 million years later: the lumbering brachiosaurus, up to 80 feet long and weighing up to 80 metric tons, the likewise massive diplodocus, as well as other sauropod dinosaurs. By the time the Jurassic period rolled around and the time of Buriolestes had passed, these quadrupedal cousins had reached tremendous size. They also had tiny brains around the size of a tennis ball. Buriolestes’s brain was markedly different, scientists who built a 3-D reconstruction of the inside of its skull report in a paper published Tuesday in the Journal of Anatomy. The brain was larger relative to its body size, and it had structures that were much more like those of predatory animals. The findings suggest that the enormous herbivores of later eras, whose ancestors probably looked a lot like Buriolestes, lost these features as they transitioned to their ponderous new lifestyle. It’s also a rare glimpse into dinosaurs’ neural anatomy at a very early moment in their evolution. In 2009, Rodrigo Müller of the Universidade Federal de Santa Maria and colleagues discovered the first partial Buriolestes fossil in southern Brazil. In 2015, they uncovered another Buriolestes nearby — and this time, to their excitement, the dinosaur’s skull was nearly all there. They used computed tomography scanning to get a peek inside, drawing inferences about the brain from the contours of the cavity left behind. They found that one portion of the cerebellum, the floccular lobe, was particularly large in Buriolestes. © 2020 The New York Times Company

Keyword: Evolution
Link ID: 27566 - Posted: 11.04.2020

Amber Dance Gerald Maguire has stuttered since childhood, but you might not guess it from talking to him. For the past 25 years, he has been treating his disorder with antipsychotic medications not officially approved for the condition. Only with careful attention might you discern his occasional stumble on multisyllabic words like "statistically" and "pharmaceutical." Maguire has plenty of company: More than 70 million people worldwide, including about 3 million Americans, stutter — they have difficulty with the starting and timing of speech, resulting in halting and repetition. That number includes approximately 5 percent of children (many of whom outgrow the condition) and 1 percent of adults. Their numbers include presidential candidate Joe Biden, deep-voiced actor James Earl Jones, and actress Emily Blunt. Though they and many others, including Maguire, have achieved career success, stuttering can contribute to social anxiety and draw ridicule or discrimination. Maguire, a psychiatrist at the University of California, Riverside, has been treating people who stutter, and researching potential treatments, for decades. He's now embarking on a clinical trial of a new medication, ecopipam, that streamlined speech and improved quality of life in a small pilot study in 2019. Others, meanwhile, are delving into the root causes of stuttering. In past decades, therapists mistakenly attributed stuttering to defects of the tongue and voice box, to anxiety, trauma, or even poor parenting — and some still do. Yet others have long suspected that neurological problems might underlie stuttering, says J. Scott Yaruss, a speech-language pathologist at Michigan State University. The first data to back up that hunch came in 1991, when researchers reported altered blood flow in the brains of people who stuttered. Since then research has made it more apparent that stuttering is all in the brain. "We are in the middle of an absolute explosion of knowledge being developed about stuttering," Yaruss says. ® 2020 The Week Publications Inc.

Keyword: Language
Link ID: 27565 - Posted: 11.04.2020

By Carolyn Wilke Fish fins aren’t just for swimming. They’re feelers, too. The fins of round gobies can detect textures with a sensitivity similar to that of the pads on monkeys’ fingers, researchers report November 3 in the Journal of Experimental Biology. Compared with landlubbers, little is known about aquatic animals’ sense of touch. And for fish, “we used to only think of fins as motor structures,” says Adam Hardy, a neuroscientist at the University of Chicago. “But it’s really becoming increasingly clear that fins play important sensory roles.” Studying those sensory roles can hint at ways to mimic nature for robotics and provide a window into the evolution of touch. The newfound parallels between primates and fish suggest that limbs that sense physical forces emerged early, before splits in the vertebrate evolutionary tree led to animals with fins, arms and legs, says Melina Hale, a neurobiologist and biomechanist also at the University of Chicago. “These capabilities arose incredibly early and maybe set the stage for what we can do with our hands now and what fish can do with their fins in terms of touch.” Hardy and Hale measured the activity of nerves in the fins of bottom-dwelling round gobies (Neogobius melanostomus) to get a sense of what fish learn about texture from their fins. In the wild, round gobies brush against the bottom surface and rest there on their large pectoral fins. “They’re really well suited to testing these sorts of questions,” Hardy says. Working with fins from six euthanized gobies, the researchers recorded electrical spikes from their nerves as a bumpy plastic ring attached to a motor rolled lightly above each fin. A salt solution keeps the nerves functioning as they would if the nerves were in a live fish, Hardy says. © Society for Science & the Public 2000–2020

Keyword: Pain & Touch; Evolution
Link ID: 27564 - Posted: 11.04.2020

By Nicholas Bakalar Some studies have suggested that older people who consistently engage in leisure activities are less likely to develop dementia than those who do not, suggesting that failure to participate in such pastimes could spur cognitive deterioration. A new study suggests another explanation: Failure to participate in leisure activities may be a consequence of dementia, not a cause. Researchers studied 8,280 people, average age 56, who were free of dementia at the start of the analysis. Over the next 18 years, the participants underwent periodic physical and psychological examinations, while researchers tracked their involvement in 13 leisure activities — listening to music, gardening, attending cultural events, playing cards, using a home computer and others. By the end of the project, 360 had developed dementia. The study, in Neurology, controlled for smoking, physical activity, education, coronary heart disease and other health and behavioral characteristics that are tied to dementia risk. They found no association between engagement in leisure activities at age 56 and the incidence of dementia over the following 18 years. The researchers concluded that actively pursuing leisure activities may not provide protection against developing dementia. “Dementia develops over a long period of time, so it’s possible that some changes happen before the diagnosis of dementia,” said the lead author, Andrew Sommerlad, a researcher at University College London. “Elderly people withdrawing from activities that they previously enjoyed may be developing early signs of dementia.” © 2020 The New York Times Company

Keyword: Alzheimers
Link ID: 27563 - Posted: 11.04.2020

Anil Ananthaswamy In the winter of 2011, Daniel Yamins, a postdoctoral researcher in computational neuroscience at the Massachusetts Institute of Technology, would at times toil past midnight on his machine vision project. He was painstakingly designing a system that could recognize objects in pictures, regardless of variations in size, position and other properties — something that humans do with ease. The system was a deep neural network, a type of computational device inspired by the neurological wiring of living brains. “I remember very distinctly the time when we found a neural network that actually solved the task,” he said. It was 2 a.m., a tad too early to wake up his adviser, James DiCarlo, or other colleagues, so an excited Yamins took a walk in the cold Cambridge air. “I was really pumped,” he said. It would have counted as a noteworthy accomplishment in artificial intelligence alone, one of many that would make neural networks the darlings of AI technology over the next few years. But that wasn’t the main goal for Yamins and his colleagues. To them and other neuroscientists, this was a pivotal moment in the development of computational models for brain functions. DiCarlo and Yamins, who now runs his own lab at Stanford University, are part of a coterie of neuroscientists using deep neural networks to make sense of the brain’s architecture. In particular, scientists have struggled to understand the reasons behind the specializations within the brain for various tasks. They have wondered not just why different parts of the brain do different things, but also why the differences can be so specific: Why, for example, does the brain have an area for recognizing objects in general but also for faces in particular? Deep neural networks are showing that such specializations may be the most efficient way to solve problems. All Rights Reserved © 2020

Keyword: Learning & Memory
Link ID: 27562 - Posted: 10.31.2020

Jon Hamilton If you fall off a bike, you'll probably end up with a cinematic memory of the experience: the wind in your hair, the pebble on the road, then the pain. That's known as an episodic memory. And now researchers have identified cells in the human brain that make this sort of memory possible, a team reports in the journal Proceedings of the National Academy of Sciences. The cells are called time cells, and they place a sort of time stamp on memories as they are being formed. That allows us to recall sequences of events or experiences in the right order. "By having time cells create this indexing across time, you can put everything together in a way that makes sense," says Dr. Bradley Lega, the study's senior author and a neurosurgeon at the University of Texas Southwestern Medical Center in Dallas. Time cells were discovered in rodents decades ago. But the new study is critical because "the final arbitrator is always the human brain," says Dr. György Buzsáki, Biggs Professor of Neuroscience at New York University. Buzsáki is not an author of the study but did edit the manuscript. Lega and his team found the time cells by studying the brains of 27 people who were awaiting surgery for severe epilepsy. As part of their pre-surgical preparation, these patients had electrodes placed in the hippocampus and another area of the brain involved in navigation, memory and time perception. In the experiment, the patients studied sequences of 12 or 15 words that appeared on a laptop screen during a period of about 30 seconds. Then, after a break, they were asked to recall the words they had seen. © 2020 npr

Keyword: Learning & Memory
Link ID: 27561 - Posted: 10.31.2020

By Jonathan Lambert Octopus arms have minds of their own. Each of these eight supple yet powerful limbs can explore the seafloor in search of prey, snatching crabs from hiding spots without direction from the octopus’ brain. But how each arm can tell what it’s grasping has remained a mystery. Now, researchers have identified specialized cells not seen in other animals that allow octopuses to “taste” with their arms. Embedded in the suckers, these cells enable the arms to do double duty of touch and taste by detecting chemicals produced by many aquatic creatures. This may help an arm quickly distinguish food from rocks or poisonous prey, Harvard University molecular biologist Nicholas Bellono and his colleagues report online October 29 in Cell. The findings provide another clue about the unique evolutionary path octopuses have taken toward intelligence. Instead of being concentrated in the brain, two-thirds of the nerve cells in an octopus are distributed among the arms, allowing the flexible appendages to operate semi-independently (SN: 4/16/15). “There was a huge gap in knowledge of how octopus [arms] actually collect information about their environment,” says Tamar Gutnick, a neurobiologist who studies octopuses at Hebrew University of Jerusalem who was not involved in the study. “We’ve known that [octopuses] taste by touch, but knowing it and understanding how it’s actually working is a very different thing.” Working out the specifics of how arms sense and process information is crucial for understanding octopus intelligence, she says. “It’s really exciting to see someone taking a comprehensive look at the cell types involved,” and how they work. © Society for Science & the Public 2000–2020

Keyword: Chemical Senses (Smell & Taste); Evolution
Link ID: 27560 - Posted: 10.31.2020

By Lucy Hicks Ogre-faced spiders might be an arachnophobe’s worst nightmare. The enormous eyes that give them their name allow them to see 2000 times better than we can at night. And these creepy crawlers are lightning-fast predators, snatching prey in a fraction of a second with mini, mobile nets. Now, new research suggests these arachnids use their legs not only to scuttle around, but also to hear. In light of their excellent eyesight, this auditory skill “is a surprise,” says George Uetz, who studies the behavioral ecology of spiders at the University of Cincinnati and wasn’t involved in the new research. Spiders don’t have ears—generally a prerequisite for hearing. So, despite the vibration-sensing hairs and receptors on most arachnids’ legs, scientists long thought spiders couldn’t hear sound as it traveled through the air, but instead felt vibrations through surfaces. The first clue they might be wrong was a 2016 study that found that a species of jumping spider can sense vibrations in the air from sound waves. Enter the ogre-faced spider. Rather than build a web and wait for their prey, these fearsome hunters “take a much more active role,” says Jay Stafstrom, a sensory ecologist at Cornell University. The palm-size spiders hang upside down from small plants on a silk line and create a miniweb across their four front legs, which they use as a net to catch their next meal. The spiders either lunge at bugs wandering below or flip backward to ensnare flying insects’ midair. © 2020 American Association for the Advancement of Science.

Keyword: Hearing; Evolution
Link ID: 27559 - Posted: 10.31.2020

By Laura Sanders Nearly 2,000 years ago, a cloud of scorching ash from Mount Vesuvius buried a young man as he lay on a wooden bed. That burning ash quickly cooled, turning some of his brain to glass. This confluence of events in A.D. 79 in the town of Herculaneum, which lay at the western base of the volcano, preserved the usually delicate neural tissue in a durable, glassy form. New scrutiny of this tissue has revealed signs of nerve cells with elaborate tendrils for sending and receiving messages, scientists report October 6 in PLOS ONE. That the young man once possessed these nerve cells, or neurons, is no surprise; human brains are packed with roughly 86 billion neurons (SN: 8/7/19). But samples from ancient brains are sparse. Those that do exist have become a soaplike substance or mummified, says Pier Paolo Petrone, a biologist and forensic anthropologist at the University of Naples Federico II in Italy. But while studying the Herculaneum site, Petrone noticed something dark and shiny inside this man’s skull. He realized that those glassy, black fragments “had to be the remains of the brain.” Petrone and colleagues used scanning electron microscopy to study glassy remains from both the man’s brain and spinal cord. The researchers saw tubular structures as well as cell bodies that were the right sizes and shapes to be neurons. In further analyses, the team found layers of tissue wrapped around tendrils in the brain tissue. This layering appears to be myelin, a fatty substance that speeds signals along nerve fibers. The preserved tissue was “something really astonishing and incredible,” Petrone says, because the conversion of objects to glass, a process called vitrification, is relatively rare in nature. “This is the first ever discovery of ancient human brain remains vitrified by hot ash during a volcanic eruption.” © Society for Science & the Public 2000–2020.

Keyword: Development of the Brain
Link ID: 27558 - Posted: 10.31.2020