Chapter 13. Memory, Learning, and Development

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 6191

BREANNE SEARING, Evergreen reporter As more states have legalized recreational cannabis, it has grown more popular among pregnant women. A WSU undergraduate researcher thinks the scientific community needs to take a closer look at the behavior of adults who were exposed to cannabis while in the womb. Neuroscience senior Collin Warrick has conducted research on the effects of prenatal cannabis vapor on the cognitive flexibility of rats. Warrick said cannabis is the most common illicit drug among pregnant women. Despite this, little to no research has been done on its effects on offspring cognition as they mature. Warrick said the lack of medical warnings on legal cannabis products comes from this lack of research. “I have not come across anything that has been black labeled,” Warrick said, “there is no Surgeon General’s warning against cannabis, I think primarily because there hasn’t been enough studies looking at the negative effects.” Ryan McLaughlin, an assistant professor of integrative physiology and neuroscience who worked with Warrick, said long-term ramifications of cannabis vapor on developing offspring are unknown. “Now that cannabis is legal for the next generation of mothers,” McLaughlin said, “they may see less of a stigma and less of a perceived harm associated with smoking a joint during pregnancy, as they would maybe from having a drink of wine or smoking cigarettes.” © 2018

Keyword: Development of the Brain; Drug Abuse
Link ID: 24888 - Posted: 04.21.2018

By Matt Warren There is no one gene that, when mutated, causes autism. But over the past decade, researchers have identified hundreds of gene variations that seem to affect brain development in ways that increase the risk of autism. However, these scientists mainly searched for variants in the DNA that directly encodes the building blocks of proteins. Now, a new study probing so-called noncoding DNA has found that alterations in regions that regulate gene activity may also contribute to autism. And surprisingly, these variations tended to be inherited from fathers who aren’t autistic. “This is a really good article—it’s somewhat provocative and it makes us think about [autism genetics in a] different way,” says Lucia Peixoto, a neuroscientist and computational biologist at Washington State University in Spokane, who was not involved in the research. “I think it’s a great contribution to the field.” Research into the genetic risk for autism has mainly focused on how mutations that arise spontaneously in an individual’s genome—rather than being inherited from a parent—disrupt protein-coding regions and lead to the condition. That’s because these sporadic mutations have relatively large effects and studies have shown that such mutations, although individually rare, together contribute to about 25% to 30% of cases, says Jonathan Sebat, a geneticist at the University of California, San Diego. But only about 2% of the genome consists of protein-coding areas. Sebat says the large noncoding portion of our DNA—often previously referred to as “junk DNA”—has so far been ignored in autism research. © 2018 American Association for the Advancement of Science

Keyword: Autism; Epigenetics
Link ID: 24886 - Posted: 04.21.2018

By CEYLAN YEGINSU A new study has shed more light on the revelations that Hans Asperger, the Austrian pediatrician for whom a form of autism is named, had collaborated with the Nazis and actively assisted in the killing of disabled children. Published on Wednesday in the journal Molecular Autism by the medical historian Herwig Czech, the report relies on eight years of research that included the examination of previously unseen Nazi-era documents. The study concludes that though Dr. Asperger was not a member of the Nazi Party, he had participated in the Third Reich’s child-euthanasia program, which aimed to establish a “pure” society by eliminating those deemed a “burden.” Dr. Asperger referred disabled children to the notorious Am Spiegelgrund clinic in Vienna, where hundreds were either drugged or gassed to death from 1940 to 1945. “The picture that emerges is that of a man who managed to further his career under the Nazi regime, despite his apparent political and ideological distance from it,” Mr. Czech, of the University of Vienna, wrote in his study. Asperger syndrome is a lifelong developmental disability associated with autism that affects perception and social interaction. About one in 68 children in the United States have been identified with Autism Spectrum Disorder. © 2018 The New York Times Company

Keyword: Autism
Link ID: 24885 - Posted: 04.21.2018

Jaclyn was diagnosed with myopia, or nearsightedness, at the age of age four. "I was surprised to learn that she needed glasses," recalled her mother, Ellen Rosenberg, in Toronto. Jaclyn wears glasses all the time at school, where they help her to read and write, she said. Her vision isn't so poor that she trips on things when she takes them off to play sports, Rosenberg said. But in a recent study, more than 30 per cent of young Canadian children walked around with fuzzy vision because of myopia that, unlike Jaclyn's, went undiagnosed. Now experts are exploring a simple way to turn the tide on the worsening problem. Myopia is "increasing globally at an alarming rate," according to the World Health Organization. It affects an estimated 1.89 billion people worldwide, and if rates don't change, that could rise to 2.56 billion by 2020 — a third of the population. Research suggests spending time outdoors protects against myopia. (Pond5) In what they call the first study of its kind in Canada, optometrists in Waterloo, Ont., found the rate of myopia was six per cent in children aged 6 to 8. That soared to 28.9 per cent in children aged 11 to 13. In myopia or nearsightedness, the eyeball doesn't get enough light and elongates. The condition isn't innocuous, said study author Debbie Jones, a clinical professor of optometry at the University of Waterloo and a scientist at the Centre for Ocular Research & Education. ©2018 CBC/Radio-Canada.

Keyword: Vision; Development of the Brain
Link ID: 24884 - Posted: 04.21.2018

By Ashley Yeager Brain organoids, also known as mini-brains, are tiny clumps of brain cells grown from stem cells that researchers are using to investigate the neural underpinnings of autism and other neurological disorders. But the organoids typically grow in culture for only a few months before they die, limiting their usefulness as models of real brains. Transplanting the three-dimensional clumps of human brain tissue into the brains of mice allows the organoids to continue to develop, sprouting life-sustaining blood vessels as well as new neuronal connections, the new study reports. The work takes a step toward using brain organoids to study complexities of human brain development and disease that can’t be investigated with current techniques. Brain organoid transplantation may even one day offer a treatment option for traumatic brain injury or stroke. “Although organoids are a great advance in human neuroscience, they are not perfect. They are missing blood vessels, immune cells and functional connections to other areas of the nervous system,” Jürgen Knoblich, a molecular biologist at the Institute of Molecular Biotechnology in Vienna who was not involved in the study, tells The Scientist by email. “The goal of the transplantation experiments is to show that integration with those other tissues is possible.” Study coauthor Fred “Rusty” Gage, a neuroscientist at the Salk Institute for Biological Studies in La Jolla, California, and his colleagues first started thinking about the health of brain organoids a few years ago when they began working with the structures. Many cells in the center of the 3-D clump of tissue would die, Gage tells The Scientist. “Those cells weren’t getting the blood and nutrients they needed to survive.” © 1986-2018 The Scientist

Keyword: Development of the Brain
Link ID: 24876 - Posted: 04.18.2018

by Lauren Neergaard It’s pretty extraordinary for people in their 80s and 90s to keep the same sharp memory as someone several decades younger, so scientists are peeking into the brains of“superagers” who do to uncover their secret. The work is the flip side of the disappointing hunt for new drugs to fight or prevent Alzheimer’s disease. Instead of tackling that problem, “why don’t we figure out what it is we might need to do to maximize our memory?” said neuro­scientist Emily Rogalski, who leads the SuperAging study at Northwestern University in Chicago. Parts of the brain shrink with age, one of the reasons that most people experience a gradual slowing of at least some types of memory late in life. But it turns out that superagers’ brains aren’t shrinking nearly as fast as their peers’. And autopsies of the first superagers to die during the study show they harbor a lot more of a special kind of nerve cell in a deep-brain region that’s important for attention, Rogalski told a recent meeting of the American Association for the Advancement of Science. These elite elders are “more than just an oddity or a rarity,” said neuroscientist Molly Wagster of the National Institute on Aging, which helps fund the research. “There’s the potential for learning an enormous amount and applying it to the rest of us, and even to those who may be on a trajectory for some type of neurodegenerative disease.” © 1996-2018 The Washington Post

Keyword: Alzheimers; Learning & Memory
Link ID: 24874 - Posted: 04.17.2018

Mariarosaria Taddeo and Luciano Floridi. Cyberattacks are becoming more frequent, sophisticated and destructive. Each day in 2017, the United States suffered, on average, more than 4,000 ransomware attacks, which encrypt computer files until the owner pays to release them1. In 2015, the daily average was just 1,000. In May last year, when the WannaCry virus crippled hundreds of IT systems across the UK National Health Service, more than 19,000 appointments were cancelled. A month later, the NotPetya ransomware cost pharmaceutical giant Merck, shipping firm Maersk and logistics company FedEx around US$300 million each. Global damages from cyberattacks totalled $5 billion in 2017 and may reach $6 trillion a year by 2021 (see go.nature.com/2gncsyg). Countries are partly behind this rise. They use cyberattacks both offensively and defensively. For example, North Korea has been linked to WannaCry, and Russia to NotPetya. As the threats escalate, so do defence tactics. Since 2012, the United States has used ‘active’ cyberdefence strategies, in which computer experts neutralize or distract viruses with decoy targets, or break into a hacker’s computer to delete data or destroy the system. In 2016, the United Kingdom announced a 5-year, £1.9-billion (US$2.7-billion) plan to combat cyber threats. NATO also began drafting principles for active cyberdefence, to be agreed by 2019. The United States and the United Kingdom are leading this initiative. Denmark, Germany, the Netherlands, Norway and Spain are also involved (see go.nature.com/2hebxnt). © 2018 Macmillan Publishers Limited,

Keyword: Intelligence; Robotics
Link ID: 24873 - Posted: 04.17.2018

In 2007, I spent the summer before my junior year of college removing little bits of brain from rats, growing them in tiny plastic dishes, and poring over the neurons in each one. For three months, I spent three or four hours a day, five or six days a week, in a small room, peering through a microscope and snapping photos of the brain cells. The room was pitch black, save for the green glow emitted by the neurons. I was looking to see whether a certain growth factor could protect the neurons from degenerating the way they do in patients with Parkinson's disease. This kind of work, which is common in neuroscience research, requires time and a borderline pathological attention to detail. Which is precisely why my PI trained me, a lowly undergrad, to do it—just as, decades earlier, someone had trained him. Now, researchers think they can train machines to do that grunt work. In a study described in the latest issue of the journal Cell, scientists led by Gladstone Institutes and UC San Francisco neuroscientist Steven Finkbeiner collaborated with researchers at Google to train a machine learning algorithm to analyze neuronal cells in culture. The researchers used a method called deep learning, the machine learning technique driving advancements not just at Google, but Amazon, Facebook, Microsoft. You know, the usual suspects. It relies on pattern recognition: Feed the system enough training data—whether it's pictures of animals, moves from expert players of the board game Go, or photographs of cultured brain cells—and it can learn to identify cats, trounce the world's best board-game players, or suss out the morphological features of neurons.

Keyword: Brain imaging; Learning & Memory
Link ID: 24862 - Posted: 04.13.2018

By Alex Beard Deb Roy and Rupal Patel pulled into their driveway on a fine July day in 2005 with the beaming smiles and sleep-deprived glow common to all first-time parents. Pausing in the hallway of their Boston home for Grandpa to snap a photo, they chattered happily over the precious newborn son swaddled between them. This normal-looking suburban couple weren’t exactly like other parents. Roy was an AI and robotics expert at MIT, Patel an eminent speech and language specialist at nearby Northeastern University. For years, they had been planning to amass the most extensive home-video collection ever. From the ceiling in the hallway blinked two discreet black dots, each the size of a coin. Further dots were located over the open-plan living area and the dining room. There were 25 in total throughout the house – 14 microphones and 11 fish-eye cameras, part of a system primed to launch on their return from hospital, intended to record the newborn’s every move. It had begun a decade earlier in Canada – but in fact Roy had built his first robots when he was just was six years old, back in Winnipeg in the 1970s, and he’d never really stopped. As his interest turned into a career, he wondered about android brains. What would it take for the machines he made to think and talk? “I thought I could just read the literature on how kids do it, and that would give me a blueprint for building my language and learning robots,” Roy told me. Over dinner one night, he boasted to Patel, who was then completing her PhD in human speech pathology, that he had already created a robot that was learning the same way kids learn. He was convinced that if it got the sort of input children get, the robot could learn from it. © 2018 Guardian News and Media Limited

Keyword: Development of the Brain; Learning & Memory
Link ID: 24860 - Posted: 04.12.2018

By Ashley Yeager Mothers who take selective serotonin reuptake inhibitors (SSRIs), a class of commonly used antidepressants, while pregnant have babies with distinct structural changes to their brains, researchers report today (April 9) in JAMA Pediatrics. MRI scans of the babies’ brains revealed exposure to the drugs in the womb increased the volumes of the babies’ amygdalae and insular cortices—regions that play a role in processing emotions. “Hopefully these results highlight the fact that something could be going on here,” study coauthor Claudia Lugo-Candelas, a postdoctoral research fellow at Columbia University tells Time. “They point to the fact that there is a signal—we don’t know what it means, or don’t know how long it might last. But we know it’s worth studying.” The number of women using SSRIs while pregnant is increasing, but not much is known about how the medication might affect the brains of developing babies. Studies in animals suggest exposure to SSRIs can change the offspring’s brain circuitry and lead to depressive-like behaviors and anxiety later in life. In the new study, Lugo-Candelas and her colleagues studied the brains of 98 human infants, 16 babies whose mothers were treated for depression with SSRIs during pregnancy, 21 mothers with depression but not treated, and 61 mothers with no history of depression. © 1986-2018 The Scientist

Keyword: Depression; Development of the Brain
Link ID: 24850 - Posted: 04.11.2018

By Alex Therrien Health reporter, BBC News People who suffer brain injuries are at increased risk of dementia later in life, a large study suggests. An analysis of 2.8 million people found those who had one or more traumatic brain injuries were 24% more likely to get dementia than those who had not. The risk was greatest in people who had the injuries in their 20s, who were 63% more likely to get the condition at some point in their life. But independent experts said other lifestyle factors were more important. Dementia, a category of brain diseases that includes Alzheimer's, affects some 47 million people worldwide - a number expected to double in the next 20 years. Previous research has suggested a link between brain injuries - leading causes of which include falls, motor vehicle accidents, and assaults - and subsequent dementia, but evidence has been mixed. This new study, which followed people in Denmark over a 36-year period, found those who had experienced even one mild TBI (concussion) were 17% more likely to get dementia, with the risk increasing with the number of TBIs and the severity of injury. Sustaining the injury at a younger age appeared to further increase the risk of getting the condition, the research found. Those who suffered a TBI in their 30s were 37% more likely to develop dementia later in life, while those who had the injury in their 50s were only 2% more likely to get the condition. © 2018 BBC

Keyword: Brain Injury/Concussion; Alzheimers
Link ID: 24847 - Posted: 04.11.2018

Jon Hamilton An international coalition of brain researchers is suggesting a new way of looking at Alzheimer's. Instead of defining the disease through symptoms like memory problems or fuzzy thinking, the scientists want to focus on biological changes in the brain associated with Alzheimer's. These include the plaques and tangles that build up in the brains of people with the disease. But they say the new approach is intended only for research studies, and isn't yet ready for use by most doctors who treat Alzheimer's patients. If the new approach is widely adopted, it would help researchers study patients whose brain function is still normal, but are likely to develop dementia caused by Alzheimer's. "There is a stage of the disease where there are no symptoms and we need to have some sort of a marker," says Eliezer Masliah, who directs the Division of Neuroscience at the National Institute on Aging. The new approach would be a dramatic departure from the traditional way of looking at Alzheimer's, says Clifford Jack, an Alzheimer's researcher at Mayo Clinic Rochester. In the past, "a person displayed a certain set of signs and symptoms and it was expected that they had Alzheimer's pathology," says Jack, who is the first author of the central paper describing the proposed new "research framework." © 2018 npr

Keyword: Alzheimers
Link ID: 24846 - Posted: 04.10.2018

By Edith Sheffer PALO ALTO, Calif. — My son’s school, David Starr Jordan Middle School, is being renamed. A seventh grader exposed the honoree, Stanford University’s first president, as a prominent eugenicist of the early 20th century who championed sterilization of the “unfit.” This sort of debate is happening all over the country, as communities fight over whether to tear down Confederate monuments and whether Andrew Jackson deserves to remain on the $20 bill. How do we decide whom to honor and whom to disavow? There are some straightforward cases: Hitler Squares were renamed after World War II; Lenin statues were hauled away after the collapse of the Soviet Union. But other, less famous monsters of the past continue to define our landscape and language. I have spent the past seven years researching the Nazi past of Dr. Hans Asperger. Asperger is credited with shaping our ideas of autism and Asperger syndrome, diagnoses given to people believed to have limited social skills and narrow interests. The official diagnosis of Asperger disorder has recently been dropped from the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders because clinicians largely agreed it wasn’t a separate condition from autism. But Asperger syndrome is still included in the World Health Organization’s International Classification of Diseases, which is used around the globe. Moreover, the name remains in common usage. It is an archetype in popular culture, a term we apply to loved ones and an identity many people with autism adopt for themselves. Most of us never think about the man behind the name. But we should. © 2018 The New York Times Company

Keyword: Autism
Link ID: 24840 - Posted: 04.09.2018

By Melissa Healy Despite years of effort, researchers have so far failed to find a pill you could take or a food you could eat to harden your brain against the injury that could be caused by a stroke. But new research offers the prospect of limiting a stroke's long-term damage in a different way: with a drug that enhances the brain's ability to rewire itself and promote recovery in the weeks and months after injury. In experiments, both mice and macaque monkeys that suffered strokes regained more movement and dexterity when their rehabilitative regimen included an experimental medication called edonerpic maleate. The drug, which has already run a gauntlet of safety trials as a possible medication for Alzheimer's disease, appears to have enhanced the effectiveness of rehab by strengthening the connections between brain cells and nourishing the chemical soup in which those cells forge those new connections. A report on the experiments appears in Friday's edition of the journal Science. The work was conducted by researchers at Yokohama City University School of Medicine and employees of Toyama Chemical Co., Ltd., a Japanese pharmaceutical firm that owns intellectual property rights to edonerpic maleate. Toyama provided funding for Yokohama City University to study the drug in macaque monkeys. The findings from the mice shed important light on how edonerpic maleate may work in an injured brain.

Keyword: Stroke; Learning & Memory
Link ID: 24835 - Posted: 04.07.2018

By Knvul Sheikh In the bare winter woods across North America, you can hear the clear whistles of Black-capped and Carolina Chickadees as they forage for food. The insects they normally love to eat are gone, so the birds must find seeds and stash them among the trees for later. The Black-capped Chickadee and its southern lookalike, the Carolina Chickadee, are like squirrels in this sense: well-known for their food-caching behavior. They’ve evolved sharp brains, with some parts that grow bigger in the winter, specifically so they can remember the location of hundreds to thousands of seeds. But in the narrow strip of land from Kansas to New Jersey where the two species overlap and mate, their offspring have a weaker memory, according to a new study published in Evolution last week. In a set of experiments, only 62.5 percent of hybrid chickadees were able to solve a puzzle to uncover their food, as opposed to 95 percent of normal chickadees. More importantly, the hybrids’ poor recall could hurt their ability to survive harsh winters. “These birds don’t migrate; they stay in their regions throughout the year, so winter survival is pretty important,” says Michael McQuillan, a biologist at Lehigh University who was the lead author of the research. “If the hybrids are less able to do this, or if they have worse memories, that could be really bad for them.” The trend could also explain why the blended birds haven’t evolved into a distinct species over time. Black-capped and Carolina Chickadees hybridize extensively—often to the chagrin of birders, who already have a hard time telling them apart. In general, hybridization is common: It occurs in about 10 percent of animal and 25 percent of plant species, McQuillan says. Many hybrids thrive, and in rare cases like the Golden-crowned Manakin and the Galapagos “Bird Bird” finch, they can form stable new lineages.

Keyword: Learning & Memory; Evolution
Link ID: 24832 - Posted: 04.07.2018

Laurel Hamers Your brain might make new nerve cells well into old age. Healthy people in their 70s have just as many young nerve cells, or neurons, in a memory-related part of the brain as do teenagers and young adults, researchers report in the April 5 Cell Stem Cell. The discovery suggests that the hippocampus keeps generating new neurons throughout a person’s life. The finding contradicts a study published in March, which suggested that neurogenesis in the hippocampus stops in childhood (SN Online: 3/8/18). But the new research fits with a larger pile of evidence showing that adult human brains can, to some extent, make new neurons. While those studies indicate that the process tapers off over time, the new study proposes almost no decline at all. Understanding how healthy brains change over time is important for researchers untangling the ways that conditions like depression, stress and memory loss affect older brains. When it comes to studying neurogenesis in humans, “the devil is in the details,” says Jonas Frisén, a neuroscientist at the Karolinska Institute in Stockholm who was not involved in the new research. Small differences in methodology — such as the way brains are preserved or how neurons are counted — can have a big impact on the results, which could explain the conflicting findings. The new paper “is the most rigorous study yet,” he says. Researchers studied hippocampi from the autopsied brains of 17 men and 11 women ranging in age from 14 to 79. In contrast to past studies that have often relied on donations from patients without a detailed medical history, the researchers knew that none of the donors had a history of psychiatric illness or chronic illness. |© Society for Science & the Public 2000 - 2018.

Keyword: Neurogenesis; Development of the Brain
Link ID: 24830 - Posted: 04.06.2018

By Dina Fine Maron A new report raises questions about whether contracting Zika virus in the months after birth may damage an infected newborn’s brain. Researchers at Emory University injected a small number of infant rhesus macaques with the virus five weeks after birth—an age which roughly correlates with Zika exposure in three-month-old human babies—and found that although the monkeys cleared the infection from their blood as expected, the animals developed brain damage and behavioral problems. “This is the first time [infant infection] has been studied in a controlled fashion. And while it is with a small group of animals, it does make us more concerned about what the long-term behavioral or cognitive issues may be in human infants that might have been similarly exposed,” says Ann Chahroudi, senior author on the study and an assistant professor of pediatrics at Emory University School of Medicine. Advertisement The good news is infected monkeys did not develop the most severe problems seen in humans exposed to Zika prenatally, which include limb deformities, hearing and vision loss, and a small-headed condition that can form in the womb called microcephaly. Yet certain brain areas typically responsible for vision as well as emotional and behavioral responses did not develop normally in infant macaques exposed to Zika, and the animals acted strangely in behavioral tests compared with control animals not exposed to the pathogen. Connections between the amygdala and hippocampus were also weak in the infected macaques, which suggests signals sent between those two areas—ones that would help the infants recognize and respond to stressful situations—would be slow or spotty. © 2018 Scientific American,

Keyword: Development of the Brain
Link ID: 24825 - Posted: 04.06.2018

By Daniela Carulli In 1898, Camillo Golgi, an eminent Italian physician and pathologist, published a landmark paper on the structure of “nervous cells.” In addition to the organelle that still bears his name, the Golgi apparatus, he described “a delicate covering” surrounding neurons’ cell bodies and extending along their dendrites. That same year, another Italian researcher, Arturo Donaggio, observed that these coverings, now known as perineuronal nets (PNNs), had openings in them, through which, he correctly surmised, axon terminals from neighboring neurons make synapses. Since then, however, PNNs have been largely neglected by the scientific community—especially after Santiago Ramón y Cajal, a fierce rival of Golgi (who would later share the Nobel Prize with him), dismissed them as a histological artifact. It wasn’t until the 1970s, thanks to the improvement of histological techniques and the development of immunohistochemistry, that researchers confirmed the existence of PNNs around some types of neurons in the brain and spinal cord of many vertebrate species, including humans. Composed of extracellular matrix (ECM) molecules, PNNs form during postnatal development, marking the end of what’s known as the “critical period” of heightened brain plasticity. For a while after birth, the external environment has a profound effect on the wiring of neuronal circuits and, in turn, on the development of an organism’s skills and behaviors, such as language, sensory processing, and emotional traits. But during childhood and adolescence, neuronal networks become more fixed, allowing the individual to retain the acquired functions. Evidence gathered over the past 15 years suggests that PNNs contribute to this fixation in many brain areas, by stabilizing the existing contacts between neurons and repelling incoming axons. © 1986-2018 The Scientist

Keyword: Learning & Memory; Brain imaging
Link ID: 24815 - Posted: 04.03.2018

by Nicholette Zeliadt Folic acid, a B vitamin, may lower autism risk and ease features of the condition, according to findings from five unrelated studies published over the past few months. Three of the studies suggest that prenatal supplements of folic acid offset autism risk associated with in utero exposure to epilepsy drugs or toxic chemicals. The supplements are also known to prevent birth defects. Another study found that people with autism and their immediate family members are more likely than those in a control group to carry immune molecules that could block folate’s passage into the brain. “These studies are particularly of interest because they suggest that people could potentially modify their risk of having a child with autism, even in the face of certain adverse exposures or conditions,” says Kristen Lyall, an assistant professor in the Modifiable Risk Factors Program at the A.J. Drexel Autism Institute in Philadelphia, who was not involved in any of the studies. A fifth study reported results from a small clinical trial suggesting that folinic acid — a form of folic acid — can ease language and communication difficulties in people with autism. “It isn’t enough to say that kids with [autism] should be taking folinic acid, necessarily, but it is enough to motivate a larger study,” says Jeremy Veenstra-VanderWeele, a professor of psychiatry at Columbia University, who was not involved in the trial. © 1996-2018 The Washington Post

Keyword: Autism; Development of the Brain
Link ID: 24813 - Posted: 04.03.2018

Philip Ball Last week, I was told my other brain is fully grown. It doesn’t look like much. A blob of pale flesh about the size of a small pea, it floats in a bath of blood-red nutrient. It would fit into the cranium of a foetus barely a month old. Still, it’s a “brain” after a fashion and it’s made from me. From a piece of my arm, to be precise. I’m not going to pretend this isn’t strange. But neither is it an exercise in gratuitously ghoulish biological engineering, a piece of Frankensteinian scientific hubris 200 years after Mary Shelley’s tale. The researchers who made my mini-brain are trying to understand how neurodegenerative diseases develop. With mini-brains grown from the tissues of people who have a genetic susceptibility to the early onset of conditions such as Alzheimer’s, they hope to unravel what goes awry in the mature adult brain. It’s this link to studies of dementia that led me to the little room in the Dementia Research Centre of University College London last July, where neuroscientist Ross Paterson anaesthetised my upper arm and then sliced a small plug of flesh from it. This biopsy was going to be the seed for growing brain cells – neurons – that would organise themselves into mini-brains. The Brains in a Dish project is one of many strands of Created Out of Mind, an initiative hosted at the Wellcome Collection in London and funded by the Wellcome Trust for two years to explore, challenge and shape perceptions and understanding of dementias through science and the creative arts. Neuroscientist Selina Wray at UCL is studying the genetics of Alzheimer’s and other neurodegenerative diseases and she and her PhD student Christopher Lovejoy gamely agreed to culture mini-brains from cells taken from four of the Created Out of Mind team: artist Charlie Murphy, who is leading Brains in a Dish, BBC journalist Fergus Walsh, neurologist Nick Fox and me. © 2018 Guardian News and Media Limited

Keyword: Development of the Brain
Link ID: 24809 - Posted: 04.02.2018