Chapter 7. Life-Span Development of the Brain and Behavior

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.

Links 1 - 20 of 5179

Diana Kwon The effects of antidepressant exposure during early development can pass down through three generations of offspring—at least in zebrafish. A new study, published today (December 10) in PNAS, reveals that fluoxetine, a commonly used antidepressant that goes by the brand name Prozac, can alter hormone levels and blunt stress responses in an exposed embryo and its descendants. “The paper is very intriguing,” says Tim Oberlander, a developmental pediatrician at the British Columbia Children’s Hospital who was not involved in this work. The question of whether these medications have a transgenerational effect is “a really important one that requires further study in other animal models, and ultimately, when we have the data, we need to figure out whether it’s also true in humans.” Fluoxetine is a selective serotonin reuptake inhibitor (SSRI), a class of drugs widely used to treat depression as well as other conditions such as obsessive-compulsive disorder and anxiety disorders. Recent data from the US National Health and Nutrition Survey show increasing antidepressant use, from approximately 7.7 percent of the population in 1999–2002 to 12.7 percent from 2011–2014. SSRIs are often prescribed as the first-line treatment for pregnant women with depression, and prior studies in humans suggest infants exposed to SSRIs while in the womb may experience developmental disturbances such as delayed motor development and increased levels of anxiety later in childhood. Oberlander, whose research is focused on the influence of prenatal exposure to these medications, notes that it has been unclear whether those correlations represent a direct result of the drugs or if other factors, such as a genetic propensity for those outcomes or growing up with a parent with a mood disorder, may also play a part. © 1986 - 2018 The Scientist

Keyword: Epigenetics; Depression
Link ID: 25781 - Posted: 12.12.2018

By Benedict Carey A generation ago, parents worried about the effects of TV; before that, it was radio. Now, the concern is “screen time,” a catchall term for the amount of time that children, especially preteens and teenagers, spend interacting with TVs, computers, smartphones, digital pads, and video games. This age group draws particular attention because screen immersion rises sharply during adolescence, and because brain development accelerates then, too, as neural networks are pruned and consolidated in the transition to adulthood. On Sunday evening, CBS’s “60 Minutes” reported on early results from the A.B.C.D. Study (for Adolescent Brain Cognitive Development), a $300 million project financed by the National Institutes of Health. The study aims to reveal how brain development is affected by a range of experiences, including substance use, concussions, and screen time. As part of an exposé on screen time, “60 Minutes” reported that heavy screen use was associated with lower scores on some aptitude tests, and to accelerated “cortical thinning" — a natural process — in some children. But the data is preliminary, and it’s unclear whether the effects are lasting or even meaningful. Does screen addiction change the brain? Yes, but so does every other activity that children engage in: sleep, homework, playing soccer, arguing, growing up in poverty, reading, vaping behind the school. The adolescent brain continually changes, or “rewires” itself, in response to daily experience, and that adaptation continues into the early to mid 20s. What scientists want to learn is whether screen time, at some threshold, causes any measurable differences in adolescent brain structure or function, and whether those differences are meaningful. Do they cause attention deficits, mood problems, or delays in reading or problem-solving ability? © 2018 The New York Times Company

Keyword: Development of the Brain; Learning & Memory
Link ID: 25772 - Posted: 12.11.2018

Doing crossword puzzles and Sudoku does not appear to protect against mental decline, according to a new study. The idea of "use it or lose it" when it comes to our brains in later life has previously been widely accepted. The new Scottish study showed that people who regularly do intellectual activities throughout life have higher mental abilities. This provides a "higher cognitive point" from which to decline, say the researchers. But the study did not show that they decline any slower. The work, published in the BMJ, was undertaken by Dr Roger Staff at Aberdeen Royal Infirmary and the University of Aberdeen. It looked at 498 people born in 1936 who had taken part in a group intelligence test at the age of 11. This current study started when they were about 64 years old and they were recalled for memory and mental-processing-speed testing up to five times over a 15-year period. It found engagement in problem solving did not protect an individual from decline. However, engaging in intellectually stimulating activities on a regular basis was linked to level of mental ability in old age. The study uses modelling to look at associations and cannot prove any causal link. Also, many of the participants were unable to complete the whole study - some dropped out, others died. Image copyright Getty Images Previously, some studies have found that cognitive training can improve some aspects of memory and thinking, particularly for people who are middle-aged or older. They found so-called brain training may help older people to manage their daily tasks better. No studies have shown that brain training prevents dementia. And last year a report from the Global Council on Brain Health recommended that people should take part in stimulating activities such as learning a musical instrument, designing a quilt or gardening rather than brain training to help their brain function in later life. © 2018 BBC

Keyword: Alzheimers; Learning & Memory
Link ID: 25771 - Posted: 12.11.2018

By Benedict Carey In mid-October, researchers in California published a study of Civil War prisoners that came to a remarkable conclusion. Male children of abused war prisoners were about 10 percent more likely to die than their peers were in any given year after middle age, the study reported. The findings, the authors concluded, supported an “epigenetic explanation.” The idea is that trauma can leave a chemical mark on a person’s genes, which then is passed down to subsequent generations. The mark doesn’t directly damage the gene; there’s no mutation. Instead it alters the mechanism by which the gene is converted into functioning proteins, or expressed. The alteration isn’t genetic. It’s epigenetic. The field of epigenetics gained momentum about a decade ago, when scientists reported that children who were exposed in the womb to the Dutch Hunger Winter, a period of famine toward the end of World War II, carried a particular chemical mark, or epigenetic signature, on one of their genes. The researchers later linked that finding to differences in the children’s health later in life, including higher-than-average body mass. The excitement since then has only intensified, generating more studies — of the descendants of Holocaust survivors, of victims of poverty — that hint at the heritability of trauma. If these studies hold up, they would suggest that we genetically inherit some trace of our parents’ and even grandparents’ experience, particularly their suffering, which in turn modifies our own day-to-day health — and perhaps our children’s, too. But behind the scenes, the work has touched off a bitter dispute among researchers that could stunt the enterprise in its infancy. Critics contend that the biology implied by such studies simply is not plausible. Epigenetics researchers counter that their evidence is solid, even if the biology is not worked out. © 2018 The New York Times Company

Keyword: Epigenetics; Stress
Link ID: 25768 - Posted: 12.10.2018

By Ramin Skibba Even when you’re fluent in two languages, it can be a challenge to switch back and forth smoothly between them. It’s common to mangle a split verb in Spanish, use the wrong preposition in English or lose sight of the connection between the beginning and end of a long German sentence. So, does mastering a second language hone our multitasking skills or merely muddle us up? This debate has been pitting linguists and psychologists against one another since the 1920s, when many experts thought that bilingual children were fated to suffer cognitive impairments later in life. But the science has marched on. Psycholinguist Mark Antoniou of Western Sydney University in Australia argues that bilingualism — as he defines it, using at least two languages in your daily life — may benefit our brains, especially as we age. In a recent article, he addressed how best to teach languages to children and laid out evidence that multiple-language use on a regular basis may help delay the onset of Alzheimer’s disease. This conversation has been edited for length and clarity. Q: What are the benefits of bilingualism? A: The first main advantage involves what’s loosely referred to as executive function. This describes skills that allow you to control, direct and manage your attention, as well as your ability to plan. It also helps you ignore irrelevant information and focus on what’s important. Because a bilingual person has mastery of two languages, and the languages are activated automatically and subconsciously, the person is constantly managing the interference of the languages so that she or he doesn’t say the wrong word in the wrong language at the wrong time. The brain areas responsible for that are also used when you’re trying to complete a task while there are distractions. The task could have nothing to do with language; it could be trying to listen to something in a noisy environment or doing some visual task. The muscle memory developed from using two languages also can apply to different skills. © 1996-2018 The Washington Post

Keyword: Language; Alzheimers
Link ID: 25767 - Posted: 12.10.2018

Rhitu Chatterjee Researchers have traced a connection between some infections and mental illnesses like schizophrenia, depression and bipolar disorder. New research from Denmark bolsters that connection. The study, published Thursday in JAMA Psychiatry, shows that a wide variety of infections, even common ones like bronchitis, are linked to a higher risk of many mental illnesses in children and adolescents. The findings support the idea that infections affect mental health, possibly by influencing the immune system. "This idea that activation of the body's immune inflammatory system as a causative factor in ... select mental illnesses is one that has really caught on," says Dr. Roger McIntyre, a professor of psychology and pharmacology at the University of Toronto, who wasn't involved in the study. "This study adds to that generally, but builds the case further in a compelling way." In the new study, the researchers gathered data on hospitalizations and prescription medications for the 1.1 million children born in Denmark between Jan. 1, 1995, and June 30, 2012. "We could follow individuals from birth, so there was no missing information during the study period," says Dr. Ole Köhler-Forsberg of Aarhus University Hospital, a neuroscientist and one of the authors of the study. Köhler-Forsberg and his colleagues used two national registries — one to get data on hospitalizations because of severe infections like pneumonia and another for data on antimicrobial or antiparasitic medications prescribed to children for less severe infections. "Most of them are those infections that you and I and all others have experienced," says Köhler-Forsberg. © 2018 npr

Keyword: Development of the Brain; Schizophrenia
Link ID: 25756 - Posted: 12.06.2018

By Pam Belluck CHARLOTTE, N.C. — Steve Singer, who has bipolar and borderline personality disorders, knows when he’s on the verge of a mental health crisis. The female voice he hears incessantly in his head suddenly shuts up, and the hula hoop he gyrates while walking to the grocery store stops easing his anxieties. That’s when he gets to a hospital. Usually, talking briefly with a nurse or social worker calms him enough to return home. But this year a hospital placed him on a locked ward, took his phone, and had an armed guard watch him for 20 hours before a social worker spoke with him and released him. “I get the heebie-jeebies thinking about it,” said Mr. Singer, 60. “They didn’t help me, they hurt me.” Deeply upset, he turned to something he’d never known existed: He completed a psychiatric advance directive, a legal document declaring what treatment he does and doesn’t want. Increasingly, patients, advocates and doctors believe such directives (called PADs) could help transform the mental health system by allowing patients to shape their care even when they lose touch with reality. Hospitals must put them in patients’ medical records and doctors are expected to follow them unless they document that specific preferences aren’t in the patients’ best medical interest. As the pendulum has swung from institutionalization to outpatient care, psychiatric directives also offer a middle path by allowing patients to designate family members to speak for them when they’re too sick to do so themselves. But some doctors and hospitals are wary that the documents could tie their hands and discourage treatment they consider warranted. Some worry the directives won’t be updated to reflect medical advances. Others question whether people with serious psychiatric conditions are ever capable of lucidly completing such directives. “A decision based on erroneous information on a PAD, that can happen,” said Dr. Katayoun Tabrizi, a forensic psychiatrist at Duke. “This is not a cookbook.” © 2018 The New York Times Company

Keyword: Schizophrenia; Alzheimers
Link ID: 25749 - Posted: 12.04.2018

By Philip S. Gutis With a slow moving disease like Alzheimer’s, there’s still time for doubt. Perhaps the diagnosis is wrong and the memory holes and struggle for words are just normal aging. Deep in your psyche, there’s still a little spark of hope. But there comes a moment when denial is no longer an option. Like Alzheimer’s itself, the moment creeps up slowly, taking care to not give away too much too soon. My moment came recently, as I was walking past the Bucks County Playhouse in downtown New Hope, Pa. I correctly remembered that my husband, Tim, and I recently saw a show there. I even remembered who went with us. But I had no recollection of what show I had seen. Tim reminded me that it was “Guys and Dolls,” but the memory wasn’t there. No songs, no story, no scenes. Nothing at all. The next morning, I sat quietly on my bed. “Tim,” I said, “It’s coming, isn’t it?” Without asking what I meant, Tim gently said, “Yes, it’s coming.” I cried, of course, but just a little. I’ve known, obviously, that change is coming. I’ve been tested, prodded, injected and studied for well over two years as part of a clinical trial. But looking back, I realize that I’ve still harbored a shadow of doubt. The shadow is gone. The spark of hope has been extinguished. Now we have to seriously plan for the future. Alzheimer’s will continue to steal from me, and, unless there’s an unlikely medical miracle, nothing is going to stop the creeping loss. Loss of memory. Loss of mobility. Loss of freedom. Despite this, I haven’t thrown in the towel. Deep down, I know there’s much more life to live, much more time to fight and to love. The years since my diagnosis haven’t been all bad. A few months after we learned the news, my partner of 12 years and I went to the county courthouse to get married. My sister and my nieces and nephew joined us and took pictures as we kissed for the first time as a married couple and fulfilled the Jewish tradition of breaking a glass for good luck. © 2018 The New York Times Company

Keyword: Alzheimers
Link ID: 25748 - Posted: 12.04.2018

By Kara Manke A single season of high school football may be enough to cause microscopic changes in the structure of the brain, according to a new study by researchers at UC Berkeley, Duke University and the University of North Carolina at Chapel Hill. A 3D representation of a magnetic resonance imaging scan, showing areas in the front and rear of the brain lit up. Magnetic resonance imaging (MRI) brain scans have revealed that playing a single season of high school football can cause microscopic changes in the grey matter in young players’ brains. These changes are located in the front and rear of the brain, where impacts are most likely to occur, as well as deep inside the brain. The researchers used a new type of magnetic resonance imaging (MRI) to take brain scans of 16 high school players, ages 15 to 17, before and after a season of football. They found significant changes in the structure of the grey matter in the front and rear of the brain, where impacts are most likely to occur, as well as changes to structures deep inside the brain. All participants wore helmets, and none received head impacts severe enough to constitute a concussion. The study, which is the cover story of the November issue of Neurobiology of Disease, is one of the first to look at how impact sports affect the brains of children at this critical age. © 2018 UC Regents

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 25740 - Posted: 12.01.2018

Ashley Westerman A single season playing football might be all it takes to change a young athlete's brain. Those are the preliminary findings of research presented this week in Chicago at the annual meeting of the Radiological Society of North America. Researchers used special MRI methods to look at nerve bundles in the brain in a study of the brains of 26 young male football players, average age 12, before and after one season. Twenty-six more young males who didn't play football also got MRI scans at the same time to be used as a control group. In the youths who played football, the researchers found that nerve fibers in their corpus callosum — the band that connects the two halves of brain — changed over the season, says lead study author Jeongchul Kim, a research associate in the Radiology Informatics and Imaging Laboratory at Wake Forest School of Medicine in Winston-Salem, N.C. "We applied here two different imaging approaches," he says. One analyzed the shape of the nerve fibers and the other focused on the integrity of the nerves. Kim says the researchers found some nerve bundles grew longer and other bundles became shorter, or contracted, after the players' initial MRI scans at the beginning of the season. He says they saw no changes in the integrity of the bundles. The team says these results suggest that repeated blows to the head could lead to changes in the shape of the corpus callosum, which is critical to integrating cognitive, motor and sensory functions between the two hemispheres of the brain, during a critical time for brain development in young people. © 2018 npr

Keyword: Brain Injury/Concussion; Development of the Brain
Link ID: 25739 - Posted: 12.01.2018

By Mitch Leslie Unlike most cells in our bodies, the neurons in our brain can scramble their genes, scientists have discovered. This genome tampering may expand the brain’s protein repertoire, but it may also promote Alzheimer’s disease, their study suggests. “It’s potentially one of the biggest discoveries in molecular biology in years,” says Geoffrey Faulkner, a molecular biologist at the University of Queensland in Brisbane, Australia, who wasn’t connected to the research. “It is a landmark study,” agrees clinical neurologist Christos Proukakis of University College London. Scientists first discovered that certain cells could shuffle and edit DNA in the 1970s. Some immune cells snip out sections of genes that code for proteins that detect or fight pathogens and splice the remaining pieces together to create new varieties. Our B cells, for example, can potentially spawn about 1 quadrillion types of antibodies, enough to fend off an enormous range of bacteria, viruses, and other attackers. Scientists have seen hints that such genomic reshuffling—known as somatic recombination—happens in our brain. Neurons there often differ dramatically from one another. They often have more DNA or different genetic sequences than the cells around them. To look for definitive evidence of somatic recombination in the brain, neuroscientist Jerold Chun of the Sanford Burnham Prebys Medical Discovery Institute in San Diego, California, and colleagues analyzed neurons from the donated brains of six healthy elderly people and seven patients who had the noninherited form of Alzheimer’s disease, which accounts for most cases. The researchers tested whether the cells harbored different versions of the gene for the amyloid precursor protein (APP), the source of the plaques in the brains of people with Alzheimer’s disease. APP’s gene was a good candidate to examine, the researchers thought, because one of their previous studies suggested neurons from patients with Alzheimer’s disease can harbor extra copies of the gene, an increase that could arise from somatic recombination. © 2018 American Association for the Advancement of Science

Keyword: Alzheimers
Link ID: 25716 - Posted: 11.24.2018

Sara Reardon Drug companies have spent billions of dollars searching for therapies to reverse or significantly slow Alzheimer’s disease, to no avail. Some researchers argue that the best way to make progress is to create better animal models for research, and several teams are now developing mice that more closely simulate how the disease devastates people’s brains. The US National Institutes of Health (NIH), the UK Dementia Research Institute and Jackson Laboratory (JAX) — one of the world’s biggest suppliers of lab mice — are among the groups trying to genetically engineer more sophisticated rodents. Scientists are also probing the complex web of mutations that influences neurological decline in mice and people. “We appreciate that the models we had were insufficient,” says Bruce Lamb, a neuroscientist at Indiana University in Indianapolis who directs the NIH-funded programme. “I think it’s sort of at a critical juncture right now.” Alzheimer’s is marked by cognitive impairment and the build-up of amyloid-protein plaques in the brains of people, but the disease does not occur naturally in mice. Scientists get around this by studying mice that have been genetically modified to produce high levels of human amyloid protein. These mice develop plaques in their brains, but they still do not display the memory problems seen in people. Many experimental drugs that have successfully removed plaques from mouse brains have not lessened the symptoms of Alzheimer’s disease in people. One high-profile stumble came last month, when three companies reported that their Alzheimer’s drugs — from a class called BACE inhibitors — had failed in large, late-stage clinical trials. Although the drugs successfully blocked the accumulation of amyloid protein in mice, they seemed to worsen cognitive decline and brain shrinkage in people. © 2018 Springer Nature Limited.

Keyword: Alzheimers
Link ID: 25715 - Posted: 11.24.2018

Shawna Williams In 1987, political scientist James Flynn of the University of Otago in New Zealand documented a curious phenomenon: broad intelligence gains in multiple human populations over time. Across 14 countries where decades’ worth of average IQ scores of large swaths of the population were available, all had upward swings—some of them dramatic. Children in Japan, for example, gained an average of 20 points on a test known as the Wechsler Intelligence Scale for Children between 1951 and 1975. In France, the average 18-year-old man performed 25 points better on a reasoning test in 1974 than did his 1949 counterpart.1 Flynn initially suspected the trend reflected faulty tests. Yet in the ensuing years, more data and analyses supported the idea that human intelligence was increasing over time. Proposed explanations for the phenomenon, now known as the Flynn effect, include increasing education, better nutrition, greater use of technology, and reduced lead exposure, to name but four. Beginning with people born in the 1970s, the trend has reversed in some Western European countries, deepening the mystery of what’s behind the generational fluctuations. But no consensus has emerged on the underlying cause of these trends. A fundamental challenge in understanding the Flynn effect is defining intelligence. At the dawn of the 20th century, English psychologist Charles Spearman first observed that people’s average performance on a variety of seemingly unrelated mental tasks—judging whether one weight is heavier than another, for example, or pushing a button quickly after a light comes on—predicts our average performance on a completely different set of tasks. Spearman proposed that a single measure of general intelligence, g, was responsible for that commonality. © 1986 - 2018 The Scientist

Keyword: Intelligence; Learning & Memory
Link ID: 25714 - Posted: 11.24.2018

By Pam Belluck It’s a rare person in America who doesn’t know of someone with Alzheimer’s disease. The most common type of dementia, it afflicts about 44 million people worldwide, including 5.5 million in the United States. Experts predict those numbers could triple by 2050 as the older population increases. So why is there still no effective treatment for it, and no proven way to prevent or delay its effects? Why is there still no comprehensive understanding of what causes the disease or who is destined to develop it? The answer, you could say, is: “It’s complicated.” And that is certainly part of it. For nearly two decades, researchers, funding agencies and clinical trials have largely focused on one strategy: trying to clear the brain of the clumps of beta amyloid protein that form the plaques integrally linked to the disease. But while some drugs have reduced the accumulation of amyloid, none have yet succeeded in stopping or reversing dementia. And amyloid doesn’t explain everything about Alzheimer’s — not everyone with amyloid plaques has the disease. “It’s not that amyloid is not an important factor,” said Dr. John Morris, director of the Knight Alzheimer’s Disease Research Center at the Washington University School of Medicine in St. Louis. “On the other hand, we’ve had some 200-plus trials since 2001 that have been negative.” Not all trials have targeted amyloid. Some have focused on tau, a protein that, in Alzheimer’s, forms threads that stick together in tangles inside neurons, sandbagging their communications with one another. Tau tangles seem to spread after amyloid accumulates into plaques between neurons. But so far, anti-tau drugs haven’t successfully attacked Alzheimer’s itself. Only five drugs have been approved to treat this dementia, but they address early symptoms and none have been shown to work very well for very long. It’s been 15 years since the last one was approved. © 2018 The New York Times Company

Keyword: Alzheimers
Link ID: 25704 - Posted: 11.20.2018

By Perri Klass, M.D. More than 30 years ago, I went to a parent meeting at my oldest child’s day care center, when he was in the 2-year-old room, and it turned out that many of the children in the room were not reliably sleeping through the night. It felt like a revelation, discovering that mine was not the only child who occasionally — or regularly — woke in the night and needed some attention. In our family, we had come to terms with this, and we had managed to make — and generally keep — some rules: no food, no drink, no coming out of the crib, but yes, once a night one of your parents is willing to stagger down the hall, look in on you, rub your back and say something like, “We haven’t moved away and left you, now go back to sleep.” (Or maybe sometimes it was, “Go back to sleep or we will move away and leave you,” but that is lost in the mists of history.) It wasn’t ideal, but we were managing. In the current issue of the journal Pediatrics, researchers describe a study of almost 400 mothers in Canada who were asked to report: “During the night, how many consecutive hours does your child sleep without waking up?” The researchers took six or eight hours of uninterrupted sleep as definitions of “sleeping through the night.” They found that at 6 months of age, 62.4 percent of mothers reported that their infants slept for 6 hours or more at a stretch, and only 43 percent of the mothers reported eight-hour blocks of consecutive sleep. At 12 months, 72.1 percent of the mothers reported six hours of consecutive sleep, and 56.6 percent reported eight hours; since all infants wake several times a night, those who were reported as sleeping consecutively presumably awoke and went back to sleep by themselves without the mothers knowing it. So by these criteria, a significant number of the babies were not “sleeping through the night” at 6 months, and even at 12 months. At some time points, girls were more likely to sleep for longer periods than boys, but at other times there was no significant difference. © 2018 The New York Times Company

Keyword: Sleep; Development of the Brain
Link ID: 25698 - Posted: 11.19.2018

Sara Reardon ‘Mini brains’ grown in a dish have spontaneously produced human-like brain waves for the first time — and the electrical patterns look similar to those seen in premature babies. The advancement could help scientists to study early brain development. Research in this area has been slow, partly because it is difficult to obtain fetal-tissue samples for analysis and nearly impossible to examine a fetus in utero. Many researchers are excited about the promise of these ‘organoids’, which, when grown as 3D cultures, can develop some of the complex structures seen in brains. But the technology also raises questions about the ethics of creating miniature organs that could develop consciousness. A team of researchers led by neuroscientist Alysson Muotri of the University of California, San Diego, coaxed human stem cells to form tissue from the cortex — a brain region that controls cognition and interprets sensory information. They grew hundreds of brain organoids in culture for 10 months, and tested individual cells to confirm that they expressed the same collection of genes seen in typical developing human brains1. The group presented the work at the Society for Neuroscience meeting in San Diego this month. Muotri and his colleagues continuously recorded electrical patterns, or electroencephalogram (EEG) activity, across the surface of the mini brains. By six months, the organoids were firing at a higher rate than other brain organoids previously created, which surprised the team. © 2018 Springer Nature Limited.

Keyword: Development of the Brain; Epilepsy
Link ID: 25694 - Posted: 11.16.2018

Tina Hesman Saey Whether people prefer coffee or tea may boil down to a matter of taste genetics. People with a version of a gene that increases sensitivity to the bitter flavor of caffeine tend to be coffee drinkers, researchers report online November 15 in Scientific Reports. Tea drinkers tended to be less sensitive to caffeine’s bitter taste, but have versions of genes that increase sensitivity to the bitterness of other chemicals, the researchers found. It’s long been thought that people avoid eating bitter foods because bitterness is an indicator of poison, says John Hayes, a taste researcher at Penn State who was not involved in the study. The coffee and tea findings help challenge that “overly simplistic ‘bitter is always bad, let’s avoid it’” view, he says. In the new study, researchers examined DNA variants of genes involved in detecting the bitter taste of the chemicals, caffeine, quinine — that bitter taste in tonic water — and propylthiouracil (PROP), a synthetic chemical not naturally found in food or drink. Other bitter components naturally in coffee and tea may trigger the same taste responses as quinine and PROP do, Hayes says. Researchers in Australia, the United States and England examined DNA from more than 400,000 participants in the UK Biobank, a repository of genetic data for medical research. Participants also reported other information about their health and lifestyle, including how much tea or coffee they drink each day. |© Society for Science & the Public 2000 - 2018

Keyword: Chemical Senses (Smell & Taste); Genes & Behavior
Link ID: 25693 - Posted: 11.16.2018

By Kelly Servick SAN DIEGO, CALIFORNIA—If a diseased or injured brain has lost neurons, why not ask other cells to change jobs and pick up the slack? Several research teams have taken a first step by "reprogramming" abundant nonneuronal cells called astrocytes into neurons in the brains of living mice. "Everybody is astonished, at the moment, that it works," says Nicola Mattugini, a neurobiologist at Ludwig Maximilian University in Munich, Germany, who presented the results of one such experiment here at the annual meeting of the Society for Neuroscience last week. Now, labs are turning to the next questions: Do these neurons function like the lost ones, and does creating neurons at the expense of astrocytes do brain-damaged animals any good? Many researchers remain skeptical on both counts. But Mattugini's team, led by neuroscientist Magdalena Götz, and two other groups presented evidence at the meeting that reprogrammed astrocytes do, at least in some respects, impersonate the neurons they're meant to replace. The two other groups also shared evidence that reprogrammed astrocytes help mice recover movement lost after a stroke. Some see the approach as a potential alternative to transplanting stem cells (or stem cell–derived neurons) into the damaged brain or spinal cord. Clinical trials of that strategy are already underway for conditions including Parkinson's disease and spinal cord injury. But Gong Chen, a neuroscientist at Pennsylvania State University in State College, says he got disillusioned with the idea after finding in his rodent experiments that transplanted cells produced relatively few neurons, and those few weren't fully functional. The recent discovery that mature cells can be nudged toward new fates pointed to a better approach, he says. His group and others took aim at the brain's most abundant cell, the star-shaped astrocyte. © 2018 American Association for the Advancement of Science

Keyword: Stem Cells; Glia
Link ID: 25687 - Posted: 11.15.2018

David Cyranoski Japanese neurosurgeons have implanted ‘reprogrammed’ stem cells into the brain of a patient with Parkinson’s disease for the first time. The condition is only the second for which a therapy has been trialled using induced pluripotent stem (iPS) cells, which are developed by reprogramming the cells of body tissues such as skin so that they revert to an embryonic-like state, from which they can morph into other cell types. Scientists at Kyoto University use the technique to transform iPS cells into precursors to the neurons that produce the neurotransmitter dopamine. A shortage of neurons producing dopamine in people with Parkinson’s disease can lead to tremors and difficulty walking. In October, neurosurgeon Takayuki Kikuchi at Kyoto University Hospital implanted 2.4 million dopamine precursor cells into the brain of a patient in his 50s. In the three-hour procedure, Kikuchi’s team deposited the cells into 12 sites, known to be centres of dopamine activity. Dopamine precursor cells have been shown to improve symptoms of Parkinson’s disease in monkeys. Stem-cell scientist Jun Takahashi and colleagues at Kyoto University derived the dopamine precursor cells from a stock of IPS cells stored at the university. These were developed by reprogramming skin cells taken from an anonymous donor. “The patient is doing well and there have been no major adverse reactions so far,” says Takahashi. The team will observe him for six months and, if no complications arise, will implant another 2.4 million dopamine precursor cells into his brain. © 2018 Springer Nature Limited

Keyword: Parkinsons; Stem Cells
Link ID: 25682 - Posted: 11.14.2018

Adriana Galván Healthy sleep leads to healthy brains. Neuroscientists have gotten that message out. But parents, doctors and educators alike have struggled to identify what to do to improve sleep. Some have called for delaying school start times or limiting screentime before bed to achieve academic, health and even economic gains. Still, recent estimates suggest that roughly half of adolescents in the United States are sleep-deprived. These numbers are alarming because sleep is particularly important during adolescence, a time of significant brain changes that affect learning, self-control and emotional systems. And sleep deficits are even greater in economically disadvantaged youth compared to more affluent counterparts. Research from my developmental neuroscience lab shows one solution to the sleep deprivation problem that is deceptively simple: provide teens with a good pillow. Because getting comfortable bedding does not involve technology, expensive interventions or lots of time, it may be particularly beneficial for improving sleep among underresourced adolescents. Studies in my lab have shown that seemingly small differences in the quality and duration of sleep make a difference in how the brain processes information. Sleep acts like a glue that helps the brain encode recently learned information into long-term knowledge. It also improves focus in school because sleep helps dampen hyperactive behavior, strong emotional reactions and squirminess. This means that students who are normally dismissed from the classroom for disruptive behavior are more likely to stay in class if they’re not sleep-deprived. More time in the class leads to more learning. © 2010–2018, The Conversation US, Inc.

Keyword: Sleep; Development of the Brain
Link ID: 25681 - Posted: 11.14.2018