Chapter 7. Life-Span Development of the Brain and Behavior

Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.


Links 1 - 20 of 4757

Joseph Jebelli The terror of Alzheimer’s is that it acts by degrees, and can therefore bewilder family members as much as its victims. Those who first notice the onset of Alzheimer’s in a loved one tell of forgotten names and unsettling behaviour, of car keys found in the fridge and clothing in the kitchen cabinet, of aimless wanderings. Naturally, they want to understand the boundaries of normal ageing and whether these are being crossed. Often, the answer arrives when they’re greeted as complete strangers, when the patient’s mind becomes irrevocably unmoored from its past. The disease is terrifying for its insidiousness as well as its long-term manifestations. Fear partly explains why Alzheimer’s has been ignored for so long. Yet it is now the leading cause of death among the oldest people, and according to Professor Sir Michael Marmot, an expert in health inequalities, it could be an “important part” of the stagnation in increases in life expectancy since 2010 that he has identified. As a researcher, I have been struck by how many patients speak openly about their condition only after receiving a diagnosis. “I knew something wasn’t right. Sometimes I don’t know what day of the week it is or what I have to do,” one newly diagnosed patient told me. “I look in my calendar but then I think: why am I looking at this? My husband was the one who made me see a GP. I was too frightened. I thought I might have it but I didn’t want to hear it.” © 2017 Guardian News and Media Limited

Keyword: Alzheimers
Link ID: 23862 - Posted: 07.22.2017

By Becca Cudmore A mother rat’s care for her pup reaches all the way into her offspring’s DNA. A young rat that gets licked and groomed a lot early on in life exhibits diminished responses to stress thanks to epigenetic changes in the hippocampus, a brain region that helps transform emotional information into memory. Specifically, maternal solicitude reduces DNA methylation and changes the structure of DNA-packaging proteins, triggering an uptick in the recycling of the neurotransmitter serotonin and the upregulation of the glucocorticoid receptor. These changes make the nurtured rat’s brain quicker to sense and tamp down the production of stress hormones in response to jarring experiences such as unexpected sound and light. That pup will likely grow into a calm adult, and two studies have shown that female rats who exhibit a dampened stress response are more likely to generously lick, groom, and nurse their own young. Caring for pups is one example of what casual observers of behavior might call an animal’s instinct—generally considered to be an innate, genetically encoded phenomenon. But could such epigenetic changes, when encoded as ancestral learning, also be at the root of maternal care and other seemingly instinctual behaviors we see across the animal kingdom? “We don’t have a general theory for the mechanics of instinct as we do for learning, and this is something that has troubled me for a very long time,” says University of Illinois entomologist Gene Robinson. He studies social evolution in the Western honey bee and recently coauthored a perspective piece in Science together with neurobiologist Andrew Barron of Macquarie University in Sydney, Australia, suggesting methylation as a possible mechanism for the transgenerational transmission of instinctual behavior, rather than those behaviors being hardwired in the genome (356:26-27, 2017). Robinson and Barron suggest that instinctual traits, such as honey bees’ well-known waggle dance or a bird’s in-born ability to sing its species’ songs, are the result of traits first learned by their ancestors and inherited across generations by the process of methylation. This differs from classical thoughts on animal learning, which say that if a behavior is learned, it is not innate, and will not be inherited. © 1986-2017 The Scientist

Keyword: Epigenetics; Evolution
Link ID: 23861 - Posted: 07.22.2017

By Fergus Walsh Medical correspondent One in three cases of dementia could be prevented if more people looked after their brain health throughout life, according to an international study in the Lancet. It lists nine key risk factors including lack of education, hearing loss, smoking and physical inactivity. The study is being presented at the Alzheimer's Association International Conference in London. By 2050, 131 million people could be living with dementia globally. There are estimated to be 47 million people with the condition at the moment. Nine factors that contribute to the risk of dementia Mid-life hearing loss - responsible for 9% of the risk Failing to complete secondary education - 8% Smoking - 5% Failing to seek early treatment for depression - 4% Physical inactivity - 3% Social isolation - 2% High blood pressure - 2% Obesity - 1% Type 2 diabetes - 1% These risk factors - which are described as potentially modifiable - add up to 35%. The other 65% of dementia risk is thought to be potentially non-modifiable. Source: Lancet Commission on dementia prevention, intervention and care "Although dementia is diagnosed in later life, the brain changes usually begin to develop years before," said lead author Prof Gill Livingston, from University College London. "Acting now will vastly improve life for people with dementia and their families and, in doing so, will transform the future of society." The report, which combines the work of 24 international experts, says lifestyle factors can play a major role in increasing or reducing an individual's dementia risk. It examines the benefits of building a "cognitive reserve", which means strengthening the brain's networks so it can continue to function in later life despite damage. Image caption Eve Laird is taking part in a study on how to prevent dementia © 2017 BBC

Keyword: Alzheimers
Link ID: 23856 - Posted: 07.21.2017

Laurel Hamers A common blood sugar medication or an extra dose of a thyroid hormone can reverse signs of cognitive damage in rats exposed in utero to alcohol. Both affect an enzyme that controls memory-related genes in the hippocampus, researchers report July 18 in Molecular Psychiatry. That insight might someday help scientists find an effective human treatment for fetal alcohol spectrum disorders, which can cause lifelong problems with concentration, learning and memory. “At this moment, there’s really no pharmaceutical therapy,” says R. Thomas Zoeller, a neurobiologist at the University of Massachusetts Amherst. Fetal alcohol syndrome disorders may affect up to 5 percent of U.S. kids, according to estimates from the U.S. Centers for Disease Control and Prevention. Scientists don’t know exactly why alcohol has such a strong effect on developing brains. But the lower thyroid hormone levels commonly induced by alcohol exposure might be one explanation, suggests study coauthor Eva Redei, a psychiatrist at Northwestern University Feinberg School of Medicine in Chicago. “The mother has to supply the thyroid hormones for brain development,” says Redei. So, pregnant women who drink might not be providing their fetuses with enough hormones for normal brain development. That could disrupt the developing hippocampus, a brain region involved in learning and memory. To counter alcohol’s effects, Redei and her colleagues gave doses of thyroxine, a thyroid hormone, to newborn rats that had been exposed to alcohol before birth. (That timing coincides developmentally with the third trimester of pregnancy in humans.) The amount of alcohol fed to the rat moms corresponded roughly to a woman drinking a glass or two of wine a day. |© Society for Science & the Public 2000 - 2017

Keyword: Drug Abuse; Development of the Brain
Link ID: 23849 - Posted: 07.19.2017

By Tara Bahrampour A significant portion of people with mild cognitive impairment or dementia who are taking medication for Alzheimer’s may not actually have the disease, according to interim results of a major study currently underway to see how PET scans could change the nature of Alzheimer’s diagnosis and treatment. The findings, presented Wednesday at the Alzheimer’s Association International Conference in London, come from a four-year study launched in 2016 that is testing over 18,000 Medicare beneficiaries with MCI or dementia to see if their brains contain the amyloid plaques that are one of the two hallmarks of the disease. So far, the results have been dramatic. Among 4,000 people tested so far in the Imaging Dementia-Evidence for Amyloid Scanning (IDEAS) study, researchers from the Memory and Aging Center at the University of California, San Francisco found that just 54.3 percent of MCI patients and 70.5 percent of dementia patients had the plaques. A positive test for amyloid does not mean someone has Alzheimer’s, though its presence precedes the disease and increases the risk of progression. But a negative test definitively means a person does not have it. The findings could change the way doctors treat people in these hard-to-diagnose groups and save money currently being spent on inappropriate medication. “If someone had a putative diagnosis of Alzheimer’s Disease, they might be on an Alzheimer’s drug like Aricept or Namenda,” said James Hendrix, the Alzheimer Association’s director of global science initiatives who co-presented the findings. “What if they had a PET scan and it showed that they didn’t have amyloid in their brain? Their physician would take them off that drug and look for something else.” © 1996-2017 The Washington Post

Keyword: Alzheimers; Brain imaging
Link ID: 23848 - Posted: 07.19.2017

Shirley S. Wang In nursing homes and residential facilities around the world, health care workers are increasingly asking dementia patients questions: What are your interests? How do you want to address us? What should we do to celebrate the life of a friend who has passed away? The questions are part of an approach to care aimed at giving people with memory loss and other cognitive problems a greater sense of control and independence. At its core is the idea that an individual with dementia should be treated as a whole person and not "just" a patient. Scientists sometimes call this approach an ecopsychosocial intervention. The goal is to create environments that better meet patients' psychological and emotional needs through strategies other than medication. At the Alzheimer's Association International Conference this week in London, researchers from the U.S., the U.K. and Israel presented data from four trials demonstrating that such interventions significantly improve residents' mood and quality of life. The interventions can also reduce their use of antipsychotic drugs and improve their ability to care for themselves. Taken together, these studies and others suggest that relatively simple and potentially cost-effective interventions can yield significant benefits for people with dementia, even those in residential facilities in the later stages of disease. Behavioral Therapy Helps More Than Drugs For Dementia Patients As the population continues to age, and the number of people with dementia continues to rise, these interventions are likely to increase in importance as well. © 2017 npr

Keyword: Alzheimers; Attention
Link ID: 23847 - Posted: 07.19.2017

Jon Hamilton Harsh life experiences appear to leave African-Americans vulnerable to Alzheimer's and other forms of dementia, researchers reported Sunday at the Alzheimer's Association International Conference in London. Several teams presented evidence that poverty, disadvantage and stressful life events are strongly associated with cognitive problems in middle age and dementia later in life among African-Americans. The findings could help explain why African-Americans are twice as likely as white Americans to develop dementia. And the research suggests genetic factors are not a major contributor. "The increased risk seems to be a matter of experience rather than ancestry," says Megan Zuelsdorff, a postdoctoral fellow in the Health Disparities Research Scholars Program at the University of Wisconsin-Madison. Scientists have struggled to understand why African-Americans are so likely to develop dementia. They are more likely to have conditions like high blood pressure and diabetes, which can affect the brain. And previous research has found some evidence that African-Americans are more likely to carry genes that raise the risk. But more recent studies suggest those explanations are incomplete, says Rachel Whitmer, an epidemiologist with Kaiser Permanente's Division of Research in Northern California. Whitmer has been involved in several studies that accounted for genetic and disease risks when comparing dementia in white and black Americans. "And we still saw these [racial] differences," she says. "So there is still something there that we are trying to get at." © 2017 npr

Keyword: Alzheimers; Stress
Link ID: 23841 - Posted: 07.17.2017

Nicola Davis People who experience hearing loss could be at greater risk of memory and thinking problems later in life than those without auditory issues, research suggests. The study focused on people who were at risk of Alzheimer’s disease, revealing that those who were diagnosed with hearing loss had a higher risk of “mild cognitive impairment” four years later. “It’s really not mild,” said Clive Ballard, professor of age-related disease at the University of Exeter. “They are in the lowest 5% of cognitive performance and about 50% of those individuals will go on to develop dementia.” Guardian Morning Briefing - sign up and start the day one step ahead Read more Presented at the Alzheimer’s Association International Conference in London, researchers from the US looked at the memory and thinking skills of 783 cognitively healthy participants in late middle age, more than two-thirds of whom had at least one parent who had been diagnosed with Alzheimer’s disease. The team carried out a range of cognitive tests on the participants over a four-year period, aimed at probing memory and mental processing, revealing that those who had hearing loss at the start of the study were more than twice as likely to be found to have mild cognitive impairment four years later than those with no auditory problems, once a variety of other risk factors were taken into account. Taylor Fields, a PhD student at the University of Wisconsin who led the research, said that the findings suggest hearing loss could be an early warning sign that an individual might be at greater risk of future cognitive impairment - but added more research was necessary to unpick the link. “There is something here and it should be looked into,” she said. © 2017 Guardian News and Media Limited

Keyword: Hearing; Alzheimers
Link ID: 23840 - Posted: 07.17.2017

Pagan Kennedy In 2011, Ben Trumble emerged from the Bolivian jungle with a backpack containing hundreds of vials of saliva. He had spent six weeks following indigenous men as they tramped through the wilderness, shooting arrows at wild pigs. The men belonged to the Tsimane people, who live as our ancestors did thousands of years ago — hunting, foraging and farming small plots of land. Dr. Trumble had asked the men to spit into vials a few times a day so that he could map their testosterone levels. In return, he carried their kills and helped them field-dress their meat — a sort of roadie to the hunters. Dr. Trumble wanted to find out whether the hunters who successfully shot an animal would be rewarded with a spike in testosterone. (They were.) As a researcher with the Tsimane Health and Life History Project, he had joined a long-running investigation into human well-being and aging in the absence of industrialization. That day when he left the jungle, he stumbled across a new and more urgent question about human health. He dropped his backpack, called his mom and heard some terrible news: His 64-year-old uncle had learned he had dementia, probably Alzheimer’s. In just a few short years, his uncle, a vibrant former lawyer, would stop speaking, stop eating and die. “I couldn’t help my uncle,” Dr. Trumble said, but he was driven to understand the disease that killed him. He wondered: Do the Tsimane suffer from Alzheimer’s disease like we do? And if not, what can we learn from them about treating or preventing dementia? “There is really no cure yet for Alzheimer’s,” Dr. Trumble told me. “We have nothing that can undo the damage already done.” Why, he wondered, had billions of dollars and decades of research yielded so little? Perhaps major clues were being missed. Dr. Trumble was trained as an anthropologist, and his field — evolutionary medicine — taught him to see our surroundings as a blip in the timeline of human history. He thinks it’s a problem that medical research focuses almost exclusively on “people who live in cities like New York or L.A.” Scientists often refer to these places as “Weird” — Western, educated, industrialized, rich and democratic — and point out that our bodies are still designed for the not-Weird environment in which our species evolved. Yet we know almost nothing about how dementia affected humans during the 50,000 years before developments like antibiotics and mechanized farming. Studying the Tsimane, Dr. Trumble believes, could shed light on this modern plague. © 2017 The New York Times Company

Keyword: Alzheimers
Link ID: 23839 - Posted: 07.14.2017

By PAM BELLUCK How we look at other people’s faces is strongly influenced by our genes, scientists have found in new research that may be especially important for understanding autism because it suggests that people are born with neurological differences that affect how they develop socially. The study, published on Wednesday in the journal Nature, adds new pieces to the nature-versus-nurture puzzle, suggesting that genetics underlie how children seek out formative social experiences like making eye contact or observing facial expressions. Experts said the study may also provide a road map for scientists searching for genes linked to autism. “These are very convincing findings, novel findings,” said Charles A. Nelson III, a professor of pediatrics and neuroscience at Harvard Medical School and Boston Children’s Hospital, who was not involved in the research. “They seem to suggest that there’s a genetic underpinning that leads to different patterns of brain development, that leads some kids to develop autism.” Dr. Nelson, an expert in child development and autism who was an independent reviewer of the study for Nature, said that while autism is known to have a genetic basis, how specific genes influence autism’s development remains undetermined. The study provides detailed data on how children look at faces, including which features they focus on and when they move their eyes from one place to another. The information, Dr. Nelson said, could help scientists “work out the circuitry that controls these eye movements, and then we ought to be able to work out which genes are being expressed in that circuit.” “That would be a big advance in autism,” he said. In the study, scientists tracked the eye movements of 338 toddlers while they watched videos of motherly women as well as of children playing in a day care center. The toddlers, 18 months to 24 months old, included 250 children who were developing normally (41 pairs of identical twins, 42 pairs of nonidentical twins and 84 children unrelated to each other). There were also 88 children with autism. © 2017 The New York Times Company

Keyword: Autism; Vision
Link ID: 23832 - Posted: 07.13.2017

Carina Storrs In the late 1960s, a team of researchers began doling out a nutritional supplement to families with young children in rural Guatemala. They were testing the assumption that providing enough protein in the first few years of life would reduce the incidence of stunted growth. It did. Children who got supplements grew 1 to 2 centimetres taller than those in a control group. But the benefits didn't stop there. The children who received added nutrition went on to score higher on reading and knowledge tests as adolescents, and when researchers returned in the early 2000s, women who had received the supplements in the first three years of life completed more years of schooling and men had higher incomes1. “Had there not been these follow-ups, this study probably would have been largely forgotten,” says Reynaldo Martorell, a specialist in maternal and child nutrition at Emory University in Atlanta, Georgia, who led the follow-up studies. Instead, he says, the findings made financial institutions such as the World Bank think of early nutritional interventions as long-term investments in human health. Since the Guatemalan research, studies around the world — in Brazil, Peru, Jamaica, the Philippines, Kenya and Zimbabwe — have all associated poor or stunted growth in young children with lower cognitive test scores and worse school achievement2. A picture slowly emerged that being too short early in life is a sign of adverse conditions — such as poor diet and regular bouts of diarrhoeal disease — and a predictor for intellectual deficits and mortality. But not all stunted growth, which affects an estimated 160 million children worldwide, is connected with these bad outcomes. Now, researchers are trying to untangle the links between growth and neurological development. Is bad nutrition alone the culprit? What about emotional neglect, infectious disease or other challenges? © 2017 Macmillan Publishers Limited

Keyword: Development of the Brain
Link ID: 23831 - Posted: 07.13.2017

By Jane C. Hu In English the sky is blue, and the grass is green. But in Vietnamese there is just one color category for both sky and grass: xanh. For decades cognitive scientists have pointed to such examples as evidence that language largely determines how we see color. But new research with four- to six-month-old infants indicates that long before we learn language, we see up to five basic categories of hue—a finding that suggests a stronger biological component to color perception than previously thought. The study, published recently in the Proceedings of the National Academy of Sciences USA, tested the color-discrimination abilities of more than 170 British infants. Researchers at the University of Sussex in England measured how long babies spent gazing at color swatches, a metric known as looking time. First the team showed infants one swatch repeatedly until their looking time decreased—a sign they had grown bored with it. Then the researchers showed them a different swatch and noted their reaction. Longer looking times were interpreted to mean the babies considered the second swatch to be a new hue. Their cumulative responses showed that they distinguished among five colors: red, green, blue, purple and yellow. The finding “suggests we’re all working from the same template,” explains lead author Alice Skelton, a doctoral student at Sussex. “You come prepackaged to make [color] distinctions, but given your culture and language, certain distinctions may or may not be used.” For instance, infants learning Vietnamese most likely see green and blue, even if their native language does not use distinct words for the two colors. © 2017 Scientific American

Keyword: Vision; Development of the Brain
Link ID: 23830 - Posted: 07.13.2017

By Giorgia Guglielmi Semen has something in common with the brains of Alzheimer’s sufferers: Both contain bundles of protein filaments called amyloid fibrils. But although amyloid accumulation appears to damage brain cells, these fibrils may be critical for reproduction. A new study suggests that semen fibrils immobilize subpar sperm, ensuring that only the fittest ones make it to the egg. “I’m sure that from the very first time scientists described semen fibrils, they must have been speculating what their natural function was,” says Daniel Otzen, an expert in protein aggregates at Aarhus University in Denmark, who did not participate in the research. “This seems to be the smoking gun.” Researchers discovered semen fibrils in 2007. At first, they seemed like mostly bad news. Scientists showed that the fibrils, found in the seminal fluid together with sperm cells and other components, can bind to HIV, helping it get inside cells. But the fibrils are found in most primates, notes Nadia Roan, a mucosal biologist at the University of California, San Francisco. “If fibrils didn’t serve some beneficial purpose, they would have been eliminated over evolutionary time.” Because the way HIV fuses to cells is reminiscent of the way a sperm fuses to the egg, she wondered whether the fibrils facilitated fertilization. © 2017 American Association for the Advancement of Science.

Keyword: Alzheimers; Sexual Behavior
Link ID: 23828 - Posted: 07.12.2017

By Linda Geddes Many dangers stalk the bushlands of Tanzania while members of the Hadza people sleep, yet no one keeps watch. There is no need because it seems that natural variation in sleep means there’s rarely a moment when someone isn’t alert enough to raise the alarm. That’s the conclusion of a study that sheds new light on why teenagers sleep late while grandparents are often up at the crack of dawn. Fifty years ago, psychologist Frederick Snyder proposed that animals who live in groups stay vigilant during sleep, by having some stay awake while others rest. However, no one had tested this sentinel hypothesis in humans until now. One way of maintaining this constant vigilance might be by the evolution of different chronotypes – individual differences in when we tend to sleep. This changes as we age, with teenagers shifting towards later bedtimes, and older people towards earlier bedtimes. Would such variability be enough to keep a community safe at night? To investigate, David Samson, then at the University of Toronto in Canada, and his colleagues turned to the Hadza, a group of hunter-gatherers in northern Tanzania. The Hadza sleep in grass huts, each containing one or two adults and often several children. They live in camps of around 30 adults, although several other camps may be close by. Samson recruited 33 adults from two nearby groups of 22 huts and asked them to wear motion-sensors on their wrists to monitor sleep, for 20 days. “It turned out that it was extremely rare for there to be synchronous sleep,” says Samson, now at Duke University in Durham, North Carolina. © Copyright New Scientist Ltd.

Keyword: Sleep; Development of the Brain
Link ID: 23827 - Posted: 07.12.2017

By Jessica Wright, Spectrum on July 11, 2017 Treatment with the hormone oxytocin improves social skills in some children with autism, suggest results from a small clinical trial. The results appeared today in the Proceedings of the National Academy of Sciences1. Oxytocin, dubbed the ‘love hormone,’ enhances social behavior in animals. This effect makes it attractive as a potential autism treatment. But studies in people have been inconsistent: Some small trials have shown that the hormone improves social skills in people with autism, and others have shown no benefit. This may be because only a subset of people with autism respond to the treatment. In the new study, researchers tried to identify this subset. The same team showed in 2014 that children with relatively high blood levels of oxytocin have better social skills than do those with low levels2. In their new work, the researchers examined whether oxytocin levels in children with autism alter the children’s response to treatment with the hormone. They found that low levels of the hormone prior to treatment are associated with the most improvement in social skills. “We need to be thinking about a precision-medicine approach for autism,” says Karen Parker, associate professor of psychiatry at Stanford University in California, who co-led the study. “There’s been a reasonable number of failed [oxytocin] trials, and the question is: Could they have failed because all of the kids, by blind, dumb luck, had really high baseline oxytocin levels?” The study marks the first successful attempt to find a biological marker that predicts response to the therapy. © 2017 Scientific American,

Keyword: Autism; Hormones & Behavior
Link ID: 23826 - Posted: 07.12.2017

Tina Hesman Saey How well, not how much, people sleep may affect Alzheimer’s disease risk. Healthy adults built up Alzheimer’s-associated proteins in their cerebral spinal fluid when prevented from getting slow-wave sleep, the deepest stage of sleep, researchers report July 10 in Brain. Just one night of deep-sleep disruption was enough to increase the amount of amyloid-beta, a protein that clumps into brain cell‒killing plaques in people with Alzheimer’s. People in the study who slept poorly for a week also had more of a protein called tau in their spinal fluid than they did when well rested. Tau snarls itself into tangles inside brain cells of people with the disease. These findings support a growing body of evidence that lack of Zs is linked to Alzheimer’s and other neurodegenerative diseases. Specifically, “this suggests that there’s something special about deep, slow-wave sleep,” says Kristine Yaffe, a neurologist and psychiatrist at the University of California, San Francisco who was not involved in the study. People with Alzheimer’s are notoriously poor sleepers, but scientists aren’t sure if that is a cause or a consequence of the disease. Evidence from recent animal and human studies suggests the problem goes both ways, Yaffe says. Lack of sleep may make people more prone to brain disorders. And once a person has the disease, disruptions in the brain may make it hard to sleep. Still, it wasn’t clear why not getting enough shut-eye promotes Alzheimer’s disease.

Keyword: Sleep; Alzheimers
Link ID: 23824 - Posted: 07.11.2017

By Jennifer Oullette Are brain-training games any better at improving your ability to think, remember and focus than regular computer games? Possibly not, if the latest study is anything to go by. Joseph Kable at the University of Pennsylvania and his colleagues have tested the popular Luminosity brain-training program from Lumos Labs in San Francisco, California, against other computer games and found no evidence that it is any better at improving your thinking skills. Brain-training is a booming market. It’s based on the premise that our brains change in response to learning challenges. Unlike computer games designed purely for entertainment, brain-training games are meant to be adaptive, adjusting challenge levels in response to a player’s changing performance. The thinking is that this should improve a player’s memory, attention, focus and multitasking skills. But there are questions over whether brain-training platforms can enhance cognitive function in a way that is meaningful for wider life. Last year, Lumos Labs paid $2 million to settle a charge from the US Federal Trade Commission for false advertising. Advertising campaigns had claimed that the company’s memory and attention games could reduce the effects of age-related dementia, and stave off Alzheimer’s disease. Most studies on the effects of brain-training games have been small and had mixed results. For this study, Kable and his colleagues recruited 128 young healthy adults for a randomised controlled trial. © Copyright New Scientist Ltd.

Keyword: Learning & Memory; Alzheimers
Link ID: 23821 - Posted: 07.11.2017

Aimee Cunningham An expectant mom might want to think twice about quenching her thirst with soda. The more sugary beverages a mom drank during mid-pregnancy, the heavier her kids were in elementary school compared with kids whose mothers consumed less of the drinks, a new study finds. At age 8, boys and girls weighed approximately 0.25 kilograms more — about half a pound — with each serving mom added per day while pregnant, researchers report online July 10 in Pediatrics. “What happens in early development really has a long-term impact,” says Meghan Azad, an epidemiologist at the University of Manitoba in Canada, who was not involved in the study. A fetus’s metabolism develops in response to the surrounding environment, including the maternal diet, she says. The new findings come out of a larger project that studies the impact of pregnant moms’ diets on their kids’ health. “We know that what mothers eat during pregnancy may affect their children’s health and later obesity,” says biostatistician Sheryl Rifas-Shiman of Harvard Medical School and Harvard Pilgrim Health Care Institute in Boston. “We decided to look at sugar-sweetened beverages as one of these factors.” Sugary drinks are associated with excessive weight gain and obesity in studies of adults and children. Rifas-Shiman and colleagues included 1,078 mother-child pairs in the study. Moms filled out a questionnaire in the first and second trimesters of their pregnancy about what they were drinking — soda, fruit drinks, 100 percent fruit juice, diet soda or water — and how often. Soda and fruit drinks were considered sugar-sweetened beverages. A serving was defined as a can, glass or bottle of a beverage. |© Society for Science & the Public 2000 - 2017

Keyword: Obesity; Development of the Brain
Link ID: 23820 - Posted: 07.11.2017

Ewen Callaway For 18 months in the early 1980s, John Sulston spent his days watching worms grow. Working in twin 4-hour shifts each day, Sulston would train a light microscope on a single Caenorhabditis elegans embryo and sketch what he saw at 5-minute intervals, as a fertilized egg morphed into two cells, then four, eight and so on. He worked alone and in silence in a tiny room at the Medical Research Council Laboratory of Molecular Biology in Cambridge, UK, solving a Rubik's cube between turns at the microscope. “I did find myself little distractions,” the retired Nobel prize-winning biologist once recalled. His hundreds of drawings revealed the rigid choreography of early worm development, encompassing the births of precisely 671 cells, and the deaths of 111 (or 113, depending on the worm’s sex). Every cell could be traced to its immediate forebear and then to the one before that in a series of invariant steps. From these maps and others, Sulston and his collaborators were able to draw up the first, and so far the only, complete ‘cell-lineage tree’ of a multicellular organism1. Although the desire to record an organism’s development in such exquisite detail preceded Sulston by at least a century, the ability to do so in more-complex animals has been limited. No one could ever track the fates of billions of cells in a mouse or a human with just a microscope and a Rubik’s cube to pass the time. But there are other ways. Revolutions in biologists’ ability to edit genomes and sequence them at the level of a single cell have sparked a renaissance in cell-lineage tracing. © 2017 Macmillan Publishers Limited

Keyword: Development of the Brain
Link ID: 23812 - Posted: 07.07.2017

By James Gallagher Abnormal deposits that build up in the brain during Alzheimer's have been pictured in unprecedented detail by UK scientists. The team at the MRC Laboratory of Molecular Biology says its findings "open up a whole new era" in neurodegenerative disease. Their work should make it easier to design drugs to stop brain cells dying. The researchers used brain tissue from a 74-year-old woman who died after having Alzheimer's disease. The form of dementia leads to tangles of a protein called tau spreading throughout the brain. The more tau tangles there are, the worse the symptoms tend to be. Doctors have known this has happened for decades but what has been missing is a detailed understanding of what the tangles look like. The team took advantage of the "resolution revolution" in microscopy to take thousands of highly detailed images of the tau inside the woman's brain tissues. And using computer software, they figured out the tangles look like this: Image copyright LMB It is pretty meaningless to an untrained eye, but to scientists this could be one of the most important recent discoveries in tackling dementia. Attempts to develop a drug to slow the pace of dementia have been met by repeated failure. But it is hard to come up with a drug when you do not know the precise chemical structure of what you are targeting. Dr Sjors Scheres, one of the researchers, told the BBC News website: "It's like shooting in the dark - you can still hit something but you are much more likely to hit if you know what the structure is. "We are excited - it opens up a whole new era in this field, it really does." © 2017 BBC.

Keyword: Alzheimers
Link ID: 23807 - Posted: 07.06.2017