Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Will Boggs MD NEW YORK (Reuters Health) - Adolescents with a history of childhood trauma show different neural responses to subjective anxiety and craving, researchers report. "I think the finding of increased activation of insula, anterior cingulate, and prefrontal cortex in response to stress cues in the high- relative to low-trauma group, while arguably not necessarily unexpected, is important as it suggests that youth exposed to higher levels of trauma may experience different brain responses to similar stressors," Dr. Marc N. Potenza from Yale University, New Haven, Connecticut told Reuters Health by email. Childhood trauma has been associated with anxiety and depression, as well as obesity, risky sexual behavior, and substance use. Previous imaging studies have not investigated neural responses to personalized stimuli, Dr. Potenza and his colleagues write in Neuropsychopharmacology, online January 8. The team used functional MRI to assess regional brain activations to personalized appetitive (favorite food), aversive (stress), and neutral/relaxing cues in 64 adolescents, including 33 in the low-trauma group and 31 in the high-trauma group. Two-thirds of the adolescents had been exposed to cocaine prenatally, with prenatal cocaine exposure being significantly over-represented in the high-trauma group. Compared with the low-trauma group, the high-trauma group showed increased responsivity in several cortical regions in response to stress, as well as decreased activation in the cerebellar vermis and right cerebellum in response to neutral/relaxing cues. But the two groups did not differ significantly in their responses to favorite-food cues, the researchers found. © 2015 Scientific American
Children who attend school in heavy traffic areas may show slower cognitive development and lower memory test scores, Spanish researchers have found. About 21,000 premature deaths are attributed to air pollution in Canada each year, according to the Canadian Medical Association. The detrimental effects of air pollution on cardiovascular health and on the lungs are well documented and now researchers are looking at its effects on the brain. To that end, Dr. Jordi Sunyer and his colleagues from the Centre for Research in Environmental Epidemiology in Barcelona measured three aspects of memory and attentiveness in more than 2,700 primary school children every three months over 12 months. "What was surprising for us is among our children, we see very robust, consistent effects," Sunyer said Tuesday from Rome. The associations between slower cognitive development and higher levels of air pollutants remained after the researchers took factors such as parents’ education, commuting time, smoking in the home and green spaces at school into account. The researchers measured air pollutants from traffic twice, in the school courtyard and inside the classroom for schools with high and low traffic-related air pollution. Pollutants from burning fossil fuels, carbon, nitrogen dioxide and ultrafine particles were measured. For example, working memory improved 7.4 per cent among children in highly polluted schools compared with 11.5 per cent among those in less polluted schools. ©2015 CBC/Radio-Canada.
By Nicholas Bakalar Gout, a form of arthritis, is extremely painful and associated with an increased risk for cardiovascular problems. But there is a bright side: It may be linked to a reduced risk for Alzheimer’s disease. Researchers compared 59,204 British men and women with gout to 238,805 without the ailment, with an average age of 65. Patients were matched for sex, B.M.I., smoking, alcohol consumption and other characteristics. The study, in The Annals of the Rheumatic Diseases, followed the patients for five years. They found 309 cases of Alzheimer’s among those with gout and 1,942 among those without. Those with gout, whether they were being treated for the condition or not, had a 24 percent lower risk of Alzheimer’s disease. The reason for the connection is unclear. But gout is caused by excessive levels of uric acid in the blood, and previous studies have suggested that uric acid protects against oxidative stress. This may play a role in limiting neuron degeneration. “This is a dilemma, because uric acid is thought to be bad, associated with heart disease and stroke,” said the senior author, Dr. Hyon K. Choi, a professor of medicine at Harvard. “This is the first piece of data suggesting that uric acid isn’t all bad. Maybe there is some benefit. It has to be confirmed in randomized trials, but that’s the interesting twist in this story.” © 2015 The New York Times Company
By Roni Caryn Rabin When my mother, Pauline, was 70, she lost her sense of balance. She started walking with an odd shuffling gait, taking short steps and barely lifting her feet off the ground. She often took my hand, holding it and squeezing my fingers. Her decline was precipitous. She fell repeatedly. She stopped driving, and she could no longer ride her bike in a straight line along the C&O Canal. The woman who taught me the sidestroke couldn’t even stand in the shallow end of the pool. “I feel like I’m drowning,” she’d say. A retired psychiatrist, my mother had numerous advantages — education, resources and insurance — but, still, getting the right diagnosis took nearly 10 years. Each expert saw the problem through the narrow prism of a single specialty. Surgeons recommended surgery. Neurologists screened for common incurable conditions. The answer was under their noses, in my mother’s hunches and her family history. But it took a long time before someone connected the dots. My mother was using a walker by the time she was told she had a rare condition that causes gait problems and cognitive loss, and is one of the few treatable forms of dementia. The bad news was that it had taken so long to get the diagnosis that some of the damage might not be reversible. “This should be one of the first things physicians look for in an older person,” my mother said recently. “You can actually do something about it.”
Link ID: 20643 - Posted: 03.03.2015
By ROBERT PEAR WASHINGTON — Federal investigators say they have found evidence of widespread overuse of psychiatric drugs by older Americans with Alzheimer’s disease, and are recommending that Medicare officials take immediate action to reduce unnecessary prescriptions. The findings will be released Monday by the Government Accountability Office, an arm of Congress, and come as the Obama administration has already been working with nursing homes to reduce the inappropriate use of antipsychotic medications like Abilify, Risperdal, Zyprexa and clozapine. But in the study, investigators said officials also needed to focus on overuse of such drugs by people with dementia who live at home or in assisted living facilities. The Department of Health and Human Services “has taken little action” to reduce the use of antipsychotic drugs by older adults living outside nursing homes, the report said. Doctors sometimes prescribe antipsychotic drugs to calm patients with dementia who display disruptive behavior like hitting, yelling or screaming, the report said. Researchers said this was often the case in nursing homes that had inadequate numbers of employees. Dementia is most commonly associated with a decline in memory, but doctors say it can also cause changes in mood or personality and, at times, agitation or aggression. Experts have raised concern about the use of antipsychotic drugs to address behavioral symptoms of Alzheimer’s and other forms of dementia. The Food and Drug Administration says antipsychotic drugs are often associated with an increased risk of death when used to treat older adults with dementia who also have psychosis. © 2015 The New York Times Company
By Elizabeth Pennisi Last week, researchers expanded the size of the mouse brain by giving rodents a piece of human DNA. Now another team has topped that feat, pinpointing a human gene that not only grows the mouse brain but also gives it the distinctive folds found in primate brains. The work suggests that scientists are finally beginning to unravel some of the evolutionary steps that boosted the cognitive powers of our species. “This study represents a major milestone in our understanding of the developmental emergence of human uniqueness,” says Victor Borrell Franco, a neurobiologist at the Institute of Neurosciences in Alicante, Spain, who was not involved with the work. The new study began when Wieland Huttner, a developmental neurobiologist at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and his colleagues started closely examining aborted human fetal tissue and embryonic mice. “We specifically wanted to figure out which genes are active during the development of the cortex, the part of the brain that is greatly expanded in humans and other primates compared to rodents,” says Marta Florio, the Huttner graduate student who carried out the main part of the work. That was harder than it sounded. Building a cortex requires several kinds of starting cells, or stem cells. The stem cells divide and sometimes specialize into other types of “intermediate” stem cells that in turn divide and form the neurons that make up brain tissue. To learn what genes are active in the two species, the team first had to develop a way to separate out the various types of cortical stem cells. © 2015 American Association for the Advancement of Science
By DENISE GRADY Faced with mounting evidence that general anesthesia may impair brain development in babies and young children, experts said Wednesday that more research is greatly needed and that when planning surgery for a child, parents and doctors should consider how urgently it is required, particularly in children younger than 3 years. In the United States each year, about a million children younger than 4 have surgery with general anesthesia, according to the Food and Drug Administration. So far, the threat is only a potential one; there is no proof that children have been harmed. The concern is based on two types of research. Experiments in young monkeys and other animals have shown that commonly used anesthetics and sedatives can kill brain cells, diminish learning and memory and cause behavior problems. And studies in children have found an association between learning problems and multiple exposures to anesthesia early in life — though not single exposures. But monkeys are not humans, and association does not prove cause and effect. Research now underway is expected to be more definitive, but results will not be available for several years. Anesthesiologists and surgeons are struggling with how — and sometimes whether — to explain a theoretical hazard to parents who are already worried about the real risks of their child’s medical problem and the surgery needed to correct it. If there is a problem with anesthesia, in many cases it may be unavoidable because there are no substitute drugs. The last thing doctors want to do is frighten parents for no reason or prompt them to delay or cancel an operation that their child needs. “On the one hand, we don’t want to overstate the risk, because we don’t know what the risk is, if there is a risk,” said Dr. Randall P. Flick, a pediatric anesthesiologist and director of Mayo Clinic Children’s Center in Rochester, Minn., who has conducted some of the studies in children suggesting a link to learning problems. “On the other hand, we want to make people aware of the risk because we feel we have a duty to do so.” © 2015 The New York Times Compan
By Michelle Roberts Health editor, BBC News online Scientists have proposed a new idea for detecting brain conditions including Alzheimer's - a skin test. Their work, which is at an early stage, found the same abnormal proteins that accumulate in the brain in such disorders can also be found in skin. Early diagnosis is key to preventing the loss of brain tissue in dementia, which can go undetected for years. But experts said even more advanced tests, including ones of spinal fluid, were still not ready for clinic. If they were, then doctors could treatment at the earliest stages, before irreversible brain damage or mental decline has taken place. Brain biomarker Investigators have been hunting for suitable biomarkers in the body - molecules in blood or exhaled breath, for example, that can be measured to accurately and reliably signal if a disease or disorder is present. Dr Ildefonso Rodriguez-Leyva and colleagues from the University of San Luis Potosi, Mexico, believe skin is a good candidate for uncovering hidden brain disorders. Skin has the same origin as brain tissue in the developing embryo and might, therefore, be a good window to what's going on in the mind in later life - at least at a molecular level - they reasoned. Post-mortem studies of people with Parkinson's also reveal that the same protein deposits which occur in the brain with this condition also accumulate in the skin. To test if the same was true in life as after death, the researchers recruited 65 volunteers - 12 who were healthy controls and the remaining 53 who had either Parkinson's disease, Alzheimer's or another type of dementia. They took a small skin biopsy from behind the ear of each volunteer to test in their laboratory for any telltale signs of disease. Specifically, they looked for the presence of two proteins - tau and alpha-synuclein. © 2015 BBC.
By Emily Underwood Infants born prematurely are more than twice as likely to have difficulty hearing and processing words than those carried to full-term, likely because brain regions that process sounds aren’t sufficiently developed at the time of delivery. Now, an unusual study with 40 preemies suggests that recreating a womblike environment with recordings of a mother's heartbeat and voice could potentially correct these deficits. "This is the kind of study where you think ‘Yes, I can believe these results,’ " because they fit well with what scientists know about fetal brain development, says cognitive scientist Karin Stromswold of Rutgers University, New Brunswick, in New Jersey. A fetus starts to hear at about 24 weeks of gestation, as neurons migrate to—and form connections in—the auditory cortex, a brain region that processes sound, Stromswold explains. Once the auditory cortex starts to function, a fetus normally hears mostly low-frequency sounds—its mother’s heartbeat, for example, and the melody and rhythm of her voice. Higher frequency tones made outside of the mother's body, such as consonants, are largely drowned out. Researchers believe that this introduction to the melody and rhythm of speech, prior to hearing individual words, may be a key part of early language acquisition that gets disrupted when a baby is born too soon. In addition to being bombarded with the bright lights, chemical smells, and shrill sounds of a hospital’s intensive care unit, preemies are largely deprived of the sensations they'd get in the womb, such as their mother's heartbeat and voice, says Amir Lahav, a neuroscientist at Harvard Medical School in Boston. Although mothers are sometimes allowed to hold premature newborns for short periods of time, the infants are often considered too fragile to leave their temperature- and humidity-controlled incubators, he says. Preemies often have their eyes covered to block out light, and previous studies have shown that reducing overall levels of high-frequency noise in a neonatal intensive care unit—by lowering the number of incubators in a unit, for example, or giving preemies earplugs—can improve premature babies' outcomes. Few studies have actively simulated a womblike environment, however, he says. © 2015 American Association for the Advancement of Science.
By Elizabeth Pennisi Researchers have increased the size of mouse brains by giving the rodents a piece of human DNA that controls gene activity. The work provides some of the strongest genetic evidence yet for how the human intellect surpassed those of all other apes. "[The DNA] could easily be a huge component in how the human brain expanded," says Mary Ann Raghanti, a biological anthropologist at Kent State University in Ohio, who was not involved with the work. "It opens up a whole world of possibilities about brain evolution." For centuries, biologists have wondered what made humans human. Once the human and chimp genomes were deciphered about a decade ago, they realized they could now begin to pinpoint the molecular underpinnings of our big brain, bipedalism, varied diet, and other traits that have made our species so successful. By 2008, almost two dozen computerized comparisons of human and ape genomes had come up with hundreds of pieces of DNA that might be important. But rarely have researchers taken the next steps to try to prove that a piece of DNA really made a difference in human evolution. "You could imagine [their roles], but they were just sort of 'just so' stories,” says Greg Wray, an evolutionary biologist at Duke University in Durham, North Carolina. Wray is particularly interested in DNA segments called enhancers, which control the activity of genes nearby. He and Duke graduate student Lomax Boyd scanned the genomic databases and combed the scientific literature for enhancers that were different between humans and chimps and that were near genes that play a role in the brain. Out of more than 100 candidates, they and Duke developmental neurobiologist Debra Silver tested a half-dozen. They first inserted each enhancer into embryonic mice to learn whether it really did turn genes on. Then for HARE5, the most active enhancer in an area of the brain called the cortex, they made minigenes containing either the chimp or human version of the enhancer linked to a “reporter” gene that caused the developing mouse embryo to turn blue wherever the enhancer turned the gene on. Embryos’ developing brains turned blue sooner and over a broader expanse if they carried the human version of the enhancer, Silver, Wray, and their colleagues report online today in Current Biology. © 2015 American Association for the Advancement of Science
by Sarah Zielinski No one would be shocked to find play behavior in a mammal species. Humans love to play — as do our cats and dogs. It’s not such a leap to believe that, say, a red kangaroo would engage in mock fights. But somehow that behavior seems unlikely in animals other than mammals. It shouldn’t, though. Researchers have documented play behavior in an astonishing range of animals, from insects to birds to mammals. The purpose of such activities isn’t always clear — and not all scientists are convinced that play even exists — but play may help creatures establish social bonds or learn new skills. Here are five non-mammals you may be surprised to find hard at play: Crocodilians Alligators and crocodiles might seem more interested in lurking near the water and chomping on their latest meal, but these frightening reptiles engage in play, Vladimir Dinets of the University of Tennessee in Knoxville reports in the February Animal Behavior and Cognition. Dinets combined 3,000 hours of observations of wild and captive crocodilians with published reports and information gathered from other people who work with the animals. He found examples of all three types of play: Locomotor play: This is movement without any apparent reason or stimulus. Young, captive American alligators, for instance, have been spotted sliding down slopes of water over and over. And a 2.5-meter-long crocodile was seen surfing the waves near a beach in Australia. Object play: Animals like toys, too. A Cuban crocodile at a Miami zoo picked up and pushed around flowers floating in its pool for several days of observation. And like a cat playing with a mouse, a Nile crocodile was photographed as it repeatedly threw a dead hippo into the air. Object play is recognized as so important to crocodilian life “that many zoo caretakers now provide various objects as toys for crocodilians as part of habitat enrichment programs,” Dinets notes. © Society for Science & the Public 2000 - 2015.
Keyword: Development of the Brain
Link ID: 20597 - Posted: 02.21.2015
Catherine Brahic THE nature versus nurture debate is getting a facelift this week, with the publication of a genetic map that promises to tell us which bits of us are set in stone by our DNA, and which bits we can affect by how we live our lives. The new "epigenomic" map doesn't just look at genes, but also the instructions that govern them. Compiled by a consortium of biologists and computer scientists, this information will allow doctors to pinpoint precisely which cells in the body are responsible for various diseases. It might also reveal how to adjust your lifestyle to counter a genetic predisposition to a particular disease. "The epigenome is the additional information our cells have on top of genetic information," says lead researcher Manolis Kellis of the Massachusetts Institute of Technology. It is made of chemical tags that are attached to DNA and its packaging. These tags act like genetic controllers, influencing whether a gene is switched on or off, and play an instrumental role in shaping our bodies and disease. Researchers are still figuring out exactly how and when epigenetic tags are added to our DNA, but the process appears to depend on environmental cues. We inherit some tags from our parents, but what a mother eats during pregnancy, for instance, might also change her baby's epigenome. Others tags relate to the environment we are exposed to as children and adults. "The epigenome sits in a very special place between nature and nurture," says Kellis. Each cell type in our body has a different epigenome – in fact, the DNA tags are the reason why our cells come in such different shapes and sizes despite having exactly the same DNA. So for its map, the Roadmap Epigenomics Consortium collected thousands of cells from different adult and embryonic tissues, and meticulously analysed all the tags. © Copyright Reed Business Information Ltd.
By Abigail Zuger, M.D. I had intended to discuss President Obama’s plans for personalized precision medicine with my patient Barbara last week, but she missed her appointment. Or, more accurately, she arrived two hours late, made the usual giant fuss at the reception desk and had to be rescheduled. I was disappointed. Barbara has some insight into the vortex of her own complications, and I thought she might help organize my thoughts. Mr. Obama announced last month that his new budget included $215 million toward the creation of a national databank of medical information, intended to associate specific gene patterns with various diseases and to predict what genetic, lifestyle and environmental factors correlate with successful treatment. Once all those relationships are clarified, the path will open to drugs or other interventions that firm up the good links and interrupt the bad ones. This step up the scientific ladder of medicine has many advocates. Researchers who sequence the genome are enthusiastic, as are those with a financial interest in the technology. Also celebrating are doctors and patients in the cancer community, where genetic data already informs some treatment choices and where the initial thrust of the initiative and much of its funding will be directed. Skeptics point out that genetic medicine, for all its promise, has delivered relatively few clinical benefits. And straightforward analyses of lifestyle and environment effects on health may occasionally lead to clear-cut advice (don’t smoke), but more often sow confusion, as anyone curious about the best way to lose weight or the optimal quantity of dietary salt knows. Without Barbara’s presence, I was left to ponder her medical record, a 20-year saga that might be titled “Genes, Lifestyle and Environment.” and published as a cautionary tale. © 2015 The New York Times Company
By Kate Baggaley A buildup of rare versions of genes that control the activity of nerve cells in the brain increases a person’s risk for bipolar disorder, researchers suggest in a paper posted online the week of February 16 in Proceedings of the National Academy of Sciences. “There are many different variants in many different genes that contribute to the genetic risk,” says coauthor Jared Roach, a geneticist at the Institute for Systems Biology in Seattle. “We think that most people with bipolar disorder will have inherited several of these…risk variants.” The bulk of a person’s risk for bipolar disorder comes from genetics, but only a quarter of that risk can be explained by common variations in genes. Roach’s team sequenced the genomes of 200 people from 41 families with a history of bipolar disorder. They then identified 164 rare forms of genes that show up more often in people with the condition. People with bipolar disorder had, on average, six of these rare forms, compared with just one, on average, found in their healthy relatives and the general population. The identified genes control the ability of ions, or charged particles, to enter or leave nerve cells, or neurons. This affects neurons’ ability to pass information through the brain. Some of the gene variants probably increase how much neurons fire while others decrease it, the researchers say. Future research will need to explain what role these brain changes play in bipolar disorder. Citations S.A. Ament et al. Rare variants in neuronal excitability genes influence risk for bipolar disorder. Proceedings of the National Academy of Sciences. Published online the week of February 16, 2015. doi:10.1073/pnas.1424958112. © Society for Science & the Public 2000 - 2015
By PAULA SPAN The word “benzodiazepines” and the phrase “widely prescribed for anxiety and insomnia” appear together so frequently that they may remind you of the apparently unbreakable connection between “powerful” and “House Ways and Means Committee.” But now we have a better sense of just how widely prescribed these medications are. A study in this month’s JAMA Psychiatry reports that among 65- to 80-year-old Americans, close to 9 percent use one of these sedative-hypnotics, drugs like Valium, Xanax, Ativan and Klonopin. Among older women, nearly 11 percent take them. “That’s an extraordinarily high rate of use for any class of medications,” said Michael Schoenbaum, a senior adviser at the National Institutes of Mental Health and a co-author of the new report. “It seemed particularly striking given the identified clinical concerns associated with benzodiazepine use in anybody, but especially in older adults.” He was referring to decades of warnings about the potentially unhappy consequences of benzodiazepines for older users. The drugs still are recommended for a handful of specific disorders, including acute alcohol withdrawal and, sometimes, seizures and panic attacks. But concerns about the overuse of benzodiazepines have been aired again and again: in the landmark nursing home reform law of 1987, in the American Geriatrics Society’s Choosing Wisely list of questionable practices in 2013, in last year’s study in the journal BMJ suggesting an association with Alzheimer’s disease. Benzodiazepine users face increased risks of falls and fractures, of auto accidents, of reduced cognition. “Even after one or two doses, you have impaired cognitive performance on memory and other neuropsychological tests, compared to a placebo,” said Dr. D.P. Devanand, director of geriatric psychiatry at Columbia University Medical Center. © 2015 The New York Times Company
|By Esther Landhuis Whereas cholesterol levels measured in a routine blood test can serve as a red flag for heart disease, there’s no simple screen for impending Alzheimer’s. A new Silicon Valley health start-up hopes to change that. A half million Americans die of Alzheimer’s disease each year. Most are diagnosed after a detailed medical workup and extensive neurological and psychological tests that gauge mental function and rule out other causes of dementia. Yet things begin going awry some 10 to 15 years before symptoms show. Spinal fluid analyses and positron emission tomography (PET) scans can detect a key warning sign—buildup of amyloid-beta protein in the brain. Studies suggest that adults with high brain amyloid have elevated risk for Alzheimer’s and stand the best chance of benefiting from treatments should they become available. Getting Alzheimer’s drugs to market requires long and costly clinical studies, which some experts say have failed thus far because experimental drugs were tested too late in the disease process. By the time people show signs of dementia, their brains have lost neurons and no current therapy can revive dead cells. That is why drug trials are looking to recruit seniors with preclinical Alzheimer’s who are on the verge of decline but otherwise look healthy. This poses a tall order. Spinal taps are cumbersome and PET costs $3,000 per scan. “There’s no cheap, fast, noninvasive test that can accurately identify people at risk of Alzheimer’s,” says Brad Dolin, chief technology officer of Neurotrack. The company is developing a computerized visual test that might fit the bill. © 2015 Scientific American
By Siri Carpenter “I don’t look like I have a disability, do I?” Jonas Moore asks me. I shake my head. No, I say — he does not. Bundled up in a puffy green coat in a drafty Starbucks, Moore, 35 and sandy-haired, doesn’t stand out in the crowd seeking refuge from the Wisconsin cold. His handshake is firm and his blue eyes meet mine as we talk. He comes across as intelligent and thoughtful, if perhaps a bit reserved. His disability — autism — is invisible. That’s part of the problem, says Moore. Like most people with autism spectrum disorders, he finds relationships challenging. In the past, he has been quick to anger and has had what he calls “meltdowns.” Those who don’t know he has autism can easily misinterpret his actions. “People think that when I do misbehave I’m somehow intentionally trying to be a jerk,” Moore says. “That’s just not the case.” His difficulty managing emotions has gotten him into some trouble, and he’s had a hard time holding onto jobs — an outcome he might have avoided, he says, if his coworkers and bosses had better understood his intentions. Over time, things have gotten better. Moore has held the same job for five years, vacuuming commercial buildings on a night cleaning crew. He attributes his success to getting the right amount of medication and therapy, to time maturing him and to the fact that he now works mostly alone. Moore is fortunate. His parents help support him financially. He has access to good mental health care. And with the help of the state’s division of vocational rehabilitation, he has found a job that suits him. Many adults with autism are not so lucky. © Society for Science & the Public 2000 - 2015.
Link ID: 20574 - Posted: 02.13.2015
By Michelle Roberts Health editor, BBC News online Women trying for a baby and those in the first three months of pregnancy should not drink any alcohol, updated UK guidelines say. The Royal College of Obstetricians and Gynaecologists (RCOG) had previously said a couple of glasses of wine a week was acceptable. It now says abstinence is the only way to be certain that the baby is not harmed. There is no proven safe amount that women can drink during pregnancy. The updated advice now chimes with guidelines from the National Institute for Health and Care Excellence (NICE). In the US, experts say there is no safe time to drink during pregnancy. But the RCOG highlights around the time of conception and the first three months of pregnancy as the most risky. Drinking alcohol may affect the unborn baby as some will pass through the placenta. Around conception and during the first three months, it may increase the chance of miscarriage, says the RCOG. After this time, women are advised to not drink more than one to two units, more than once or twice a week, it says. Drinking more than this could affect the development of the baby, in particular the way the baby's brain develops and the way the baby grows in the womb, which can lead to foetal growth restriction and increase the risk of stillbirth and premature labour, says the advice. © 2015 BBC
Alison Abbott Fabienne never found out why she went into labour three months too early. But on a quiet afternoon in June 2007, she was hit by accelerating contractions and was rushed to the nearest hospital in rural Switzerland, near Lausanne. When her son, Hugo, was born at 26 weeks of gestation rather than the typical 40, he weighed just 950 grams and was immediately placed in intensive care. Three days later, doctors told Fabienne that ultrasound pictures of Hugo's brain indicated that he had had a severe haemorrhage from his immature blood vessels. “I just exploded into tears,” she says. Both she and her husband understood that the prognosis for Hugo was grim: he had a very high risk of cerebral palsy, a neurological condition that can lead to a life of severe disability. The couple agreed that they did not want to subject their child to that. “We immediately told the doctors that we did not want fierce medical intervention to keep him alive — and saw the relief on the doctors' faces,” recalls Fabienne, who requested that her surname not be used. That night was the most tortured of her life. The next day, however, before any change had been made to Hugo's treatment, his doctors proposed a new option to confirm the diagnosis: a brain scan using magnetic resonance imaging (MRI). This technique, which had been newly adapted for premature babies, would allow the doctors to predict the risk of cerebral palsy more accurately than with ultrasound alone, which has a high false-positive rate. Hugo's MRI scan showed that the damage caused by the brain haemorrhage was limited, and his risk of severe cerebral palsy was likely to be relatively low. So just 24 hours after their decision to let his life end, Hugo's parents did an about-turn. They agreed that the doctors should try to save him. © 2015 Nature Publishing Group
Keyword: Development of the Brain
Link ID: 20555 - Posted: 02.05.2015
By Amanda Baker While we all may vary on just how much time we like spending with other people, humans are overall very social beings. Scientists have already found this to be reflected in our health and well-being – with social isolation being associated with more depression, worse health, and a shorter life. Looking even deeper, they find evidence of our social nature reflected in the very structure of our brains. Just thinking through your daily interactions with your friends or siblings probably gives you dozens of examples of times when it was important to interpret or predict the feelings and behaviors of other people. Our brains agree. Over time parts of our brains have been developed specifically for those tasks, but apparently not all social interaction was created equally. When researchers study the brains of people trying to predict the thoughts and feelings of others, they can actually see a difference in the brain activity depending on whether that person is trying to understand a friend versus a stranger. Even at the level of blood flowing through your brain, you treat people you know well differently than people you don’t. These social interactions also extend into another important area of the brain: the nucleus accumbens. This structure is key in the reward system of the brain, with activity being associated with things that leave you feeling good. Curious if this could have a direct connection with behavior, one group of scientists studied a very current part of our behavior as modern social beings: Facebook use. © 2015 Scientific American
Keyword: Development of the Brain
Link ID: 20554 - Posted: 02.05.2015