Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By SARAH MASLIN NIR The day after the funeral of Avonte Oquendo, the boy with autism whose remains were found this month after he disappeared at age 14 from his school in October, his mother and grandmother stood with Senator Charles E. Schumer as he announced a proposal for a new law. Called “Avonte’s law,” it would finance a program to provide optional electronic tracking devices to be worn by children with autism. “Avonte’s running away was not an isolated incident,” Mr. Schumer, Democrat of New York, said at a news conference on Sunday morning in his office on the East Side of Manhattan. “This is a high-tech solution to an age-old problem.” Citing research that suggests nearly 50 percent of children with autism wander off, often to escape the overstimulation of sounds and noise, Mr. Schumer said the new legislation would expand an existing Department of Justice program that grants money to law enforcement agencies and other groups to provide trackers for people who have Alzheimer’s disease. Mr. Schumer said he had contacted the department months ago about including children with autism in the program. There was receptiveness, he said, but money was needed to provide children with the devices, which cost $80 to $90 and a few dollars a month to operate. The legislation would allocate $10 million for the program, giving interested parents free access to the equipment, which can be worn like a watch or even sewn into clothing. Whether to use such a monitor would be up to the parents, and the exact system of employing the devices would be up to individual municipalities, Mr. Schumer said. There are different variants that could be selected, including one that alerts authorities automatically when a child has stepped across a given perimeter — for example, outside school grounds — and another that becomes activated only after authorities are called. © 2014 The New York Times Company
Link ID: 19173 - Posted: 01.27.2014
By CARL ZIMMER The term “X chromosome” has an air of mystery to it, and rightly so. It got its name in 1891 from a baffled biologist named Hermann Henking. To investigate the nature of chromosomes, Henking examined cells under a simple microscope. All the chromosomes in the cells came in pairs. All except one. Henking labeled this outlier chromosome the “X element.” No one knows for sure what he meant by the letter. Maybe he saw it as an extra chromosome. Or perhaps he thought it was an ex-chromosome. Maybe he used X the way mathematicians do, to refer to something unknown. Today, scientists know the X chromosome much better. It’s part of the system that determines whether we become male or female. If an egg inherits an X chromosome from both parents, it becomes female. If it gets an X from its mother and a Y from its father, it becomes male. But the X chromosome remains mysterious. For one thing, females shut down an X chromosome in every cell, leaving only one active. That’s a drastic step to take, given that the X chromosome has more than 1,000 genes. In some cells, the father’s goes dormant, and in others, the mother’s does. While scientists have known about this so-called X-chromosome inactivation for more than five decades, they still know little about the rules it follows, or even how it evolved. In the journal Neuron, a team of scientists has unveiled an unprecedented view of X-chromosome inactivation in the body. They found a remarkable complexity to the pattern in which the chromosomes were switched on and off. © 2014 The New York Times Company
by Laura Sanders Growing up, I loved it when my parents read aloud the stories of the Berenstain Bears living in their treehouse. So while I was pregnant with my daughter, I imagined lots of cuddly quiet time with her in a comfy chair, reading about the latest adventures of Brother and Sister. Of course, reality soon let me know just how ridiculous that idea was. My newborn couldn’t see more than a foot away, cried robustly and frequently for mysterious reasons, and didn’t really understand words yet. Baby V was simply not interested in the latest dispatch from Bear County. When I started reading child development expert Elaine Reese’s new book Tell Me a Story, I realized that I was not the only one with idyllic story time dreams. Babies and toddlers are squirmy, active people with short attention spans. “Why, then, do we cling to this soft-focus view of storytelling when we know it is unrealistic?” she writes. These days, as Baby V closes in on the 1-year mark, she has turned into a most definite book lover. But it’s not the stories that enchant her. It’s holding the book, turning its pages back to front to back again, flipping it over and generally showing it who’s in charge. Every so often I can entice Baby V to sit on my lap with a book, but we never read through a full story. Instead, we linger on the page with all the junk food that the Hungry Caterpillar chomps through, sticking our fingers in the little holes in the pages. And we make Froggy pop in and out of the bucket. And we study the little goats as they climb up and up and up on the hay bales. © Society for Science & the Public 2000 - 2014
By Melissa Healy Adolescents treated with the antidepressant fluoxetine -- better known by its commercial name, Prozac -- appear to undergo changes in brain signaling that result in changed behavior well into adulthood, says a new study. Adult mice and rats who were administered Prozac for a stretch of mid-adolescence responded to daunting social and physical challenges with less despair than animals who passed their teen years unmedicated, a team of researchers found. But, even as adults long separated from their antidepressant days, the Prozac veterans reacted to stressful situations with greater anxiety than did the adult Prozac virgins. The latest research, published Wednesday in the Journal of Neuroscience, offers evidence that treatment with a selective serotonin reuptake inhibitor -- an SSRI antidepressant -- has long-lived effects on the developing brain. It also zeroes in on how and where fluoxetine effects those lasting changes: by modifying the cascade of chemical signals issued by the brain's ventral tegmentum -- a region active in mood regulation -- in stressful situations. Yet, the new research raises more questions than it answers, since the changes in adults who were treated with Prozac as adolescents seem contradictory. Sensitivity to stress appears to predispose one to developing depression. So how does a medication that treats depression in children and teens -- and that continues to protect them from depression as adults -- also heighten their sensitivity to stress?
Dan Hurley Forget mindfulness meditation, computerized working-memory training, and learning a musical instrument; all methods recently shown by scientists to increase intelligence. There could be an easier answer. It turns out that sex might actually make you smarter. Researchers in Maryland and South Korea recently found that sexual activity in mice and rats improves mental performance and increases neurogenesis (the production of new neurons) in the hippocampus, where long-term memories are formed. In April, a team from the University of Maryland reported that middle-aged rats permitted to engage in sex showed signs of improved cognitive function and hippocampal function. In November, a group from Konkuk University in Seoul concluded that sexual activity counteracts the memory-robbing effects of chronic stress in mice. “Sexual interaction could be helpful,” they wrote, “for buffering adult hippocampal neurogenesis and recognition memory function against the suppressive actions of chronic stress.” So growing brain cells through sex does appear to have some basis in scientific fact. But there’s some debate over whether fake sex—pornography—could be harmful. Neuroscientists from the University of Texas recently argued that excessive porn viewing, like other addictions, can result in permanent “anatomical and pathological” changes to the brain. That view, however, was quickly challenged in a rebuttal from researchers at the University of California, Los Angeles, who said that the Texans "offered little, if any, convincing evidence to support their perspectives. Instead, excessive liberties and misleading interpretations of neuroscience research are used to assert that excessive pornography consumption causes brain damage." © 2014 by The Atlantic Monthly Group
Injuries to the head can leave victims susceptible to early death even years later through impaired judgement, a major analysis of survivors shows. Those with a history of psychiatric disorders before the injury are most at risk of dying prematurely. The study, in JAMA Psychiatry, of 40 years of data on more than two million people, showed that overall a brain injury trebled the risk. Suicide and fatal injuries were among the commonest causes of early death. More than one million people in Europe are taken to hospital with a traumatic brain injury each year. The study, by researchers at the University of Oxford and the Karolinska Institute in Stockholm, looked at Swedish medical records between 1969 and 2009. They followed patients who survived the initial six-month danger period after injury. The data showed that without injury 0.2% of people were dying prematurely - before the age of 56. However, the premature-death rate was three-fold higher in patients who had previously suffered traumatic brain injury. In those who also had a psychiatric disorder the rate soared to 4%. Dr Seena Fazel, one of the researchers in Oxford, said: "There are these subgroups with really high rates, and these are potentially treatable illnesses, so this is something we can do something about." BBC © 2014
A new website that helps determine whether someone might have Alzheimer's disease or dementia is so popular that the site crashed temporarily. Ohio State University's website says its Self-Administered Gerocognitive Exam (SAGE) is a test that can be done in your own home with a paper and pencil. When researchers visited 45 community events where they asked people to take the simple test, they found that of the 1, 047 who did it, 28 per cent were identified with cognitive impairment, test developer Dr. Douglas Scharre of Ohio State and his team reported Monday in The Journal of Neuropsychiatry and Clinical Neurosciences. Alzheimer's test Researchers in Ohio say the SAGE test has been shown to be effective in spotting the early signs of cognitive decline. (Ohio State University Wexner Medical Center) Participants were told the test represented their baseline level, which doctors could use for future comparisons during re-screening. "What we found was that this SAGE self-administered test correlated very well with detailed cognitive testing," Scharre said in a release. "If we catch this cognitive change really early, then we can start potential treatments much earlier than without having this test." The Alzheimer Society of Canada says early diagnosis can help with planning, care and support. © CBC 2014
Link ID: 19137 - Posted: 01.16.2014
By ANDREW POLLACK Launch media viewer Kristin Tremblay helps make dinner at home in Gainesville, Fla. She has a disorder that makes her uncontrollably hungry. Rob C. Witzel for The New York Times Lisa Tremblay still recalls in horror the time her daughter Kristin pulled a hot dog crawling with ants from the garbage at a cookout and prepared to swallow it. Kristin has a rare genetic abnormality that gives her an incessant, uncontrollable hunger. Some people with the condition, called Prader-Willi syndrome, will eat until their stomach ruptures and they die. And, not surprisingly, many are obese. “She’s eaten dog food. She’s eaten cat food,” said Ms. Tremblay, who lives in Nokomis, Fla. When Kristin, now 28, was a child, neighbors once called social welfare authorities, thinking Kristin was not being fed because she complained of being hungry so much. Once an obscure and neglected disease, Prader-Willi is starting to attract more attention from scientists and pharmaceutical companies for a simple reason: It may shed some light on the much broader public health problems of overeating and obesity. “These are remarkable human models of severe obesity,” said Dr. Steven B. Heymsfield, a professor and former executive director of the Pennington Biomedical Research Center in Baton Rouge, La. “When we discover the underlying mechanism of these very rare disorders, they will shed light on garden-variety obesity.” One drug being developed to help obese people lose weight has shown some preliminary signs of success in patients with Prader-Willi. The drug, beloranib, is believed to work by reducing fat synthesis and increasing fat use. In a small trial, it reduced weight and body fat and lowered the food-seeking urge, according to the drug’s developer, Zafgen. © 2014 The New York Times Company
Training to improve cognitive abilities in older people lasted to some degree 10 years after the training program was completed, according to results of a randomized clinical trial supported by the National Institutes of Health. The findings showed training gains for aspects of cognition involved in the ability to think and learn, but researchers said memory training did not have an effect after 10 years. The report, from the Advanced Cognitive Training for Independent and Vital Elderly (ACTIVE) study, appears in the January 2014 issue of the Journal of the American Geriatrics Society. The project was funded by the National Institute on Aging (NIA) and the National Institute of Nursing Research (NINR), components of the NIH. “Previous data from this clinical trial demonstrated that the effects of the training lasted for five years,” said NIA Director Richard J. Hodes, M.D. “Now, these longer term results indicate that particular types of cognitive training can provide a lasting benefit a decade later. They suggest that we should continue to pursue cognitive training as an intervention that might help maintain the mental abilities of older people so that they may remain independent and in the community.” “ACTIVE is an important example of intervention research aimed at enabling older people to maintain their cognitive abilities as they age,” said NINR Director Patricia Grady, Ph.D. “The average age of the individuals who have been followed over the last 10 years is now 82. Given our nation’s aging population, this type of research is an increasingly high priority.”
By Gary Stix The blood-brain barrier is the Berlin Wall of human anatomy and physiology Its closely packed cells shield neurons and the like from toxins and pathogens, while letting pass glucose and other essential chemicals for brain metabolism (caffeine?). For years, pharmaceutical companies and academic researchers have engaged in halting efforts to traverse this imposing blockade in order to deliver some of the big molecules that might potentially help slow the progression of devastating neurological diseases. Like would-be refugees from the former East Germany, many medications get snagged by border guards during the crossing—a molecular security force that either impedes or digests any invader. There have been many attempts to secure safe passage—deploying chemicals that make brain-barrier “endothelial” cells shrivel up, or wielding tiny catheters or minute bubbles that slip through minuscule breaches. Success has been mixed at best—none of these molecular cargo carriers have made their way as far as human trials. Roche, the Swiss-based drugmaker, reported in the Jan. 8 Neuron a bit of progress toward overcoming the lingering technical impediments. The study described a new technique that tricks one of the BBB’s natural checkpoints to let through an elaborately engineered drug that attacks the amyloid-beta protein fragments that may be the primary culprit inflicting the damage wrought by Alzheimer’s. The subterfuge involves the transferrin receptor, a docking site used to transport iron into the brain. Roche took a fragment of an antibody that binds the transferrin receptor and latched it onto another antibody that, once on the other side of the BBB, attaches to and then removes amyloid. © 2014 Scientific American
Link ID: 19121 - Posted: 01.13.2014
by Bethany Brookshire When most people think of the quintessential lab mouse, they think of a little white mouse with red eyes. Soft fur. A timid nature. But scientists think of something very different. This mouse is black, small and fast, with pink ears and a pinkish tail. It’s got black eyes to match. The fur may be soft, but the temper sure isn’t. This is the C57 Black 6 mouse. Each Black 6 mouse should be almost identical to every other Black 6 mouse. They have been bred to their own siblings for hundreds of generations, so there should be very few genetic differences left. But even supposedly identical mouse strains have their differences. These take the form of mutations in single DNA base pairs that accumulate in different populations. Recently, researchers showed that one of these tiny changes in a single gene was enough to produce a huge difference in how two groups of Black 6 mice respond to drugs. And the authors identified a surprising number of other small DNA differences still waiting to be explored. On one level, the new work offers scientists a novel tool for identifying genes that could relate to behaviors. But it also serves as a warning. “Identical” mouse populations aren’t as alike as many scientists had assumed. The Black 6, the most common lab mouse in the United States, is used for everything from drug abuse studies to cancer research. The Black 6 is also the reference strain for the Mouse Genome Sequencing Consortium. Whenever scientists discover a new genetic change in a mouse strain, they compare it first against the Black 6. And it’s the mouse used by the International Knockout Mouse Consortium (now the International Mouse Phenotyping Consortium), which keep a library of mouse embryos with different deleted genes. The Allen Brain Atlas, a database of neuroanatomy and gene activity throughout the mouse brain, relies on the Black 6 as well. © Society for Science & the Public 2000 - 2014
By PAM BELLUCK Does vitamin E help people with Alzheimer’s disease? For years, scientists have been trying to find out, guessing that the vitamin’s antioxidant properties might be beneficial. But the results from clinical trials have been mixed and — following a report that high doses of vitamin E may increase the risk of death — cautionary. Now a study suggests that vitamin E supplements may be good for some Alzheimer’s patients after all. The benefit was not huge, but for a devastating disease that has proved almost impervious to treatment, it was notable. The study, published in Wednesday’s issue of JAMA, The Journal of the American Medical Association, found that over a little more than two years, high-dose vitamin E slowed the decline of people with mild to moderate Alzheimer’s by about six months on average. Vitamin E did not delay cognitive or memory deterioration, however. Instead, it seemed to temporarily protect something many patients consider especially valuable: their ability to perform daily activities like putting on clothes and feeding themselves. Compared with other study participants, people who took vitamin E also required about two fewer hours of help from caregivers per day, the researchers said. “Is it really going to dramatically alter the lives of Alzheimer’s patients? That’s unclear,” said Dr. Scott Small, director of Columbia University’s Alzheimer’s Disease Research Center, who was not involved in the study. “But it might improve patients’ ability to bathe themselves and dress themselves.” © 2014 The New York Times Company
Link ID: 19086 - Posted: 01.02.2014
Stephen S. Hall Hochelaga was the original Iroquoian name for the village that ultimately became Montreal, but it is also the name of a rough-hewn French–Canadian neighbourhood located east of — and a world away from — the cosmopolitan city centre. The district's tidy two- and three-storey brick duplexes, adorned with Montreal's characteristic wrought-iron staircases, predominantly house families that have, because of poverty and lack of education, never quite attained thriving middle-class status. During the 1980s, public-school officials identified Hochelaga and many other impoverished neighbourhoods in the eastern part of Montreal as places where kindergarten children disproportionately displayed severe behavioural problems, such as physical aggression. The school system asked a young University of Montreal psychologist named Richard Tremblay for help. “Their parents didn't have a high-school diploma, and many of the mothers had their first child before the age of 20,” Tremblay says of the families he began to study, as he walks along Rue Ontario in Hochelaga on a sunny afternoon in September. Those were the women, he adds, “most at risk of having children who have problems”. Over the past three decades, Hochelaga and similar neighbourhoods have served as living laboratories in the study of the roots of aggression. Since 1984, Tremblay and his collaborators have followed more than 1,000 children from 53 schools in the city from childhood into adulthood. And in 1985, he initiated a ground-breaking experiment in which some families of at-risk children were given support and counselling to help curb bad behaviour. His research overturned ideas about when aggressive behaviour first emerges, and showed that early intervention can deflect children away from adult criminality. © 2013 Nature Publishing Group
By CARL ZIMMER There are many things that make humans a unique species, but a couple stand out. One is our mind, the other our brain. The human mind can carry out cognitive tasks that other animals cannot, like using language, envisioning the distant future and inferring what other people are thinking. The human brain is exceptional, too. At three pounds, it is gigantic relative to our body size. Our closest living relatives, chimpanzees, have brains that are only a third as big. Scientists have long suspected that our big brain and powerful mind are intimately connected. Starting about three million years ago, fossils of our ancient relatives record a huge increase in brain size. Once that cranial growth was underway, our forerunners started leaving behind signs of increasingly sophisticated minds, like stone tools and cave paintings. But scientists have long struggled to understand how a simple increase in size could lead to the evolution of those faculties. Now, two Harvard neuroscientists, Randy L. Buckner and Fenna M. Krienen, have offered a powerful yet simple explanation. In our smaller-brained ancestors, the researchers argue, neurons were tightly tethered in a relatively simple pattern of connections. When our ancestors’ brains expanded, those tethers ripped apart, enabling our neurons to form new circuits. Dr. Buckner and Dr. Krienen call their idea the tether hypothesis, and present it in a paper in the December issue of the journal Trends in Cognitive Sciences. “I think it presents some pretty exciting ideas,” said Chet C. Sherwood, an expert on human brain evolution at George Washington University who was not involved in the research. Dr. Buckner and Dr. Krienen developed their hypothesis after making detailed maps of the connections in the human brain using f.M.R.I. scanners. When they compared their maps with those of other species’ brains, they saw some striking differences. © 2013 The New York Times Company
By Regina Harrell and Pulse, I am a primary-care doctor who makes house calls in and around Tuscaloosa, Ala. Today my rounds start at a house located down a dirt road a few miles outside town. Gingerly, I cross the front walk; Mrs. Edgars told me that she killed a rattlesnake in her flowerbed last year. She is at the door, expecting my visit. Mr. Edgars sits on the couch, unable to recall that I am his doctor, or even that I am a doctor, but happy to see me nonetheless. We chat about the spring garden and the rain, then we move on to Mr. Edgars’s arthritis. Earlier on in his dementia, he wandered the woods, and his wife was afraid he would get lost and die, although the entire family agreed that this was how he would want it. Now, in a strange twist, his knee arthritis has worsened enough that it has curtailed his wanderings. I suspect that Mrs. Edgars is undertreating the pain to decrease the chance that he’ll wander off again. We talk about how anxious he grows whenever she’s out of his sight and how one of his children comes to sit with him so that she can run errands. She shows me a quilt remnant found in a log cabin on their property; it likely belonged to her husband’s grandfather, making the rough-edged fabric about a century old. I leave carrying a parting gift from her — a jar of homegrown pickled okra. When I get back to the office, I turn on the computer to write a progress note in Mr. Edgars’s electronic health record, or EHR. In addition to recording the details of our visit, I must try to meet the new federal criteria for “meaningful use,” criteria that have been adopted by my office with threats that I won’t get paid for my work if I don’t. © 1996-2013 The Washington Post
Link ID: 19067 - Posted: 12.24.2013
By Alexandra Sifferlin It’s always been conventional wisdom that girls reach maturity more quickly than boys, but now scientists have provided some proof. In new research published in the journal Cerebral Cortex, an international group of researchers led by a team from Newcastle University in England found that girls’ brains march through the reorganization and pruning typical of normal brain development earlier than boys’ brains. In the study, in which 121 people between ages 4 to 40 were scanned using MRIs, the scientists documented the ebb and flow of new neural connections, and found that some brain fibers that bridged far-flung regions of the brain tended to remain stable, while shorter connections, many of which were redundant, were edited away. And the entire reorganization seemed to occur sooner in girls’ brains than in boys’ brains. Females also tended to have more connections across the two hemispheres of the brain. The researchers believe that the earlier reorganization in girls makes the brain work more efficiently, and therefore reach a more mature state for processing the environment. What drives the gender-based difference in timing isn’t clear from the current study, but the results suggest that may be a question worth investigating. © 2013 Time Inc.
Amanda Mascarelli In children with certain gene variants, symptoms similar to common learning disabilities could be omens of serious psychiatric conditions. People who carry high-risk genetic variants for schizophrenia and autism have impairments reminiscent of disorders such as dyslexia, even when they do not yet have a mental illness, a new study has found. The findings offer a window into the brain changes that precede severe mental illness and hold promise for early intervention and even prevention, researchers say. Rare genetic alterations called copy number variants (CNVs), in which certain segments of the genome have an abnormal number of copies, play an important part in psychiatric disorders: Individuals who carry certain CNVs have a several-fold increased risk of developing schizophrenia or autism1. But previous studies were based on individuals who already have a psychiatric disorder, and until now, no one had looked at what effects these CNVs have in the general population. In a study published today in Nature2, researchers report that people with these variants but no diagnosis of autism or a mental illness still show subtle brain changes and impairments in cognitive function. “In psychiatry we always have the problem that disorders are defined by symptoms that patients experience or tell us about, or that we observe,” says study co-author Andreas Meyer-Lindenberg, a psychiatrist and the director the Central Institute of Mental Health in Mannheim, Germany, affiliated with the University of Heidelberg. This work, on the other hand, provides a glimpse into the biological underpinnings of people who are at risk of psychiatric disorders, he says. The team searched a genealogical database of more than 100,000 Icelanders, focusing on 26 genetic variants that have been shown to increase the risk of schizophrenia or autism. They found that 1,178 people in the database, or 1.16% of the sample, carried one or more of these CNVs. © 2013 Nature Publishing Group
By DANNY HAKIM LONDON — European food regulators said on Tuesday that a class of pesticides linked to the deaths of large numbers of honey bees might also harm human health, and they recommended that the European Commission further restrict their use. The commission, which requested the review, has already taken a tougher stance than regulators in other parts of the world against neonicotinoids, a relatively new nicotine-derived class of pesticide. Earlier this year, some were temporarily banned for use on many flowering crops in Europe that attract honey bees, an action that the pesticides’ makers are opposing in court. Now European Union regulators say the same class of pesticides “may affect the developing human nervous system” of children. They focused on two specific versions of the pesticide, acetamiprid and imidacloprid, saying they were safe to use only in smaller amounts than currently allowed. Imidacloprid was one of the pesticides placed under a two-year ban this year. The review was prompted by a Japanese study that raised similar concerns last year. Imidacloprid is one of the most popular insecticides, and is used in agricultural and consumer products. It was developed by Bayer, the German chemicals giant, and is the active ingredient in products like Bayer Advanced Fruit, Citrus & Vegetable Insect Control, which can be purchased at stores internationally, including Home Depot in the United States. Acetamiprid is sold by Nisso Chemical, a German branch of a Japanese company, though it was developed with Bayer’s help. It is used in consumer products like Ortho Flower, Fruit & Vegetable Insect Killer. The action by European regulators could affect the entire category of neonicotinoid pesticides, however. James Ramsay, a spokesman for the European Food Safety Authority, which conducted the review, said the agency was recommending a mandatory submission of studies related to developmental neurotoxicity “as part of the authorization process in the E.U.” © 2013 The New York Times Company
By DONALD G. McNEIL Jr. A long-awaited study has confirmed the fears of Somali residents in Minneapolis that their children suffer from higher rates of a disabling form of autism compared with other children there. The study — by the University of Minnesota, the Centers for Disease Control and Prevention, and the research and advocacy group Autism Speaks — found high rates of autism in two populations: About one Somali child in 32 and one white child in 36 in Minneapolis were on the autism spectrum. The national average is one child in 88, according to Coleen A. Boyle, who directs the C.D.C.’s Center on Birth Defects and Developmental Disabilities. But the Somali children were less likely than the whites to be “high-functioning” and more likely to have I.Q.s below 70. (The average I.Q. score is 100.) The study offered no explanation of the statistics. “We do not know why more Somali and white children were identified,” said Amy S. Hewitt, the project’s primary investigator and director of the University of Minnesota’s Research and Training Center on Community Living. “This project was not designed to answer these questions.” The results echoed those of a Swedish study published last year finding that children from immigrant families in Stockholm — many of them Somali — were more likely to have autism with intellectual disabilities. The Minneapolis study also found that Somali children with autism received their diagnoses late. Age 5 was the average, while autism and learning disabilities can be diagnosed as early as age 2, and children get the most benefit from behavioral treatment when it is started early. Black American-born children and Hispanic children in Minneapolis had much lower autism rates: one in 62 for the former and one in 80 for the latter. © 2013 The New York Times Company
Link ID: 19044 - Posted: 12.17.2013
by Bethany Brookshire “You are what you eat.” We’ve all heard that one. What we eat can affect our growth, life span and whether we develop disease. These days, we know that we also are what our mother eats. Or rather, what our mothers ate while we were in the womb. But are we also what our father eats? A new study shows that in mice, a dietary deficiency in dad can be a big downer for baby. The dietary staple in the study was folic acid, or folate. Folate is one of the B vitamins and is found in dark leafy greens (eat your kale!) and has even been added to some foods like cereals. It is particularly essential to get in the diet because we cannot synthesize it on our own. And it plays roles in DNA repair and DNA synthesis, as well as methylation of DNA. It’s particularly important during development. Without adequate folate, developing fetuses are prone to neural tube disorders, such as spina bifida. Some of the neural tube disorders caused by folate deficiency could result from breaks in the DNA itself. But folic acid is also important in the epigenome. Epigenetics is a mechanism that allows cells to change how genes are used without changing the genes themselves. Instead of altering the DNA itself, epigenetic alterations put chemical “marks” or “notes” —methyl or acetyl groups — on the DNA and the proteins associated with it. The marks can either make a gene more accessible (acetylation) or less accessible (methylation), making it more or less likely to be made into a protein. This means that each cell type can have a different epigenome, allowing a neuron to function differently than a muscle cell, even though they contain the same DNA. Folate affects DNA synthesis, but it can also affect DNA methylation. In fact, DNA methylation requires the presence of folate. So low folate could affect whether genes are turned off or on and by how much. In a developing fetus, that could contribute to developmental problems. © Society for Science & the Public 2000 - 2013.