Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Elizabeth Pennisi Last week, researchers expanded the size of the mouse brain by giving rodents a piece of human DNA. Now another team has topped that feat, pinpointing a human gene that not only grows the mouse brain but also gives it the distinctive folds found in primate brains. The work suggests that scientists are finally beginning to unravel some of the evolutionary steps that boosted the cognitive powers of our species. “This study represents a major milestone in our understanding of the developmental emergence of human uniqueness,” says Victor Borrell Franco, a neurobiologist at the Institute of Neurosciences in Alicante, Spain, who was not involved with the work. The new study began when Wieland Huttner, a developmental neurobiologist at the Max Planck Institute of Molecular Cell Biology and Genetics in Dresden, Germany, and his colleagues started closely examining aborted human fetal tissue and embryonic mice. “We specifically wanted to figure out which genes are active during the development of the cortex, the part of the brain that is greatly expanded in humans and other primates compared to rodents,” says Marta Florio, the Huttner graduate student who carried out the main part of the work. That was harder than it sounded. Building a cortex requires several kinds of starting cells, or stem cells. The stem cells divide and sometimes specialize into other types of “intermediate” stem cells that in turn divide and form the neurons that make up brain tissue. To learn what genes are active in the two species, the team first had to develop a way to separate out the various types of cortical stem cells. © 2015 American Association for the Advancement of Science
By DENISE GRADY Faced with mounting evidence that general anesthesia may impair brain development in babies and young children, experts said Wednesday that more research is greatly needed and that when planning surgery for a child, parents and doctors should consider how urgently it is required, particularly in children younger than 3 years. In the United States each year, about a million children younger than 4 have surgery with general anesthesia, according to the Food and Drug Administration. So far, the threat is only a potential one; there is no proof that children have been harmed. The concern is based on two types of research. Experiments in young monkeys and other animals have shown that commonly used anesthetics and sedatives can kill brain cells, diminish learning and memory and cause behavior problems. And studies in children have found an association between learning problems and multiple exposures to anesthesia early in life — though not single exposures. But monkeys are not humans, and association does not prove cause and effect. Research now underway is expected to be more definitive, but results will not be available for several years. Anesthesiologists and surgeons are struggling with how — and sometimes whether — to explain a theoretical hazard to parents who are already worried about the real risks of their child’s medical problem and the surgery needed to correct it. If there is a problem with anesthesia, in many cases it may be unavoidable because there are no substitute drugs. The last thing doctors want to do is frighten parents for no reason or prompt them to delay or cancel an operation that their child needs. “On the one hand, we don’t want to overstate the risk, because we don’t know what the risk is, if there is a risk,” said Dr. Randall P. Flick, a pediatric anesthesiologist and director of Mayo Clinic Children’s Center in Rochester, Minn., who has conducted some of the studies in children suggesting a link to learning problems. “On the other hand, we want to make people aware of the risk because we feel we have a duty to do so.” © 2015 The New York Times Compan
By Michelle Roberts Health editor, BBC News online Scientists have proposed a new idea for detecting brain conditions including Alzheimer's - a skin test. Their work, which is at an early stage, found the same abnormal proteins that accumulate in the brain in such disorders can also be found in skin. Early diagnosis is key to preventing the loss of brain tissue in dementia, which can go undetected for years. But experts said even more advanced tests, including ones of spinal fluid, were still not ready for clinic. If they were, then doctors could treatment at the earliest stages, before irreversible brain damage or mental decline has taken place. Brain biomarker Investigators have been hunting for suitable biomarkers in the body - molecules in blood or exhaled breath, for example, that can be measured to accurately and reliably signal if a disease or disorder is present. Dr Ildefonso Rodriguez-Leyva and colleagues from the University of San Luis Potosi, Mexico, believe skin is a good candidate for uncovering hidden brain disorders. Skin has the same origin as brain tissue in the developing embryo and might, therefore, be a good window to what's going on in the mind in later life - at least at a molecular level - they reasoned. Post-mortem studies of people with Parkinson's also reveal that the same protein deposits which occur in the brain with this condition also accumulate in the skin. To test if the same was true in life as after death, the researchers recruited 65 volunteers - 12 who were healthy controls and the remaining 53 who had either Parkinson's disease, Alzheimer's or another type of dementia. They took a small skin biopsy from behind the ear of each volunteer to test in their laboratory for any telltale signs of disease. Specifically, they looked for the presence of two proteins - tau and alpha-synuclein. © 2015 BBC.
By Emily Underwood Infants born prematurely are more than twice as likely to have difficulty hearing and processing words than those carried to full-term, likely because brain regions that process sounds aren’t sufficiently developed at the time of delivery. Now, an unusual study with 40 preemies suggests that recreating a womblike environment with recordings of a mother's heartbeat and voice could potentially correct these deficits. "This is the kind of study where you think ‘Yes, I can believe these results,’ " because they fit well with what scientists know about fetal brain development, says cognitive scientist Karin Stromswold of Rutgers University, New Brunswick, in New Jersey. A fetus starts to hear at about 24 weeks of gestation, as neurons migrate to—and form connections in—the auditory cortex, a brain region that processes sound, Stromswold explains. Once the auditory cortex starts to function, a fetus normally hears mostly low-frequency sounds—its mother’s heartbeat, for example, and the melody and rhythm of her voice. Higher frequency tones made outside of the mother's body, such as consonants, are largely drowned out. Researchers believe that this introduction to the melody and rhythm of speech, prior to hearing individual words, may be a key part of early language acquisition that gets disrupted when a baby is born too soon. In addition to being bombarded with the bright lights, chemical smells, and shrill sounds of a hospital’s intensive care unit, preemies are largely deprived of the sensations they'd get in the womb, such as their mother's heartbeat and voice, says Amir Lahav, a neuroscientist at Harvard Medical School in Boston. Although mothers are sometimes allowed to hold premature newborns for short periods of time, the infants are often considered too fragile to leave their temperature- and humidity-controlled incubators, he says. Preemies often have their eyes covered to block out light, and previous studies have shown that reducing overall levels of high-frequency noise in a neonatal intensive care unit—by lowering the number of incubators in a unit, for example, or giving preemies earplugs—can improve premature babies' outcomes. Few studies have actively simulated a womblike environment, however, he says. © 2015 American Association for the Advancement of Science.
By Elizabeth Pennisi Researchers have increased the size of mouse brains by giving the rodents a piece of human DNA that controls gene activity. The work provides some of the strongest genetic evidence yet for how the human intellect surpassed those of all other apes. "[The DNA] could easily be a huge component in how the human brain expanded," says Mary Ann Raghanti, a biological anthropologist at Kent State University in Ohio, who was not involved with the work. "It opens up a whole world of possibilities about brain evolution." For centuries, biologists have wondered what made humans human. Once the human and chimp genomes were deciphered about a decade ago, they realized they could now begin to pinpoint the molecular underpinnings of our big brain, bipedalism, varied diet, and other traits that have made our species so successful. By 2008, almost two dozen computerized comparisons of human and ape genomes had come up with hundreds of pieces of DNA that might be important. But rarely have researchers taken the next steps to try to prove that a piece of DNA really made a difference in human evolution. "You could imagine [their roles], but they were just sort of 'just so' stories,” says Greg Wray, an evolutionary biologist at Duke University in Durham, North Carolina. Wray is particularly interested in DNA segments called enhancers, which control the activity of genes nearby. He and Duke graduate student Lomax Boyd scanned the genomic databases and combed the scientific literature for enhancers that were different between humans and chimps and that were near genes that play a role in the brain. Out of more than 100 candidates, they and Duke developmental neurobiologist Debra Silver tested a half-dozen. They first inserted each enhancer into embryonic mice to learn whether it really did turn genes on. Then for HARE5, the most active enhancer in an area of the brain called the cortex, they made minigenes containing either the chimp or human version of the enhancer linked to a “reporter” gene that caused the developing mouse embryo to turn blue wherever the enhancer turned the gene on. Embryos’ developing brains turned blue sooner and over a broader expanse if they carried the human version of the enhancer, Silver, Wray, and their colleagues report online today in Current Biology. © 2015 American Association for the Advancement of Science
by Sarah Zielinski No one would be shocked to find play behavior in a mammal species. Humans love to play — as do our cats and dogs. It’s not such a leap to believe that, say, a red kangaroo would engage in mock fights. But somehow that behavior seems unlikely in animals other than mammals. It shouldn’t, though. Researchers have documented play behavior in an astonishing range of animals, from insects to birds to mammals. The purpose of such activities isn’t always clear — and not all scientists are convinced that play even exists — but play may help creatures establish social bonds or learn new skills. Here are five non-mammals you may be surprised to find hard at play: Crocodilians Alligators and crocodiles might seem more interested in lurking near the water and chomping on their latest meal, but these frightening reptiles engage in play, Vladimir Dinets of the University of Tennessee in Knoxville reports in the February Animal Behavior and Cognition. Dinets combined 3,000 hours of observations of wild and captive crocodilians with published reports and information gathered from other people who work with the animals. He found examples of all three types of play: Locomotor play: This is movement without any apparent reason or stimulus. Young, captive American alligators, for instance, have been spotted sliding down slopes of water over and over. And a 2.5-meter-long crocodile was seen surfing the waves near a beach in Australia. Object play: Animals like toys, too. A Cuban crocodile at a Miami zoo picked up and pushed around flowers floating in its pool for several days of observation. And like a cat playing with a mouse, a Nile crocodile was photographed as it repeatedly threw a dead hippo into the air. Object play is recognized as so important to crocodilian life “that many zoo caretakers now provide various objects as toys for crocodilians as part of habitat enrichment programs,” Dinets notes. © Society for Science & the Public 2000 - 2015.
Keyword: Development of the Brain
Link ID: 20597 - Posted: 02.21.2015
Catherine Brahic THE nature versus nurture debate is getting a facelift this week, with the publication of a genetic map that promises to tell us which bits of us are set in stone by our DNA, and which bits we can affect by how we live our lives. The new "epigenomic" map doesn't just look at genes, but also the instructions that govern them. Compiled by a consortium of biologists and computer scientists, this information will allow doctors to pinpoint precisely which cells in the body are responsible for various diseases. It might also reveal how to adjust your lifestyle to counter a genetic predisposition to a particular disease. "The epigenome is the additional information our cells have on top of genetic information," says lead researcher Manolis Kellis of the Massachusetts Institute of Technology. It is made of chemical tags that are attached to DNA and its packaging. These tags act like genetic controllers, influencing whether a gene is switched on or off, and play an instrumental role in shaping our bodies and disease. Researchers are still figuring out exactly how and when epigenetic tags are added to our DNA, but the process appears to depend on environmental cues. We inherit some tags from our parents, but what a mother eats during pregnancy, for instance, might also change her baby's epigenome. Others tags relate to the environment we are exposed to as children and adults. "The epigenome sits in a very special place between nature and nurture," says Kellis. Each cell type in our body has a different epigenome – in fact, the DNA tags are the reason why our cells come in such different shapes and sizes despite having exactly the same DNA. So for its map, the Roadmap Epigenomics Consortium collected thousands of cells from different adult and embryonic tissues, and meticulously analysed all the tags. © Copyright Reed Business Information Ltd.
By Abigail Zuger, M.D. I had intended to discuss President Obama’s plans for personalized precision medicine with my patient Barbara last week, but she missed her appointment. Or, more accurately, she arrived two hours late, made the usual giant fuss at the reception desk and had to be rescheduled. I was disappointed. Barbara has some insight into the vortex of her own complications, and I thought she might help organize my thoughts. Mr. Obama announced last month that his new budget included $215 million toward the creation of a national databank of medical information, intended to associate specific gene patterns with various diseases and to predict what genetic, lifestyle and environmental factors correlate with successful treatment. Once all those relationships are clarified, the path will open to drugs or other interventions that firm up the good links and interrupt the bad ones. This step up the scientific ladder of medicine has many advocates. Researchers who sequence the genome are enthusiastic, as are those with a financial interest in the technology. Also celebrating are doctors and patients in the cancer community, where genetic data already informs some treatment choices and where the initial thrust of the initiative and much of its funding will be directed. Skeptics point out that genetic medicine, for all its promise, has delivered relatively few clinical benefits. And straightforward analyses of lifestyle and environment effects on health may occasionally lead to clear-cut advice (don’t smoke), but more often sow confusion, as anyone curious about the best way to lose weight or the optimal quantity of dietary salt knows. Without Barbara’s presence, I was left to ponder her medical record, a 20-year saga that might be titled “Genes, Lifestyle and Environment.” and published as a cautionary tale. © 2015 The New York Times Company
By Kate Baggaley A buildup of rare versions of genes that control the activity of nerve cells in the brain increases a person’s risk for bipolar disorder, researchers suggest in a paper posted online the week of February 16 in Proceedings of the National Academy of Sciences. “There are many different variants in many different genes that contribute to the genetic risk,” says coauthor Jared Roach, a geneticist at the Institute for Systems Biology in Seattle. “We think that most people with bipolar disorder will have inherited several of these…risk variants.” The bulk of a person’s risk for bipolar disorder comes from genetics, but only a quarter of that risk can be explained by common variations in genes. Roach’s team sequenced the genomes of 200 people from 41 families with a history of bipolar disorder. They then identified 164 rare forms of genes that show up more often in people with the condition. People with bipolar disorder had, on average, six of these rare forms, compared with just one, on average, found in their healthy relatives and the general population. The identified genes control the ability of ions, or charged particles, to enter or leave nerve cells, or neurons. This affects neurons’ ability to pass information through the brain. Some of the gene variants probably increase how much neurons fire while others decrease it, the researchers say. Future research will need to explain what role these brain changes play in bipolar disorder. Citations S.A. Ament et al. Rare variants in neuronal excitability genes influence risk for bipolar disorder. Proceedings of the National Academy of Sciences. Published online the week of February 16, 2015. doi:10.1073/pnas.1424958112. © Society for Science & the Public 2000 - 2015
By PAULA SPAN The word “benzodiazepines” and the phrase “widely prescribed for anxiety and insomnia” appear together so frequently that they may remind you of the apparently unbreakable connection between “powerful” and “House Ways and Means Committee.” But now we have a better sense of just how widely prescribed these medications are. A study in this month’s JAMA Psychiatry reports that among 65- to 80-year-old Americans, close to 9 percent use one of these sedative-hypnotics, drugs like Valium, Xanax, Ativan and Klonopin. Among older women, nearly 11 percent take them. “That’s an extraordinarily high rate of use for any class of medications,” said Michael Schoenbaum, a senior adviser at the National Institutes of Mental Health and a co-author of the new report. “It seemed particularly striking given the identified clinical concerns associated with benzodiazepine use in anybody, but especially in older adults.” He was referring to decades of warnings about the potentially unhappy consequences of benzodiazepines for older users. The drugs still are recommended for a handful of specific disorders, including acute alcohol withdrawal and, sometimes, seizures and panic attacks. But concerns about the overuse of benzodiazepines have been aired again and again: in the landmark nursing home reform law of 1987, in the American Geriatrics Society’s Choosing Wisely list of questionable practices in 2013, in last year’s study in the journal BMJ suggesting an association with Alzheimer’s disease. Benzodiazepine users face increased risks of falls and fractures, of auto accidents, of reduced cognition. “Even after one or two doses, you have impaired cognitive performance on memory and other neuropsychological tests, compared to a placebo,” said Dr. D.P. Devanand, director of geriatric psychiatry at Columbia University Medical Center. © 2015 The New York Times Company
|By Esther Landhuis Whereas cholesterol levels measured in a routine blood test can serve as a red flag for heart disease, there’s no simple screen for impending Alzheimer’s. A new Silicon Valley health start-up hopes to change that. A half million Americans die of Alzheimer’s disease each year. Most are diagnosed after a detailed medical workup and extensive neurological and psychological tests that gauge mental function and rule out other causes of dementia. Yet things begin going awry some 10 to 15 years before symptoms show. Spinal fluid analyses and positron emission tomography (PET) scans can detect a key warning sign—buildup of amyloid-beta protein in the brain. Studies suggest that adults with high brain amyloid have elevated risk for Alzheimer’s and stand the best chance of benefiting from treatments should they become available. Getting Alzheimer’s drugs to market requires long and costly clinical studies, which some experts say have failed thus far because experimental drugs were tested too late in the disease process. By the time people show signs of dementia, their brains have lost neurons and no current therapy can revive dead cells. That is why drug trials are looking to recruit seniors with preclinical Alzheimer’s who are on the verge of decline but otherwise look healthy. This poses a tall order. Spinal taps are cumbersome and PET costs $3,000 per scan. “There’s no cheap, fast, noninvasive test that can accurately identify people at risk of Alzheimer’s,” says Brad Dolin, chief technology officer of Neurotrack. The company is developing a computerized visual test that might fit the bill. © 2015 Scientific American
By Siri Carpenter “I don’t look like I have a disability, do I?” Jonas Moore asks me. I shake my head. No, I say — he does not. Bundled up in a puffy green coat in a drafty Starbucks, Moore, 35 and sandy-haired, doesn’t stand out in the crowd seeking refuge from the Wisconsin cold. His handshake is firm and his blue eyes meet mine as we talk. He comes across as intelligent and thoughtful, if perhaps a bit reserved. His disability — autism — is invisible. That’s part of the problem, says Moore. Like most people with autism spectrum disorders, he finds relationships challenging. In the past, he has been quick to anger and has had what he calls “meltdowns.” Those who don’t know he has autism can easily misinterpret his actions. “People think that when I do misbehave I’m somehow intentionally trying to be a jerk,” Moore says. “That’s just not the case.” His difficulty managing emotions has gotten him into some trouble, and he’s had a hard time holding onto jobs — an outcome he might have avoided, he says, if his coworkers and bosses had better understood his intentions. Over time, things have gotten better. Moore has held the same job for five years, vacuuming commercial buildings on a night cleaning crew. He attributes his success to getting the right amount of medication and therapy, to time maturing him and to the fact that he now works mostly alone. Moore is fortunate. His parents help support him financially. He has access to good mental health care. And with the help of the state’s division of vocational rehabilitation, he has found a job that suits him. Many adults with autism are not so lucky. © Society for Science & the Public 2000 - 2015.
Link ID: 20574 - Posted: 02.13.2015
By Michelle Roberts Health editor, BBC News online Women trying for a baby and those in the first three months of pregnancy should not drink any alcohol, updated UK guidelines say. The Royal College of Obstetricians and Gynaecologists (RCOG) had previously said a couple of glasses of wine a week was acceptable. It now says abstinence is the only way to be certain that the baby is not harmed. There is no proven safe amount that women can drink during pregnancy. The updated advice now chimes with guidelines from the National Institute for Health and Care Excellence (NICE). In the US, experts say there is no safe time to drink during pregnancy. But the RCOG highlights around the time of conception and the first three months of pregnancy as the most risky. Drinking alcohol may affect the unborn baby as some will pass through the placenta. Around conception and during the first three months, it may increase the chance of miscarriage, says the RCOG. After this time, women are advised to not drink more than one to two units, more than once or twice a week, it says. Drinking more than this could affect the development of the baby, in particular the way the baby's brain develops and the way the baby grows in the womb, which can lead to foetal growth restriction and increase the risk of stillbirth and premature labour, says the advice. © 2015 BBC
Alison Abbott Fabienne never found out why she went into labour three months too early. But on a quiet afternoon in June 2007, she was hit by accelerating contractions and was rushed to the nearest hospital in rural Switzerland, near Lausanne. When her son, Hugo, was born at 26 weeks of gestation rather than the typical 40, he weighed just 950 grams and was immediately placed in intensive care. Three days later, doctors told Fabienne that ultrasound pictures of Hugo's brain indicated that he had had a severe haemorrhage from his immature blood vessels. “I just exploded into tears,” she says. Both she and her husband understood that the prognosis for Hugo was grim: he had a very high risk of cerebral palsy, a neurological condition that can lead to a life of severe disability. The couple agreed that they did not want to subject their child to that. “We immediately told the doctors that we did not want fierce medical intervention to keep him alive — and saw the relief on the doctors' faces,” recalls Fabienne, who requested that her surname not be used. That night was the most tortured of her life. The next day, however, before any change had been made to Hugo's treatment, his doctors proposed a new option to confirm the diagnosis: a brain scan using magnetic resonance imaging (MRI). This technique, which had been newly adapted for premature babies, would allow the doctors to predict the risk of cerebral palsy more accurately than with ultrasound alone, which has a high false-positive rate. Hugo's MRI scan showed that the damage caused by the brain haemorrhage was limited, and his risk of severe cerebral palsy was likely to be relatively low. So just 24 hours after their decision to let his life end, Hugo's parents did an about-turn. They agreed that the doctors should try to save him. © 2015 Nature Publishing Group
Keyword: Development of the Brain
Link ID: 20555 - Posted: 02.05.2015
By Amanda Baker While we all may vary on just how much time we like spending with other people, humans are overall very social beings. Scientists have already found this to be reflected in our health and well-being – with social isolation being associated with more depression, worse health, and a shorter life. Looking even deeper, they find evidence of our social nature reflected in the very structure of our brains. Just thinking through your daily interactions with your friends or siblings probably gives you dozens of examples of times when it was important to interpret or predict the feelings and behaviors of other people. Our brains agree. Over time parts of our brains have been developed specifically for those tasks, but apparently not all social interaction was created equally. When researchers study the brains of people trying to predict the thoughts and feelings of others, they can actually see a difference in the brain activity depending on whether that person is trying to understand a friend versus a stranger. Even at the level of blood flowing through your brain, you treat people you know well differently than people you don’t. These social interactions also extend into another important area of the brain: the nucleus accumbens. This structure is key in the reward system of the brain, with activity being associated with things that leave you feeling good. Curious if this could have a direct connection with behavior, one group of scientists studied a very current part of our behavior as modern social beings: Facebook use. © 2015 Scientific American
Keyword: Development of the Brain
Link ID: 20554 - Posted: 02.05.2015
By ALAN SCHWARZ High numbers of students are beginning college having felt depressed and overwhelmed during the previous year, according to an annual survey released on Thursday, reinforcing some experts’ concern about the emotional health of college freshmen. The survey of more than 150,000 students nationwide, “The American Freshman: National Norms Fall 2014,” found that 9.5 percent of respondents had frequently “felt depressed” during the past year, a significant rise over the 6.1 percent reported five years ago. Those who “felt overwhelmed” by schoolwork and other commitments rose to 34.6 percent from 27.1 percent. Conducted by the Cooperative Institutional Research Program at the University of California, Los Angeles’s Higher Education Research Institute for almost 50 years, the survey assesses hundreds of matters ranging from political views to exercise habits. It is considered one of the most comprehensive snapshots of trends among recent high school seniors and is of particular interest to people involved in mental well-being. “It’s a public health issue,” said Dr. Anthony L. Rostain, a psychiatrist and co-chairman of a University of Pennsylvania task force on students’ emotional health. “We’re expecting more of students: There’s a sense of having to compete in a global economy, and they think they have to be on top of their game all the time. It’s no wonder they feel overwhelmed.” Other survey results indicated that students were spending more time on academics and socializing less — trends that would normally be lauded. But the lead author of the study, Kevin Eagan, cautioned that the shift could result in higher levels of stress. © 2015 The New York Times Company
|By Andrea Anderson and Victoria Stern Blood type may affect brain function as we age, according to a new large, long-term study. People with the rare AB blood type, present in less than 10 percent of the population, have a higher than usual risk of cognitive problems as they age. University of Vermont hematologist Mary Cushman and her colleagues used data from a national study called REGARDS, which has been following 30,239 African-American and Caucasian individuals older than 45 since 2007. The aim of the study is to understand the heavy stroke toll seen in the southeastern U.S., particularly among African-Americans. Cushman's team focused on information collected twice yearly via phone surveys that evaluate cognitive skills such as learning, short-term memory and executive function. The researchers zeroed in on 495 individuals who showed significant declines on at least two of the three phone survey tests. When they compared that cognitively declining group with 587 participants whose mental muster remained robust, researchers found that impairment in thinking was roughly 82 percent more likely in individuals with AB blood type than in those with A, B or O blood types, even after taking their race, sex and geography into account. The finding was published online last September in Neurology. The seemingly surprising result has some precedent: past studies suggest non-O blood types are linked to elevated incidence of heart disease, stroke and blood clots—vascular conditions that could affect brain function. Yet these cardiovascular consequences are believed to be linked to the way non-O blood types coagulate, which did not seem to contribute to the cognitive effects described in the new study. The researchers speculate that other blood-group differences, such as how likely cells are to stick to one another or to blood vessel walls, might affect brain function. © 2015 Scientific American
Link ID: 20552 - Posted: 02.05.2015
By Katherine Ellison Dr. Mark Bertin is no A.D.H.D. pill-pusher. The Pleasantville, N.Y., developmental pediatrician won’t allow drug marketers in his office, and says he doesn’t always prescribe medication for children diagnosed with attention deficit hyperactivity disorder. Yet Dr. Bertin has recently changed the way he talks about medication, offering parents a powerful argument. Recent research, he says, suggests the pills may “normalize” the child’s brain over time, rewiring neural connections so that a child would feel more focused and in control, long after the last pill was taken. “There might be quite a profound neurological benefit,” he said in an interview. A growing number of doctors who treat the estimated 6.4 million American children diagnosed with A.D.H.D. are hearing that stimulant medications not only help treat the disorder but may actually be good for their patients’ brains. In an interview last spring with Psych Congress Network, http://www.psychcongress.com/video/are-A.D.H.D.-medications-neurotoxic-or-neuroprotective-16223an Internet news site for mental health professionals, Dr. Timothy Wilens, chief of child and adolescent psychiatry at Massachusetts General Hospital, said “we have enough data to say they’re actually neuroprotective.” The pills, he said, help “normalize” the function and structure of brains in children with A.D.H.D., so that, “over years, they turn out to look more like non-A.D.H.D. kids.” Medication is already by far the most common treatment for A.D.H.D., with roughly 4 million American children taking the pills — mostly stimulants, such as amphetamines and methylphenidate. Yet the decision can be anguishing for parents who worry about both short-term and long-term side effects. If the pills can truly produce long-lasting benefits, more parents might be encouraged to start their children on these medications early and continue them for longer. Leading A.D.H.D. experts, however, warn the jury is still out. © 2015 The New York Times Company
The longer a teenager spends using electronic devices such as tablets and smartphones, the worse their sleep will be, a study of nearly 10,000 16- to 19-year-olds suggests. More than two hours of screen time after school was strongly linked to both delayed and shorter sleep. Almost all the teens from Norway said they used the devices shortly before going to bed. Many said they often got less than five hours sleep a night, BMJ Open reports. The teens were asked questions about their sleep routine on weekdays and at weekends, as well as how much screen time they clocked up outside school hours. On average, girls said they spent around five and a half hours a day watching TV or using computers, smartphones or other electronic devices. And boys spent slightly more time in front of a screen - around six and a half hours a day, on average. Playing computer games was more popular among the boys, whereas girls were more likely to spend their time chatting online. teen using a laptop Any type of screen use during the day and in the hour before bedtime appeared to disrupt sleep - making it more difficult for teenagers to nod off. And the more hours they spent on gadgets, the more disturbed their sleep became. When daytime screen use totalled four or more hours, teens had a 49% greater risk of taking longer than an hour to fall asleep. These teens also tended to get less than five hours of sleep per night. Sleep duration went steadily down as gadget use increased. © 2015 BBC
|By Esther Landhuis One in nine Americans aged 65 and older has Alzheimer's disease, a fatal brain disorder with no cure or effective treatment. Therapy could come in the form of new drugs, but some experts suspect drug trials have failed so far because compounds were tested too late in the disease's progression. By the time people show signs of dementia, their brains have lost neurons. No therapy can revive dead cells, and little can be done to create new ones. So researchers running trials now seek participants who still pass as cognitively normal but are on the verge of decline. These “preclinical” Alzheimer's patients may represent a window of opportunity for therapeutic intervention. How to identify such individuals before they have symptoms presents a challenge, however. Today most Alzheimer's patients are diagnosed after a detailed medical workup and extensive tests that gauge mental function. Other tests, such as spinal fluid analyses and positron-emission tomography (PET) scans, can detect signs of approaching disease and help pinpoint the preclinical window but are cumbersome or expensive. “There's no cheap, fast, noninvasive test that can identify people at risk of Alzheimer's,” says Brad Dolin, chief technology officer of Neurotrack in Palo Alto, Calif.—a company developing a computerized visual screening test for Alzheimer's. Unlike other cognitive batteries, the Neurotrack test requires no language or motor skills. Participants view images on a monitor while a camera tracks their eye movements. The test draws on research by co-founder Stuart Zola of Emory University, who studies learning and memory in monkeys. When presented with a pair of images—one novel, the other familiar—primates fixate longer on the novel one. But if the hippocampus is damaged, as it is in people with Alzheimer's, the subject does not show a clear preference for the novel images. © 2015 Scientific American