Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Robert Lavine Just the briefest eye contact can heighten empathetic feelings, giving people a sense of being drawn together. But patients who suffer from autism, even in its most high-functioning forms, often have trouble establishing this sort of a social connection with other people. Researchers are delving into what’s going on behind the eyes when these magical moments occur, and the hormones and neural substrates involved may offer hope of helping people with autism. University of Cambridge neuroscientist Bonnie Auyeung and colleagues gave oxytocin—a compound commonly referred to as the “love hormone,” as it’s been found to play roles in maternal and romantic bonding—to both normal men and those with a high-functioning form of autism also called Asperger’s syndrome. The scientists then tracked the eye movements of the study subjects and found that, compared with controls, those who received oxytocin via nasal spray showed increases in the number of fixations—pauses of about 300 milliseconds—on the eye region of an interviewer’s face and in the fraction of time spent looking at this region during a brief interview (Translational Psychiatry, doi:10.1038/tp.2014.146, 2015). Oxytocin, a neuropeptide hormone secreted by the pituitary gland, has long been known to activate receptors in the uterus and mammary glands, facilitating labor and milk letdown. But research on the neural effects of oxytocin has been accelerated by the availability of a nasal spray formulation of the hormone, which can deliver it more directly to the brain, also rich with oxytocin receptors. Auyeung adds that her study used a unique experimental setup. “Other studies have shown that [oxytocin] increases looking at the eye region when presented with a picture of a face,” Auyeung says. “The new part is that we are using a live interaction.”
Laura Sanders A busy protein known for its role in aging may also have a hand in depression, a study on mice hints. Under certain circumstances, the aging-related SIRT1 protein seems to make mice despondent, scientists report August 10 in the Journal of Neuroscience. The results are preliminary, but they might ultimately help find new depression treatments. Today’s treatments aren’t always effective, and new approaches are sorely needed. “This is one potential new avenue,” says study coauthor Deveroux Ferguson of the University of Arizona College of Medicine in Phoenix. Ferguson and colleagues subjected mice to 10 days of stressful encounters with other mice. After their demoralizing ordeal, the mice showed signs of depression, such as eschewing sugar water and giving up attempts to swim. Along with these signs of rodent despair, the mice had more SIRT1 gene activity in the nucleus accumbens, a brain area that has been linked to motivation and depression. Resveratrol, a compound found in red grapes, supercharges the SIRT1 protein, making it more efficient at its job. When Ferguson and colleagues delivered resveratrol directly to the nucleus accumbens, mice displayed more signs of depression and anxiety. When the researchers used a different compound to hinder SIRT1 activity, the mice showed the opposite effect, appearing bolder in some tests than mice that didn’t receive the compound. |© Society for Science & the Public 2000 - 2016.
By Ann Griswold, Autism shares genetic roots with obsessive-compulsive disorder (OCD) andattention deficit hyperactivity disorder (ADHD). The three conditions have features in common, such as impulsivity. New findings suggest that they also share a brain signature. The first comparison of brain architecture across these conditions has found that all are associated with disruptions in the structure of the corpus callosum. The corpus callosum is a bundle of nerve fibers that links the brain’s left and right hemispheres. The results appeared July 1 in the American Journal of Psychiatry. Clinicians may find it difficult to distinguish autism from ADHD based on symptoms alone. But if the conditions are marked by similar structural problems in the brain, the same interventions might be useful no matter what the diagnosis is, says lead researcher Stephanie Ameis, assistant professor of psychiatry at the University of Toronto. The unique aspects of each condition might arise from other brain attributes, such as differences in the connections between neurons, says Thomas Frazier, director of research at the Cleveland Clinic Foundation. “A reasonable conclusion is that autism and ADHD don’t differ dramatically in a structural way, but could differ in connectivity,” says Frazier, who was not involved in the study. Ameis’ team examined the brains of 71 children with autism, 31 with ADHD, 36 with OCD and 62 typical children using diffusion tensor imaging. This method provides a picture of the brain’s white matter, the long fibers that connect nerve cells, by measuring the diffusion of water across these fibers. © 2016 Scientific American
Tina Hesman Saey Alcoholism may stem from using genes incorrectly, a study of hard-drinking rats suggests. Rats bred either to drink heavily or to shun alcohol have revealed 930 genes linked to a preference for drinking alcohol, researchers in Indiana report August 4 in PLOS Genetics. Human genetic studies have not found most of the genetic variants that put people at risk for alcoholism, says Michael Miles, a neurogenomicist at Virginia Commonwealth University in Richmond. The new study takes a “significant and somewhat novel approach” to find the genetic differences that separate those who will become addicted to alcohol from those who drink in moderation. It took decades to craft the experiment, says study coauthor William Muir, a population geneticist at Purdue University in West Lafayette, Ind. Starting in the 1980s, rats bred at Indiana University School of Medicine in Indianapolis were given a choice to drink pure water or water mixed with 10 percent ethanol, about the same amount of alcohol as in a weak wine. For more than 40 generations, researchers selected rats from each generation that voluntarily drank the most alcohol and bred them to create a line of rats that consume the rat equivalent of 25 cans of beer a day. Simultaneously, the researchers also selected rats that drank the least alcohol and bred them to make a line of low-drinking rats. A concurrent breeding program produced another line of high-drinking and teetotaling rats. For the new study, Muir and colleagues collected DNA from 10 rats from each of the high- and low-drinking lines. Comparing complete sets of genetic instructions from all the rats identified 930 genes that differ between the two lines. |© Society for Science & the Public 2000 - 2016.
By Nicholas Bakalar A drug used to treat rheumatoid arthritis may have benefits against Alzheimer’s disease, researchers report. Rheumatoid arthritis is an autoimmune disease believed to be driven in part by tumor necrosis factor, or T.N.F., a protein that promotes inflammation. Drugs that block T.N.F., including an injectable drug called etanercept, have been used to treat rheumatoid arthritis for many years. T.N.F. is also elevated in the cerebrospinal fluid of Alzheimer’s patients. Researchers identified 41,109 men and women with a diagnosis of rheumatoid arthritis and 325 with both rheumatoid arthritis and Alzheimer’s disease. In people over 65, the prevalence of Alzheimer’s disease was more than twice as high in people with rheumatoid arthritis as in those without it. The study is in CNS Drugs. But unlike patients treated with five other rheumatoid arthritis drugs, those who had been treated with etanercept showed a significantly reduced risk for Alzheimer’s disease. Still, the lead author, Dr. Richard C. Chou, an assistant professor of medicine at Dartmouth, said that it is too early to think of using etanercept as a treatment for Alzheimer’s. “We’ve identified a process in the brain, and if you can control this process with etanercept, you may be able to control Alzheimer’s,” he said. “But we need clinical trials to prove and confirm it.” © 2016 The New York Times Company
Link ID: 22520 - Posted: 08.06.2016
By Megan Scudellari In late 2013, psychologist Raphael Bernier welcomed a 12-year-old girl and her parents into his office at the University of Washington (UW) in Seattle. The girl had been diagnosed with autism spectrum disorder, and Bernier had invited the family in to discuss the results of a genetic analysis his collaborator, geneticist Evan Eichler, had performed in search of the cause. As they chatted, Bernier noticed the girl’s wide-set eyes, which had a slight downward slant. Her head was unusually large, featuring a prominent forehead. The mother described how her daughter had gastrointestinal issues and sometimes wouldn’t sleep for two to three days at a time. The girl’s presentation was interesting, Bernier recalls, but he didn’t think too much of it—until a week later, when he met an eight-year-old boy with similarly wide-set eyes and a large head. Bernier did a double take. The “kiddos,” as he calls children who come to see him, could have been siblings. According to the boy’s parents, he also suffered from gastrointestinal and sleep problems. The similarities between the unrelated children were remarkable, especially for a disorder so notoriously complex that it has been said, “If you’ve met one child with autism, you’ve met one child with autism.” But Bernier knew that the patients shared another similarity that might explain the apparent coincidence: both harbored a mutation in a gene known as chromodomain helicase DNA binding protein 8 (CHD8). © 1986-2016 The Scientist
Nicola Davis Scientists have discovered 17 separate genetic variations that increase the risk of a person developing depression. The findings, which came from analysing DNA data collected from more than 300,000 people, are the first genetics links to the disease found in people of European ancestry. The scientists say the research will contribute to a better understanding of the disease and could eventually lead to new treatments. They also hope it will reduce the stigma that can accompany depression. According to Nice, up to 10% of people seen by practitioners in primary care have clinical depression, with symptoms including a continuously low mood, low self-esteem, difficulties making decisions and lack of energy. Both environmental and genetic factors are thought to be behind depression, with the interaction between the two also thought to be important. But with a large number of genetic variants each thought to make a tiny contribution to the risk of developing the condition, unravelling their identity has proved challenging. While previous studies have turned up a couple of regions in the genome of Chinese women that might increase the risk of depression, the variants didn’t appear to play a role in depression for people of European ancestry. © 2016 Guardian News and Media Limited
By Andy Coghlan Mysterious shrunken cells have been spotted in the human brain for the first time, and appear to be associated with Alzheimer’s disease. “We don’t know yet if they’re a cause or consequence,” says Marie-Ève Tremblay of Laval University in Québec, Canada, who presented her discovery at the Translational Neuroimmunology conference in Big Sky, Montana, last week. The cells appear to be withered forms of microglia – the cells that keep the brain tidy and free of infection, normally by pruning unwanted brain connections or destroying abnormal and infected brain cells. But the cells discovered by Tremblay appear much darker when viewed using an electron microscope, and they seem to be more destructive. “It took a long time for us to identify them,” says Tremblay, who adds that these shrunken microglia do not show up with the same staining chemicals that normally make microglia visible under the microscope. Compared with normal microglia, the dark cells appear to wrap much more tightly around neurons and the connections between them, called synapses. “It seems they’re hyperactive at synapses,” says Tremblay. Where these microglia are present, synapses often seem shrunken and in the process of being degraded. Tremblay first discovered these dark microglia in mice, finding that they increase in number as mice age, and appear to be linked to a number of things, including stress, the neurodegenerative condition Huntington’s disease and a mouse model of Alzheimer’s disease. “There were 10 times as many dark microglia in Alzheimer’s mice as in control mice,” says Tremblay. © Copyright Reed Business Information Ltd.
By Katherine S. Pollard When the first human genome sequence was published in 2001,1 I was a graduate student working as the statistics expert on a team of scientists. Hailing from academia and biotechnology, we aimed to discover differences in gene expression levels between tumors and healthy cells. Like many others, I had high hopes for what we could do with this enormous text file of more than 3 billion As, Cs, Ts, and Gs. Ambitious visions of a precise wiring diagram for human cells and imminent cures for disease were commonplace among my classmates and professors. But I was most excited about a different use of the data, and I found myself counting the months until the genome of a chimpanzee would be sequenced. Chimps are our closest living relatives on the tree of life. While their biology is largely similar to ours, we have many striking differences, ranging from digestive enzymes to spoken language. Humans also suffer from an array of diseases that do not afflict chimpanzees or are less severe in them, including autism, schizophrenia, Alzheimer’s disease, diabetes, atherosclerosis, AIDS, rheumatoid arthritis, and certain cancers. I had long been fascinated with hominin fossils and the way the bones morphed into different forms over evolutionary time. But those skeletons cannot tell us much about the history of our immune system or our cognitive abilities. So I started brainstorming about how to extend the statistical approaches we were using for cancer research to compare human and chimpanzee DNA. My immodest goal was to identify the genetic basis for all the traits that make humans unique. © 1986-2016 The Scientist
Aaron E. Carroll I remember thinking, after my pregnant wife’s water broke, minutes after I went to bed, anguishing really, over one thought as we drove to the hospital: “I’m never going to be well rested again.” If there’s one things all new parents wish for, it’s a good night’s sleep. Unfortunately, infants sometimes make that impossible. They wake up repeatedly, needing to be fed, changed and comforted. Eventually, they reach an age when they should sleep through the night. Some don’t, though. What to do with them continues to be a topic of a heated debate in parenting circles. One camp believes that babies should be left to cry it out. These people place babies in their cribs at a certain time, after a certain routine, and don’t interfere until the next morning. No matter how much the babies scream or cry, parents ignore them. After all, if babies learn that tantrums lead to the appearance of a loved one, they will continue that behavior in the future. The official name for this intervention is “Extinction.” The downside, of course, is that it’s unbelievably stressful for parents. Many can’t do it. And not holding fast to the plan can make everything worse. Responding to an infant’s crying after an extended period of time makes the behavior harder to extinguish. To a baby, it’s like a slot machine that hits just as you’re ready to walk away; it makes you want to play more. A modification of this strategy is known as “Graduated Extinction.” Parents allow their infant to cry it out for a longer period each night, until infants eventually put themselves to sleep. On the first night, for instance, parents might commit to not entering the baby’s room for five minutes. The next night, 10 minutes. Then 15, and so on. Or, they could increase the increments on progressive checks each night. When they do go in the room, it’s only to check and make sure the baby is O.K. – no picking up or comforting. This isn’t meant to be a reward for crying, but to allow parents to be assured that nothing is wrong. © 2016 The New York Times Company
By Tanya Lewis The tangled buildup of tau protein in brain cells is a hallmark of the cognitive decline linked with Alzheimer’s disease. Antibodies have been shown to block tau’s spread, but some scientists worry it could also fuel inflammation. Now, researchers from Genentech in San Francisco and colleagues have found that an antibody’s ability to recruit immune cells—known as its effector function—is not necessary for stopping tau’s spread, the team reported today (July 28) in Cell Reports. “Our results suggest that, given that effector function is not required for efficacy [in treating tau accumulation], going without it could offer a safer approach for immunotherapy,” study coauthor Gai Ayalon of Genentech told The Scientist. Alzheimer’s disease causes a characteristic constellation of pathologies: accumulation of amyloid-β plaques outside neurons, neurofibrillary tangles of tau inside brain cells, and chronic inflammation. Clinical research has mostly focused on targeting amyloid-β with antibody therapies, and several treatments based on this approach are currently in clinical trials. But recent efforts have zeroed in on tau as a new potential target. Antibodies are known to spur the brain’s defense system, microglia, to absorb and degrade tau, but their recruitment of immune cells may also worsen inflammation. Ayalon and colleagues wondered whether effector function was necessary for stopping tau’s spread. © 1986-2016 The Scientist
By ANDREW POLLACK A new type of drug for Alzheimer’s disease failed to slow the rate of decline in mental ability and daily functioning in its first large clinical trial. There was a hint, though, that it might be effective for certain patients. The drug, called LMTX, is the first one with its mode of action — trying to undo so-called tau tangles in the brain — to reach the final stage of clinical trials. So the results of the study were eagerly awaited. The initial reaction to the outcome was disappointment, with perhaps a glimmer of hopefulness. Over all, the patients who received LMTX, which was developed by TauRx Therapeutics, did not have a slower rate of decline in mental ability or daily functioning than those in the control group. However, the drug did seem to work for the subset of patients — about 15 percent of those in the study — who took LMTX as their only therapy. The other 85 percent of patients took an existing Alzheimer’s drug in addition to either LMTX or a placebo. “There were highly significant, clinically meaningful, large effects in patients taking the drug as monotherapy, and no effect in patients taking it as an add-on,” Claude Wischik, a founder and the chief executive of TauRx, said in an interview. He spoke from Toronto, where the results were being presented at the Alzheimer’s Association International Conference. Dr. Wischik said a second clinical trial sponsored by the company, whose results will be announced later, found the same phenomenon. He said the company planned to apply for approval of LMTX to be used by itself. But some experts not involved in the study were skeptical about drawing conclusions from a small subset of patients, especially since there was no obvious explanation why LMTX would be expected to work only in patients not getting other drugs. on © 2016 The New York Times Company
Link ID: 22488 - Posted: 07.28.2016
Ian Sample and Nicky Woolf When Bill Gates pulled on a red and white-striped cord to upturn a bucket of iced water positioned delicately over his head, the most immediate thought for many was not, perhaps, of motor neurone disease. But the ice bucket challenge, the charity campaign that went viral in the summer of 2014 and left scores of notable persons from Gates and Mark Zuckerberg to George W. Bush and Anna Wintour shivering and drenched, has paid off in the most spectacular way. Dismissed by some at the time as “slacktivism” - an exercise that appears to do good while achieving very little - the ice bucket challenge raised more than $115m (£88m) for motor neurone disease in a single month. Now, scientists funded with the proceeds have discovered a gene variant associated with the condition. In the near term the NEK1 gene variant, described in the journal Nature Genetics this week, will help scientists understand how the incurable disorder, known also as Amyotrophic Lateral Sclerosis (ALS) or Lou Gehrig’s disease, takes hold. Once the mechanisms are more clearly elucidated, it may steer researchers on a path towards much-needed treatments. The work may never have happened were it not for the curious appeal of the frozen water drenchings. The research grants that scientists are awarded do not get close to the €4m the study required. Instead, Project MinE, which aims to unravel the genetic basis of the disease and ultimately find a cure, was funded by the ALS Association through ice bucket challenge donations. © 2016 Guardian News and Media Limited
Jon Hamilton Two studies released at an international Alzheimer's meeting Tuesday suggest doctors may eventually be able to screen people for this form of dementia by testing the ability to identify familiar odors, like smoke, coffee and raspberry. In both studies, people who were in their 60s and older took a standard odor detection test. And in both cases, those who did poorly on the test were more likely to already have — or go on to develop — problems with memory and thinking. "The whole idea is to create tests that a general clinician can use in an office setting," says Dr. William Kreisl, a neurologist at Columbia University, where both studies were done. The research was presented at the Alzheimer's Association International Conference in Toronto. Currently, any tests that are able to spot people in the earliest stages of Alzheimer's are costly and difficult. They include PET scans, which can detect sticky plaques in the brain, and spinal taps that measure the levels of certain proteins in spinal fluid. The idea of an odor detection test arose, in part, from something doctors have observed for many years in patients with Alzheimer's, Kreisl says. "Patients will tell us that food does not taste as good," he says. The reason is often that these patients have lost the ability to smell what they eat. That's not surprising, Kreisl says, given that odor signals from the nose have to be processed in areas of the brain that are among the first to be affected by Alzheimer's disease. But it's been tricky to develop a reliable screening test using odor detection. © 2016 npr
By PAM BELLUCK “Has the person become agitated, aggressive, irritable, or temperamental?” the questionnaire asks. “Does she/he have unrealistic beliefs about her/his power, wealth or skills?” Or maybe another kind of personality change has happened: “Does she/he no longer care about anything?” If the answer is yes to one of these questions — or others on a new checklist — and the personality or behavior change has lasted for months, it could indicate a very early stage of dementia, according to a group of neuropsychiatrists and Alzheimer’s experts. They are proposing the creation of a new diagnosis: mild behavioral impairment. The idea is to recognize and measure something that some experts say is often overlooked: Sharp changes in mood and behavior may precede the memory and thinking problems of dementia. The group made the proposal on Sunday at the Alzheimer’s Association International Conference in Toronto, and presented a 38-question checklist that may one day be used to identify people at greater risk for Alzheimer’s. “I think we do need something like this,” said Nina Silverberg, the director of the Alzheimer’s Disease Centers program at the National Institute on Aging, who was not involved in creating the checklist or the proposed new diagnosis. “Most people think of Alzheimer’s as primarily a memory disorder, but we do know from years of research that it also can start as a behavioral issue.” Under the proposal, mild behavioral impairment (M.B.I.) would be a clinical designation preceding mild cognitive impairment (M.C.I.), a diagnosis created more than a decade ago to describe people experiencing some cognitive problems but who can still perform most daily functions. © 2016 The New York Times Company
Link ID: 22480 - Posted: 07.26.2016
By Sharon Begley, STAT For the first time ever, researchers have managed to reduce people’s risk for dementia — not through a medicine, special diet, or exercise, but by having healthy older adults play a computer-based brain-training game. The training nearly halved the incidence of Alzheimer’s disease and other devastating forms of cognitive and memory loss in older adults a decade after they completed it, scientists reported on Sunday. If the surprising finding holds up, the intervention would be the first of any kind — including drugs, diet, and exercise — to do that. “I think these results are highly, highly promising,” said George Rebok of the Johns Hopkins Bloomberg School of Public Health, an expert on cognitive aging who was not involved in the study. “It’s exciting that this intervention pays dividends so far down the line.” The results, presented at the Alzheimer’s Association International Conference in Toronto, come from the government-funded ACTIVE (Advanced Cognitive Training for Independent and Vital Elderly) study. Starting in 1998, ACTIVE’s 2,832 healthy older adults (average age at the start: 74) received one of three forms of cognitive training, or none, and were evaluated periodically in the years after. In actual numbers, 14 percent of ACTIVE participants who received no training had dementia 10 years later, said psychologist Jerri Edwards of the University of South Florida, who led the study. Among those who completed up to 10 60-to-75-minute sessions of computer-based training in speed-of-processing — basically, how quickly and accurately they can pay attention to, process, and remember brief images on a computer screen — 12.1 percent developed dementia. Of those who completed all 10 initial training sessions plus four booster sessions a few years later, 8.2 percent developed dementia. © 2016 Scientific American
Dean Burnett On July 31st 2016, this blog will have been in existence for four years exactly. A huge thanks to everyone who’s made the effort to read it in that time (an alarming number of you). Normally there’d be a post on the day to mark the occasion, but this year the 31st is a) a Sunday, and b) my birthday, so even if I could be bothered to work that day, it’s unlikely anyone would want to read it. However, today also marks the ridiculously-unlikely-but-here-we-are American release of my book. How did it get to this point? I’ve been a “professional” science writer now for four years, and I’ve been involved in neuroscience, in one guise or another, since 2000, the year I started my undergraduate degree. In that time, I’ve heard/encountered some seriously bizarre claims about how the brain works. Oftentimes it was me not understanding what was being said, or misinterpreting a paper, or just my own lack of competence. Sometimes, it was just a media exaggeration. However, there have been occasions when a claim made about the brain thwarts all my efforts to find published evidence or even a rational basis for it, leaving me scratching my head and wondering “where the hell did THAT come from?” Here are some of my favourites. In the past, one terabyte of storage capacity would have seemed incredibly impressive. But Moore’s law has put paid to that. My home desktop PC presently has 1.5 TB of storage space, and that’s over seven years old. Could my own clunky desktop be, in terms of information capacity, smarter than me? Apparently. Some estimates put the capacity of the human brain as low as 1TB. A lifetimes worth of memories wouldn’t fill a modern-day hard drive? That seems far-fetched, at least at an intuitive level.
Keyword: Development of the Brain
Link ID: 22477 - Posted: 07.26.2016
By Dave Dormer, Transporting babies deprived of oxygen at birth to a neonatal intensive care unit in Calgary will soon be safer thanks to a new portable cooling device. The Foothills hospital is one of the first facilities in Canada to acquire one and doctors hope it will help prevent brain injuries, as reducing a baby's temperature can prevent damage to brain tissue and promote healing. The reduction in temperature is called therapeutic hypothermia, and it can help prevent damage to brain tissue and promote healing. (Evelyne Asselin/CBC) "The period immediately following birth is critical. We have about a six-hour window to lower these babies' temperatures to prevent neurological damage," said Dr. Khorshid Mohammad, the neonatal neurocritical care project lead who spearheaded the initiative. "The sooner we can do so, and the more consistent we can make the temperature, the more protective it is and the better their chances of surviving without injury." Since about 2008, doctors used cooling blankets and gel packs to lower a baby's temperature to 33.5 C from the normal 37 C for 72 hours in order to prevent brain damage. "With those methods, it can be difficult to maintain a stable temperature," said Mohammad. ©2016 CBC/Radio-Canada.
Keyword: Development of the Brain
Link ID: 22476 - Posted: 07.26.2016
By Andy Coghlan The final brain edit before adulthood has been observed for the first time. MRI scans of 300 adolescents and young adults have shown how the teenage brain upgrades itself to become quicker – but that errors in this process may lead to schizophrenia in later life. The editing process that takes place in teen years seems to select the brain’s best connections and networks, says Kirstie Whitaker at the University of Cambridge. “The result is a brain that’s sleeker and more efficient.” When Whitaker and her team scanned brains from people between the ages of 14 and 24, they found that two major changes take place in the outer layer of the brain – the cortex – at this time. As adolescence progresses, this layer of grey matter gets thinner – probably because unwanted or unused connections between neurons – called synapses – are pruned back. At the same time, important neurons are upgraded. The parts of these cells that carry signals down towards synapses are given a sheath that helps them transmit signals more quickly – a process called myelination. “It may be that pruning and myelination are part of the maturation of the brain,” says Steven McCarroll at Harvard Medical School. “Pruning involves removing the connections that are not used, and myelination takes the ones that are left and makes them faster,” he says. McCarroll describes this as a trade-off – by pruning connections, we lose some flexibility in the brain, but the proficiency of signal transmission improves. © Copyright Reed Business Information Ltd.
Keyword: Development of the Brain
Link ID: 22474 - Posted: 07.26.2016
By Lizzie Wade Neandertals and modern humans had a lot in common—at least enough to have babies together fairly often. But what about their brains? To answer that question, scientists have looked at how Neandertal and modern human brains developed during the crucial time of early childhood. In the first year of life, modern human infants go through a growth spurt in several parts of the brain: the cerebellum, the parietal lobes, and the temporal lobes—key regions for language and social interaction. Past studies suggested baby Neandertal brains developed more like the brains of chimpanzees, without concentrated growth in any particular area. But a new study casts doubt on that idea. Scientists examined 15 Neandertal skulls, including one newborn and a pair of children under the age of 2. By carefully imaging the skulls, the team determined that Neandertal temporal lobes, frontal lobes, and cerebellums did, in fact, grow faster than the rest of the brain in early life, a pattern very similar to modern humans, they report today in Current Biology. Scientists had overlooked that possibility, the researchers say, because Neandertals and Homo sapiens have such differently shaped skulls. Modern humans’ rounded skull is a telltale marker of the growth spurt, for example, whereas Neandertals’ skulls were relatively flat on the top. If Neandertals did, in fact, have fast developing cerebellums and temporal and frontal lobes, they might have been more skilled at language and socializing than assumed, scientists say. This could in turn explain how the children of Neandertal–modern human pairings fared well enough to pass down their genes to so many us living today. © 2016 American Association for the Advancement of Science