Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Nicholas Bakalar Stress in childhood may be linked to hardening of the arteries in adulthood, new research suggests. Finnish researchers studied 311 children 12 to 18 years old, scoring their levels of stress according to a variety of components, including the family’s economic circumstances, the emotional environment in the home, whether parents engaged in healthy behaviors, stressful events (such as divorce, moves or death of a family member) and parental concerns about the child’s social adjustment. Using these criteria, they calculated a stress score. When the members of the group were 40 to 46 years old, they used computed tomography to measure coronary artery calcification, a marker of atherosclerosis and a risk factor for cardiovascular disease. The study, in JAMA Pediatrics, controlled for sex, cholesterol, body mass index and other factors, but still found that the higher the childhood stress score, the greater the risk for coronary artery calcification. The study is observational, and the data is based largely on parental reports, which can be biased. Still, its long follow-up time and careful control of other variables gives it considerable strength. There are plausible mechanisms for the connection, including stress-induced increases in inflammation, which in animal models have been linked to a variety of ailments. “I think that economic conditions are important here,” said the lead author, Dr. Markus Juonala, a professor of internal medicine at the University of Turku in Finland. “Public health interventions should focus on how to intervene in better ways with people with higher stress and lower socioeconomic status.” © 2016 The New York Times Company
By Matthew Hutson Earlier this month, a computer program called AlphaGo defeated a (human) world champion of the board game Go, years before most experts expected computers to rival the best flesh-and-bone players. But then last week, Microsoft was forced to silence its millennial-imitating chatbot Tay for blithely parroting Nazi propaganda and misogynistic attacks after just one day online, her failure a testimony to the often underestimated role of human sensibility in intelligent behavior. Why are we so compelled to pit human against machine, and why are we so bad at predicting the outcome? As the number of jobs susceptible to automation rises, and as Stephen Hawking, Elon Musk, and Bill Gates warn that artificial intelligence poses an existential threat to humanity, it’s natural to wonder how humans measure up to our future robot overlords. But even those tracking technology’s progress in taking on human skills have a hard time setting an accurate date for the uprising. That’s in part because one prediction strategy popular among both scientists and journalists—benchmarking the human brain with digital metrics such as bits, hertz, and million instructions per section, or MIPS—is severely misguided. And doing so could warp our expectations of what technology can do for us and to us. Since their development, digital computers have become a standard metaphor for the mind and brain. The comparison makes sense, in that brains and computers both transform input into output. Most human brains, like computers, can also manipulate abstract symbols. (Think arithmetic or language processing.) But like any metaphor, this one has limitations.
By David Z. Hambrick Nearly a century after James Truslow Adams coined the phrase, the “American dream” has become a staple of presidential campaign speeches. Kicking off her 2016 campaign, Hillary Clinton told supporters that “we need to do a better job of getting our economy growing again and producing results and renewing the American dream.” Marco Rubio lamented that “too many Americans are starting to doubt” that it is still possible to achieve the American dream, and Ted Cruz asked his supporters to “imagine a legal immigration system that welcomes and celebrates those who come to achieve the American dream.” Donald Trump claimed that “the American dream is dead” and Bernie Sanders quipped that for many “the American dream has become a nightmare.” But the American dream is not just a pie-in-the-sky notion—it’s a scientifically testable proposition. The American dream, Adams wrote, “is not a dream of motor cars and high wages merely, but a dream of social order in which each man and each woman shall be able to attain to the fullest stature of which they are innately capable…regardless of the fortuitous circumstances of birth or position.” In the parlance of behavioral genetics—the scientific study of genetic influences on individual differences in behavior—Adams’ idea was that all Americans should have an equal opportunity to realize their genetic potential. A study just published in Psychological Science by psychologists Elliot Tucker-Drob and Timothy Bates reveals that this version of the American dream is in serious trouble. Tucker-Drob and Bates set out to evaluate evidence for the influence of genetic factors on IQ-type measures (aptitude and achievement) that predict success in school, work, and everyday life. Their specific question was how the contribution of genes to these measures would compare at low versus high levels of socioeconomic status (or SES), and whether the results would differ across countries. The results reveal, ironically, that the American dream is more of a reality for other countries than it is for America: genetic influences on IQ were uniform across levels of SES in Western Europe and Australia, but, in the United States, were much higher for the rich than for the poor. © 2016 Scientific American
By Patrick Monahan Yesterday, mountaineer Richard Parks set out for Kathmandu to begin some highly unusual data-gathering. As part of Project Everest Cynllun, he will climb Mount Everest without supplemental oxygen and perform—on himself—a series of blood draws, muscle biopsies, and cognitive tests. If he makes it to the summit, these will be the highest-elevation blood and tissue samples ever collected. Damian Bailey, a physiologist at the University of South Wales, Pontypridd, in the United Kingdom and the project’s lead scientist, hopes the risky experiment will yield new information about how the human body responds to low-oxygen conditions, and how similar mechanisms might drive cognitive decline with aging. As Parks began the acclimatization process with warm-up climbs on two smaller peaks, Bailey told ScienceInsider about his ambitions for the project. This interview has been edited for clarity and brevity. Q: Parks is an extreme athlete who has climbed Everest before. What can his performance tell us about regular people? A: What we’re trying to understand is, what is it about Richard’s brain that is potentially different from other people’s brains, and can that provide us with some clues to accelerated cognitive decline, which occurs with aging [and] dementia. We know that sedentary aging is associated with a progressive decline in blood flow to the brain. … And the main challenge for sedentary aging is we have to wait so long to see the changes occurring. So this is almost a snapshot, a day in the life of a patient with cognitive decline. © 2016 American Association for the Advancement of Science.
By Esther Hsieh Spinal implants have suffered similar problems as those in the brain—they tend to abrade tissue, causing inflammation and ultimately rejection by the body. Now an interdisciplinary research collaboration based in Switzerland has made a stretchable implant that appears to solve this problem. Like Lieber's new brain implant, it matches the physical qualities of the tissue where it is embedded. The “e-dura” implant is made from a silicone rubber that has the same elasticity as dura mater, the protective skin that surrounds the spinal cord and brain, explains Stéphanie Lacour, a professor at the school of engineering at the Swiss Federal Institute of Technology in Lausanne. This feature allows the implant to mimic the movement of the surrounding tissues. Embedded in the e-dura are electrodes for stimulation and microchannels for drug therapy. Ultrathin gold wires are made with microscopic cracks that allow them to stretch. Also, the electrodes are coated with a special platinum-silicone mixture that is stretchable. In an experiment that lasted two months, the scientists found that healthy rats with an e-dura spinal implant could walk across a ladder as well as a control group with no implant. Yet rats with a traditional plastic implant (which is flexible but not stretchable) started stumbling and missing rungs a few weeks after surgery. The researchers removed the implants and found that rats with a traditional implant had flattened, damaged spinal cords—but the e-dura implants had left spinal cords intact. Cellular testing also showed a strong immune response to the traditional implant, which was minimal in rats with the e-dura implant. © 2016 Scientific American
New York's Tribeca Film Festival will not show Vaxxed, a controversial film about the MMR vaccine, its founder Robert De Niro says. As recently as Friday, Mr De Niro stood by his decision to include the film by anti-vaccination activist Andrew Wakefield in next month's festival. The link the film makes between the measles, mumps and rubella vaccine and autism has been widely discredited. "We have concerns with certain things in this film," said Mr De Niro. Mr De Niro, who has a child with autism, said he had hoped the film would provide the opportunity for discussion of the issue. But after reviewing the film with festival organisers and scientists, he said: "We do not believe it contributes to or furthers the discussion I had hoped for." Image caption Wakefield published his controversial study in 1998 Vaxxed was directed and co-written by Mr Wakefield, who described it as a "whistle-blower documentary". In a statement issued following the Tribeca Film Festival's decision, he and the film's producer Del Bigtree said that "we have just witnessed yet another example of the power of corporate interests censoring free speech, art and truth". The British doctor was the lead author of a controversial study published in 1998, which argued there might be a link between MMR and autism and bowel disease. Mr Wakefield suggested that parents should opt for single jabs against mumps, measles and rubella instead of the three-in-one vaccine. His comments and the subsequent media furore led to a sharp drop in the number of children being vaccinated against these diseases. But the study, first published in The Lancet, was later retracted by the medical journal. Mr Wakefield's research methods were subsequently investigated by the General Medical Council and he was struck off the medical register.
Link ID: 22037 - Posted: 03.28.2016
John Consentino After multiple doctors had conflicted about ADHD, I decided to move away from psychiatry and seek a neuropsychologist. I thought that autism made sense, but what ultimately led me to seek help was my focus problem. When I was 8 years old, it would take me HOURS to do homework. On Wednesdays, we got out of school at noon, and I wouldn't finish homework until about 8 p.m. No one understood why this was happening, and with all of the screaming and punishments I withstood, nothing improved. I still had GPAs near the high 90s, so all was OK, supposedly. I struggled with eye contact during that time, and this is very much apparent now. I struggled speaking to waiters/waitresses, to teachers, to family members. Speaking to members of the opposite sex was a near-impossible task. I never understood social groups. I went through all of high school in the same fashion. However, my family felt that everything was OK. I still had a mid-90 GPA, and I had made numerous friends. Unfortunately, my GPA had dropped by about 15-plus points by my senior year. I struggled badly during my first two years of college. I was constantly unhappy, and I made little to no friends. My GPA was horrid, and my time at the university was dwindling. I dropped out of school twice, and my future felt bleak. After transferring schools, I did great. So, everything was OK yet again. © 2016 npr
Link ID: 22036 - Posted: 03.28.2016
Kristin Gourlay Swaddled in soft hospital blankets, Lexi is 2 weeks old and weighs 6 pounds. She's been at Women and Infants Hospital in Providence, R.I., since she was born, and is experiencing symptoms of opioid withdrawal. Her mother took methadone to wean herself from heroin when she got pregnant, just as doctors advised. But now the hospital team has to wean newborn Lexi from the methadone. As rates of opioid addiction have continued to climb in the U.S., the number of babies born with neonatal abstinence syndrome has gone up, too — by five-fold from 2000 to 2012, according to the National Institute of Drug Abuse. It can be a painful way to enter the world, abruptly cut off from the powerful drug in the mother's system. The baby is usually born with some level of circulating opioids. As drug levels decline in the first 72 hours, various withdrawal symptoms may appear — such as trembling, vomiting, diarrhea or seizures. At some point, if symptoms mount in number or severity, doctors will begin giving medication to help ease them. The idea is to give the baby just enough opioid to reduce those symptoms, and then slowly, over days or weeks, decrease that dose to zero. A doctor comes to check on Lexi and her mother, Carrie. To protect her family's privacy, Carrie asked us not to use the family name. "So, hi, Peanut!" the doctor says to the baby. "Any concerns?" she asks Carrie. "Coming down has been catching up with her," says Carrie. © 2016 npr
Healthy body, healthy mind. Elderly people who are physically active seem to be able to stave off memory loss – but only if they start exercising before symptoms appear. At the end of a five-year period, the brains of non-exercisers look 10 years older than those who did moderate exercise. That’s what Clinton Wright at the University of Miami in Florida and his colleagues found when they followed 876 people, starting at an average age of 71, for five years. At the start of the study, each participant underwent a number of memory and cognition tests, and had the health of their brain assessed during an MRI scan. Each person was also asked how much exercise they had done in recent weeks, ranging from “no/light”, such as walking or gardening, to “moderate/heavy”, which included running and swimming. Five years later, the volunteers were called back to repeat all the tests. The participants generally performed less well than they had five years earlier. But their scores were linked to their level of exercise – those who reported no or low levels of exercise scored lower in all tests, the team found. The 10 per cent of people who said they had been engaged in moderate-to-heavy exercise not only started with higher scores in the first round of tests, but showed less of a decline five years later . Those who did little or no exercise also seemed to have worse vascular health – they had higher blood pressure, and their MRI scans showed evidence of undetected strokes. © Copyright Reed Business Information Ltd.
Nicola Davis The same genes involved in predisposing people to autism appear to influence social skills in the wider population, suggesting that the autism spectrum has no clear cut-off point, scientists have discovered. Researchers have previously shown that autism is linked not just to one or two powerful genes, but to the combined effect of many small genetic changes. The latest findings, published in Nature Genetics, suggest that social charm, empathy and the ability to make friends is about more than just practice and upbringing, but is also affected by how many of these autism risk gene variants we possess. Dr Elise Robinson, from Harvard University and a lead author on the paper, said: “This is the first study that specifically shows that ... factors that we have unambiguously associated with autism are also very clearly associated with social communication differences in the general population.” Rather than viewing a person as either having or not having such a disorder, Robinson believes our social skills are better viewed as sitting on a sliding scale across the whole population. “The primary implication is that the line at which we say people are affected or unaffected is arbitrary,” said Robinson. “There is no clear objective point either in terms of genetic risk or in terms of behavioural traits, where you can say quite simply or categorically that you’re affected or unaffected. It’s like trying to pick a point where you say someone is tall or not.” © 2016 Guardian News and Media Limited
Link ID: 22014 - Posted: 03.22.2016
By Emily Underwood When pharmaceutical company Eli Lilly in Indianapolis last week announced a major change to its closely watched clinical trial for the Alzheimer’s drug solanezumab, some in the scientific community and drug development industry cried foul. To critics, the company’s decision to eliminate changes in a person’s daily ability to function as a primary measure of solanezumab’s efficacy and focus solely on a cognitive test seemed like a last-ditch attempt to keep a doomed drug from failing its third trial. Lilly’s stock plunged by nearly 5%, apparently reflecting that sentiment. Largely lost in the online “chatter,” however, was that Lilly’s move reflects a growing scientific consensus about how the early stages of Alzheimer’s disease progress, says Dennis Selkoe, a neurologist at Brigham and Women’s Hospital in Boston, who is not involved in the Lilly trial. “From the point of view of a neurologist who’s seen hundreds of patients, [Lilly’s decision] makes clinical sense,” he says. Solanezumab is an antibody designed to bind to and promote the clearance of the β-amyloid protein, which forms plaques around the neurons of people with Alzheimer’s. Not everyone agrees that these plaques are at the root of the disease—a concept called the amyloid hypothesis, of which Selkoe is a major proponent—but fighting them is the foundation of nearly all current efforts in Alzheimer’s drug development. By helping destroy the plaques in people with early stages of Alzheimer’s, Lilly hopes solanezumab can slow the disease’s progression. © 2016 American Association for the Advancement of Science.
Link ID: 22011 - Posted: 03.22.2016
By Emily Underwood People with autism spectrum disorder (ASD) die on average 18 years before the general population, according to a report released today by Autistica, a philanthropic group based in the United Kingdom. People with both ASD and an intellectual disability die even younger, on average 30 years earlier than those without the conditions. Fatal accidents—often by drowning, when a child or adult with ASD wanders away from caregivers—are one of the classic causes of premature death in people who have both ASD and an intellectual disability, says Sven Bölte, a clinical psychologist at the Karolinksa Institute in Stockholm, whose research is cited in the Autistica report. Epilepsy, along with several other neurological disorders, is another common cause of death among people with both ASD and learning difficulties, suggesting that early disruption of neurodevelopment is to blame. These “classic” causes of premature death in autism, however, do not fully account for a decades-long life span gap between autistic and nonautistic people, or the difference in mortality between autistic people with and without an intellectual disability, Bölte says. To explore these gaps, in 2015 Bölte’s group published a large epidemiological study of more than 27,000 Swedish people with ASD, 6500 of whom had an intellectual disability. They found that risk of premature death was about 2.5 times higher for the entire group, a gap largely due to increased prevalence of common health problems such as diabetes and respiratory disease. Patients may be being diagnosed too late because they do not know how to express health concerns to their doctors, Bölte says, making it “extremely important” for general practitioners to thoroughly explore autistic patients’ symptoms and histories. © 2016 American Association for the Advancement of Scienc
Link ID: 22010 - Posted: 03.19.2016
By John Elder Robison What happens to your relationships when your emotional perception changes overnight? Because I’m autistic, I have always been oblivious to unspoken cues from other people. My wife, my son and my friends liked my unflappable demeanor and my predictable behavior. They told me I was great the way I was, but I never really agreed. For 50 years I made the best of how I was, because there was nothing else I could do. Then I was offered a chance to participate in a study at Beth Israel Deaconess Medical Center, a teaching hospital of Harvard Medical School. Investigators at the Berenson-Allen Center there were studying transcranial magnetic stimulation, or T.M.S., a noninvasive procedure that applies magnetic pulses to stimulate the brain. It offers promise for many brain disorders. Several T.M.S. devices have been approved by the Food and Drug Administration for the treatment of severe depression, and others are under study for different conditions. (It’s still in the experimental phase for autism.) The doctors wondered if changing activity in a particular part of the autistic brain could change the way we sense emotions. That sounded exciting. I hoped it would help me read people a little better. They say, be careful what you wish for. The intervention succeeded beyond my wildest dreams — and it turned my life upside down. After one of my first T.M.S. sessions, in 2008, I thought nothing had happened. But when I got home and closed my eyes, I felt as if I were on a ship at sea. And there were dreams — so real they felt like hallucinations. It sounds like a fairy tale, but the next morning when I went to work, everything was different. Emotions came at me from all directions, so fast that I didn’t have a moment to process them. © 2016 The New York Times Company
By Daisy Yuhas Something was wrong with Brayson Thibodeaux. At 15 months old, he still was not walking; his parents and grandparents were certain that his development was slower than normal. After pushing doctors for answers they finally got him to a neurologist who recommended a genetic test. Brayson had fragile X syndrome, the leading heritable cause of intellectual disability and of autism. The discovery sent ripples through the extended family, who live outside New Orleans. Brayson’s great-grandmother, Cheryl, recalled having heard of fragile X and discovered a cousin whose grandson had the same condition. She soon learned that many members of her family were confirmed carriers of a genetic condition—the fragile X pre-mutation—that put them at risk of having children with this syndrome. “Fragile X” refers to a mutation that alters the X chromosome in such a way that, viewed under a microscope, it would look like a piece was about to break off. That is because one gene contains multiple repetitions of noncoding DNA—specifically CGG (cytosine, guanine, guanine). The exact number of CGG repetitions is variable, but when it reaches more than 200, it is considered to be the full mutation, which causes the syndrome. People with between 55 and 200 repeats are said to have a partial or pre-mutation, an unstable gene that can expand into the full mutation in future generations. © 2016 Scientific American,
Alison Abbott In the 25 years that John Collinge has studied neurology, he has seen hundreds of human brains. But the ones he was looking at under the microscope in January 2015 were like nothing he had seen before. He and his team of pathologists were examining the autopsied brains of four people who had once received injections of growth hormone derived from human cadavers. It turned out that some of the preparations were contaminated with a misfolded protein — a prion — that causes a rare and deadly condition called Creutzfeldt–Jakob disease (CJD), and all four had died in their 40s or 50s as a result. But for Collinge, the reason that these brains looked extraordinary was not the damage wrought by prion disease; it was that they were scarred in another way. “It was very clear that something was there beyond what you'd expect,” he says. The brains were spotted with the whitish plaques typical of people with Alzheimer's disease. They looked, in other words, like young people with an old person's disease. For Collinge, this led to a worrying conclusion: that the plaques might have been transmitted, alongside the prions, in the injections of growth hormone — the first evidence that Alzheimer's could be transmitted from one person to another. If true, that could have far-reaching implications: the possibility that 'seeds' of the amyloid-β protein involved in Alzheimer's could be transferred during other procedures in which fluid or tissues from one person are introduced into another, such as blood transfusions, organ transplants and other common medical procedures. © 2016 Nature Publishing Group,
Laura Sanders Using flashes of blue light, scientists have pulled forgotten memories out of the foggy brains of mice engineered to have signs of early Alzheimer’s disease. This memory rehab feat, described online March 16 in Nature, offers new clues about how the brain handles memories, and how that process can go awry. The result “provides a theoretical mechanism for reviving old, forgotten memories,” says Yale School of Medicine neurologist Arash Salardini. Memory manipulations, such as the retrieval of lost memories and the creation of false memories, were “once the realm of science fiction,” he says. But this experiment and other recent work have now accomplished these feats, at least in rodents (SN: 12/27/14, p. 19), he says. To recover a lost memory, scientists first had to mark it. Neuroscientist Susumu Tonegawa of MIT and colleagues devised a system that tagged the specific nerve cells that stored a memory — in this case, an association between a particular cage and a shock. A virus delivered a gene for a protein that allowed researchers to control this collection of memory-holding nerve cells. The genetic tweak caused these cells to fire off signals in response to blue laser light, letting Tonegawa and colleagues call up the memory with light delivered by an optic fiber implanted in the brain. A day after receiving a shock in a particular cage, mice carrying two genes associated with Alzheimer’s seemed to have forgotten their ordeal; when put back in that cage, these mice didn’t seem as frightened as mice without the Alzheimer’s-related genes. But when the researchers used light to restore this frightening memory, it caused the mice to freeze in place in a different cage. (Freezing in a new venue showed that laser activation of the memory cells, and not environmental cues, caused the fear reaction.) © Society for Science & the Public 2000 - 2016. All rights reserved.
THERE they are! Newborn neurons vital for memory have been viewed in a live brain for the first time. The work could aid treatments for anxiety and stress disorders. Attila Losonczy at Columbia University Medical Center in New York and his team implanted a tiny microscope into the brains of live mice, the brain cells of which had been modified to make newly made neurons glow. The mice then ran on a treadmill as the team tweaked the surrounding sights, smells and sounds. The researchers paired a small electric shock with some cues, so the mice learned to associate these with an unpleasant experience. They then deactivated the newborn neurons – present in areas of the brain responsible for learning and memory – using optogenetics, which switches off specific cells with light. After this, the mice were unable to tell the difference between the scary and safe cues, becoming fearful of them all (Neuron, doi.org/bc7v). “It suggests that newborn cells do something special that allows animals to tell apart and separate memories,” says Losonczy. An inability to discriminate between similar sensory information triggered by different events – such as the sound of a gunshot and a car backfiring – is often seen in panic and anxiety disorders, such as PTSD. This suggests that new neurons, or a lack of them, plays a part in such conditions and could guide novel treatments. © Copyright Reed Business Information Ltd.
Barbara Bradley Hagerty Faced with her own forgetfulness, former NPR correspondent and author Barbara Bradley Hagerty tried to do something about it. She's written about her efforts in her book on midlife, called Life Reimagined. To her surprise, she discovered that an older dog can learn new tricks. A confession: I loathe standardized tests, and one of the perks of reaching midlife is that I thought I'd never have to take another. But lately I've noticed that in my 50s, my memory isn't the same as it once was. And so I decided to take a radical leap into the world of brain training. At the memory laboratory at the University of Maryland, manager Ally Stegman slides a sheet of paper in front of me. It has a series of boxes containing different patterns and one blank space. My job is to figure out the missing pattern. The test measures a sort of raw intelligence, the ability to figure out novel problems. Time races by. It takes me two minutes to crack the first question. I am stumped by the second and third. Finally, I begin to guess. After 25 minutes, the test is over, and to my relief, Stegman walks in. This test was really, really hard. The reason I am here, voluntarily reliving my nightmare, is simple: I want to tune up my 50-something brain. So over the next month, I will do brain-training exercises, then come back, take the test again and see if I made myself smarter. © 2016 npr
Linda Geddes The health effects of a bad diet can carry over to offspring through eggs and sperm cells without DNA mutations, researchers have found. The mouse study, published in Nature Genetics1, provides some of the strongest evidence yet for the non-genetic inheritance of traits acquired during an organism’s lifetime. And although previous work has suggested that sperm cells can carry 'epigenetic' factors, this is the first time that such an effect has been observed with egg cells. Researchers have suspected for some time that parents' lifestyle and behaviour choices can affect their children's health through epigenetics. These are chemical modifications to DNA or the proteins in chromosomes that affect how genes are expressed, but that do not alter the gene sequences themselves. Whether those changes can be inherited is still controversial. In particular, there have been suggestions that parental eating habits might shape the offspring's risk of obesity and diabetes. However, it has been difficult to disentangle the possibility that the parents’ behaviour during pregnancy or during the offspring's early childhood was to blame, rather than epigenetic changes that had occurred before conception. To get around this issue, endocrinologist Peter Huypens at the German Research Center for Environmental Health in Neuherberg, Germany, and his colleagues gave genetically identical mice one of three diets — high fat, low fat or standard laboratory chow — for six weeks. As expected, those fed the high-fat diet became obese and had impaired tolerance to glucose, an early sign of type 2 diabetes. © 2016 Nature Publishing Group
By Perri Klass, M.D. I got my good sleeper second. My oldest child, my first darling baby, did not reliably sleep through the night till he was well past 2. Since he is now an adult, I can skip right over all the questions of whether we could have trained him to self-soothe and stop calling for us in the night — we tried; we failed; we eventually gave up. The good sleeper was a good sleeper right from the beginning. She followed the timeline in the books, slept longer and longer between feedings, till she was reliably giving us a real night while she was still an infant and she never looked back. Had we matured as parents, become less anxious, more willing to let her learn how to soothe herself? Were our lives calmer? Well, no. In fact, kind of the opposite. We just got dealt two very different babies. I supervise pediatric residents as they learn to provide primary care, to offer guidance to parents as they struggle with all the complexities of baby and toddler sleep, eating, potty training, discipline and tantrums. All of the stuff that shapes your daily life with a small child, and I’m talking about an essentially healthy, normally developing small child. And the hardest thing to teach, especially to people who haven’t yet done any child-rearing, is how different those healthy, normal babies can be, right from the beginning. So we review our sensible pediatric rubrics that deal with these questions, from establishing good sleep patterns to setting limits to encouraging a healthy varied diet. But sometimes it seems that these rubrics work best with the children and families who need them least. Every child is a different assignment — and we can all pay lip service to that cheerfully enough. But the hard thing to believe is how different the assignments can be. Within the range of developmentally normal children, some parents have a much, much harder job than others: more drudge work, less gratification, more public shaming. It sometimes feels like the great undiscussed secret of pediatrics — and of parenting © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 21988 - Posted: 03.15.2016