Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Daisy Yuhas Something was wrong with Brayson Thibodeaux. At 15 months old, he still was not walking; his parents and grandparents were certain that his development was slower than normal. After pushing doctors for answers they finally got him to a neurologist who recommended a genetic test. Brayson had fragile X syndrome, the leading heritable cause of intellectual disability and of autism. The discovery sent ripples through the extended family, who live outside New Orleans. Brayson’s great-grandmother, Cheryl, recalled having heard of fragile X and discovered a cousin whose grandson had the same condition. She soon learned that many members of her family were confirmed carriers of a genetic condition—the fragile X pre-mutation—that put them at risk of having children with this syndrome. “Fragile X” refers to a mutation that alters the X chromosome in such a way that, viewed under a microscope, it would look like a piece was about to break off. That is because one gene contains multiple repetitions of noncoding DNA—specifically CGG (cytosine, guanine, guanine). The exact number of CGG repetitions is variable, but when it reaches more than 200, it is considered to be the full mutation, which causes the syndrome. People with between 55 and 200 repeats are said to have a partial or pre-mutation, an unstable gene that can expand into the full mutation in future generations. © 2016 Scientific American,
Alison Abbott In the 25 years that John Collinge has studied neurology, he has seen hundreds of human brains. But the ones he was looking at under the microscope in January 2015 were like nothing he had seen before. He and his team of pathologists were examining the autopsied brains of four people who had once received injections of growth hormone derived from human cadavers. It turned out that some of the preparations were contaminated with a misfolded protein — a prion — that causes a rare and deadly condition called Creutzfeldt–Jakob disease (CJD), and all four had died in their 40s or 50s as a result. But for Collinge, the reason that these brains looked extraordinary was not the damage wrought by prion disease; it was that they were scarred in another way. “It was very clear that something was there beyond what you'd expect,” he says. The brains were spotted with the whitish plaques typical of people with Alzheimer's disease. They looked, in other words, like young people with an old person's disease. For Collinge, this led to a worrying conclusion: that the plaques might have been transmitted, alongside the prions, in the injections of growth hormone — the first evidence that Alzheimer's could be transmitted from one person to another. If true, that could have far-reaching implications: the possibility that 'seeds' of the amyloid-β protein involved in Alzheimer's could be transferred during other procedures in which fluid or tissues from one person are introduced into another, such as blood transfusions, organ transplants and other common medical procedures. © 2016 Nature Publishing Group,
Laura Sanders Using flashes of blue light, scientists have pulled forgotten memories out of the foggy brains of mice engineered to have signs of early Alzheimer’s disease. This memory rehab feat, described online March 16 in Nature, offers new clues about how the brain handles memories, and how that process can go awry. The result “provides a theoretical mechanism for reviving old, forgotten memories,” says Yale School of Medicine neurologist Arash Salardini. Memory manipulations, such as the retrieval of lost memories and the creation of false memories, were “once the realm of science fiction,” he says. But this experiment and other recent work have now accomplished these feats, at least in rodents (SN: 12/27/14, p. 19), he says. To recover a lost memory, scientists first had to mark it. Neuroscientist Susumu Tonegawa of MIT and colleagues devised a system that tagged the specific nerve cells that stored a memory — in this case, an association between a particular cage and a shock. A virus delivered a gene for a protein that allowed researchers to control this collection of memory-holding nerve cells. The genetic tweak caused these cells to fire off signals in response to blue laser light, letting Tonegawa and colleagues call up the memory with light delivered by an optic fiber implanted in the brain. A day after receiving a shock in a particular cage, mice carrying two genes associated with Alzheimer’s seemed to have forgotten their ordeal; when put back in that cage, these mice didn’t seem as frightened as mice without the Alzheimer’s-related genes. But when the researchers used light to restore this frightening memory, it caused the mice to freeze in place in a different cage. (Freezing in a new venue showed that laser activation of the memory cells, and not environmental cues, caused the fear reaction.) © Society for Science & the Public 2000 - 2016. All rights reserved.
THERE they are! Newborn neurons vital for memory have been viewed in a live brain for the first time. The work could aid treatments for anxiety and stress disorders. Attila Losonczy at Columbia University Medical Center in New York and his team implanted a tiny microscope into the brains of live mice, the brain cells of which had been modified to make newly made neurons glow. The mice then ran on a treadmill as the team tweaked the surrounding sights, smells and sounds. The researchers paired a small electric shock with some cues, so the mice learned to associate these with an unpleasant experience. They then deactivated the newborn neurons – present in areas of the brain responsible for learning and memory – using optogenetics, which switches off specific cells with light. After this, the mice were unable to tell the difference between the scary and safe cues, becoming fearful of them all (Neuron, doi.org/bc7v). “It suggests that newborn cells do something special that allows animals to tell apart and separate memories,” says Losonczy. An inability to discriminate between similar sensory information triggered by different events – such as the sound of a gunshot and a car backfiring – is often seen in panic and anxiety disorders, such as PTSD. This suggests that new neurons, or a lack of them, plays a part in such conditions and could guide novel treatments. © Copyright Reed Business Information Ltd.
Barbara Bradley Hagerty Faced with her own forgetfulness, former NPR correspondent and author Barbara Bradley Hagerty tried to do something about it. She's written about her efforts in her book on midlife, called Life Reimagined. To her surprise, she discovered that an older dog can learn new tricks. A confession: I loathe standardized tests, and one of the perks of reaching midlife is that I thought I'd never have to take another. But lately I've noticed that in my 50s, my memory isn't the same as it once was. And so I decided to take a radical leap into the world of brain training. At the memory laboratory at the University of Maryland, manager Ally Stegman slides a sheet of paper in front of me. It has a series of boxes containing different patterns and one blank space. My job is to figure out the missing pattern. The test measures a sort of raw intelligence, the ability to figure out novel problems. Time races by. It takes me two minutes to crack the first question. I am stumped by the second and third. Finally, I begin to guess. After 25 minutes, the test is over, and to my relief, Stegman walks in. This test was really, really hard. The reason I am here, voluntarily reliving my nightmare, is simple: I want to tune up my 50-something brain. So over the next month, I will do brain-training exercises, then come back, take the test again and see if I made myself smarter. © 2016 npr
Linda Geddes The health effects of a bad diet can carry over to offspring through eggs and sperm cells without DNA mutations, researchers have found. The mouse study, published in Nature Genetics1, provides some of the strongest evidence yet for the non-genetic inheritance of traits acquired during an organism’s lifetime. And although previous work has suggested that sperm cells can carry 'epigenetic' factors, this is the first time that such an effect has been observed with egg cells. Researchers have suspected for some time that parents' lifestyle and behaviour choices can affect their children's health through epigenetics. These are chemical modifications to DNA or the proteins in chromosomes that affect how genes are expressed, but that do not alter the gene sequences themselves. Whether those changes can be inherited is still controversial. In particular, there have been suggestions that parental eating habits might shape the offspring's risk of obesity and diabetes. However, it has been difficult to disentangle the possibility that the parents’ behaviour during pregnancy or during the offspring's early childhood was to blame, rather than epigenetic changes that had occurred before conception. To get around this issue, endocrinologist Peter Huypens at the German Research Center for Environmental Health in Neuherberg, Germany, and his colleagues gave genetically identical mice one of three diets — high fat, low fat or standard laboratory chow — for six weeks. As expected, those fed the high-fat diet became obese and had impaired tolerance to glucose, an early sign of type 2 diabetes. © 2016 Nature Publishing Group
By Perri Klass, M.D. I got my good sleeper second. My oldest child, my first darling baby, did not reliably sleep through the night till he was well past 2. Since he is now an adult, I can skip right over all the questions of whether we could have trained him to self-soothe and stop calling for us in the night — we tried; we failed; we eventually gave up. The good sleeper was a good sleeper right from the beginning. She followed the timeline in the books, slept longer and longer between feedings, till she was reliably giving us a real night while she was still an infant and she never looked back. Had we matured as parents, become less anxious, more willing to let her learn how to soothe herself? Were our lives calmer? Well, no. In fact, kind of the opposite. We just got dealt two very different babies. I supervise pediatric residents as they learn to provide primary care, to offer guidance to parents as they struggle with all the complexities of baby and toddler sleep, eating, potty training, discipline and tantrums. All of the stuff that shapes your daily life with a small child, and I’m talking about an essentially healthy, normally developing small child. And the hardest thing to teach, especially to people who haven’t yet done any child-rearing, is how different those healthy, normal babies can be, right from the beginning. So we review our sensible pediatric rubrics that deal with these questions, from establishing good sleep patterns to setting limits to encouraging a healthy varied diet. But sometimes it seems that these rubrics work best with the children and families who need them least. Every child is a different assignment — and we can all pay lip service to that cheerfully enough. But the hard thing to believe is how different the assignments can be. Within the range of developmentally normal children, some parents have a much, much harder job than others: more drudge work, less gratification, more public shaming. It sometimes feels like the great undiscussed secret of pediatrics — and of parenting © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 21988 - Posted: 03.15.2016
Deborah Orr The psychologist Oliver James has for many years been a part of the cultural landscape, writing best-selling books, making television programmes, contributing articles to newspapers and generally offering his views. As a practicing psychotherapist of many years’ standing, he has good reason to believe that he has important insights to offer. James is particularly exercised by the damage caused by casual emotional abuse – the explosive parent who shouts and swears at their kids, displays resentment against them or tries to coerce them into doing things instead of employing reason. No sensible person disagrees with him on this, and only a harsh critic would deny that James has played a strong and positive part in popularising these simple, important wisdoms. That’s why it’s so very odd that James has chosen now to perpetrate casual emotional abuse on a grand scale. His latest book, Not in Your Genes: The Real Reason Parents Are Like Their Children, expands on an argument he’s been making for years: that there is no scientific basis for belief in the idea that there is any genetic element to any psychological trait. Even illnesses such as schizophrenia and bipolar disorder are completely down to the environment in which you grew up, not the complex interplay between nature and nurture that mainstream science espouses. Even if James had conclusive evidence to back up his absolutist claim – which he does not – I would suggest that such news should be broken gently. © 2016 Guardian News and Media Limited
A senior British doctor, who has been an expert defence witness for parents accused of killing their children, has been found guilty of multiple charges that include giving misleading evidence in court. The Medical Practitioners Tribunal Service said that Waney Squier, a consultant pathologist at John Radcliffe Hospital in Oxford, UK, had failed to work within the limits of her competence, failed to be objective and unbiased, and failed to heed the views of other experts. In many of the cases investigated, her actions were deliberately misleading and irresponsible. The MPTS had considered Squier’s work as an expert witness in six child abuse cases and one appeal in which parents faced charges of non-accidental head injury, formerly known as shaken-baby syndrome. Squier is prominent among several researchers worldwide who have challenged a long-standing belief that a trio of symptoms of head injury provide unequivocal evidence of abusive behaviour. Squier has argued in the scientific literature and in court that the symptoms in question – haemorrhages on the surface of the brain, haemorrhages in the retinas, and a swollen brain – can have innocent causes, such as choking or other difficulties in breathing. These symptoms, they say, can also arise from the birthing process itself. Michele Codd, chair of the tribunal, gave examples of where the panel felt Squier’s court evidence had strayed outside her field of expertise. These included offering opinions on biomechanics in relation to injuries from falling, pathology of the eyes, and paediatric medicine. © Copyright Reed Business Information Ltd.
By Emily Underwood Nestled deep within a brain region that processes memory is a sliver of tissue that continually sprouts brand-new neurons, at least into late adulthood. A study in mice now provides the first glimpse at how these newborn neurons behave in animals as they learn, and hints at the purpose of the new arrivals: to keep closely-related but separate memories distinct. A number of previous studies have suggested that the birth of new neurons is key to memory formation. In particular, scientists believe the new cell production—known as neurogenesis—plays a role in pattern separation, the ability to discriminate between similar experiences, events, or contexts based on sensory cues such as a certain smell or visual landmark. Pattern separation helps us use cues such as the presence of a particular tree or cars nearby, for example, to distinguish which parking space we chose today, as opposed to yesterday or the day before. This ability appears to be particularly diminished in people with anxiety and mood disorders. Scientists can produce deficits in pattern separation in animals by blocking neurogenesis, using x-ray radiation to kill targeted populations of cells in the dentate gyrus. Because such studies have not established the precise identity of which cells are being recorded from, however, no one has been able to address the “burning question” in the field: "how young, adult-born neurons and mature dentate granule neurons differ in their activity," says Amar Sahay, a neuroscientist at the Massachusetts General Hospital and Harvard Medical School. © 2016 American Association for the Advancement of Scienc
By Kj Dell’Antonia New research shows that the youngest students in a classroom are more likely to be given a diagnosis of attention deficit hyperactivity disorder than the oldest. The findings raise questions about how we regard those wiggly children who just can’t seem to sit still – and who also happen to be the youngest in their class. Researchers in Taiwan looked at data from 378,881 children ages 4 to 17 and found that students born in August, the cut-off month for school entry in that country, were more likely to be given diagnoses of A.D.H.D. than students born in September. The children born in September would have missed the previous year’s cut-off date for school entry, and thus had nearly a full extra year to mature before entering school. The findings were published Thursday in The Journal of Pediatrics. While few dispute that A.D.H.D. is a legitimate disability that can impede a child’s personal and school success and that treatment can be effective, “our findings emphasize the importance of considering the age of a child within a grade when diagnosing A.D.H.D. and prescribing medication for treating A.D.H.D.,” the authors concluded. Dr. Mu-Hong Chen, a member of the department of psychiatry at Taipei Veterans General Hospital in Taiwan and the lead author of the study, hopes that a better understanding of the data linking relative age at school entry to an A.D.H.D. diagnosis will encourage parents, teachers and clinicians to give the youngest children in a grade enough time and help to allow them to prove their ability. Other research has shown similar results. An earlier study in the United States, for example, found that roughly 8.4 percent of children born in the month before their state’s cutoff date for kindergarten eligibility are given A.D.H.D. diagnoses, compared to 5.1 percent of children born in the month immediately afterward. © 2016 The New York Times Company
By Dominic Howell BBC News Gum disease has been linked to a greater rate of cognitive decline in people with Alzheimer's disease, early stage research has suggested. The small study, published in PLOS ONE, looked at 59 people who were all deemed to have mild to moderate dementia. It is thought the body's response to gum inflammation may be hastening the brain's decline. The Alzheimer's Society said if the link was proven to be true, then good oral health may help slow dementia. The body's response to inflammatory conditions was cited as a possible reason for the quicker decline. Inflammation causes immune cells to swell and has long been associated with Alzheimer's. Researchers believe their findings add weight to evidence that inflammation in the brain is what drives the disease. 'Six-fold increase' The study, jointly led by the University of Southampton and King's College London, cognitively assessed the participants, and took blood samples to measure inflammatory markers in their blood. Their oral health was also assessed by a dental hygienist who was unaware of the cognitive outcomes. Of the sample group, 22 were found to have considerable gum disease while for the remaining 37 patients the disease was much less apparent. The average age of the group with gum disease was 75, and in the other group it was 79. A majority of participants - 52 - were followed up at six months, and all assessments were repeated. The presence of gum disease - or periodontitis as it is known - was associated with a six-fold increase in the rate of cognitive decline, the study suggested. © 2016 BBC
Link ID: 21976 - Posted: 03.12.2016
Susan Gaidos Most people would be happy to get rid of excess body fat. Even better: Trade the spare tire for something useful — say, better-functioning knees or hips, or a fix for an ailing heart or a broken bone. The idea is not far-fetched, some scientists say. Researchers worldwide are repurposing discarded fat to repair body parts damaged by injury, disease or age. Recent studies in lab animals and humans show that the much-maligned material can be a source of cells useful for treating a wide range of ills. At the University of Pittsburgh, bioengineer Rocky Tuan and colleagues extract buckets full of yellow fat from volunteers’ bellies and thighs and turn the liposuctioned material into tissue that resembles shock-absorbing cartilage. If the cartilage works as well in people as it has in animals, Tuan’s approach might someday offer a kind of self-repair for osteoarthritis, the painful degeneration of cartilage in the joints. He’s also using fat cells to grow replacement parts for the tendons and ligaments that support the joints. Foremost among fat’s virtues is its richness of stem cells, which have the ability to divide and grow into a wide variety of tissue types. Fat stem cells — also known as adipose-derived stem cells — can be coerced to grow into bone, cartilage, muscle tissue or, of course, more fat. Cells from fat are being tested to mend tissues found in damaged joints, hearts and muscle, and to regrow bone and heal wounds. © Society for Science & the Public 2000 - 2016
Rich Stanton In 1976, the driving simulation Death Race was removed from an Illinois amusement park. There had, according to a news story at the time, been complaints that it encouraged players to run over pedestrians to score points. Through a series of subsequent newspaper reports, the US National Safety Council labelled the game “gross” and motoring groups demanded its removal from distribution. The first moral panic over video game violence had begun. This January, a group of four scholars published a paper analysing the links between playing violent video games at a young age and aggressive behaviour in later life. The titles mentioned in the report are around 15-years-old – one of several troubling ambiguities to be found in the research. Nevertheless, the quality and quantity of the data make this an uncommonly valuable study. Given that game violence remains a favoured bogeyman for politicians, press and pressure groups, it should be shocking that such a robust study of the phenomenon is rare. But it is, and it’s important to ask why. A history of violence With the arrival of Pong in 1973, video games became a commercial reality, but now, in 2016, they are still on the rocky path to mass acceptance that all new media must traverse. The truth is that the big targets of moral concern – Doom, Grand Theft Auto, Call of Duty – are undeniably about killing and they are undeniably popular among male teenagers. An industry report estimates that 80% of the audience for the Call of Duty series is male, and 21% is aged 10-14. Going by the 18 rating on the last three entries, that means at least a fifth of the game’s vast audience shouldn’t be playing. © 2016 Guardian News and Media Limited
By GINA KOLATA Marty and Matt Reiswig, two brothers in Denver, knew that Alzheimer’s disease ran in their family, but neither of them understood why. Then a cousin, Gary Reiswig, whom they barely knew, wrote a book about their family, “The Thousand Mile Stare.” When the brothers read it, they realized what they were facing. In the extended Reiswig family, Alzheimer’s disease is not just a random occurrence. It results from a mutated gene that is passed down from parent to child. If you inherit the mutated gene, Alzheimer’s will emerge at around age 50 — with absolute certainty. Your child has a 50-50 chance of suffering the same fate. The revelation came as a shock. And so did the next one: The brothers learned that there is a blood test that can reveal whether one carries the mutated gene. They could decide to know if they had it. Or not. It’s a dilemma more people are facing as scientists discover more genetic mutations linked to diseases. Often the newly discovered gene increases risk, but does not guarantee it. Sometimes knowing can be useful: If you have a gene mutation that makes colon cancer much more likely , for example, then frequent colonoscopies may help doctors stave off trouble. But then there are genes that make a dreaded disease a certainty: There is no way to prevent it, and no way to treat it. Marty Reiswig, 37, saw his father, now in the final stages of Alzheimer’s, slowly lose his ability to think, to remember, to care for himself, or even to recognize his wife and sons. Mr. Reiswig knows that if he has the gene, he has perhaps a bit more than a decade before the first symptoms appear. If he has it, his two young children may have it, too. He wavers about getting tested. © 2016 The New York Times Company
By DONALD G. McNEIL Jr. and CATHERINE SAINT LOUIS The Zika virus damages many fetuses carried by infected and symptomatic mothers, regardless of when in pregnancy the infection occurs, according to a small but frightening study released on Friday by Brazilian and American researchers. In a separate report published on Friday, other scientists suggested a mechanism for the damage, showing in laboratory experiments that the virus targets and destroys fetal cells that eventually form the brain’s cortex. The reports are far from conclusive, but the studies help shed light on a mysterious epidemic that has swept across more than two dozen countries in the Western Hemisphere, alarming citizens and unnerving public health officials. In the first study, published in The New England Journal of Medicine, researchers found that 29 percent of women who had ultrasound examinations after testing positive for infection with the Zika virus had fetuses that suffered “grave outcomes.” They included fetal death, tiny heads, shrunken placentas and nerve damage that suggested blindness. “This is going to have a chilling effect,” said Dr. Anthony S. Fauci, the director of the National Institute of Allergy and Infectious Diseases. “Now there’s almost no doubt that Zika is the cause.” The small size of the study, which looked at 88 women at one clinic in Rio de Janeiro, was a limitation, Dr. Fauci added. From such a small sample, it is impossible to be certain how often fetal damage may occur in a much larger population. © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 21957 - Posted: 03.05.2016
Ewan Birney The Daily Mail recently ran an article about how alcohol abuse could harm future generations, via the (exciting-sounding) mechanism of trans-generational epigenetics. This is an emotive topic, combining a commonplace habit (drinking beer and wine) with a scary outcome (harming your children, grandchildren and future generations) and adding a twist of science for gravitas. It’s not surprising that this research has been handed a megaphone by the mainstream press – but does the science stack up? To start with, the research was carried out in rats, as multi-generational experiments on humans are both grossly unethical and logistically extremely hard. This crucial bit of information is missing from both the Daily Mail headline and the paper’s title. Secondly, the big effects of alcohol consumption were mainly seen on the rats’ children and grandchildren – the effects on their great grandchildren were smaller. That is really important, because if there’s no effect on great grandchildren, it’s probably not due to epigenetics. Drinking large amounts of alcohol (for a rat) whilst pregnant would be expected to have an effect on the children and even the grandchildren. This is because the eggs of female mammals are made early on in foetal development, whilst a daughter is developing in the womb. So if that cell (the egg) also gives rise to a daughter, she will have directly experienced exposures that occurred during her maternal grandmother’s pregnancy. © 2016 Guardian News and Media Limited or its affiliated companies.
By Meeri Kim Teenagers tend to have a bad reputation in our society, and perhaps rightly so. When compared to children or adults, adolescents are more likely to engage in binge drinking, drug use, unprotected sex, criminal activity, and reckless driving. Risk-taking is like second nature to youth of a certain age, leading health experts to cite preventable and self-inflicted causes as the biggest threats to adolescent well-being in industrialized societies. But before going off on a tirade about groups of reckless young hooligans, consider that a recent study may have revealed a silver lining to all that misbehavior. While adolescents will take more risks in the presence of their peers than when alone, it turns out that peers can also encourage them to learn faster and engage in more exploratory acts. A group of 101 late adolescent males were randomly assigned to play the Iowa Gambling Task, a psychological game used to assess decision making, either alone or observed by their peers. The task involves four decks of cards: two are “lucky” decks that will generate long-term gain if the player continues to draw from them, while the other two are “unlucky” decks that have the opposite effect. The player chooses to play or pass cards drawn from one of these decks, eventually catching on to which of the decks are lucky or unlucky — and subsequently only playing from the lucky ones.
Laura Sanders In a multivirus competition, a newcomer came out on top for its ability to transport genetic cargo to a mouse’s brain cells. The engineered virus AAV-PHP.B was best at delivering a gene that instructed Purkinje cells, the dots in the micrograph above, to take on a whitish glow. Unaffected surrounding cells in the mouse cerebellum look blue. Cargo carried by viruses like AAV-PHP.B could one day replace faulty genes in the brains of people. AAV-PHP.B beat out other viruses including a similar one called AAV9, which is already used to get genes into the brains of mice. Genes delivered by AAV-PHP.B also showed up in the spinal cord, retina and elsewhere in the body, Benjamin Deverman of Caltech and colleagues report in the February Nature Biotechnology. Similar competitions could uncover viruses with the ability to deliver genes to specific types of cells, the researchers write. Selective viruses that can also get into the brain would enable deeper studies of the brain and might improve gene therapy techniques in people. © Society for Science & the Public 2000 - 2016
By DONALD G. McNEIL Jr. A baby with a shrunken, misshapen head is surely a heartbreaking sight. But reproductive health experts are warning that microcephaly may be only the most obvious consequence of the spread of the Zika virus. Even infants who appear normal at birth may be at higher risk for mental illnesses later in life if their mothers were infected during pregnancy, many researchers fear. The Zika virus, they say, closely resembles some infectious agents that have been linked to the development of autism, bipolar disorder and schizophrenia. Schizophrenia and other debilitating mental illnesses have no single cause, experts emphasized in interviews. The conditions are thought to arise from a combination of factors, including genetic predisposition and traumas later in life, such as sexual or physical abuse, abandonment or heavy drug use. But illnesses in utero, including viral infections, are thought to be a trigger. “The consequences of this go way beyond microcephaly,” said Dr. W. Ian Lipkin, who directs The Center for Infection and Immunity at Columbia University. Here is a look at the most prominent rumors and theories about Zika virus, along with responses from scientists. Among children in Latin America and the Caribbean, “I wouldn’t be surprised if we saw a big upswing in A.D.H.D., autism, epilepsy and schizophrenia,” he added. “We’re looking at a large group of individuals who may not be able to function in the world.” © 2016 The New York Times Company