Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Simon Makin Other species are capable of displaying dazzling feats of intelligence. Crows can solve multistep problems. Apes display numerical skills and empathy. Yet, neither species has the capacity to conduct scientific investigations into other species' cognitive abilities. This type of behavior provides solid evidence that humans are by far the smartest species on the planet. Besides just elevated IQs, however, humans set themselves apart in another way: Their offspring are among the most helpless of any species. A new study, published recently in Proceedings of the National Academy of Sciences (PNAS), draws a link between human smarts and an infant’s dependency, suggesting one thing led to the other in a spiraling evolutionary feedback loop. The study, from psychologists Celeste Kidd and Steven Piantadosi at the University of Rochester, represents a new theory about how humans came to possess such extraordinary smarts. Like a lot of evolutionary theories, this one can be couched in the form of a story—and like a lot of evolutionary stories, this one is contested by some scientists. Kidd and Piantadosi note that, according to a previous theory, early humans faced selection pressures for both large brains and the capacity to walk upright as they moved from forest to grassland. Larger brains require a wider pelvis to give birth whereas being bipedal limits the size of the pelvis. These opposing pressures—biological anthropologists call them the “obstetric dilemma”—could have led to giving birth earlier when infants’ skulls were still small. Thus, newborns arrive more immature and helpless than those of most other species. Kidd and Piantadosi propose that, as a consequence, the cognitive demands of child care increased and created evolutionary pressure to develop higher intelligence. © 2016 Scientific American
By Gary Stix Scientists will never find a single gene for depression—nor two, nor 20. But among the 20,000 human genes and the hundreds of thousands of proteins and molecules that switch on those genes or regulate their activity in some way, there are clues that point to the roots of depression. Tools to identify biological pathways that are instrumental in either inducing depression or protecting against it have recently debuted—and hold the promise of providing leads for new drug therapies for psychiatric and neurological diseases. A recent paper in the journal Neuron illustrates both the dazzling complexity of this approach and the ability of these techniques to pinpoint key genes that may play a role in governing depression. Scientific American talked with the senior author on the paper—neuroscientist Eric Nestler from the Icahn School of Medicine at Mt. Sinai in New York. Nestler spoke about the potential of this research to break the logjam in pharmaceutical research that has impeded development of drugs to treat brain disorders. Scientific American: The first years in the war on cancer met with a tremendous amount of frustration. Things look like they're improving somewhat now for cancer. Do you anticipate a similar trajectory may occur in neuroscience for psychiatric disorders? Eric Nestler: I do. I just think it will take longer. I was in medical school 35 years ago when the idea that identifying a person's specific pathophysiology was put forward as a means of directing treatment of cancer. We're now three decades later finally seeing the day when that’s happening. I definitely think the same will occur for major brain disorders. The brain is just more complicated and the disorders are more complicated so it will take longer. © 2016 Scientific American
By Anil Ananthaswamy and Alice Klein Our brain’s defence against invading microbes could cause Alzheimer’s disease – which suggests that vaccination could prevent the condition. Alzheimer’s disease has long been linked to the accumulation of sticky plaques of beta-amyloid proteins in the brain, but the function of plaques has remained unclear. “Does it play a role in the brain, or is it just garbage that accumulates,” asks Rudolph Tanzi of Harvard Medical School. Now he has shown that these plaques could be defences for trapping invading pathogens. Working with Robert Moir at the Massachusetts General Hospital in Boston, Tanzi’s team has shown that beta-amyloid can act as an anti-microbial compound, and may form part of our immune system. .. To test whether beta-amyloid defends us against microbes that manage to get into the brain, the team injected bacteria into the brains of mice that had been bred to develop plaques like humans do. Plaques formed straight away. “When you look in the plaques, each one had a single bacterium in it,” says Tanzi. “A single bacterium can induce an entire plaque overnight.” Double-edged sword This suggests that infections could be triggering the formation of plaques. These sticky plaques may trap and kill bacteria, viruses or other pathogens, but if they aren’t cleared away fast enough, they may lead to inflammation and tangles of another protein, called tau, causing neurons to die and the progression towards © Copyright Reed Business Information Ltd.
Robert Plomin, Scientists have investigated this question for more than a century, and the answer is clear: the differences between people on intelligence tests are substantially the result of genetic differences. But let's unpack that sentence. We are talking about average differences among people and not about individuals. Any one person's intelligence might be blown off course from its genetic potential by, for example, an illness in childhood. By genetic, we mean differences passed from one generation to the next via DNA. But we all share 99.5 percent of our three billion DNA base pairs, so only 15 million DNA differences separate us genetically. And we should note that intelligence tests include diverse examinations of cognitive ability and skills learned in school. Intelligence, more appropriately called general cognitive ability, reflects someone's performance across a broad range of varying tests. Genes make a substantial difference, but they are not the whole story. They account for about half of all differences in intelligence among people, so half is not caused by genetic differences, which provides strong support for the importance of environmental factors. This estimate of 50 percent reflects the results of twin, adoption and DNA studies. From them, we know, for example, that later in life, children adopted away from their biological parents at birth are just as similar to their biological parents as are children reared by their biological parents. Similarly, we know that adoptive parents and their adopted children do not typically resemble one another in intelligence. © 2016 Scientific American
By Roland Pease BBC Radio Science Unit Researchers have invented a DNA "tape recorder" that can trace the family history of every cell in an organism. The technique is being hailed as a breakthrough in understanding how the trillions of complex cells in a body are descended from a single egg. "It has the potential to provide profound insights into how normal, diseased or damaged tissues are constructed and maintained," one UK biologist told the BBC. The work appears in Science journal. The human body has around 40 trillion cells, each with a highly specialised function. Yet each can trace its history back to the same starting point - a fertilised egg. Developmental biology is the business of unravelling how the genetic code unfolds at each cycle of cell division, how the body plan develops, and how tissues become specialised. But much of what it has revealed has depended on inference rather than a complete cell-by-cell history. "I actually started working on this problem as a graduate student in 2000," confessed Jay Shendure, lead researcher on the new scientific paper. "Could we find a way to record these relationships between cells in some compact form we could later read out in adult organisms?" The project failed then because there was no mechanism to record events in a cell's history. That changed with recent developments in so called CRISPR gene editing, a technique that allows researchers to make much more precise alterations to the DNA in living organisms. The molecular tape recorder developed by Prof Shendure's team at the University of Washington in Seattle, US, is a length of DNA inserted into the genome that contains a series of edit points which can be changed throughout an organism's life. © 2016 BBC.
By Jordana Cepelewicz General consensus among Alzheimer’s researchers has it that the disease’s main culprit, a protein called amyloid beta, is an unfortunate waste product that is not known to play any useful role in the body—and one that can have devastating consequences. When not properly cleared from the brain it builds up into plaques that destroy synapses, the junctions between nerve cells, resulting in cognitive decline and memory loss. The protein has thus become a major drug target in the search for a cure to Alzheimer’s. Now a team of researchers at Harvard Medical School and Massachusetts General Hospital are proposing a very different story. In a study published this week in Science Translational Medicine, neurologists Rudolph Tanzi and Robert Moir report evidence that amyloid beta serves a crucial purpose: protecting the brain from invading microbes. “The original idea goes back to 2010 or so when Rob had a few too many Coronas,” Tanzi jokes. Moir had come across surprising similarities between amyloid beta and LL37, a protein that acts as a foot soldier in the brain’s innate immune system, killing potentially harmful bugs and alerting other cells to their presence. “These types of proteins, although small, are very sophisticated in what they do,” Moir says. “And they’re very ancient, going back to the dawn of multicellular life.” © 2016 Scientific American,
By GINA KOLATA Could it be that Alzheimer’s disease stems from the toxic remnants of the brain’s attempt to fight off infection? Provocative new research by a team of investigators at Harvard leads to this startling hypothesis, which could explain the origins of plaque, the mysterious hard little balls that pockmark the brains of people with Alzheimer’s. It is still early days, but Alzheimer’s experts not associated with the work are captivated by the idea that infections, including ones that are too mild to elicit symptoms, may produce a fierce reaction that leaves debris in the brain, causing Alzheimer’s. The idea is surprising, but it makes sense, and the Harvard group’s data, published Wednesday in the journal Science Translational Medicine, supports it. If it holds up, the hypothesis has major implications for preventing and treating this degenerative brain disease. The Harvard researchers report a scenario seemingly out of science fiction. A virus, fungus or bacterium gets into the brain, passing through a membrane — the blood-brain barrier — that becomes leaky as people age. The brain’s defense system rushes in to stop the invader by making a sticky cage out of proteins, called beta amyloid. The microbe, like a fly in a spider web, becomes trapped in the cage and dies. What is left behind is the cage — a plaque that is the hallmark of Alzheimer’s. So far, the group has confirmed this hypothesis in neurons growing in petri dishes as well as in yeast, roundworms, fruit flies and mice. There is much more work to be done to determine if a similar sequence happens in humans, but plans — and funding — are in place to start those studies, involving a multicenter project that will examine human brains. “It’s interesting and provocative,” said Dr. Michael W. Weiner, a radiology professor at the University of California, San Francisco, and a principal investigator of the Alzheimer’s Disease Neuroimaging Initiative, a large national effort to track the progression of the disease and look for biomarkers like blood proteins and brain imaging to signal the disease’s presence. © 2016 The New York Times Company
Sara Reardon Children from impoverished families are more prone to mental illness, and alterations in DNA structure could be to blame, according to a study published on 24 May in Molecular Psychiatry1. Poverty brings with it a number of different stressors, such as poor nutrition, increased prevalence of smoking and the general struggle of trying to get by. All of these can affect a child’s development, particularly in the brain, where the structure of areas involved in response to stress and decision-making have been linked to low socioeconomic status. Poor children are more prone to mental illnesses such as depression than their peers from wealthier families, but they are also more likely to have cognitive problems. Some of these differences are clearly visible in the brain structure and seem to appear at birth, which suggests that prenatal exposure to these stressors can be involved2. But neurodevelopment does not stop at birth. Neuroscientist Ahmad Hariri of Duke University in Durham, North Carolina, suspected that continual exposure to stressors might affect older children as well. He decided to test this idea by studying chemical tags known as methyl groups, which alter DNA structure to regulate how genes are expressed. There is some evidence that methylation patterns can be passed down through generations, but they are also altered by environmental factors, such as smoking. © 2016 Nature Publishing Group,
by Bruce Bower For a landmark 1977 paper, psychologist Andrew Meltzoff stuck his tongue out at 2- to 3-week-old babies. Someone had to do it. After watching Meltzoff razz them for 15 seconds, babies often stuck out their own tongues within the next 2½ minutes. Newborns also tended to respond in kind when the young researcher opened his mouth wide, pushed out his lips like a duck and opened and closed the fingers of one hand. Meltzoff, now at the University of Washington in Seattle, and a colleague were the first to report that babies copy adults’ simple physical deeds within weeks of birth. Until then, most scientists assumed that imitation began at around 9 months of age. Newborns don’t care that imitation is the sincerest form of flattery. For them, it may be a key to interacting with (and figuring out) those large, smiley people who come to be known as mommy and daddy. And that’s job number one for tykes hoping to learn how to talk and hang out with a circle of friends. Meltzoff suspected that babies enter the world able to compare their own movements — even those they can feel but not see, such as a projecting tongue — to corresponding adult actions. Meltzoff’s report has inspired dozens of papers on infant imitation. Some have supported his results, some haven’t. A new report, published May 5 in Current Biology, falls in the latter group. The study of 106 Australian babies tracked from 1 to 9 weeks of age concludes that infants don’t imitate anyone. © Society for Science & the Public 2000 - 201
Keyword: Development of the Brain
Link ID: 22246 - Posted: 05.25.2016
By Lisa Rapaport (Reuters Health) - Attention deficit hyperactivity disorder (ADHD), usually diagnosed in children, may show up for the first time in adulthood, two recent studies suggest. And not only can ADHD appear for the first time after childhood, but the symptoms for adult-onset ADHD may be different from symptoms experienced by kids, the researchers found. “Although the nature of symptoms differs somewhat between children and adults, all age groups show impairments in multiple domains – school, family and friendships for kids and school, occupation, marriage and driving for adults,” said Stephen Faraone, a psychiatry researcher at SUNY Upstate Medical University in Syracuse, New York and author of an editorial accompanying the two studies in JAMA Psychiatry. Faraone cautions, however, that some newly diagnosed adults might have had undetected ADHD as children. Support from parents and teachers or high intelligence, for example, might prevent ADHD symptoms from emerging earlier in life. It’s not clear whether study participants “were completely free of psychopathology prior to adulthood,” Faraone said in an email. One of the studies, from Brazil, tracked more than 5,200 people born in 1993 until they were 18 or 19 years old. © 2016 Scientific American
By Diana Kwon More than one in 10 Americans older than 12 takes antidepressants, according to a 2011 report by the National Center for Health Statistics. A significant but unknown number of children younger than 12 take them, too. Although most such drugs are not approved for young children, doctors have prescribed them off-label for years because they have been thought to have relatively mild side effects. Yet recent reports have revealed that important data about the safety of these drugs—especially their risks for children and adolescents—have been withheld from the medical community and the public. In the latest and most comprehensive analysis, published in January in the BMJ, researchers at the Nordic Cochrane Center in Copenhagen showed that pharmaceutical companies have not been revealing the full extent of serious harm in clinical study reports, which are detailed documents sent to regulatory authorities such as the U.S. Food and Drug Administration and the European Medicines Agency (EMA) when applying for approval of a new drug. The researchers examined reports from 70 double-blind, placebo-controlled trials of two common categories of antidepressants—selective serotonin reuptake inhibitors (SSRIs) and serotonin and norepinephrine reuptake inhibitors (SNRIs)—and found that the occurrence of suicidal thoughts and aggressive behavior doubled in children and adolescents who used these drugs. The investigators discovered that some of the most revealing information was buried in appendices where individual patient outcomes are listed. For example, they found clear instances of suicidal thinking that had been passed off as “emotional lability” or “worsening depression” in the report itself. This information, however, was available for only 32 out of the 70 trials. “We found that a lot of the appendices were often only available on request to the authorities, and the authorities had never requested them,” says Tarang Sharma, a Ph.D. student at Cochrane and lead author of the study. “I'm actually kind of scared about how bad the actual situation would be if we had the complete data.” © 2016 Scientific American
By Sarah Kaplan You probably wouldn't be surprised if a scientist told you that your genes influence when you hit puberty, how tall you are, what your BMI will be and whether you're likely to develop male pattern baldness. But what if he said that the same gene could hold sway over all four things? That finding comes from a study published Monday in the journal Nature Genetics. Using data from dozens of genome-wide association studies (big scans of complete sets of DNA from many thousands of people), researchers at the New York Genome Center and the genetic analysis company 23andMe found examples of single "multitasking" genes that influence diverse and sometimes seemingly disparate traits. The scientists say that the links they uncovered could help researchers understand how certain genes work, and figure out better ways of treating some of the health problems they might control. "Most studies tend to go one disease at a time," said Joseph Pickrell, a professor at Columbia University and the New York Genome Center's lead investigator on the project. "But if we can try to make these sorts of connections between what you might think of as unrelated traits ... that gives us another angle of attack to understand the connections between these different diseases." To start, Pickrell and his team sought out genome-wide association studies (GWAS) identifying particular genetic variants associated with 42 different traits. Many had to do with diseases (for example, studies that linked certain genes to the risk of developing Alzheimer's or type 2 diabetes) and other personal health traits (body mass index, blood type, cholesterol levels).
Keyword: Genes & Behavior
Link ID: 22225 - Posted: 05.18.2016
By Emily Underwood One of the telltale signs of Alzheimer’s disease (AD) is sticky plaques of ß-amyloid protein, which form around neurons and are thought by a large number of scientists to bog down information processing and kill cells. For more than a decade, however, other researchers have fingered a second protein called tau, found inside brain cells, as a possible culprit. Now, a new imaging study of 10 people with mild AD suggests that tau deposits—not amyloid—are closely linked to symptoms such as memory loss and dementia. Although this evidence won’t itself resolve the amyloid-tau debate, the finding could spur more research into new, tau-targeting treatments and lead to better diagnostic tools, researchers say. Scientists have long used an imaging technique called positron emission tomography (PET) to visualize ß-amyloid deposits marked by radioactive chemical tags in the brains of people with AD. Combined with postmortem analyses of brain tissue, these studies have demonstrated that people with AD have far more ß-amyloid plaques in their brains than healthy people, at least as a general rule. But they have also revealed a puzzle: Roughly 30% of people without any signs of dementia have brains “chock-full” of ß-amyloid at autopsy, says neurologist Beau Ances at Washington University in St. Louis in Missouri. That mystery has inspired many in the AD field to ask whether a second misfolded protein, tau, is the real driver of the condition’s neurodegeneration and symptoms, or at least an important accomplice. Until recently, the only ways to test that hypothesis were to measure tau in brain tissue after a person died, or in a sample of cerebrospinal fluid (CSF) extracted from a living person by needle. But in the past several years, researchers have developed PET imaging agents that can harmlessly bind to tau in the living brain. The more tau deposits found in the temporal lobe, a brain region associated with memory, the more likely a person was to show deficits on a battery of memory and attention tests, the team reports today in Science Translational Medicine. © 2016 American Association for the Advancement of Science.
Erika Check Hayden The largest-ever genetics study in the social sciences has turned up dozens of DNA markers that are linked to the number of years of formal education an individual completes. The work, reported this week in Nature, analysed genetic material from around 300,000 people. “This is good news,” says Stephen Hsu, a theoretical physicist at Michigan State University in East Lansing, who studies the genetics of intelligence. “It shows that if you have enough statistical power you can find genetic variants that are associated with cognitive ability.” Yet the study’s authors estimate that the 74 genetic markers they uncovered comprise just 0.43% of the total genetic contribution to educational achievement (A. Okbay et al. Nature http://dx.doi.org/10.1038/nature17671; 2016). By themselves, the markers cannot predict a person’s performance at school. And because the work examined only people of European ancestry, it is unclear whether the results apply to those with roots in other regions, such as Africa or Asia. The findings have proved divisive. Some researchers hope that the work will aid studies of biology, medicine and social policy, but others say that the emphasis on genetics obscures factors that have a much larger impact on individual attainment, such as health, parenting and quality of schooling. © 2016 Nature Publishing Group
Sara Reardon As a medical student in Paris in the 1980s, Eric Vilain found himself pondering the differences between men and women. What causes them to develop differently, and what happens when the process goes awry? At the time, he was encountering babies that defied simple classification as a boy or girl. Born with disorders of sex development (DSDs), many had intermediate genitalia — an overlarge clitoris, an undersized penis or features of both sexes. Then, as now, the usual practice was to operate. And the decision of whether a child would be left with male or female genitalia was often made not on scientific evidence, says Vilain, but on practicality: an oft-repeated, if insensitive, line has it that “it's easier to dig a hole than build a pole”. Vilain found the approach disturbing. “I was fascinated and shocked by how the medical team was making decisions.” Vilain has spent the better part of his career studying the ambiguities of sex. Now a paediatrician and geneticist at the University of California, Los Angeles (UCLA), he is one of the world's foremost experts on the genetic determinants of DSDs. He has worked closely with intersex advocacy groups that campaign for recognition and better medical treatment — a movement that has recently gained momentum. And in 2011, he established a major longitudinal study to track the psychological and medical well-being of hundreds of children with DSDs. © 2016 Nature Publishing Group
By PAM BELLUCK BALTIMORE — Leave it to the youngest person in the lab to think of the Big Idea. Xuyu Qian, 23, a third-year graduate student at Johns Hopkins, was chatting in late January with Hongjun Song, a neurologist. Dr. Song was wondering how to test their three-dimensional model of a brain — well, not a brain, exactly, but an “organoid,” essentially a tiny ball of brain cells, grown from stem cells and mimicking early brain development. “We need a disease,” Dr. Song said. Mr. Qian tossed out something he’d seen in the headlines: “Why don’t we check out this Zika virus?” Within a few weeks — a nanosecond compared with typical scientific research time — that suggestion led to one of the most significant findings in efforts to answer a central question: How does the Zika virus cause brain damage, including the abnormally small heads in babies born to infected mothers? The answer could spur discoveries to prevent such devastating neurological problems. And time is of the essence. One year after the virus was first confirmed in Latin America, with the raging crisis likely to reach the United States this summer, no treatment or vaccine exists. “We can’t wait,” said Dr. Song, at the university’s Institute for Cell Engineering, where he and his wife and research partner, Dr. Guo-Li Ming, provided a pipette-and-petri-dish-level tour. “To translate our work for the clinic, to the public, normally it takes years. This is a case where we can make a difference right away.” The laboratory’s initial breakthrough, published in March with researchers at two other universities, showed that the Zika virus attacked and killed so-called neural progenitor cells, which form early in fetal development and generate neurons in the brain. © 2016 The New York Times Company
By Geraldine Dawson There’s a popular saying in the autism community: “If you’ve met one person with autism, you’ve met one person with autism.” Although this phrase is meant to convey the remarkable variation in abilities and disabilities among people with autism spectrum disorder (ASD), we’re learning that it also applies to the extraordinary variability in how ASD develops. When I first began doing research on autism decades ago, we thought of it as one condition and aimed to discover its “cause.” Now we know ASD is actually a group of lifelong conditions that can arise from a complex combination of multiple genetic and environmental factors. In the same way that each person with ASD has a unique personality and profile of talents and disabilities, each also has a distinct developmental history shaped by a specific combination of genetic and environmental factors. More evidence of this extraordinary variety will be presented this week in Baltimore, where nearly 2,000 of the world’s leading autism researchers will gather for the International Meeting for Autism Research (IMFAR). As president of the International Society for Autism Research, which sponsors the conference, I am more impressed than ever with the progress we are making. New findings being presented at the conference will highlight the importance of the prenatal period in understanding how various environmental factors such as exposure to alcohol, smoking and certain chemical compounds can increase risk for ASD. The impact of many environmental factors depends, however, on an individual’s genetic background and the timing of the exposure. Other research links inflammation—detected in blood spot tests taken at birth—with a higher likelihood of an ASD diagnosis later on. Researchers suggest that certain factors such as maternal infection and other factors during pregnancy may influence an infant’s immune system and contribute to risk. As our knowledge of these risk factors grows, so do the opportunities for promoting healthy pregnancies and better outcomes. © 2016 Scientific American
Chris Woolston A story about epigenetics in the 2 May issue of The New Yorker has been sharply criticized for inaccurately describing how genes are regulated. The article by Siddhartha Mukherjee — a physician, cancer researcher and award-winning author at Columbia University in New York — examines how environmental factors can change the activity of genes without altering the DNA sequence. Jerry Coyne, an evolutionary ecologist at the University of Chicago in Illinois, posted two widely discussed blog posts calling the piece “superficial and misleading”, largely because it ignored key aspects of gene regulation. Other researchers quoted in the blog posts called the piece “horribly damaging” and “a truly painful read”. Mukherjee responded by publishing a point-by-point rebuttal online. Speaking to Nature, he says he now realizes that he erred by omitting key areas of the science, but that he didn’t mean to mislead. “I sincerely thought that I had done it justice,” he says. Mukherjee’s article, ‘Same But Different’, takes a personal view of epigenetics — a term whose definition is highly contentious in the field. The story features his mother and aunt, identical twins who have distinct personalities. Mukherjee, who won a Pulitzer Prize in 2011 for his best-selling book The Emperor of All Maladies: A Biography of Cancer (Scribner, 2010), writes that identical twins differ because: “Chance events — injuries, infections, infatuations; the haunting trill of that particular nocturne — impinge on one twin and not on the other. Genes are turned on and off in response to these events, as epigenetic marks are gradually layered above genes, etching the genome with its own scars, calluses, and freckles.” The article is drawn from a book by Mukherjee that is due out later this month, called The Gene: An Intimate History (Scribner, 2016). © 2016 Nature Publishing Group
By DAN BARRY IDIOT. Imbecile. Cretin. Feebleminded. Moron. Retarded. Offensive now but once quite acceptable, these terms figured in the research for a lengthy article I wrote in 2014 about 32 men who spent decades eviscerating turkeys in a meat-processing plant in Iowa — all for $65 a month, along with food and lodging in an ancient former schoolhouse on a hill. These were men with intellectual disability, which meant they had significant limitations in reasoning, learning and problem solving, as well as in adaptive behavior. But even though “intellectual disability” has been the preferred term for more than a decade, it gave my editors and me pause. We wondered whether readers would instantly understand what the phrase meant. What’s more, advocates and academicians were recommending that I suppress my journalistic instinct to tighten the language. I was told that it was improper to call these men “intellectually disabled,” instead of “men with intellectual disability.” Their disability does not define them; they are human beings with a disability. This linguistic preference is part of society’s long struggle to find the proper terminology for people with intellectual disability, and reflects the discomfort the subject creates among many in the so-called non-disabled world. It speaks to a continuing sense of otherness; to perceptions of what is normal, and not. “It often doesn’t matter what the word is,” said Michael Wehmeyer, the director and senior scientist at the Beach Center on Disability at the University of Kansas. “It’s that people associate that word with what their perceptions of these people are — as broken, or as defective, or as something else.” For many years, the preferred term was, simply, idiot. When Massachusetts established a commission on idiocy in the mid-1840s, it appointed Dr. Samuel G. Howe, an abolitionist and early disability rights advocate, as its chairman. The commission argued for the establishment of schools to help this segment of society, but made clear that it regarded idiocy “as an outward sign of an inward malady.” © 2016 The New York Times Company
Keyword: Development of the Brain
Link ID: 22195 - Posted: 05.09.2016
By Aleszu Bajak In its May 2 issue, The New Yorker magazine published a report titled “Same But Different,” with the subhead: “How epigenetics can blur the line between nature and nurture.” The piece was written by Siddhartha Mukherjee, a physician and author of the Pulitzer prize-winning book “The Emperor of all Maladies: A Biography of Cancer.” In his New Yorker story, Mukherjee, with deft language and colorful anecdotes, examines a topic that is very much du jour in science writing: Epigenetics. Google defines epigenetics as “the study of changes in organisms caused by modification of gene expression, rather than alteration of the genetic code itself.” Merriam Webster’s definition is similar — but not exactly the same: “The study of heritable changes in gene function that do not involve changes in DNA sequence.” The slight variation in definition is telling in itself — and it’s really that “heritable” part that has sparked intense interest not just among scientists, but in the popular mind. Steven Henikoff, a molecular biologist at the Fred Hutchinson Cancer Research Center in Seattle, called Siddhartha Mukherjee’s lyrical take on epigenetics “baloney.” It’s the idea that external factors like diet, or stress or even lifestyle choices can impact not just your own genes, but the genetic information you pass down to all of your descendants. Spend your life smoking cigarettes and eating fatty foods, the thinking goes, and you’ll not just make yourself sick, you’ll predispose your offspring — and their offspring, and their offspring — to associated diseases as well. It’s heady stuff, but much of it remains speculative and poorly supported, which is where Mukherjee may have run into trouble. The publication of his story — an excerpt from his forthcoming book “The Gene: An Intimate History” — was met with swift criticism from biologists working in epigenetics and the broader field of gene regulation. They argue that Mukherjee played fast and loose with his description of epigenetic processes and misled readers by casting aside decades of research into how genes are regulated during development. Copyright 2016 Undark
Link ID: 22194 - Posted: 05.09.2016