Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Gary Stix Scientists will never find a single gene for depression—nor two, nor 20. But among the 20,000 human genes and the hundreds of thousands of proteins and molecules that switch on those genes or regulate their activity in some way, there are clues that point to the roots of depression. Tools to identify biological pathways that are instrumental in either inducing depression or protecting against it have recently debuted—and hold the promise of providing leads for new drug therapies for psychiatric and neurological diseases. A recent paper in the journal Neuron illustrates both the dazzling complexity of this approach and the ability of these techniques to pinpoint key genes that may play a role in governing depression. Scientific American talked with the senior author on the paper—neuroscientist Eric Nestler from the Icahn School of Medicine at Mt. Sinai in New York. Nestler spoke about the potential of this research to break the logjam in pharmaceutical research that has impeded development of drugs to treat brain disorders. Scientific American: The first years in the war on cancer met with a tremendous amount of frustration. Things look like they're improving somewhat now for cancer. Do you anticipate a similar trajectory may occur in neuroscience for psychiatric disorders? Eric Nestler: I do. I just think it will take longer. I was in medical school 35 years ago when the idea that identifying a person's specific pathophysiology was put forward as a means of directing treatment of cancer. We're now three decades later finally seeing the day when that’s happening. I definitely think the same will occur for major brain disorders. The brain is just more complicated and the disorders are more complicated so it will take longer. © 2016 Scientific American
By Anil Ananthaswamy and Alice Klein Our brain’s defence against invading microbes could cause Alzheimer’s disease – which suggests that vaccination could prevent the condition. Alzheimer’s disease has long been linked to the accumulation of sticky plaques of beta-amyloid proteins in the brain, but the function of plaques has remained unclear. “Does it play a role in the brain, or is it just garbage that accumulates,” asks Rudolph Tanzi of Harvard Medical School. Now he has shown that these plaques could be defences for trapping invading pathogens. Working with Robert Moir at the Massachusetts General Hospital in Boston, Tanzi’s team has shown that beta-amyloid can act as an anti-microbial compound, and may form part of our immune system. .. To test whether beta-amyloid defends us against microbes that manage to get into the brain, the team injected bacteria into the brains of mice that had been bred to develop plaques like humans do. Plaques formed straight away. “When you look in the plaques, each one had a single bacterium in it,” says Tanzi. “A single bacterium can induce an entire plaque overnight.” Double-edged sword This suggests that infections could be triggering the formation of plaques. These sticky plaques may trap and kill bacteria, viruses or other pathogens, but if they aren’t cleared away fast enough, they may lead to inflammation and tangles of another protein, called tau, causing neurons to die and the progression towards © Copyright Reed Business Information Ltd.
Robert Plomin, Scientists have investigated this question for more than a century, and the answer is clear: the differences between people on intelligence tests are substantially the result of genetic differences. But let's unpack that sentence. We are talking about average differences among people and not about individuals. Any one person's intelligence might be blown off course from its genetic potential by, for example, an illness in childhood. By genetic, we mean differences passed from one generation to the next via DNA. But we all share 99.5 percent of our three billion DNA base pairs, so only 15 million DNA differences separate us genetically. And we should note that intelligence tests include diverse examinations of cognitive ability and skills learned in school. Intelligence, more appropriately called general cognitive ability, reflects someone's performance across a broad range of varying tests. Genes make a substantial difference, but they are not the whole story. They account for about half of all differences in intelligence among people, so half is not caused by genetic differences, which provides strong support for the importance of environmental factors. This estimate of 50 percent reflects the results of twin, adoption and DNA studies. From them, we know, for example, that later in life, children adopted away from their biological parents at birth are just as similar to their biological parents as are children reared by their biological parents. Similarly, we know that adoptive parents and their adopted children do not typically resemble one another in intelligence. © 2016 Scientific American
By Roland Pease BBC Radio Science Unit Researchers have invented a DNA "tape recorder" that can trace the family history of every cell in an organism. The technique is being hailed as a breakthrough in understanding how the trillions of complex cells in a body are descended from a single egg. "It has the potential to provide profound insights into how normal, diseased or damaged tissues are constructed and maintained," one UK biologist told the BBC. The work appears in Science journal. The human body has around 40 trillion cells, each with a highly specialised function. Yet each can trace its history back to the same starting point - a fertilised egg. Developmental biology is the business of unravelling how the genetic code unfolds at each cycle of cell division, how the body plan develops, and how tissues become specialised. But much of what it has revealed has depended on inference rather than a complete cell-by-cell history. "I actually started working on this problem as a graduate student in 2000," confessed Jay Shendure, lead researcher on the new scientific paper. "Could we find a way to record these relationships between cells in some compact form we could later read out in adult organisms?" The project failed then because there was no mechanism to record events in a cell's history. That changed with recent developments in so called CRISPR gene editing, a technique that allows researchers to make much more precise alterations to the DNA in living organisms. The molecular tape recorder developed by Prof Shendure's team at the University of Washington in Seattle, US, is a length of DNA inserted into the genome that contains a series of edit points which can be changed throughout an organism's life. © 2016 BBC.
By BENEDICT CAREY Suzanne Corkin, whose painstaking work with a famous amnesiac known as H.M. helped clarify the biology of memory and its disorders, died on Tuesday in Danvers, Mass. She was 79. Her daughter, Jocelyn Corkin, said the cause was liver cancer. Dr. Corkin met the man who would become a lifelong subject and collaborator in 1964, when she was a graduate student in Montreal at the McGill University laboratory of the neuroscientist Brenda Milner. Henry Molaison — known in published reports as H.M., to protect his privacy — was a modest, middle-aged former motor repairman who had lost the ability to form new memories after having two slivers of his brain removed to treat severe seizures when he was 27. In a series of experiments, Dr. Milner had shown that a part of the brain called the hippocampus was critical to the consolidation of long-term memories. Most scientists had previously thought that memory was not dependent on any one cortical area. Mr. Molaison lived in Hartford, and Dr. Milner had to take the train down to Boston and drive from there to Connecticut to see him. It was a long trip, and transporting him to Montreal proved to be so complicated, largely because of his condition, that Dr. Milner did it just once. Yet rigorous study of H.M., she knew, would require proximity and a devoted facility — with hospital beds — to accommodate extended experiments. The psychology department at the Massachusetts Institute of Technology offered both, and with her mentor’s help, Dr. Corkin landed a position there. Thus began a decades-long collaboration between Dr. Corkin and Mr. Molaison that would extend the work of Dr. Milner, focus intense interest on the hippocampus, and make H.M. the most famous patient in the history of modern brain science. © 2016 The New York Times Company
Keyword: Learning & Memory
Link ID: 22258 - Posted: 05.28.2016
By Jordana Cepelewicz General consensus among Alzheimer’s researchers has it that the disease’s main culprit, a protein called amyloid beta, is an unfortunate waste product that is not known to play any useful role in the body—and one that can have devastating consequences. When not properly cleared from the brain it builds up into plaques that destroy synapses, the junctions between nerve cells, resulting in cognitive decline and memory loss. The protein has thus become a major drug target in the search for a cure to Alzheimer’s. Now a team of researchers at Harvard Medical School and Massachusetts General Hospital are proposing a very different story. In a study published this week in Science Translational Medicine, neurologists Rudolph Tanzi and Robert Moir report evidence that amyloid beta serves a crucial purpose: protecting the brain from invading microbes. “The original idea goes back to 2010 or so when Rob had a few too many Coronas,” Tanzi jokes. Moir had come across surprising similarities between amyloid beta and LL37, a protein that acts as a foot soldier in the brain’s innate immune system, killing potentially harmful bugs and alerting other cells to their presence. “These types of proteins, although small, are very sophisticated in what they do,” Moir says. “And they’re very ancient, going back to the dawn of multicellular life.” © 2016 Scientific American,
By GINA KOLATA Could it be that Alzheimer’s disease stems from the toxic remnants of the brain’s attempt to fight off infection? Provocative new research by a team of investigators at Harvard leads to this startling hypothesis, which could explain the origins of plaque, the mysterious hard little balls that pockmark the brains of people with Alzheimer’s. It is still early days, but Alzheimer’s experts not associated with the work are captivated by the idea that infections, including ones that are too mild to elicit symptoms, may produce a fierce reaction that leaves debris in the brain, causing Alzheimer’s. The idea is surprising, but it makes sense, and the Harvard group’s data, published Wednesday in the journal Science Translational Medicine, supports it. If it holds up, the hypothesis has major implications for preventing and treating this degenerative brain disease. The Harvard researchers report a scenario seemingly out of science fiction. A virus, fungus or bacterium gets into the brain, passing through a membrane — the blood-brain barrier — that becomes leaky as people age. The brain’s defense system rushes in to stop the invader by making a sticky cage out of proteins, called beta amyloid. The microbe, like a fly in a spider web, becomes trapped in the cage and dies. What is left behind is the cage — a plaque that is the hallmark of Alzheimer’s. So far, the group has confirmed this hypothesis in neurons growing in petri dishes as well as in yeast, roundworms, fruit flies and mice. There is much more work to be done to determine if a similar sequence happens in humans, but plans — and funding — are in place to start those studies, involving a multicenter project that will examine human brains. “It’s interesting and provocative,” said Dr. Michael W. Weiner, a radiology professor at the University of California, San Francisco, and a principal investigator of the Alzheimer’s Disease Neuroimaging Initiative, a large national effort to track the progression of the disease and look for biomarkers like blood proteins and brain imaging to signal the disease’s presence. © 2016 The New York Times Company
Sara Reardon Children from impoverished families are more prone to mental illness, and alterations in DNA structure could be to blame, according to a study published on 24 May in Molecular Psychiatry1. Poverty brings with it a number of different stressors, such as poor nutrition, increased prevalence of smoking and the general struggle of trying to get by. All of these can affect a child’s development, particularly in the brain, where the structure of areas involved in response to stress and decision-making have been linked to low socioeconomic status. Poor children are more prone to mental illnesses such as depression than their peers from wealthier families, but they are also more likely to have cognitive problems. Some of these differences are clearly visible in the brain structure and seem to appear at birth, which suggests that prenatal exposure to these stressors can be involved2. But neurodevelopment does not stop at birth. Neuroscientist Ahmad Hariri of Duke University in Durham, North Carolina, suspected that continual exposure to stressors might affect older children as well. He decided to test this idea by studying chemical tags known as methyl groups, which alter DNA structure to regulate how genes are expressed. There is some evidence that methylation patterns can be passed down through generations, but they are also altered by environmental factors, such as smoking. © 2016 Nature Publishing Group,
by Bruce Bower For a landmark 1977 paper, psychologist Andrew Meltzoff stuck his tongue out at 2- to 3-week-old babies. Someone had to do it. After watching Meltzoff razz them for 15 seconds, babies often stuck out their own tongues within the next 2½ minutes. Newborns also tended to respond in kind when the young researcher opened his mouth wide, pushed out his lips like a duck and opened and closed the fingers of one hand. Meltzoff, now at the University of Washington in Seattle, and a colleague were the first to report that babies copy adults’ simple physical deeds within weeks of birth. Until then, most scientists assumed that imitation began at around 9 months of age. Newborns don’t care that imitation is the sincerest form of flattery. For them, it may be a key to interacting with (and figuring out) those large, smiley people who come to be known as mommy and daddy. And that’s job number one for tykes hoping to learn how to talk and hang out with a circle of friends. Meltzoff suspected that babies enter the world able to compare their own movements — even those they can feel but not see, such as a projecting tongue — to corresponding adult actions. Meltzoff’s report has inspired dozens of papers on infant imitation. Some have supported his results, some haven’t. A new report, published May 5 in Current Biology, falls in the latter group. The study of 106 Australian babies tracked from 1 to 9 weeks of age concludes that infants don’t imitate anyone. © Society for Science & the Public 2000 - 201
Keyword: Development of the Brain
Link ID: 22246 - Posted: 05.25.2016
By Lisa Rapaport (Reuters Health) - Attention deficit hyperactivity disorder (ADHD), usually diagnosed in children, may show up for the first time in adulthood, two recent studies suggest. And not only can ADHD appear for the first time after childhood, but the symptoms for adult-onset ADHD may be different from symptoms experienced by kids, the researchers found. “Although the nature of symptoms differs somewhat between children and adults, all age groups show impairments in multiple domains – school, family and friendships for kids and school, occupation, marriage and driving for adults,” said Stephen Faraone, a psychiatry researcher at SUNY Upstate Medical University in Syracuse, New York and author of an editorial accompanying the two studies in JAMA Psychiatry. Faraone cautions, however, that some newly diagnosed adults might have had undetected ADHD as children. Support from parents and teachers or high intelligence, for example, might prevent ADHD symptoms from emerging earlier in life. It’s not clear whether study participants “were completely free of psychopathology prior to adulthood,” Faraone said in an email. One of the studies, from Brazil, tracked more than 5,200 people born in 1993 until they were 18 or 19 years old. © 2016 Scientific American
By Diana Kwon More than one in 10 Americans older than 12 takes antidepressants, according to a 2011 report by the National Center for Health Statistics. A significant but unknown number of children younger than 12 take them, too. Although most such drugs are not approved for young children, doctors have prescribed them off-label for years because they have been thought to have relatively mild side effects. Yet recent reports have revealed that important data about the safety of these drugs—especially their risks for children and adolescents—have been withheld from the medical community and the public. In the latest and most comprehensive analysis, published in January in the BMJ, researchers at the Nordic Cochrane Center in Copenhagen showed that pharmaceutical companies have not been revealing the full extent of serious harm in clinical study reports, which are detailed documents sent to regulatory authorities such as the U.S. Food and Drug Administration and the European Medicines Agency (EMA) when applying for approval of a new drug. The researchers examined reports from 70 double-blind, placebo-controlled trials of two common categories of antidepressants—selective serotonin reuptake inhibitors (SSRIs) and serotonin and norepinephrine reuptake inhibitors (SNRIs)—and found that the occurrence of suicidal thoughts and aggressive behavior doubled in children and adolescents who used these drugs. The investigators discovered that some of the most revealing information was buried in appendices where individual patient outcomes are listed. For example, they found clear instances of suicidal thinking that had been passed off as “emotional lability” or “worsening depression” in the report itself. This information, however, was available for only 32 out of the 70 trials. “We found that a lot of the appendices were often only available on request to the authorities, and the authorities had never requested them,” says Tarang Sharma, a Ph.D. student at Cochrane and lead author of the study. “I'm actually kind of scared about how bad the actual situation would be if we had the complete data.” © 2016 Scientific American
Laura Sanders In mice, a long course of antibiotics that wiped out gut bacteria slowed the birth of new brain cells and impaired memory, scientists write May 19 in Cell Reports. The results reinforce evidence for a powerful connection between bacteria in the gut and the brain (SN: 4/2/16, p. 23). After seven weeks of drinking water spiked with a cocktail of antibiotics, mice had fewer newborn nerve cells in a part of the hippocampus, a brain structure important for memory. The mice’s ability to remember previously seen objects also suffered. Further experiments revealed one way bacteria can influence brain cell growth and memory. Injections of immune cells called Ly6Chi monocytes boosted the number of new nerve cells. Themonocytes appear to carry messages from gut to brain, Susanne Wolf of the Max Delbrück Center for Molecular Medicine in Berlin and colleagues found. Exercise and probiotic treatment with eight types of live bacteria also increased the number of newborn nerve cells and improved memory in mice treated with antibiotics. The results help clarify the toll of prolonged antibiotic treatment, and hint at ways to fight back, the authors write. L. Möhle et al. Ly6Chi monocytes provide a link between antibiotic-induced changes in gut microbiota and adult hippocampal neurogenesis. Cell Reports. Vol. 15, May 31, 2016. doi: 10.1016/j.celrep.2016.04.074. © Society for Science & the Public 2000 - 2016
By Sarah Kaplan You probably wouldn't be surprised if a scientist told you that your genes influence when you hit puberty, how tall you are, what your BMI will be and whether you're likely to develop male pattern baldness. But what if he said that the same gene could hold sway over all four things? That finding comes from a study published Monday in the journal Nature Genetics. Using data from dozens of genome-wide association studies (big scans of complete sets of DNA from many thousands of people), researchers at the New York Genome Center and the genetic analysis company 23andMe found examples of single "multitasking" genes that influence diverse and sometimes seemingly disparate traits. The scientists say that the links they uncovered could help researchers understand how certain genes work, and figure out better ways of treating some of the health problems they might control. "Most studies tend to go one disease at a time," said Joseph Pickrell, a professor at Columbia University and the New York Genome Center's lead investigator on the project. "But if we can try to make these sorts of connections between what you might think of as unrelated traits ... that gives us another angle of attack to understand the connections between these different diseases." To start, Pickrell and his team sought out genome-wide association studies (GWAS) identifying particular genetic variants associated with 42 different traits. Many had to do with diseases (for example, studies that linked certain genes to the risk of developing Alzheimer's or type 2 diabetes) and other personal health traits (body mass index, blood type, cholesterol levels).
Keyword: Genes & Behavior
Link ID: 22225 - Posted: 05.18.2016
By JONATHAN BALCOMBE Washington — IN March, two marine biologists published a study of giant manta rays responding to their reflections in a large mirror installed in their aquarium in the Bahamas. The two captive rays circled in front of the mirror, blew bubbles and performed unusual body movements as if checking their reflection. They made no obvious attempt to interact socially with their reflections, suggesting that they did not mistake what they saw as other rays. The scientists concluded that the mantas seemed to be recognizing their reflections as themselves. Mirror self-recognition is a big deal. It indicates self-awareness, a mental attribute previously known only among creatures of noted intelligence like great apes, dolphins, elephants and magpies. We don’t usually think of fishes as smart, let alone self-aware. As a biologist who specializes in animal behavior and emotions, I’ve spent the past four years exploring the science on the inner lives of fishes. What I’ve uncovered indicates that we grossly underestimate these fabulously diverse marine vertebrates. The accumulating evidence leads to an inescapable conclusion: Fishes think and feel. Because fishes inhabit vast, obscure habitats, science has only begun to explore below the surface of their private lives. They are not instinct-driven or machinelike. Their minds respond flexibly to different situations. They are not just things; they are sentient beings with lives that matter to them. A fish has a biography, not just a biology. Those giant manta rays have the largest brains of any fish, and their relative brain-to-body size is comparable to that of some mammals. So, an exception? Then you haven’t met the frillfin goby. © 2016 The New York Times Company
By Julia Shaw You see a crime take place. You are interviewed about it. You give a statement about what you saw. Do you think that at a later date you would be able to detect whether someone had tampered with your statement? Or re-written parts of it? This is currently a hot topic in the UK, where a very recently published inquiry into the so-called Hillsborough disaster, in which 96 people were crushed to death during a soccer match in 1989, found that testimonies had been deliberately altered by police. Research published earlier this year by the false memory dream team at the University of California, looked directly into the implications of such police (mis)conduct. They found that it is possible that changed statements can go unnoticed by the person who gave the original testimony, and may even develop into a false memory that accommodates the false account. To describe this effect, the researchers came up with the term "memory blindness"—the phenomenon of failing to recognize our own memories. The term was intended to mirror the ‘choice blindness’ literature. Choice blindness is forgetting choices that we have made. The researchers wanted to know “Can choice blindness have lasting effects on eyewitness memory?” To examine this, PhD Student Kevin Cochran and his colleagues conducted two experiments. © 2016 Scientific American
Keyword: Learning & Memory
Link ID: 22218 - Posted: 05.16.2016
Bret Stetka Last year, in an operating room at the University of Toronto, a 63-year-old women with Alzheimer's disease experienced something she hadn't for 55 years: a memory of her 8-year-old self playing with her siblings on their family farm in Scotland. The woman is a patient of Dr. Andres Lozano, a neurosurgeon who is among a growing number of researchers studying the potential of deep brain stimulation to treat Alzheimer's and other forms of dementia. If the approach pans out it could provide options for patients with fading cognition and retrieve vanished memories. Right now, deep brain stimulation is used primarily to treat Parkinson's disease and tremor, for which it's approve by the Food and Drug Administration. DBS involves delivering electrical impulses to specific areas of the brain through implanted electrodes. The technique is also approved for obsessive-compulsive disorder and is being looked at for a number of other brain disorders, including depression, chronic pain and, as in Lozano's work, dementia. In 2008 Lozano's group published a study in which an obese patient was treated with deep brain stimulation of the hypothalamus. Though no bigger than a pea, the hypothalamus is a crucial bit of brain involved in appetite regulation and other bodily essentials such as temperature control, sleep and circadian rhythms. It seemed like a reasonable target in trying to suppress excessive hunger. To the researcher's surprise, following stimulation the patient reported a sensation of deja vu. He also perceived feeling 20 years younger and recalled a memory of being in a park with friends, including an old girlfriend. With increasing voltages his memories became more vivid. He remembered their clothes. © 2016 npr
Keyword: Learning & Memory
Link ID: 22213 - Posted: 05.14.2016
Laura Sanders Brain waves during REM sleep solidify memories in mice, scientists report in the May 13 Science. Scientists suspected that the eye-twitchy, dream-packed slumber known as rapid eye movement sleep was important for memory. But REM sleep’s influence on memory has been hard to study, in part because scientists often resorted to waking people or animals up — a stressful experience that might influence memory in different ways. Richard Boyce of McGill University in Montreal and colleagues interrupted REM sleep in mice in a more delicate way. Using a technique called optogenetics, the researchers blocked a brain oscillation called theta waves in the hippocampus, a brain structure involved in memory, during REM sleep. This light touch meant that the mice stayed asleep but had fewer REM-related theta waves in their hippocampi. Usually, post-learning sleep helps strengthen memories. But mice with disturbed REM sleep had memory trouble, the researchers found. Curious mice will spend more time checking out an object that’s been moved to a new spot than an unmoved object. But after the sleep treatment, the mice seemed to not remember objects’ earlier positions, spending equal time exploring an unmoved object as one in a new place. These mice also showed fewer signs of fear in a place where they had previously suffered shocks. Interfering with theta waves during other stages of sleep didn’t seem to cause memory trouble, suggesting that something special happens during REM sleep. R. Boyce et al. Causal evidence for the role of REM sleep theta rhythm in contextual memory consolidation. Science. Vol. 352, p. 812, May 13, 2016. doi: 10.1126/science.aad5252. © Society for Science & the Public 2000 - 2016.
By Emily Underwood One of the telltale signs of Alzheimer’s disease (AD) is sticky plaques of ß-amyloid protein, which form around neurons and are thought by a large number of scientists to bog down information processing and kill cells. For more than a decade, however, other researchers have fingered a second protein called tau, found inside brain cells, as a possible culprit. Now, a new imaging study of 10 people with mild AD suggests that tau deposits—not amyloid—are closely linked to symptoms such as memory loss and dementia. Although this evidence won’t itself resolve the amyloid-tau debate, the finding could spur more research into new, tau-targeting treatments and lead to better diagnostic tools, researchers say. Scientists have long used an imaging technique called positron emission tomography (PET) to visualize ß-amyloid deposits marked by radioactive chemical tags in the brains of people with AD. Combined with postmortem analyses of brain tissue, these studies have demonstrated that people with AD have far more ß-amyloid plaques in their brains than healthy people, at least as a general rule. But they have also revealed a puzzle: Roughly 30% of people without any signs of dementia have brains “chock-full” of ß-amyloid at autopsy, says neurologist Beau Ances at Washington University in St. Louis in Missouri. That mystery has inspired many in the AD field to ask whether a second misfolded protein, tau, is the real driver of the condition’s neurodegeneration and symptoms, or at least an important accomplice. Until recently, the only ways to test that hypothesis were to measure tau in brain tissue after a person died, or in a sample of cerebrospinal fluid (CSF) extracted from a living person by needle. But in the past several years, researchers have developed PET imaging agents that can harmlessly bind to tau in the living brain. The more tau deposits found in the temporal lobe, a brain region associated with memory, the more likely a person was to show deficits on a battery of memory and attention tests, the team reports today in Science Translational Medicine. © 2016 American Association for the Advancement of Science.
Erika Check Hayden The largest-ever genetics study in the social sciences has turned up dozens of DNA markers that are linked to the number of years of formal education an individual completes. The work, reported this week in Nature, analysed genetic material from around 300,000 people. “This is good news,” says Stephen Hsu, a theoretical physicist at Michigan State University in East Lansing, who studies the genetics of intelligence. “It shows that if you have enough statistical power you can find genetic variants that are associated with cognitive ability.” Yet the study’s authors estimate that the 74 genetic markers they uncovered comprise just 0.43% of the total genetic contribution to educational achievement (A. Okbay et al. Nature http://dx.doi.org/10.1038/nature17671; 2016). By themselves, the markers cannot predict a person’s performance at school. And because the work examined only people of European ancestry, it is unclear whether the results apply to those with roots in other regions, such as Africa or Asia. The findings have proved divisive. Some researchers hope that the work will aid studies of biology, medicine and social policy, but others say that the emphasis on genetics obscures factors that have a much larger impact on individual attainment, such as health, parenting and quality of schooling. © 2016 Nature Publishing Group
Sara Reardon As a medical student in Paris in the 1980s, Eric Vilain found himself pondering the differences between men and women. What causes them to develop differently, and what happens when the process goes awry? At the time, he was encountering babies that defied simple classification as a boy or girl. Born with disorders of sex development (DSDs), many had intermediate genitalia — an overlarge clitoris, an undersized penis or features of both sexes. Then, as now, the usual practice was to operate. And the decision of whether a child would be left with male or female genitalia was often made not on scientific evidence, says Vilain, but on practicality: an oft-repeated, if insensitive, line has it that “it's easier to dig a hole than build a pole”. Vilain found the approach disturbing. “I was fascinated and shocked by how the medical team was making decisions.” Vilain has spent the better part of his career studying the ambiguities of sex. Now a paediatrician and geneticist at the University of California, Los Angeles (UCLA), he is one of the world's foremost experts on the genetic determinants of DSDs. He has worked closely with intersex advocacy groups that campaign for recognition and better medical treatment — a movement that has recently gained momentum. And in 2011, he established a major longitudinal study to track the psychological and medical well-being of hundreds of children with DSDs. © 2016 Nature Publishing Group