Chapter 7. Life-Span Development of the Brain and Behavior
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
By Candy Schulman My mother’s greatest fear was Alzheimer’s. She got Lewy body dementia, or LBD, instead. This little known, oddly named, debilitating illness afflicts an estimated 1.3 million Americans, the actor and comedian Robin Williams possibly among them. It is often misdiagnosed because its signs, such as hallucinations and body rigidity, do not seem like those of dementia, but in the end it robs people of themselves even more painfully. I first noticed my mother’s cognitive difficulties when she was 88. Until then, she’d led an extraordinarily active life: She was a competitive golfer with a bureau full of trophies, a painter and a sculptor. Every Hanukkah she hosted a lively feast for her eight grandchildren and nine great-grandchildren. This time, though, she needed my help planning, shopping and cooking. She was having difficulty with the guest list, trying to write every family member’s name on a piece of paper, adding up the numbers to see how many potatoes to buy for latkes. Her concentration became frayed and she kept ripping it up and starting again, close to tears. Several months before that, she had sent me a Mother’s Day card that was illustrated with childlike prose, colorful illustrations and glitter hearts. The poem on the cover was printed in a playful purple font: “For you, Mom. For kissing my boo-boos, for wiping my face. . . . For calming my fears with your loving embrace.” On Mother’s Day and the rest of the year, Mom added in a shaky script, “thanks.”
Link ID: 20422 - Posted: 12.16.2014
|By Emilie Reas If you carried a gene that doubled your likelihood of getting Alzheimer's disease, would you want to know? What if there was a simple lifestyle change that virtually abolished that elevated risk? People with a gene known as APOE e4 have a higher risk of cognitive impairment and dementia in old age. Even before behavioral symptoms appear, their brains show reduced metabolism, altered activity and more deterioration than those without the high-risk gene. Yet accumulating research is showing that carrying this gene is not necessarily a sentence for memory loss and confusion—if you know how to work it to your advantage with exercise. Scientists have long known that exercise can help stave off cognitive decline. Over the past decade evidence has mounted suggesting that this benefit is even greater for those at higher genetic risk for Alzheimer's. For example, two studies by a team in Finland and Sweden found that exercising at least twice a week in midlife lowers one's chance of getting dementia more than 20 years later, and this protective effect is stronger in people with the APOE e4 gene. Several others reported that frequent exercise—at least three times a week in some studies; up to more than an hour a day in others—can slow cognitive decline only in those carrying the high-risk gene. Furthermore, for those who carry the gene, being sedentary is associated with increased brain accumulation of the toxic protein beta-amyloid, a hallmark of Alzheimer's. More recent studies, including a 2012 paper published in Alzheimer's & Dementia and a 2011 paper in NeuroImage, found that high-risk individuals who exercise have greater brain activity and glucose uptake during a memory task compared with their less active counterparts or with those at low genetic risk. © 2014 Scientific American
By Nicholas Bakalar Poor sleep in older adults may be linked to brain changes associated with dementia, a new study has found. Researchers studied 167 men who underwent sleep tests in 1999 and died by 2010. The study, in Neurology, recorded sleep duration, periods of waking up and episodes of apnea, and used pulse oximetry to measure oxygen saturation of their blood. On autopsy, they found that those in the highest one-quarter for duration of sleep at oxygen saturation of less than 95 percent were almost four times as likely to have higher levels microinfarcts, small areas of dead tissue caused by deprivation of blood supply, as those in the lowest one-quarter. Compared with those in the lowest 25 percent for duration of slow-wave (deep) sleep, those in the highest one-quarter were about a third as likely to have moderate or high levels of generalized brain atrophy. “Prior studies have shown an association between certain types of sleep disturbance and dementia,” said the lead author, Dr. Rebecca P. Gelber, an epidemiologist with the Veterans Administration in Hawaii. “These lesions may help explain the association.” © 2014 The New York Times Company
By Gail Sullivan Chemicals found in food and common household products have been linked to lower IQ in kids exposed to high levels during pregnancy. Previous research linked higher exposure to chemicals called "phthalates" to poor mental and motor development in preschoolers. This study was said to be the first to report a link between prenatal exposure to the chemicals and childhood development. Researchers from Columbia University’s Mailman School of Public Health studied exposure to five types of phthalates, which are sometimes referred to as “hormone disruptors” or “endocrine disruptors.” Among these, di-n-butyl phthalate (DnBP) is used in shower curtains, raincoats, hairspray, food wraps, vinyl and pill coating, among other things — but according to the EPA, the largest source of exposure may be seafood. Di-isobutyl phthalate (DiBP) and Butylbenzyl phthalate (BBzP) are added to plastics to make them flexible. These chemicals may also used in makeup, nail polish, lacquer and explosives. The researchers linked prenatal exposure to phthalates to a more than six-point drop in IQ score compared with kids with less exposure. The study, “Persistent Associations between Maternal Prenatal Exposure to Phthalates on Child IQ at Age 7 Years," was published Wednesday in the journal PLOS One. "The magnitude of these IQ differences is troubling," one of the study’s authors, Robin Whyatt, said in a press release. "A six- or seven-point decline in IQ may have substantial consequences for academic achievement and occupational potential."
Kelly Servick* Anesthesiologists and surgeons who operate on children have been dogged by a growing fear—that being under anesthesia can permanently damage the developing brain. Although the few studies of children knocked out for surgeries have been inconclusive, evidence of impaired development in nematodes, zebrafish, rats, guinea pigs, pigs, and monkeys given common anesthetics has piled up in recent years. Now, the alarm is reaching a tipping point. “Anything that goes from [the roundworm] C. elegans to nonhuman primates, I've got to worry about,” Maria Freire, co-chair of the U.S. Food and Drug Administration (FDA) science advisory board, told attendees at a meeting the agency convened here last month to discuss the issue. The gathering came as anesthesia researchers and regulators consider several moves to address the concerns: a clinical trial of anesthetics in children, a consensus statement about their possible risks, and an FDA warning label on certain drugs. But each step stirs debate. Many involved in the issue are reluctant to make recommendations to parents and physicians based on animal data alone. At the same time, more direct studies of anesthesia's risks in children are plagued by confounding factors, lack of funding, and ethical issues. “We have to generate—very quickly—an action item, because I don't think the status quo is acceptable,” Freire said at the 19 November meeting. “Generating an action item without having the data is where things become very, very tricky.” © 2014 American Association for the Advancement of Science
by Andy Coghlan How does this make you feel? Simply asking people to think about emotion-laden actions as their brains are scanned could become one of the first evidence-based tests for psychiatric illness. Assessing people in this way would be a step towards a more scientific approach to diagnosis, away from that based on how someone behaves or how they describe their symptoms. The US National Institute of Mental Health has had such a goal in mind since 2013. Marcel Just of Carnegie Mellon University in Pittsburgh, Pennsylvania, and his colleagues developed the brain scanning technique and used it to identify people with autism. "This gives us a whole new perspective to understanding psychiatric illnesses and disorders," says Just. "We've discovered a biological thought-marker for autism." The technique builds on work by the group showing that specific thoughts and emotions are represented in the brain by certain patterns of neural activation. The idea is that deviations from these patterns, what Just refers to as thought-markers, can be used to diagnose different psychiatric conditions. The team asked a group of adults to imagine 16 actions, some of which required emotional involvement, such as "hugging", "persuading" or "adoring", while they lay in an fMRI scanner. © Copyright Reed Business Information Ltd.
by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.
By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
By Anna North The idea that poverty can change the brain has gotten significant attention recently, and not just from those lay readers (a minority, according to recent research) who spend a lot of time thinking about neuroscience. Policy makers and others have begun to apply neuroscientific principles to their thinking about poverty — and some say this could end up harming poor people rather than helping. At The Conversation, the sociologist Susan Sered takes issue with “news reports with headlines like this one: ‘Can Brain Science Help Lift People Out Of Poverty?’” She’s referring to a June story by Rachel Zimmerman at WBUR, about a nonprofit called Crittenton Women’s Union that aims to use neuroscience to help get people out of poverty. Elisabeth Babcock, Crittenton’s chief executive, tells Ms. Zimmerman: “What the new brain science says is that the stresses created by living in poverty often work against us, make it harder for our brains to find the best solutions to our problems. This is a part of the reason why poverty is so ‘sticky.’” And, she adds: “If we’ve been raised in poverty under all this stress, our executive functioning wiring, the actual neurology of our brains, is built differently than if we’re not raised in poverty. It is built to react quickly to danger and threats and not built as much to plan or execute strategies for how we want things to be in the future because the future is so uncertain and planning is so pointless that this wiring isn’t as called for.” Dr. Sered, however, says that applying neuroscience to problems like poverty can sometimes lead to trouble: “Studies showing that trauma and poverty change people’s brains can too easily be read as scientific proof that poor people (albeit through no fault of their own) have inferior brains or that women who have been raped are now brain-damaged.” © 2014 The New York Times Company
More than 40 percent of infants in a group who died of sudden infant death syndrome (SIDS) were found to have an abnormality in a key part of the brain, researchers report. The abnormality affects the hippocampus, a brain area that influences such functions as breathing, heart rate, and body temperature, via its neurological connections to the brainstem. According to the researchers, supported by the National Institutes of Health, the abnormality was present more often in infants who died of SIDS than in infants whose deaths could be attributed to known causes. The researchers believe the abnormality may destabilize the brain’s control of breathing and heart rate patterns during sleep, or during the periodic brief arousals from sleep that occur throughout the night. “The new finding adds to a growing body of evidence that brain abnormalities may underlie many cases of sudden infant death syndrome,” said Marian Willinger, Ph.D, special assistant for SIDS at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development, which funded the study. “The hope is that research efforts in this area eventually will provide the means to identify vulnerable infants so that we’ll be able to reduce their risk for SIDS.” SIDS is the sudden death of an infant younger than 1 year of age that is still unexplained after a complete post mortem investigation by a coroner or medical examiner. This investigation includes an autopsy, a review of the death scene, and review of family and medical histories. In the United States, SIDS is the leading cause of death between one month and one year of age. The deaths are associated with an infant’s sleep period.
By Michelle Roberts Health editor, BBC News online The brain has a weak spot for Alzheimer's disease and schizophrenia, according to UK scientists who have pinpointed the region using scans. The brain area involved develops late in adolescence and degenerates early during ageing. At the moment, it is difficult for doctors to predict which people might develop either condition. The findings, in the journal PNAS, hint at a potential way to diagnose those at risk earlier, experts say. Although they caution that "much more research is needed into how to bring these exciting discoveries into the clinic". The Medical Research Council team who carried out the study did MRI brain scans on 484 healthy volunteers aged between eight and 85 years. The researchers, led by Dr Gwenaëlle Douaud of Oxford University, looked at how the brain naturally changes as people age. The images revealed a common pattern - the parts of the brain that were the last to develop were also the first to show signs of age-related decline. These brain regions - a network of nerve cells or grey matter - co-ordinate "high order" information coming from the different senses, such as sight and sound. When the researchers looked at scans of patients with Alzheimer's disease and scans of patients with schizophrenia they found the same brain regions were affected. The findings fit with what other experts have suspected - that although distinct, Alzheimer's and schizophrenia are linked. Prof Hugh Perry of the MRC said: "Early doctors called schizophrenia 'premature dementia' but until now we had no clear evidence that the same parts of the brain might be associated with two such different diseases. This large-scale and detailed study provides an important, and previously missing, link between development, ageing and disease processes in the brain. BBC © 2014
By Chelsea Rice On December 14, 2012, Adam Lanza shot and killed 20 children and six personnel at Sandy Hook Elementary School in Newtown, Connecticut, before turning the gun on himself. Ever since, America has been wondering: Why? Today, after investigating and detailing every available record of the 20-year-old Lanza’s life since birth, the Connecticut Office of the Child Advocate released a report that said: We still don’t know what drove him to commit those terrible acts. But we do know he fell through the cracks of the school system, the health care system, and possibly the awareness of his own parents. Every documented moment of Lanza’s life was evaluated, from mental health records that tracked his social development to school and medical records that outlined his needs—and showed disparities in the services provided to him by the state. The review did not, however, stop at Lanza. It included a review of the laws regarding special education and the confidentiality of personal records in the system, as well as “how these laws implicate professional obligations and practices.” Unredacted state police and law enforcement records were also reviewed alongside interviews and extensive research with members of the Child Fatality Review Panel who led the initial investigation of that day. From “earliest childhood,” according to the report, Lanza had “significant developmental challenges,” such as communication and sensory problems, delays in socialization, and repetitive behaviors. Lanza was seen and evaluated by the New Hampshire “Birth to Three” early intervention program when he was almost 3 years old, and referred to special education preschool services.
By Kelly Servick Dean Hamer finally feels vindicated. More than 20 years ago, in a study that triggered both scientific and cultural controversy, the molecular biologist offered the first direct evidence of a “gay gene,” by identifying a stretch on the X chromosome likely associated with homosexuality. But several subsequent studies called his finding into question. Now the largest independent replication effort so far, looking at 409 pairs of gay brothers, fingers the same region on the X. “When you first find something out of the entire genome, you’re always wondering if it was just by chance,” says Hamer, who asserts that new research “clarifies the matter absolutely.” But not everyone finds the results convincing. And the kind of DNA analysis used, known as a genetic linkage study, has largely been superseded by other techniques. Due to the limitations of this approach, the new work also fails to provide what behavioral geneticists really crave: specific genes that might underlie homosexuality. Few scientists have ventured into this line of research. When the genetics of being gay comes up at scientific meetings, “sometimes even behavioral geneticists kind of wrinkle up their noses,” says Kenneth Kendler, a psychiatric geneticist at Virginia Commonwealth University in Richmond. That’s partially because the science itself is so complex. Studies comparing identical and fraternal twins suggest there is some heritable component to homosexuality, but no one believes that a single gene or genes can make a person gay. Any genetic predispositions probably interact with environmental factors that influence development of a sexual orientation. © 2014 American Association for the Advancement of Science.
By ALAN SCHWARZ CONCORD, Calif. — Every time Matthias is kicked out of a school or day camp for defying adults and clashing with other children, his mother, Joelle Kendle, inches closer to a decision she dreads. With each morning of arm-twisting and leg-flailing as she tries to get him dressed and out the door for first grade, the temptation intensifies. Ms. Kendle is torn over whether to have Matthias, just 6 and already taking the stimulant Adderall for attention deficit hyperactivity disorder, go on a second and more potent medication: the antipsychotic Risperdal. Her dilemma is shared by a steadily rising number of American families who are using multiple psychotropic drugs — stimulants, antipsychotics, antidepressants and others — to temper their children’s troublesome behavior, even though many doctors who mix such medications acknowledge that little is known about the overall benefits and risks for children. In 2012 about one in 54 youngsters ages 6 through 17 covered by private insurance was taking at least two psychotropic medications — a rise of 44 percent in four years, according to Express Scripts, which processes prescriptions for 85 million Americans. Academic studies of children covered by Medicaid have also found higher rates and growth. Combined, the data suggest that about one million children are currently taking various combinations of psychotropics. Risks of antipsychotics alone, for example, are known to include substantial weight gain and diabetes. Stimulants can cause appetite suppression, insomnia and, far more infrequently, hallucinations. Some combinations of medication classes, like antipsychotics and antidepressants, have shown improved benefits (for psychotic depression) but also heightened risks (for heart rhythm disturbances). But this knowledge has been derived substantially from studies in adults — children are rarely studied because of concerns about safety and ethics — leaving many experts worried that the use of multiple psychotropics in youngsters has not been explored fully. There is also debate over whether the United States Food and Drug Administration’s database of patients’ adverse drug reactions reliably monitors the hazards of psychotropic drug combinations, primarily because only a small fraction of cases are ever reported. Some clinicians are left somewhat queasy about relying mostly on anecdotal reports of benefit and harm. © 2014 The New York Times Company
By Emma Wilkinson Health reporter, BBC News Taking vitamin B12 and folic acid supplements does not seem to cut the risk of developing dementia in healthy people, say Dutch researchers. In one of the largest studies to date, there was no difference in memory test scores between those who had taken the supplements for two years and those who were given a placebo. The research was published in the journal Neurology. Alzheimer's Research UK said longer trials were needed to be sure. B vitamins have been linked to Alzheimer's for some years, and scientists know that higher levels of a body chemical called homocysteine can raise the risk of both strokes and dementia. Vitamin B12 and folic acid are both known to lower levels of homocysteine. That, along with studies linking low vitamin B12 and folic acid intake with poor memory, had prompted scientists to view the supplements as a way to ward off dementia. Yet in the study of almost 3,000 people - with an average age of 74 - who took 400 micrograms of folic acid and 500 micrograms of vitamin B12 or a placebo every day, researchers found no evidence of a protective effect. All those taking part in the trial had high blood levels of homocysteine, which did drop more in those taking the supplements. But on four different tests of memory and thinking skills taken at the start and end of the study, there was no beneficial effect of the supplements on performance. The researchers did note that the supplements might slightly slow the rate of decline but concluded the small difference they detected could just have been down to chance. Study leader Dr Rosalie Dhonukshe-Rutten, from Wageningen University in the Netherlands, said: "Since homocysteine levels can be lowered with folic acid and vitamin B12 supplements, the hope has been that taking these vitamins could also reduce the risk of memory loss and Alzheimer's disease. BBC © 2014
Link ID: 20313 - Posted: 11.15.2014
Sara Reardon Companies selling ‘probiotic’ foods have long claimed that cultivating the right gut bacteria can benefit mental well-being, but neuroscientists have generally been sceptical. Now there is hard evidence linking conditions such as autism and depression to the gut’s microbial residents, known as the microbiome. And neuroscientists are taking notice — not just of the clinical implications but also of what the link could mean for experimental design. “The field is going to another level of sophistication,” says Sarkis Mazmanian, a microbiologist at the California Institute of Technology in Pasadena. “Hopefully this will shift this image that there’s too much commercial interest and data from too few labs.” This year, the US National Institute of Mental Health spent more than US$1 million on a new research programme aimed at the microbiome–brain connection. And on 19 November, neuroscientists will present evidence for the link in a symposium at the annual Society for Neuroscience meeting in Washington DC called ‘Gut Microbes and the Brain: Paradigm Shift in Neuroscience’. Although correlations have been noted between the composition of the gut microbiome and behavioural conditions, especially autism1, neuroscientists are only now starting to understand how gut bacteria may influence the brain. The immune system almost certainly plays a part, Mazmanian says, as does the vagus nerve, which connects the brain to the digestive tract. Bacterial waste products can also influence the brain — for example, at least two types of intestinal bacterium produce the neurotransmitter γ-aminobutyric acid (GABA)2. © 2014 Nature Publishing Group
by Helen Thomson Could a futuristic society of humans with the power to control their own biological functions ever become reality? It's not as out there as it sounds, now the technical foundations have been laid. Researchers have created a link between thoughts and cells, allowing people to switch on genes in mice using just their thoughts. "We wanted to be able to use brainwaves to control genes. It's the first time anyone has linked synthetic biology and the mind," says Martin Fussenegger, a bioengineer at ETH Zurich in Basel, Switzerland, who led the team behind the work. They hope to use the technology to help people who are "locked-in" – that is, fully conscious but unable to move or speak – to do things like self-administer pain medication. It might also be able to help people with epilepsy control their seizures. In theory, the technology could be used for non-medical purposes, too. For example, we could give ourselves a hormone burst on demand, much like in the Culture – Iain M. Banks's utopian society, where people are able to secrete hormones and other chemicals to change their mood. Fussenegger's team started by inserting a light-responsive gene into human kidney cells in a dish. The gene is activated, or expressed, when exposed to infrared light. The cells were engineered so that when the gene activated, it caused a cascade of chemical reactions leading to the expression of another gene – the one the team wanted to switch on. © Copyright Reed Business Information Ltd.
By Abby Phillip If you're confused about what marijuana use really does to people who use it, you're not alone. For years, the scientific research on health effects of the drug have been all over the map. Earlier this year, one study suggested that even casual marijuana use could cause changes to the brain. Another found that marijuana use was also associated with poor sperm quality, which could lead to infertility in men. But marijuana advocates point to other research indicating that the drug is far less addictive than other drugs, and some studies have found no relationship between IQ and marijuana use in teens. Researchers at the Center for Brain Health at the University of Texas in Dallas sought to clear up some of the confusion with a study that looked at a relatively large group of marijuana users and evaluated their brains for a slew of different indicators. What they found was complex, but the pattern was clear: The brains of marijuana users were different than those of non-marijuana users. The area of the brain responsible for establishing the reward system that helps us survive and also keeps us motivated was smaller in users than in non-marijuana users. But there was also evidence that the brain compensated for this loss of volume by increasing connectivity and the structural integrity of the brain tissue. Those effects were more pronounced for marijuana users who started young. "The orbitofrontal cortex is one of the primary regions in a network of brain areas called the reward system," explained Francesca Filbey, lead author of the study and an associate professor of the neurogenetics of addictive behavior at the University of Texas in Dallas. "
Email David By David Grimm Place a housecat next to its direct ancestor, the Near Eastern wildcat, and it may take you a minute to spot the difference. They’re about the same size and shape, and, well, they both look like cats. But the wildcat is fierce and feral, whereas the housecat, thanks to nearly 10,000 years of domestication, is tame and adaptable enough to have become the world’s most popular pet. Now scientists have begun to pinpoint the genetic changes that drove this remarkable transformation. The findings, based on the first high-quality sequence of the cat genome, could shed light on how other creatures, even humans, become tame. “This is the closest thing to a smoking gun we’ve ever had,” says Greger Larson, an evolutionary biologist at the University of Oxford in the United Kingdom who has studied the domestication of pigs, dogs, and other animals. “We’re much closer to understanding the nitty-gritty of domestication than we were a decade ago.” Cats first entered human society about 9500 years ago, not long after people first took up farming in the Middle East. Drawn to rodents that had invaded grain stores, wildcats slunk out of the deserts and into villages. There, many scientists suspect, they mostly domesticated themselves, with the friendliest ones able to take advantage of human table scraps and protection. Over thousands of years, cats shrank slightly in size, acquired a panoply of coat colors and patterns, and (largely) shed the antisocial tendencies of their past. Domestic animals from cows to dogs have undergone similar transformations, yet scientists know relatively little about the genes involved. Researchers led by Michael Montague, a postdoc at the Washington University School of Medicine in St. Louis, have now pinpointed some of them. The scientists started with the genome of a domestic cat—a female Abyssinian—that had been published in draft form in 2007, then filled in missing sequences and identified genes. They compared the resulting genome with those of cows, tigers, dogs, and humans. © 2014 American Association for the Advancement of Science.