Chapter 13. Memory, Learning, and Development
Follow us on Facebook and Twitter, or subscribe to our mailing list, to receive news updates. Learn more.
by Andy Coghlan What would Stewart Little make of it? Mice have been created whose brains are half human. As a result, the animals are smarter than their siblings. The idea is not to mimic fiction, but to advance our understanding of human brain diseases by studying them in whole mouse brains rather than in dishes. The altered mice still have mouse neurons – the "thinking" cells that make up around half of all their brain cells. But practically all the glial cells in their brains, the ones that support the neurons, are human. "It's still a mouse brain, not a human brain," says Steve Goldman of the University of Rochester Medical Center in New York. "But all the non-neuronal cells are human." Goldman's team extracted immature glial cells from donated human fetuses. They injected them into mouse pups where they developed into astrocytes, a star-shaped type of glial cell. Within a year, the mouse glial cells had been completely usurped by the human interlopers. The 300,000 human cells each mouse received multiplied until they numbered 12 million, displacing the native cells. "We could see the human cells taking over the whole space," says Goldman. "It seemed like the mouse counterparts were fleeing to the margins." Astrocytes are vital for conscious thought, because they help to strengthen the connections between neurons, called synapses. Their tendrils (see image) are involved in coordinating the transmission of electrical signals across synapses. © Copyright Reed Business Information Ltd.
by Aviva Rutkin THERE is only one real rule to conversing with a baby: talking is better than not talking. But that one rule can make a lifetime of difference. That's the message that the US state of Georgia hopes to send with Talk With Me Baby, a public health programme devoted to the art of baby talk. Starting in January, nurses will be trained in the best way to speak to babies to help them learn language, based on what the latest neuroscience says. Then they, along with teachers and nutritionists, will model this good behaviour for the parents they meet. Georgia hopes to expose every child born in 2015 in the Atlanta area to this speaking style; by 2018, the hope is to reach all 130,000 or so newborns across the state. Talk With Me Baby is the latest and largest attempt to provide "language nutrition" to infants in the US – a rich quantity and variety of words supplied at a critical time in the brain's development. Similar initiatives have popped up in Providence, Rhode Island, where children have been wearing high-tech vests that track every word they hear, and Hollywood, where the Clinton Foundation has encouraged television shows like Parenthood and Orange is the New Black to feature scenes demonstrating good baby talk. "The idea is that language is as important to the brain as food is to physical growth," says Arianne Weldon, director of Get Georgia Reading, one of several partner organisations involved in Talk With Me Baby. © Copyright Reed Business Information Ltd.
By Amy Ellis Nutt Scientists say the "outdoor effect" on nearsighted children is real: natural light is good for the eyes. (Photo by Bill O'Leary/The Washington Post) It's long been thought kids are more at risk of nearsightedness, or myopia, if they spend hours and hours in front of computer screens or fiddling with tiny hand-held electronic devices. Not true, say scientists. But now there is research that suggests that children who are genetically predisposed to the visual deficit can improve their chances of avoiding eyeglasses just by stepping outside. Yep, sunshine is all they need -- more specifically, the natural light of outdoors -- and 14 hours a week of outdoor light should do it. Why this is the case is not exactly clear. "We don't really know what makes outdoor time so special," said Donald Mutti, the lead researcher of the study from Ohio State University College of Optometry, in a press release. "If we knew, we could change how we approach myopia." What is known is that UVB light, (invisible ultraviolet B rays), plays a role in the cellular production of vitamin D, which is believed to help the eyes focus light on the retina. However, the Ohio State researchers think there is another possibility. "Between the ages of five and nine, a child's eye is still growing," said Mutti. "Sometimes this growth causes the distance between the lens and the retina to lengthen, leading to nearsightedness. We think these different types of outdoor light may help preserve the proper shape and length of the eye during that growth period."
By BENEDICT CAREY Quick: Which American president served before slavery ended, John Tyler or Rutherford B. Hayes? If you need Google to get the answer, you are not alone. (It is Tyler.) Collective cultural memory — for presidents, for example — works according to the same laws as the individual kind, at least when it comes to recalling historical names and remembering them in a given order, researchers reported on Thursday. The findings suggest that leaders who are well known today, like the elder President George Bush and President Bill Clinton, will be all but lost to public memory in just a few decades. The particulars from the new study, which tested Americans’ ability to recollect the names of past presidents, are hardly jaw-dropping: People tend to recall best the presidents who served recently, as well as the first few in the country’s history. They also remember those who navigated historic events, like the ending of slavery (Abraham Lincoln) and World War II (Franklin D. Roosevelt). But the broader significance of the report — the first to measure forgetfulness over a 40-year period, using a constant list — is that societies collectively forget according to the same formula as, say, a student who has studied a list of words. Culture imitates biology, even though the two systems work in vastly different ways. The new paper was published in the journal Science. “It’s an exciting study, because it mixes history and psychology and finds this one-on-one correspondence” in the way memory functions, said David C. Rubin, a psychologist at Duke University who was not involved in the research. The report is based on four surveys by psychologists now at Washington University in St. Louis, conducted from 1974 to 2014. In the first three, in 1974, 1991 and 2009, Henry L. Roediger III gave college students five minutes to write down as many presidents as they could remember, in order. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20364 - Posted: 11.29.2014
By Amy Ellis Nutt In a novel use of video game playing, researchers at Ohio State have found a Pac-Man-like game, when played repetitively, can improve vision in both children and adults who have "lazy eye" or poor depth perception. In the Pac-Man-style game, players wear red-green 3-D glasses that filter images to the right and left eyes. The lazy or weak eye sees two discs containing vertical, horizontal or diagonal lines superimposed on a background of horizontal lines. The dominant eye sees a screen of only horizontal lines. The player controls the larger, Pac-man-like disc and chases the smaller one. In another game, the player must match discs with rows based on the orientation of their lines. Ten Leng Ooi, professor of optometry at Ohio State University, presented her research findings at last week's annual meeting of the Society for Neuroscience. Only a handful of test subjects were involved in the experimental training, but all saw weak-eye improvement to 20/20 vision or better and for a period of at least eight months. Lazy eye, or amblyopia, affects between 2 and 3 percent of the U.S. population. The disorder usually occurs in infancy when the neural pathway between the brain and one eye (or sometimes both) fails to fully develop. Often the cause of lazy eye is strabismus, in which the eyes are misaligned or "crossed." To prevent double vision, the brain simply blocks the fuzzy images from one eye, thereby causing incomplete visual development. The result: lazy eye.
By Linda Searing THE QUESTION Keeping your brain active by working is widely believed to protect memory and thinking skills as we age. Does the type of work matter? THIS STUDY involved 1,066 people who, at an average age of 70, took a battery of tests to measure memory, processing speed and cognitive ability. The jobs they had held were rated by the complexity of dealings with people, data and things. Those whose main jobs required complex work, especially in dealings with people — such as social workers, teachers, managers, graphic designers and musicians — had higher cognitive scores than those who had held jobs requiring less-complex dealings, such as construction workers, food servers and painters. Overall, more-complex occupations were tied to higher cognitive scores, regardless of someone’s IQ, education or environment. WHO MAY BE AFFECTED? Older adults. Cognitive abilities change with age, so it can take longer to recall information or remember where you placed your keys. That is normal and not the same thing as dementia, which involves severe memory loss as well as declining ability to function day to day. Commonly suggested ways to maintain memory and thinking skills include staying socially active, eating healthfully and getting adequate sleep as well as such things as doing crossword puzzles, learning to play a musical instrument and taking varied routes to common destinations when driving.
Keyword: Learning & Memory
Link ID: 20359 - Posted: 11.26.2014
By Anna North The idea that poverty can change the brain has gotten significant attention recently, and not just from those lay readers (a minority, according to recent research) who spend a lot of time thinking about neuroscience. Policy makers and others have begun to apply neuroscientific principles to their thinking about poverty — and some say this could end up harming poor people rather than helping. At The Conversation, the sociologist Susan Sered takes issue with “news reports with headlines like this one: ‘Can Brain Science Help Lift People Out Of Poverty?’” She’s referring to a June story by Rachel Zimmerman at WBUR, about a nonprofit called Crittenton Women’s Union that aims to use neuroscience to help get people out of poverty. Elisabeth Babcock, Crittenton’s chief executive, tells Ms. Zimmerman: “What the new brain science says is that the stresses created by living in poverty often work against us, make it harder for our brains to find the best solutions to our problems. This is a part of the reason why poverty is so ‘sticky.’” And, she adds: “If we’ve been raised in poverty under all this stress, our executive functioning wiring, the actual neurology of our brains, is built differently than if we’re not raised in poverty. It is built to react quickly to danger and threats and not built as much to plan or execute strategies for how we want things to be in the future because the future is so uncertain and planning is so pointless that this wiring isn’t as called for.” Dr. Sered, however, says that applying neuroscience to problems like poverty can sometimes lead to trouble: “Studies showing that trauma and poverty change people’s brains can too easily be read as scientific proof that poor people (albeit through no fault of their own) have inferior brains or that women who have been raped are now brain-damaged.” © 2014 The New York Times Company
By Gary Stix One area of brain science that has drawn intense interest in recent years is the study of what psychologists call reconsolidation—a ponderous technical term that, once translated, means giving yourself a second chance. Memories of our daily experience are formed, often during sleep, by inscribing—or “consolidating”—a record of what happened into neural tissue. Joy at the birth of a child or terror in response to a violent personal assault. A bad memory, once fixed, may replay again and again, turning toxic and all-consuming. For the traumatized, the desire to forget becomes an impossible dream. Reconsolidation allows for a do-over by drawing attention to the emotional and factual content of traumatic experience. In the safety of a therapist’s office, the patient lets demons return and then the goal is to reshape karma to form a new more benign memory. The details remain the same, but the power of the old terror to overwhelm and induce psychic paralysis begins to subside. The clinician would say that the memory has undergone a change in “valence”—from negative to neutral and detached. The trick to undertaking successful reconsolidation requires revival of these memories without provoking panic and chaos that can only makes things worse. Talk therapies and psycho-pharm may not be enough. One new idea just starting to be explored is the toning down of memories while a patient is fast asleep © 2014 Scientific American,
More than 40 percent of infants in a group who died of sudden infant death syndrome (SIDS) were found to have an abnormality in a key part of the brain, researchers report. The abnormality affects the hippocampus, a brain area that influences such functions as breathing, heart rate, and body temperature, via its neurological connections to the brainstem. According to the researchers, supported by the National Institutes of Health, the abnormality was present more often in infants who died of SIDS than in infants whose deaths could be attributed to known causes. The researchers believe the abnormality may destabilize the brain’s control of breathing and heart rate patterns during sleep, or during the periodic brief arousals from sleep that occur throughout the night. “The new finding adds to a growing body of evidence that brain abnormalities may underlie many cases of sudden infant death syndrome,” said Marian Willinger, Ph.D, special assistant for SIDS at NIH’s Eunice Kennedy Shriver National Institute of Child Health and Human Development, which funded the study. “The hope is that research efforts in this area eventually will provide the means to identify vulnerable infants so that we’ll be able to reduce their risk for SIDS.” SIDS is the sudden death of an infant younger than 1 year of age that is still unexplained after a complete post mortem investigation by a coroner or medical examiner. This investigation includes an autopsy, a review of the death scene, and review of family and medical histories. In the United States, SIDS is the leading cause of death between one month and one year of age. The deaths are associated with an infant’s sleep period.
By Michelle Roberts Health editor, BBC News online The brain has a weak spot for Alzheimer's disease and schizophrenia, according to UK scientists who have pinpointed the region using scans. The brain area involved develops late in adolescence and degenerates early during ageing. At the moment, it is difficult for doctors to predict which people might develop either condition. The findings, in the journal PNAS, hint at a potential way to diagnose those at risk earlier, experts say. Although they caution that "much more research is needed into how to bring these exciting discoveries into the clinic". The Medical Research Council team who carried out the study did MRI brain scans on 484 healthy volunteers aged between eight and 85 years. The researchers, led by Dr Gwenaëlle Douaud of Oxford University, looked at how the brain naturally changes as people age. The images revealed a common pattern - the parts of the brain that were the last to develop were also the first to show signs of age-related decline. These brain regions - a network of nerve cells or grey matter - co-ordinate "high order" information coming from the different senses, such as sight and sound. When the researchers looked at scans of patients with Alzheimer's disease and scans of patients with schizophrenia they found the same brain regions were affected. The findings fit with what other experts have suspected - that although distinct, Alzheimer's and schizophrenia are linked. Prof Hugh Perry of the MRC said: "Early doctors called schizophrenia 'premature dementia' but until now we had no clear evidence that the same parts of the brain might be associated with two such different diseases. This large-scale and detailed study provides an important, and previously missing, link between development, ageing and disease processes in the brain. BBC © 2014
By Chelsea Rice On December 14, 2012, Adam Lanza shot and killed 20 children and six personnel at Sandy Hook Elementary School in Newtown, Connecticut, before turning the gun on himself. Ever since, America has been wondering: Why? Today, after investigating and detailing every available record of the 20-year-old Lanza’s life since birth, the Connecticut Office of the Child Advocate released a report that said: We still don’t know what drove him to commit those terrible acts. But we do know he fell through the cracks of the school system, the health care system, and possibly the awareness of his own parents. Every documented moment of Lanza’s life was evaluated, from mental health records that tracked his social development to school and medical records that outlined his needs—and showed disparities in the services provided to him by the state. The review did not, however, stop at Lanza. It included a review of the laws regarding special education and the confidentiality of personal records in the system, as well as “how these laws implicate professional obligations and practices.” Unredacted state police and law enforcement records were also reviewed alongside interviews and extensive research with members of the Child Fatality Review Panel who led the initial investigation of that day. From “earliest childhood,” according to the report, Lanza had “significant developmental challenges,” such as communication and sensory problems, delays in socialization, and repetitive behaviors. Lanza was seen and evaluated by the New Hampshire “Birth to Three” early intervention program when he was almost 3 years old, and referred to special education preschool services.
By MAX BEARAK MUMBAI, India — The young man sat cross-legged atop a cushioned divan on an ornately decorated stage, surrounded by other Jain monks draped in white cloth. His lip occasionally twitched, his hands lay limp in his lap, and for the most part his eyes were closed. An announcer repeatedly chastised the crowd for making even the slightest noise. From daybreak until midafternoon, members of the audience approached the stage, one at a time, to show the young monk a random object, pose a math problem, or speak a word or phrase in one of at least six different languages. He absorbed the miscellany silently, letting it slide into his mind, as onlookers in their seats jotted everything down on paper. After six hours, the 500th and last item was uttered — it was the number 100,008. An anxious hush descended over the crowd. And the monk opened his eyes and calmly recalled all 500 items, in order, detouring only once to fill in a blank he had momentarily set aside. When he was done, and the note-keepers in the audience had confirmed his achievement, the tense atmosphere dissolved and the announcer led the crowd in a series of triumphant chants. The opportunity to witness the feat of memory drew a capacity crowd of 6,000 to the Sardar Vallabhbhai Patel stadium in Mumbai on Sunday. The exhibition was part of a campaign to encourage schoolchildren to use meditation to build brainpower, as Jain monks have done for centuries in India, a country drawn both toward ancient religious practices and more recent ambitions. But even by Jain standards, the young monk — Munishri Ajitchandrasagarji, 24 — is something special. His guru, P. P. Acharya Nayachandrasagarji, said no other monk in many years had come close to his ability. © 2014 The New York Times Company
By Gretchen Reynolds Exercise seems to be good for the human brain, with many recent studies suggesting that regular exercise improves memory and thinking skills. But an interesting new study asks whether the apparent cognitive benefits from exercise are real or just a placebo effect — that is, if we think we will be “smarter” after exercise, do our brains respond accordingly? The answer has significant implications for any of us hoping to use exercise to keep our minds sharp throughout our lives. In experimental science, the best, most reliable studies randomly divide participants into two groups, one of which receives the drug or other treatment being studied and the other of which is given a placebo, similar in appearance to the drug, but not containing the active ingredient. Placebos are important, because they help scientists to control for people’s expectations. If people believe that a drug, for example, will lead to certain outcomes, their bodies may produce those results, even if the volunteers are taking a look-alike dummy pill. That’s the placebo effect, and its occurrence suggests that the drug or procedure under consideration isn’t as effective as it might seem to be; some of the work is being done by people’s expectations, not by the medicine. Recently, some scientists have begun to question whether the apparently beneficial effects of exercise on thinking might be a placebo effect. While many studies suggest that exercise may have cognitive benefits, those experiments all have had a notable scientific limitation: They have not used placebos. This issue is not some abstruse scientific debate. If the cognitive benefits from exercise are a result of a placebo effect rather than of actual changes in the brain because of the exercise, then those benefits could be ephemeral and unable in the long term to help us remember how to spell ephemeral. © 2014 The New York Times Company
Keyword: Learning & Memory
Link ID: 20329 - Posted: 11.20.2014
By Kelly Servick Dean Hamer finally feels vindicated. More than 20 years ago, in a study that triggered both scientific and cultural controversy, the molecular biologist offered the first direct evidence of a “gay gene,” by identifying a stretch on the X chromosome likely associated with homosexuality. But several subsequent studies called his finding into question. Now the largest independent replication effort so far, looking at 409 pairs of gay brothers, fingers the same region on the X. “When you first find something out of the entire genome, you’re always wondering if it was just by chance,” says Hamer, who asserts that new research “clarifies the matter absolutely.” But not everyone finds the results convincing. And the kind of DNA analysis used, known as a genetic linkage study, has largely been superseded by other techniques. Due to the limitations of this approach, the new work also fails to provide what behavioral geneticists really crave: specific genes that might underlie homosexuality. Few scientists have ventured into this line of research. When the genetics of being gay comes up at scientific meetings, “sometimes even behavioral geneticists kind of wrinkle up their noses,” says Kenneth Kendler, a psychiatric geneticist at Virginia Commonwealth University in Richmond. That’s partially because the science itself is so complex. Studies comparing identical and fraternal twins suggest there is some heritable component to homosexuality, but no one believes that a single gene or genes can make a person gay. Any genetic predispositions probably interact with environmental factors that influence development of a sexual orientation. © 2014 American Association for the Advancement of Science.
By ALAN SCHWARZ CONCORD, Calif. — Every time Matthias is kicked out of a school or day camp for defying adults and clashing with other children, his mother, Joelle Kendle, inches closer to a decision she dreads. With each morning of arm-twisting and leg-flailing as she tries to get him dressed and out the door for first grade, the temptation intensifies. Ms. Kendle is torn over whether to have Matthias, just 6 and already taking the stimulant Adderall for attention deficit hyperactivity disorder, go on a second and more potent medication: the antipsychotic Risperdal. Her dilemma is shared by a steadily rising number of American families who are using multiple psychotropic drugs — stimulants, antipsychotics, antidepressants and others — to temper their children’s troublesome behavior, even though many doctors who mix such medications acknowledge that little is known about the overall benefits and risks for children. In 2012 about one in 54 youngsters ages 6 through 17 covered by private insurance was taking at least two psychotropic medications — a rise of 44 percent in four years, according to Express Scripts, which processes prescriptions for 85 million Americans. Academic studies of children covered by Medicaid have also found higher rates and growth. Combined, the data suggest that about one million children are currently taking various combinations of psychotropics. Risks of antipsychotics alone, for example, are known to include substantial weight gain and diabetes. Stimulants can cause appetite suppression, insomnia and, far more infrequently, hallucinations. Some combinations of medication classes, like antipsychotics and antidepressants, have shown improved benefits (for psychotic depression) but also heightened risks (for heart rhythm disturbances). But this knowledge has been derived substantially from studies in adults — children are rarely studied because of concerns about safety and ethics — leaving many experts worried that the use of multiple psychotropics in youngsters has not been explored fully. There is also debate over whether the United States Food and Drug Administration’s database of patients’ adverse drug reactions reliably monitors the hazards of psychotropic drug combinations, primarily because only a small fraction of cases are ever reported. Some clinicians are left somewhat queasy about relying mostly on anecdotal reports of benefit and harm. © 2014 The New York Times Company
By Emma Wilkinson Health reporter, BBC News Taking vitamin B12 and folic acid supplements does not seem to cut the risk of developing dementia in healthy people, say Dutch researchers. In one of the largest studies to date, there was no difference in memory test scores between those who had taken the supplements for two years and those who were given a placebo. The research was published in the journal Neurology. Alzheimer's Research UK said longer trials were needed to be sure. B vitamins have been linked to Alzheimer's for some years, and scientists know that higher levels of a body chemical called homocysteine can raise the risk of both strokes and dementia. Vitamin B12 and folic acid are both known to lower levels of homocysteine. That, along with studies linking low vitamin B12 and folic acid intake with poor memory, had prompted scientists to view the supplements as a way to ward off dementia. Yet in the study of almost 3,000 people - with an average age of 74 - who took 400 micrograms of folic acid and 500 micrograms of vitamin B12 or a placebo every day, researchers found no evidence of a protective effect. All those taking part in the trial had high blood levels of homocysteine, which did drop more in those taking the supplements. But on four different tests of memory and thinking skills taken at the start and end of the study, there was no beneficial effect of the supplements on performance. The researchers did note that the supplements might slightly slow the rate of decline but concluded the small difference they detected could just have been down to chance. Study leader Dr Rosalie Dhonukshe-Rutten, from Wageningen University in the Netherlands, said: "Since homocysteine levels can be lowered with folic acid and vitamin B12 supplements, the hope has been that taking these vitamins could also reduce the risk of memory loss and Alzheimer's disease. BBC © 2014
Link ID: 20313 - Posted: 11.15.2014
Sara Reardon Companies selling ‘probiotic’ foods have long claimed that cultivating the right gut bacteria can benefit mental well-being, but neuroscientists have generally been sceptical. Now there is hard evidence linking conditions such as autism and depression to the gut’s microbial residents, known as the microbiome. And neuroscientists are taking notice — not just of the clinical implications but also of what the link could mean for experimental design. “The field is going to another level of sophistication,” says Sarkis Mazmanian, a microbiologist at the California Institute of Technology in Pasadena. “Hopefully this will shift this image that there’s too much commercial interest and data from too few labs.” This year, the US National Institute of Mental Health spent more than US$1 million on a new research programme aimed at the microbiome–brain connection. And on 19 November, neuroscientists will present evidence for the link in a symposium at the annual Society for Neuroscience meeting in Washington DC called ‘Gut Microbes and the Brain: Paradigm Shift in Neuroscience’. Although correlations have been noted between the composition of the gut microbiome and behavioural conditions, especially autism1, neuroscientists are only now starting to understand how gut bacteria may influence the brain. The immune system almost certainly plays a part, Mazmanian says, as does the vagus nerve, which connects the brain to the digestive tract. Bacterial waste products can also influence the brain — for example, at least two types of intestinal bacterium produce the neurotransmitter γ-aminobutyric acid (GABA)2. © 2014 Nature Publishing Group
by Helen Thomson Could a futuristic society of humans with the power to control their own biological functions ever become reality? It's not as out there as it sounds, now the technical foundations have been laid. Researchers have created a link between thoughts and cells, allowing people to switch on genes in mice using just their thoughts. "We wanted to be able to use brainwaves to control genes. It's the first time anyone has linked synthetic biology and the mind," says Martin Fussenegger, a bioengineer at ETH Zurich in Basel, Switzerland, who led the team behind the work. They hope to use the technology to help people who are "locked-in" – that is, fully conscious but unable to move or speak – to do things like self-administer pain medication. It might also be able to help people with epilepsy control their seizures. In theory, the technology could be used for non-medical purposes, too. For example, we could give ourselves a hormone burst on demand, much like in the Culture – Iain M. Banks's utopian society, where people are able to secrete hormones and other chemicals to change their mood. Fussenegger's team started by inserting a light-responsive gene into human kidney cells in a dish. The gene is activated, or expressed, when exposed to infrared light. The cells were engineered so that when the gene activated, it caused a cascade of chemical reactions leading to the expression of another gene – the one the team wanted to switch on. © Copyright Reed Business Information Ltd.
By Abby Phillip If you're confused about what marijuana use really does to people who use it, you're not alone. For years, the scientific research on health effects of the drug have been all over the map. Earlier this year, one study suggested that even casual marijuana use could cause changes to the brain. Another found that marijuana use was also associated with poor sperm quality, which could lead to infertility in men. But marijuana advocates point to other research indicating that the drug is far less addictive than other drugs, and some studies have found no relationship between IQ and marijuana use in teens. Researchers at the Center for Brain Health at the University of Texas in Dallas sought to clear up some of the confusion with a study that looked at a relatively large group of marijuana users and evaluated their brains for a slew of different indicators. What they found was complex, but the pattern was clear: The brains of marijuana users were different than those of non-marijuana users. The area of the brain responsible for establishing the reward system that helps us survive and also keeps us motivated was smaller in users than in non-marijuana users. But there was also evidence that the brain compensated for this loss of volume by increasing connectivity and the structural integrity of the brain tissue. Those effects were more pronounced for marijuana users who started young. "The orbitofrontal cortex is one of the primary regions in a network of brain areas called the reward system," explained Francesca Filbey, lead author of the study and an associate professor of the neurogenetics of addictive behavior at the University of Texas in Dallas. "
Email David By David Grimm Place a housecat next to its direct ancestor, the Near Eastern wildcat, and it may take you a minute to spot the difference. They’re about the same size and shape, and, well, they both look like cats. But the wildcat is fierce and feral, whereas the housecat, thanks to nearly 10,000 years of domestication, is tame and adaptable enough to have become the world’s most popular pet. Now scientists have begun to pinpoint the genetic changes that drove this remarkable transformation. The findings, based on the first high-quality sequence of the cat genome, could shed light on how other creatures, even humans, become tame. “This is the closest thing to a smoking gun we’ve ever had,” says Greger Larson, an evolutionary biologist at the University of Oxford in the United Kingdom who has studied the domestication of pigs, dogs, and other animals. “We’re much closer to understanding the nitty-gritty of domestication than we were a decade ago.” Cats first entered human society about 9500 years ago, not long after people first took up farming in the Middle East. Drawn to rodents that had invaded grain stores, wildcats slunk out of the deserts and into villages. There, many scientists suspect, they mostly domesticated themselves, with the friendliest ones able to take advantage of human table scraps and protection. Over thousands of years, cats shrank slightly in size, acquired a panoply of coat colors and patterns, and (largely) shed the antisocial tendencies of their past. Domestic animals from cows to dogs have undergone similar transformations, yet scientists know relatively little about the genes involved. Researchers led by Michael Montague, a postdoc at the Washington University School of Medicine in St. Louis, have now pinpointed some of them. The scientists started with the genome of a domestic cat—a female Abyssinian—that had been published in draft form in 2007, then filled in missing sequences and identified genes. They compared the resulting genome with those of cows, tigers, dogs, and humans. © 2014 American Association for the Advancement of Science.